Verb Errors in 5th-Grade English Learners’ Written Responses: Relation to Writing Quality

: The ability to express oneself through written language is a critically important skill for long-term educational, emotional, and social success. However, despite the importance of writing, English Learner students continue to perform at or below basic levels which warrants additional efforts to identify speciﬁc areas of weakness that impact writing quality. To that end, this study aims to describe the effect of verb accuracy on writing quality ratings of 5th-grade written expository samples. This study examines the responses of 243 students in the 5th grade who differed in English proﬁciency. The sample included 112 English Learners and 131 English-proﬁcient students. Verb error patterns in written samples by English Learner students are described and compared to the patterns of their monolingual English-proﬁcient peers. Group differences were examined in verb accuracy, types of verb errors, and overall grammaticality. A regression analysis was used to examine verb accuracy as a predictor of writing quality. Findings showed that English Learner students demonstrated more verb errors than their English-speaking peers and the total number of verb errors was a signiﬁcant predictor of writing quality ratings.


Introduction
The ability to express oneself through written language is a critically important skill, not only for the purpose of academic writing, but also for successful social and professional endeavors. Written language consists of several components as well as the combination of conceptual and linguistic skills (Penner-Williams et al. 2009;Pierangelo and Giuliani 2006). Kellogg (2001) describes writing as a cognitive process that requires memory, thinking and verbal capacity in order to simultaneously think intellectually and linguistically to express and elaborate on one's ideas. As students mature, the complexity of their writing increases; they transition from focusing on the mechanics of the process (e.g., letter formation and use of conventions of Standard English) in the early elementary grades to focusing on cohesively conveying ideas in the upper elementary grades. However, despite the importance of writing development, we continue to see trends in academic writing where students perform at or below basic levels, a trend more common in English Learner (EL) populations.
The Institute of Education Sciences has established use of the term English Learners to refer to students whose home language is not English and whose English language proficiency hinders their ability to meet expectations for students at their grade level. In many countries around the world, competence in two, or more, languages has steadily become the norm. In fact, evidence shows that there are more bilingual or multilingual individuals around the world than there are monolinguals (Bhatia and Ritchie 2012). In the United States, for example, there has been a 1.5 percent increase in the number of ELs, which, as of 2016, was 4.9 million students across the nation (NCES 2016). The number of students from culturally and linguistically diverse (CLD) backgrounds continues to rise as changes in the modern world present new demands for learning additional languages (Genesee 2004).

Writing in a Second Language
It has been argued that learning to proficiently write may be one of the most challenging aspects of learning a second language (L2) (Arfé et al. 2016;Celaya 2019;Crossley and McNamara 2012;Figueiredo 2019). Figueiredo (2019) examined differences in writing performance by comparing 99 immigrant English as a foreign language (EFL) students in Portugal and found that the most observed difficulty L2 learners face when writing is related to vocabulary and grammar components. Considering this finding, poor writing may be a result of challenges with lexical retrieval in the L2, overlaps between the first language (L1) and the L2, oral grammar skills, and spelling skills (Arfé et al. 2016;Figueiredo 2019). Additionally, Matuchniak et al. (2014) elaborate on how when attempting to write proficiently, ELs not only have to distribute attention across multiple demands including the cognitive, linguistic, communicative, contextual, textual, and affective challenges all writers face, but are also challenged with cultural influences and second language development limitations. Although the writing process may be challenging for ELs, Cumming (1990) noted the importance of writing for L2 acquisition as it draws their attention to linguistic forms while they create meaning in their text. Cumming's work elaborates on how writing requires the learner to pay attention to form-meaning relations that may prompt them to revise their linguistic expression, demonstrating control over their linguistic knowledge. Aligned with Cumming's views, Williams (2012) adds that writing measures of language provide a unique view of ELs' proficiency levels, affording the learner the time to pause and revise their output. Therefore, evaluating EL written language samples may provide valuable information regarding L2 proficiency as the task affords ELs the opportunity to apply and practice language skills at all stages of language learning. Additionally, proficient writing necessitates learning the syntactic and grammatical rules of a language. Syntactic and grammatical accuracy in writing indicate successful L2 acquisition, as they measure the student's ability to appropriately and effectively express their ideas in the L2 (Polio and Lee 2017). Grammatical accuracy in writing represents control and dominance of the rules and structures of a language and warrants comprehensible output and appropriate language usage.
Grammatical accuracy has also been found to impact quality ratings of written responses. Generally, students with higher L2 proficiency are more likely to produce highquality writing. Studies have found that L2 essays with higher quality ratings of writing contain more complex grammar, also referred to as morphosyntactic complexity, (Crossley and McNamara 2014;Ortega 2015) and sophisticated lexical items (Crossley and McNamara 2012;Kyle and Crossley 2016). Schoonen et al. (2011) reported that ELs with more grammatical knowledge express themselves more clearly and accurately in their writing, warranting high-quality writing. While quality ratings can vary across tasks and rubrics, Guo et al. (2013) found text length, the use of past participle verbs, lexical sophistication, and third-person singular verbs to be strong predictors of high-quality writing. They also found that essays containing grammatical errors, specifically related to verb forms, are more likely to receive low quality ratings. Additionally, Figueiredo (2019) found that the written samples provided by the young L2 learners were simply structured, with limited complex sentences, and poor command of grammar. Altogether, the findings substantiate that grammatical errors negatively impact coherence and consequently affect the ability to effectively communicate ideas through text (Fareed et al. 2016;Guo et al. 2013).

Grammatical Accuracy
The majority of the studies exploring grammatical accuracy of monolingual and EL students compare their morphosyntactic acquisition rates. Morphosyntax refers to grammatical categories that have both morphological and syntactic properties. Studies have provided evidence that ELs often fall behind their monolingual counterparts in terms of morphosyntax (Gathercole 2002a;Gutiérrez-Clellen et al. 2008;Paradis 2005;Pearson 2002). Differences between monolinguals and ELs have been explored, but results have varied depending on language skills and tasks (e.g., Gathercole 2002aGathercole , 2002bGutiérrez-Clellen et al. 2008;Paradis and Genesee 1996). Studies have explored aspects of grammar related to use of complex syntax (Pearson 2002), noun quantity (Gathercole 2002b), subject omissions (Yip and Matthews 2000), grammatical gender (Gathercole et al. 2002;Gathercole and Thomas 2005), morphosyntactic accuracy (Pearson 2002), or tense-making morphemes (Jacobson and Schwartz 2005;Paradis 2010). Overall, ELs have been known to demonstrate higher rates of grammatical errors for a longer period of time when compared to their monolingual, English-proficient, counterparts (Döpke 2000;Gutiérrez-Clellen et al. 2008).
In order to examine grammatical accuracy in EL writing, grammatical structures were narrowed down and verb errors became of particular interest to this study. Errors in grammatical morphology, in terms of verb usage in the school-age years, have not been as comprehensively researched for the school-age population over the years. In the seminal work of Dulay and Burt (1974) more than two hundred 6-to 8-year-old ELs were evaluated regarding their accuracy in the use of 14 grammatical morphemes, including tense and non-tense morphemes. They found that certain tense morphemes, such as third-person singular -s, were used less accurately by EL students. In alignment with these findings, some longitudinal case studies exploring grammatical morphology in ELs have also found errors with third-person singular -s and past-tense -ed (Hakuta 1978;Lakshmanan 1994) and omission of BE auxiliary (Haznedar 2001). Research also suggests that ELs tend to overgeneralize tense markers (e.g., growed for grew) more than their similar-aged monolingual peers (Jacobson and Schwartz 2005). In a case study with a 12-year-old Spanish-speaking student, Mellow (2008) found that the learner gradually produced different English constructions to further increase complexity. These results, along with a limited number of others (e.g., Crossley and McNamara 2014), may imply that students eventually produce correct forms in their writing, but the evidence on acquisition appears to vary by learner and task. Watcharapunyawong and Usaha (2013) explored the writing errors in different texts of Thai EFL adult students and found verb errors to be the most frequent errors made, specifically subject-verb agreement errors and tense errors. Similarly, several researchers have investigated the quality of L2 writing of EFL students and found the most frequent errors to be determiners, subject-verb agreement, and tenses (Nayan and Jusoff 2009;Wu and Garza 2014).

Research Aims
Despite this knowledge, limited research is available specifically examining verb error patterns of school-aged ELs and how verb errors may directly impact the quality of their writing. If more is known about these patterns, this could serve to better inform instruction decisions aimed at improving writing skills for ELs earlier in their education to avoid long-term effects. In response to these gaps in the literature, the current study aims to use authentic classroom-based assessments of writing administered to all 5th-grade students to answer the following research questions:

1.
What verb error patterns do 5th-grade EL students demonstrate in their writing in English? 2.
Are there differences in error patterns between EL 5th-grade students and their English-proficient monolingual peers? 3.
How does the rate of verb errors relate to quality rating of 5th-grade written samples? Specifically, does the proportion of verb errors and overall grammaticality predict ratings of writing quality?

Materials and Methods
For this study, the investigators used data gathered as part of a larger study examining the writing skills of students in the 5th grade. This project was approved by the university human subjects committee (HCS # 2018.25857). Due to the purpose of this study and the time-intensive nature of transcribing and coding written samples, this study included writing samples for a subset of randomly selected ELs and English-proficient monolingual students.

Participants
This sample of 243 5th-grade students consisted of 131 girls and 112 boys from 65 inclusive classrooms in 34 elementary schools in a large school district. Descriptive information on participants, including race, eligibility for free and reduced lunch status, and home language, is provided in Table 1. This study focused on English Learner (EL) students from varied linguistic backgrounds. The term EL was used by the participating district to refer to students in which a language other than English is most relied upon for communication at home. Of 662 students who spoke another language at home and had proficiency data, 335 (51%) were classified as EL and were currently enrolled in ESOL support services. ELs who had identified exceptionalities, as reported by the school districts, were excluded from this study. Students who were absent on the day the writing sample was completed or completed the writing sample in their home language (e.g., Spanish) were also excluded from this study. Consequently, 112 EL writing samples met the criteria for this study. A comparison group of English-proficient monolinguals was created through the random selection of 131 monolingual students who participated in the writing task. Students in this group only spoke English at home and had no identified exceptionalities.

English Vocabulary and Reading Comprehension Skills
Subtests of the iReady assessment battery were administered to participants to provide descriptive information regarding their English vocabulary and reading comprehension skills (e.g., comprehension of informational texts, comprehension of literature and reading combined). iReady is a computer-delivered, interim adaptive assessment for students in KG to the 12th grade built on the College-and Career-Ready Standards (Curriculum Associates 2018). Students are given four choices in a multiple-choice design and items are scored dichotomously as correct or incorrect using a Rasch Model. Ability estimates are updated based on students' performance on a randomly selected set of five items at a starting difficulty level and subsequent items are adjusted higher or lower in difficulty accordingly. Estimated reliability for overall reading is 0.97. Test-retest reliability is reported to be 0.85-0.86 for the 3rd-5th grades. See Table 2 for participant scores.

Data Collection
Language and literacy skills were assessed during the first eight weeks of the school year as part of the district assessment procedures. Classroom teachers administered the computer-adapted standardized assessment in the computer center or media center of their school. English Language Arts instructors administered the expository writing task classroom-wide as part of the district's mandatory curriculum-based assessment measure. For the writing task used in this study, classroom teachers distributed a packet in October of the school year containing two written passages about the benefits of exercise, directions for the task, a planning sheet, and lined paper. Students were given 120 min to read the passages and write a response in English. Researchers collected students' written language samples from classroom teachers and requested data from the participating district regarding students' performance in reading comprehension and vocabulary which included scores on iReady (Curriculum Associates 2018), a computer-adaptive assessment administered district-wide.

Writing Instrument
The current study used a curriculum-based measure of writing that was used by the partnering school district and administered to all students. The writing prompt challenged writers to a dual purpose, to inform the reader about the benefits of fitness and persuade the reader to consider the benefits of fitness. The directions instructed students to read two passages, plan a response explaining how fitness can contribute to unexpected outcomes, write the response, and revise and edit the response. The first passage pertained to unexpected outcomes of fitness. The second passage, two pages in length, was about the benefits of fitness for an individual who was blind. The passage was seven paragraphs long (one and a half pages double-spaced) and students were given 120 min to compose a written response to the prompt. Since the task of writing in response to a passage is common practice in district and statewide assessments the use of such a measure (as opposed to a researcher-created measure), offers the advantage of being readily recognizable and interpretable by general educators familiar with this common practice. Additionally, the construct validity of the measure is supported by the fact that the measure aligns with accountability and classroom progress monitoring measures currently used in all 3rd-5thgrade classrooms throughout the district.

Writing Measures 2.4.1. Grammatical Accuracy
The decision to include a measure of grammatical accuracy was based on a number of previous findings suggesting that measures of correct writing sequences were sensitive to student achievement and progress over time (Dockrell et al. 2015;Malecki and Jewell 2003). As such, a broad measure of accuracy based on the proportion of errors was included in the current study. This is consistent with other sources that have reported grammaticality as a proportion of utterances with grammatical errors (Eisenberg and Guo 2013). Grammatical accuracy was calculated by adding the total number of grammatical errors and dividing by the number of sentences in the written response.

Verb Errors
In order to narrow down the specific grammatical error patterns of particular interest for this study, three error codes were created: VE:O, referring to verb omissions; VE:T, referring to verb tense errors; and VE:A, referring to verb agreement errors. Instances where students would omit a verb from their sentences, such as "he overweight", would be coded as [VE:O]. If the student used a verb but did not use the correct tense, for example, "yesterday he run fast" instead of "yesterday he ran fast," this would be coded as [VE:T]. Finally, if their sentences did not follow the subject-verb agreement rules-for example, if they write the following sentence, "he like running" instead of "he likes running"the sentence would be coded as [VE:A]. All other errors (e.g., spelling) were ignored for this study.

Writing Quality
Quality rating scores were based on the rubric adopted by the district, which was consistent with state assessment. The rubric was used to score the written samples on three categories of quality: (a) purpose, focus, and organization; (b) evidence and elaboration; and (c) conventions of Standard English. These elements are consistent with components found in established scoring systems such as the Weschler Objective Language Dimensions (Rust 1996;Wechsler 2005) and previous studies (e.g., Williams et al. 2013).
For the first category of writing quality, students' writing samples were scored on purpose, focus, and organization. To achieve the maximum 4 points in this category, the student's written response demonstrated a strong idea with little or no loosely related material; skillful use of transitions; use of grade-level vocabulary to convey ideas; and a logical progression of ideas including an introduction and conclusion. For the second category of writing quality, investigators scored the writing samples on the inclusion of evidence and elaboration. To obtain the maximum number of 4 points, students integrated evidence thoroughly and smoothly using appropriate vocabulary and sentence structure. It is important to note that scores on both category 1 and 2 are influenced by students' vocabulary knowledge and their ability to cohesively elaborate and convey main ideas. Finally, for the third quality category rating, investigators rated students' writing on use of conventions of Standard English. To receive a full 2-point rating in this category, students' responses demonstrated appropriate use of punctuation, capitalization, sentence formation and spelling.
Finally, a composite score was calculated as the sum of the three components. This overall quality of writing rating was aligned with state assessment procedures. As a composite score, the total writing quality rubric score (on a 10-point scale) is purported to reflect original thought, use of text evidence, inferences, implicit understanding, and synthesizing across texts.

Procedures
All writing samples were transcribed verbatim (maintaining errors) by the investigators and trained research assistants. The Systematic Analysis of Language Transcripts (SALT) software was used to code and analyze the writing samples. Each sample underwent two rounds of error coding. For the first round, the samples were divided among five undergraduate students who had completed training on the codes of interest and demonstrated proficiency in how to code each sample for errors. For the second round of error coding, each transcript was coded by the first author and a research assistant (RA). During this round, the samples were checked for coding accuracy and any missing or incorrect codes were adjusted. SALT was then used to provide descriptive measures for the frequency of each error code, total number of errors per student, and total number of T-units and words per writing sample. To ensure coding reliability, 25% of the final coded writing samples were randomly selected and double coded by the first author and the RA. Interclass Correlations (ICC) estimates for each error code and their 95% confidence intervals were calculated using SPSS Statistics software. For the individual verb error codes, omission, tense, and agreement, inter-rater agreement was 89%, 88%, and 91%, respectively. As reported by Koo and Li (2016), ICC values between 0.75 and 0.9 indicate good reliability.
For human ratings of writing quality, two raters who were blind to the students' characteristics scored the written samples on each of the three categories of writing quality. The two raters were certified teachers employed by the partnering school district. They held graduate degrees and had worked as writing resource teachers and writing academic coaches for the partnering district. The raters had completed extensive district-provided training on the writing rubric, passed an assessment of writing training, and attended monthly training meetings including regular online scoring courses to recalibrate.
A randomly selected subsample was blindly double rated by both raters independently. When considering any point difference, a disagreement, inter-rater agreement was 70%, 77.5%, and 67.5% for quality subcomponents, respectively. This was above the 60% criteria for ratings of writing quality in published reviews (Graham and Perin 2007). When considering agreement as a point difference of greater than 1, similar to previous studies (e.g., Koutsoftas and Gray 2012;Koutsoftas 2016), an inter-rater agreement of 100%, 100%, and 97.5% was attained.

Verb Error Patterns
To answer the first research question which examined verb error patterns of ELs in the 5th grade, we report descriptive statistics on students' rate of verb errors specifically. Table 2 shows descriptive statistics on the types of verb error patterns made by EL students and their English-proficient monolingual peers. The verb errors are presented in terms of the ratio of errors per T-units, as transcribed in the SALT software, to account for the different student sample lengths. As displayed in Table 3, ELs demonstrated a higher frequency of verb errors across all types, with the most common verb error type being tense errors. The descriptive statistics also demonstrate that over 50% of the grammatical errors made by ELs were related to verb errors, while less than 25% of grammatical errors were related to verb errors for English-proficient monolinguals.

Group Differences in Verb Errors
To answer the second research question, we conducted an analysis of variance (ANOVA) between the groups that differed by language proficiency (e.g., monolinguals and ELs) to compare the frequency of verb errors in their writing samples. The frequency of verb errors differed between the two groups, F (1, 154) = 29.218, (p < 0.0001), with a Cohen's D effect size of 0.69. On average, EL students produced more verb errors per T-units in their writing than their monolingual peers. To further examine the second question, we analyzed the difference between verb error types among the two groups. As displayed by Table 4, there were significant differences between the two groups in the frequency of verb agreement errors.

Verb Errors and Writing Quality
To answer research question three, we first analyzed descriptive data related to student writing quality outcomes by group, see Table 4. We then calculated Pearson correlation coefficients to assess the relationship between the rate and type of verb errors present in 5th-grade writing samples and composite writing quality scores. There was a significant negative relationship between overall verb errors and composite scores (r = −0.25, p < 0.0001). Each type of verb error also demonstrated a significantly negative relationship between composite quality writing scores, as displayed in Table 5. To further examine the third question, Pearson correlation coefficients were calculated to assess the relationship between the type of verb errors and each category of quality used in the composite writing scores. Verb tense errors demonstrated significant negative relationships with all three components of writing quality ratings, r = −0.22, (p = 0.001); r = −0.20, (p = 0.002); and r = −0.16, (p = 0.02), respectively. Verb agreement errors had a significant negative relationship with ratings of purpose, focus, and organization (r = −0.13, p = 0.05). Further, verb omission errors had a significant negative relationship with ratings of conventions (r = −0.15, p = 0.02). See Table 6 for additional correlations.  Note. Category 1, 2, and 3 are all individual scores used for the quality composite writing score. ** Correlation is significant at the 0.01 level. * Correlation is significant at the 0.05 level.
To further examine the third research question, measures of grammatical accuracy in writing (proportion of verb errors to number of T-units per sample and overall grammaticality) were entered into a regression analysis. The ratio of verb errors to length were significant predictors of students' writing quality (refer to Table 7). For every unit increase in the number of errors, composite score for writing quality is predicted to be lower by 0.80 points. With two factors entered (proportion of verb errors to length, and overall grammaticality), the model accounted for a significant but small amount (6%) of the variance in composite writing quality scores, F (2, 228) = 8.78, p < 0.0001.

Discussion
The purpose of this study was to examine and describe verb errors in 5th-grade students' written responses. In addition, this study aimed to determine whether verb errors in writing significantly influenced overall ratings of writing quality. Data analyses revealed three key findings. First, EL students made verb errors in their written responses more often than their English-proficient monolingual peers. Specifically, they had significantly higher verb tense errors when compared to their peers and overall grammatical accuracy was also lower for ELs. Second, the ratio of total number of errors to overall grammatical errors was also significantly different between the two groups, with ELs demonstrating higher overall grammatical errors due to verb errors. With regards to writing quality, there was a significant small negative correlation between overall verb errors and writing quality ratings. Lastly, the total number of verb errors per T-unit was also a significant predictor for writing quality ratings.

Verb Errors in Writing
The current findings align with previous studies indicating persistent gaps in grammatical accuracy in the written language of EL students. As expected from previous results (e.g., Haznedar 2001;Watcharapunyawong and Usaha 2013), the presence of correct, varied tense-marking morphemes was limited in the EL writing samples. For example, several verb errors present in EL students' writing samples were related to irregular past tense verbs (e.g., "he win many medals" instead of "he won . . . "). Consistent with previous research (e.g., Jacobson and Schwartz 2005), EL students were more prone than Englishproficient students to overgeneralize the past tense -ed (e.g., "run-ed" for "ran" and "go-ed" for "went"), which is common in younger typically developing English-speaking children (Pence Turnbull and Justice 2016). The overgeneralization of past tense -ed may also be indicative of insufficient knowledge or access to irregular past tense forms. Several of their samples also included omission of BE auxiliary verbs as they included sentences such as "he skiing all day long," (omission of "was"). Several EL students also demonstrated inconsistent use of third-person singular -s in their written responses, as they were more likely to omit the -s from their verbs (e.g., "he play basketball").

Group Differences in Verb Errors
In line with previous literature examining the accuracy and acquisition of different grammatical aspects of language (e.g., Döpke 2000;Gathercole 2002aGathercole , 2002bGutiérrez-Clellen et al. 2008), ELs in the current study also demonstrated higher rates of grammatical errors when compared to their English-proficient peers. This study highlights that more than half (54%) of the grammatical errors demonstrated by EL were attributable to verb accuracy. While errors have been found to be related to knowledge gaps in the language (Ellis 1997;Polio and Lee 2017), the current study outlines the specific gaps between ELs and English-proficient students pertaining to verb knowledge. Although the cause of the observed group differences cannot be determined by the current design, higher rates of verb errors in writing by ELs may be partially attributed to the increased cognitive demands of a writing in a second language (Kellogg 2008;Ortega 2009) and mitigating between the grammatical rules in each language.

Relation between Verb Errors and Writing Quality Ratings
The findings pertaining to the relation between writing quality ratings and grammaticality are in line with previous studies (e.g., Schoonen et al. 2011;Ortega 2015). Similar to the findings of Guo et al. (2013), in the present study, high occurrences of grammatical errors in written responses were associated with low quality ratings which may have impacted the clarity and coherence of the writing. The current findings related to verb errors as predictors of writing quality align closely to previous studies (e.g., Guo et al. 2013;Kim and Crossley 2018), substantiating that grammatical errors related to verb form contribute to low quality ratings. This finding is concerning considering that rubric-based writing quality ratings are commonly used in many school districts, state assessments, and high-stakes testing as a core metric in the evaluation of writing achievement and advancement (Wagner et al. 2011). In light of the essential role of writing quality in academic success, the current findings substantiate the importance of addressing verb errors and lend support for the need for intensified instructional efforts focused on verb errors.

Limitations
Results should be interpreted cautiously, recognizing that only one written response was collected per student. It cannot be assumed that the same findings would result with additional samples or if the task elicited specific verb forms. Because it was an open-ended written response, it is possible that students may have avoided using specific verbs or verb forms that they were uncertain about. Although it was beyond the scope of the current study to examine factors that influenced verb selection, prior research supports the notion that ELs are likely to rely on high frequency verbs (Durrant and Brenchley 2019) and early developing or nonspecific verbs (Hasselgren 1994).
It should be noted that although this study focused on three verb error types, the measures used in the current study were not intended to serve as a proxy for all types of errors or grammatical knowledge broadly. We recognize that other types of grammatical errors, not categorized in the current study, may have been present and/or prevalent. Although the measures selected may account for only a portion of the grammatical errors demonstrated, it was not feasible in this exploratory study to categorize every potential type of grammatical error. As such, further study is planned to examine other types of errors within the students' responses.
Finally, due to the limited information related to EL students' English language skills at the time of the writing assessment, these findings should be interpreted cautiously. Additional information related to length of English exposure for each student, current English speaking, writing, and reading skills, and socioeconomic status would substantiate the findings of this study. It is important to note that the abovementioned factors may influence students' writing experience and consequently the number of grammatical errors present in student writing.

Implications
Despite the limitations, the results of the current study offer insights to inform instruction. The findings have implications for educational personnel in informing areas to target for intensive instructional supports, including the importance of grammatical accuracy, explicit instruction of grammatical markers related to verb tenses and subjectverb agreement rules, and the importance of high-quality writing. Additionally, the group differences would suggest that we cannot assume that all students have sufficient supports to acquire the grammatical rules of English and construct grammatically correct sentences in academic writing. As such, the use of explicit instructional activities addressing verb tenses and subject-verb agreement may be necessary in the contexts of academic writing and to further support effective use of written English in the classroom and real-life settings, although further empirical study is needed.
As mentioned in previous literature, adult EL students have expressed a desire for further grammatical knowledge and skills relating to verb tenses and subject-verb agreement. These desires may be addressed earlier in second language learning through explicit instruction and multiple opportunities in the classroom may off-set future difficulties in writing. Given that the rubric used in the current study is based on common writing assessments used in classrooms, the current finding that verb errors significantly predict lower writing quality ratings supports the importance of incorporating more explicit focus on verb knowledge in the classroom.
Although not surprising, the finding that there was a negative relation between verb errors and overall writing quality denotes the importance of addressing these errors early on in language learning. As high-quality writing samples tend to have more complex grammar and morphosyntax, mastering tense markers and overall grammatical accuracy earlier should positively impact writing quality. Given that academic writing has a large influence on one's future academic, social, and emotional well-being, addressing factors that impact the quality of writing, such as verb accuracy, earlier and explicitly should be paramount when developing classroom instruction.

Future Research
Further studies are needed to expand on this research and consider examining and describing a greater range of grammatical errors. It is also important to consider different cultural factors that may influence certain manifestations of different errors in writing. For example, the omission of BE verbs could be attributable to dialect or language transfer. It is important to note that the types of errors demonstrated by ELs may be partially influenced by differences in grammatical rules in students' heritage language, specifically the rules for verb conjugation and agreement. In addition, the use of different writing tasks could elicit a better depiction of writing capabilities. While argumentative writing warrants the use of more complex syntactic structures than narrative tasks, it would still be beneficial to understand student abilities across tasks. Comparing student written and oral language errors may also better illustrate language proficiency. Finally, future studies should also consider providing descriptive information in terms of the types of verbs, verb frequency, and different number of verbs produced by the students to further quantify student verb knowledge and accuracy.

Conclusions
Given the importance of becoming a skilled writer, the findings of the current study provide evidence to support the need to identify effective instructional strategies to improve English writing and grammar acquisition for EL students. In light of the fact that EL students had significantly higher verb tense errors and lower grammatical accuracy when compared to their peers, it is paramount to focus instructional efforts to promote EL students' academic success. Finally, as academic standards underscore the importance of early writing proficiency, instruction should focus on ensuring that EL students have a grasp of English grammatical rules. Specific focus is needed on tense marking which was shown to influence ratings of overall writing quality in the current study. While quality ratings take several aspects of writing into consideration, the negative impact of verb errors on writing quality ratings in this study highlights the need to identify and address knowledge gaps that EL students may have pertaining to verb form and function.
Author Contributions: Conceptualization, K.F. and C.W.; investigation, C.W.; writing-original draft preparation, K.F.; writing-review and editing, K.F. and C.W.; funding acquisition, K.F. All authors have read and agreed to the published version of the manuscript. Informed Consent Statement: Informed consent was obtained from all subjects involved in the study.

Data Availability Statement:
The data presented in this study are not available due to the nature of this research. Participants did not agree to have their data shared publicly.