Next Article in Journal
The Emergence of Determiners in French L2 from the Point of View of L1/L2 Comparison
Previous Article in Journal
Word Order in Complex Verb Phrases in Heritage Polish Spoken in Germany
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Verb Errors in 5th-Grade English Learners’ Written Responses: Relation to Writing Quality

School of Communication Science and Disorders, Florida State University, Tallahassee, FL 32303, USA
*
Author to whom correspondence should be addressed.
Languages 2021, 6(2), 71; https://doi.org/10.3390/languages6020071
Submission received: 6 January 2021 / Revised: 30 March 2021 / Accepted: 6 April 2021 / Published: 9 April 2021

Abstract

:
The ability to express oneself through written language is a critically important skill for long-term educational, emotional, and social success. However, despite the importance of writing, English Learner students continue to perform at or below basic levels which warrants additional efforts to identify specific areas of weakness that impact writing quality. To that end, this study aims to describe the effect of verb accuracy on writing quality ratings of 5th-grade written expository samples. This study examines the responses of 243 students in the 5th grade who differed in English proficiency. The sample included 112 English Learners and 131 English-proficient students. Verb error patterns in written samples by English Learner students are described and compared to the patterns of their monolingual English-proficient peers. Group differences were examined in verb accuracy, types of verb errors, and overall grammaticality. A regression analysis was used to examine verb accuracy as a predictor of writing quality. Findings showed that English Learner students demonstrated more verb errors than their English-speaking peers and the total number of verb errors was a significant predictor of writing quality ratings.

1. Introduction

The ability to express oneself through written language is a critically important skill, not only for the purpose of academic writing, but also for successful social and professional endeavors. Written language consists of several components as well as the combination of conceptual and linguistic skills (Penner-Williams et al. 2009; Pierangelo and Giuliani 2006). Kellogg (2001) describes writing as a cognitive process that requires memory, thinking and verbal capacity in order to simultaneously think intellectually and linguistically to express and elaborate on one’s ideas. As students mature, the complexity of their writing increases; they transition from focusing on the mechanics of the process (e.g., letter formation and use of conventions of Standard English) in the early elementary grades to focusing on cohesively conveying ideas in the upper elementary grades. However, despite the importance of writing development, we continue to see trends in academic writing where students perform at or below basic levels, a trend more common in English Learner (EL) populations.
The Institute of Education Sciences has established use of the term English Learners to refer to students whose home language is not English and whose English language proficiency hinders their ability to meet expectations for students at their grade level. In many countries around the world, competence in two, or more, languages has steadily become the norm. In fact, evidence shows that there are more bilingual or multilingual individuals around the world than there are monolinguals (Bhatia and Ritchie 2012). In the United States, for example, there has been a 1.5 percent increase in the number of ELs, which, as of 2016, was 4.9 million students across the nation (NCES 2016). The number of students from culturally and linguistically diverse (CLD) backgrounds continues to rise as changes in the modern world present new demands for learning additional languages (Genesee 2004). Such demands arise from, but are not limited to, the rise in global communication through the internet, the growing number of global businesses and organizations, and voluntary immigration of people seeking new opportunities in different countries.
In the United States, ELs have historically demonstrated gaps in academic performance and achievement when compared to their monolingual, English-proficient counterparts. Students’ writing achievement in upper elementary grades warrants further study due to the fact that the core competencies and universal standards for grade-level expectations focus more heavily on academic writing skills in the upper elementary grades. Although there is wide variability in the rate of writing development, students are expected to have attained proficiency in academic writing by the 5th grade. For example, under Florida’s Benchmarks for Excellent Student Thinking (B.E.S.T), Standards for English Language Arts (ELA), students should be able to effectively compose narrative, argumentative, and expository writing types by the 5th grade. Florida’s BEST Standards for ELA is a product developed by Florida literacy experts, educators, and stakeholders which outlines expected student achievement. ELs, in particular, are at risk for poor writing achievement coupled with the increasingly challenging standards in successive grade levels of upper elementary school. According to a nationally representative measure of academic achievement trends in U.S. elementary and secondary schools, EL students face considerable challenges in writing as they trail their English-proficient peers by 56 points at grade 12 (NCES 2012). Consequently, only 1% of ELs performed at or above the proficient level of writing, which is low compared to the 31% of English-proficient students who scored at or above proficient levels. Finally, 80% of ELs scored below basic writing skills compared to 19% of their English-proficient peers. Given these disparities, it is paramount that we focus our efforts towards addressing EL academic writing skills in order to improve chances of long-term success for all students. Writing is seen as a gatekeeper for college admissions as most colleges require essays in their applications and as a skill for hiring and promoting for most career paths (Olson et al. 2017). Therefore, failure to identify and address performance gaps, specifically in academic writing, early on can have not only detrimental, long-term, consequences for an ELs’ academic success, but can negatively impact their social and economic well-being as well (National Commission on Writing for America’s Families, Schools, and Colleges 2004; Olson et al. 2017).

1.1. Writing in a Second Language

It has been argued that learning to proficiently write may be one of the most challenging aspects of learning a second language (L2) (Arfé et al. 2016; Celaya 2019; Crossley and McNamara 2012; Figueiredo 2019). Figueiredo (2019) examined differences in writing performance by comparing 99 immigrant English as a foreign language (EFL) students in Portugal and found that the most observed difficulty L2 learners face when writing is related to vocabulary and grammar components. Considering this finding, poor writing may be a result of challenges with lexical retrieval in the L2, overlaps between the first language (L1) and the L2, oral grammar skills, and spelling skills (Arfé et al. 2016; Figueiredo 2019). Additionally, Matuchniak et al. (2014) elaborate on how when attempting to write proficiently, ELs not only have to distribute attention across multiple demands including the cognitive, linguistic, communicative, contextual, textual, and affective challenges all writers face, but are also challenged with cultural influences and second language development limitations. Although the writing process may be challenging for ELs, Cumming (1990) noted the importance of writing for L2 acquisition as it draws their attention to linguistic forms while they create meaning in their text. Cumming’s work elaborates on how writing requires the learner to pay attention to form-meaning relations that may prompt them to revise their linguistic expression, demonstrating control over their linguistic knowledge. Aligned with Cumming’s views, Williams (2012) adds that writing measures of language provide a unique view of ELs’ proficiency levels, affording the learner the time to pause and revise their output. Therefore, evaluating EL written language samples may provide valuable information regarding L2 proficiency as the task affords ELs the opportunity to apply and practice language skills at all stages of language learning. Additionally, proficient writing necessitates learning the syntactic and grammatical rules of a language. Syntactic and grammatical accuracy in writing indicate successful L2 acquisition, as they measure the student’s ability to appropriately and effectively express their ideas in the L2 (Polio and Lee 2017). Grammatical accuracy in writing represents control and dominance of the rules and structures of a language and warrants comprehensible output and appropriate language usage.
Grammatical accuracy has also been found to impact quality ratings of written responses. Generally, students with higher L2 proficiency are more likely to produce high-quality writing. Studies have found that L2 essays with higher quality ratings of writing contain more complex grammar, also referred to as morphosyntactic complexity, (Crossley and McNamara 2014; Ortega 2015) and sophisticated lexical items (Crossley and McNamara 2012; Kyle and Crossley 2016). Schoonen et al. (2011) reported that ELs with more grammatical knowledge express themselves more clearly and accurately in their writing, warranting high-quality writing. While quality ratings can vary across tasks and rubrics, Guo et al. (2013) found text length, the use of past participle verbs, lexical sophistication, and third-person singular verbs to be strong predictors of high-quality writing. They also found that essays containing grammatical errors, specifically related to verb forms, are more likely to receive low quality ratings. Additionally, Figueiredo (2019) found that the written samples provided by the young L2 learners were simply structured, with limited complex sentences, and poor command of grammar. Altogether, the findings substantiate that grammatical errors negatively impact coherence and consequently affect the ability to effectively communicate ideas through text (Fareed et al. 2016; Guo et al. 2013).

1.2. Grammatical Accuracy

The majority of the studies exploring grammatical accuracy of monolingual and EL students compare their morphosyntactic acquisition rates. Morphosyntax refers to grammatical categories that have both morphological and syntactic properties. Studies have provided evidence that ELs often fall behind their monolingual counterparts in terms of morphosyntax (Gathercole 2002a; Gutierréz-Clellen et al. 2008; Paradis 2005; Pearson 2002). Differences between monolinguals and ELs have been explored, but results have varied depending on language skills and tasks (e.g., Gathercole 2002a, 2002b; Gutierréz-Clellen et al. 2008; Paradis and Genesee 1996). Studies have explored aspects of grammar related to use of complex syntax (Pearson 2002), noun quantity (Gathercole 2002b), subject omissions (Yip and Matthews 2000), grammatical gender (Gathercole et al. 2002; Gathercole and Thomas 2005), morphosyntactic accuracy (Pearson 2002), or tense-making morphemes (Jacobson and Schwartz 2005; Paradis 2010). Overall, ELs have been known to demonstrate higher rates of grammatical errors for a longer period of time when compared to their monolingual, English-proficient, counterparts (Döpke 2000; Gutierréz-Clellen et al. 2008).
In order to examine grammatical accuracy in EL writing, grammatical structures were narrowed down and verb errors became of particular interest to this study. Errors in grammatical morphology, in terms of verb usage in the school-age years, have not been as comprehensively researched for the school-age population over the years. In the seminal work of Dulay and Burt (1974) more than two hundred 6- to 8-year-old ELs were evaluated regarding their accuracy in the use of 14 grammatical morphemes, including tense and non-tense morphemes. They found that certain tense morphemes, such as third-person singular -s, were used less accurately by EL students. In alignment with these findings, some longitudinal case studies exploring grammatical morphology in ELs have also found errors with third-person singular -s and past-tense -ed (Hakuta 1978; Lakshmanan 1994) and omission of BE auxiliary (Haznedar 2001). Research also suggests that ELs tend to overgeneralize tense markers (e.g., growed for grew) more than their similar-aged monolingual peers (Jacobson and Schwartz 2005). In a case study with a 12-year-old Spanish-speaking student, Mellow (2008) found that the learner gradually produced different English constructions to further increase complexity. These results, along with a limited number of others (e.g., Crossley and McNamara 2014), may imply that students eventually produce correct forms in their writing, but the evidence on acquisition appears to vary by learner and task. Watcharapunyawong and Usaha (2013) explored the writing errors in different texts of Thai EFL adult students and found verb errors to be the most frequent errors made, specifically subject–verb agreement errors and tense errors. Similarly, several researchers have investigated the quality of L2 writing of EFL students and found the most frequent errors to be determiners, subject–verb agreement, and tenses (Nayan and Jusoff 2009; Wu and Garza 2014).

1.3. Research Aims

Despite this knowledge, limited research is available specifically examining verb error patterns of school-aged ELs and how verb errors may directly impact the quality of their writing. If more is known about these patterns, this could serve to better inform instruction decisions aimed at improving writing skills for ELs earlier in their education to avoid long-term effects. In response to these gaps in the literature, the current study aims to use authentic classroom-based assessments of writing administered to all 5th-grade students to answer the following research questions:
  • What verb error patterns do 5th-grade EL students demonstrate in their writing in English?
  • Are there differences in error patterns between EL 5th-grade students and their English-proficient monolingual peers?
  • How does the rate of verb errors relate to quality rating of 5th-grade written samples? Specifically, does the proportion of verb errors and overall grammaticality predict ratings of writing quality?

2. Materials and Methods

For this study, the investigators used data gathered as part of a larger study examining the writing skills of students in the 5th grade. This project was approved by the university human subjects committee (HCS # 2018.25857). Due to the purpose of this study and the time-intensive nature of transcribing and coding written samples, this study included writing samples for a subset of randomly selected ELs and English-proficient monolingual students.

2.1. Participants

This sample of 243 5th-grade students consisted of 131 girls and 112 boys from 65 inclusive classrooms in 34 elementary schools in a large school district. Descriptive information on participants, including race, eligibility for free and reduced lunch status, and home language, is provided in Table 1. This study focused on English Learner (EL) students from varied linguistic backgrounds. The term EL was used by the participating district to refer to students in which a language other than English is most relied upon for communication at home. Of 662 students who spoke another language at home and had proficiency data, 335 (51%) were classified as EL and were currently enrolled in ESOL support services. ELs who had identified exceptionalities, as reported by the school districts, were excluded from this study. Students who were absent on the day the writing sample was completed or completed the writing sample in their home language (e.g., Spanish) were also excluded from this study. Consequently, 112 EL writing samples met the criteria for this study. A comparison group of English-proficient monolinguals was created through the random selection of 131 monolingual students who participated in the writing task. Students in this group only spoke English at home and had no identified exceptionalities.

English Vocabulary and Reading Comprehension Skills

Subtests of the iReady assessment battery were administered to participants to provide descriptive information regarding their English vocabulary and reading comprehension skills (e.g., comprehension of informational texts, comprehension of literature and reading combined). iReady is a computer-delivered, interim adaptive assessment for students in KG to the 12th grade built on the College- and Career-Ready Standards (Curriculum Associates 2018). Students are given four choices in a multiple-choice design and items are scored dichotomously as correct or incorrect using a Rasch Model. Ability estimates are updated based on students’ performance on a randomly selected set of five items at a starting difficulty level and subsequent items are adjusted higher or lower in difficulty accordingly. Estimated reliability for overall reading is 0.97. Test–retest reliability is reported to be 0.85–0.86 for the 3rd–5th grades. See Table 2 for participant scores.

2.2. Data Collection

Language and literacy skills were assessed during the first eight weeks of the school year as part of the district assessment procedures. Classroom teachers administered the computer-adapted standardized assessment in the computer center or media center of their school. English Language Arts instructors administered the expository writing task classroom-wide as part of the district’s mandatory curriculum-based assessment measure. For the writing task used in this study, classroom teachers distributed a packet in October of the school year containing two written passages about the benefits of exercise, directions for the task, a planning sheet, and lined paper. Students were given 120 min to read the passages and write a response in English. Researchers collected students’ written language samples from classroom teachers and requested data from the participating district regarding students’ performance in reading comprehension and vocabulary which included scores on iReady (Curriculum Associates 2018), a computer-adaptive assessment administered district-wide.

2.3. Writing Instrument

The current study used a curriculum-based measure of writing that was used by the partnering school district and administered to all students. The writing prompt challenged writers to a dual purpose, to inform the reader about the benefits of fitness and persuade the reader to consider the benefits of fitness. The directions instructed students to read two passages, plan a response explaining how fitness can contribute to unexpected outcomes, write the response, and revise and edit the response. The first passage pertained to unexpected outcomes of fitness. The second passage, two pages in length, was about the benefits of fitness for an individual who was blind. The passage was seven paragraphs long (one and a half pages double-spaced) and students were given 120 min to compose a written response to the prompt. Since the task of writing in response to a passage is common practice in district and statewide assessments the use of such a measure (as opposed to a researcher-created measure), offers the advantage of being readily recognizable and interpretable by general educators familiar with this common practice. Additionally, the construct validity of the measure is supported by the fact that the measure aligns with accountability and classroom progress monitoring measures currently used in all 3rd–5th-grade classrooms throughout the district.

2.4. Writing Measures

2.4.1. Grammatical Accuracy

The decision to include a measure of grammatical accuracy was based on a number of previous findings suggesting that measures of correct writing sequences were sensitive to student achievement and progress over time (Dockrell et al. 2015; Malecki and Jewell 2003). As such, a broad measure of accuracy based on the proportion of errors was included in the current study. This is consistent with other sources that have reported grammaticality as a proportion of utterances with grammatical errors (Eisenberg and Guo 2013). Grammatical accuracy was calculated by adding the total number of grammatical errors and dividing by the number of sentences in the written response.

2.4.2. Verb Errors

In order to narrow down the specific grammatical error patterns of particular interest for this study, three error codes were created: VE:O, referring to verb omissions; VE:T, referring to verb tense errors; and VE:A, referring to verb agreement errors. Instances where students would omit a verb from their sentences, such as “he overweight”, would be coded as [VE:O]. If the student used a verb but did not use the correct tense, for example, “yesterday he run fast” instead of “yesterday he ran fast,” this would be coded as [VE:T]. Finally, if their sentences did not follow the subject–verb agreement rules—for example, if they write the following sentence, “he like running” instead of “he likes running”—the sentence would be coded as [VE:A]. All other errors (e.g., spelling) were ignored for this study.

2.4.3. Writing Quality

Quality rating scores were based on the rubric adopted by the district, which was consistent with state assessment. The rubric was used to score the written samples on three categories of quality: (a) purpose, focus, and organization; (b) evidence and elaboration; and (c) conventions of Standard English. These elements are consistent with components found in established scoring systems such as the Weschler Objective Language Dimensions (Rust 1996; Wechsler 2005) and previous studies (e.g., Williams et al. 2013).
For the first category of writing quality, students’ writing samples were scored on purpose, focus, and organization. To achieve the maximum 4 points in this category, the student’s written response demonstrated a strong idea with little or no loosely related material; skillful use of transitions; use of grade-level vocabulary to convey ideas; and a logical progression of ideas including an introduction and conclusion. For the second category of writing quality, investigators scored the writing samples on the inclusion of evidence and elaboration. To obtain the maximum number of 4 points, students integrated evidence thoroughly and smoothly using appropriate vocabulary and sentence structure. It is important to note that scores on both category 1 and 2 are influenced by students’ vocabulary knowledge and their ability to cohesively elaborate and convey main ideas. Finally, for the third quality category rating, investigators rated students’ writing on use of conventions of Standard English. To receive a full 2-point rating in this category, students’ responses demonstrated appropriate use of punctuation, capitalization, sentence formation and spelling.
Finally, a composite score was calculated as the sum of the three components. This overall quality of writing rating was aligned with state assessment procedures. As a composite score, the total writing quality rubric score (on a 10-point scale) is purported to reflect original thought, use of text evidence, inferences, implicit understanding, and synthesizing across texts.

2.5. Procedures

All writing samples were transcribed verbatim (maintaining errors) by the investigators and trained research assistants. The Systematic Analysis of Language Transcripts (SALT) software was used to code and analyze the writing samples. Each sample underwent two rounds of error coding. For the first round, the samples were divided among five undergraduate students who had completed training on the codes of interest and demonstrated proficiency in how to code each sample for errors. For the second round of error coding, each transcript was coded by the first author and a research assistant (RA). During this round, the samples were checked for coding accuracy and any missing or incorrect codes were adjusted. SALT was then used to provide descriptive measures for the frequency of each error code, total number of errors per student, and total number of T-units and words per writing sample. To ensure coding reliability, 25% of the final coded writing samples were randomly selected and double coded by the first author and the RA. Interclass Correlations (ICC) estimates for each error code and their 95% confidence intervals were calculated using SPSS Statistics software. For the individual verb error codes, omission, tense, and agreement, inter-rater agreement was 89%, 88%, and 91%, respectively. As reported by Koo and Li (2016), ICC values between 0.75 and 0.9 indicate good reliability.
For human ratings of writing quality, two raters who were blind to the students’ characteristics scored the written samples on each of the three categories of writing quality. The two raters were certified teachers employed by the partnering school district. They held graduate degrees and had worked as writing resource teachers and writing academic coaches for the partnering district. The raters had completed extensive district-provided training on the writing rubric, passed an assessment of writing training, and attended monthly training meetings including regular online scoring courses to recalibrate.
A randomly selected subsample was blindly double rated by both raters independently. When considering any point difference, a disagreement, inter-rater agreement was 70%, 77.5%, and 67.5% for quality subcomponents, respectively. This was above the 60% criteria for ratings of writing quality in published reviews (Graham and Perin 2007). When considering agreement as a point difference of greater than 1, similar to previous studies (e.g., Koutsoftas and Gray 2012; Koutsoftas 2016), an inter-rater agreement of 100%, 100%, and 97.5% was attained.

3. Results

3.1. Verb Error Patterns

To answer the first research question which examined verb error patterns of ELs in the 5th grade, we report descriptive statistics on students’ rate of verb errors specifically. Table 2 shows descriptive statistics on the types of verb error patterns made by EL students and their English-proficient monolingual peers. The verb errors are presented in terms of the ratio of errors per T-units, as transcribed in the SALT software, to account for the different student sample lengths. As displayed in Table 3, ELs demonstrated a higher frequency of verb errors across all types, with the most common verb error type being tense errors. The descriptive statistics also demonstrate that over 50% of the grammatical errors made by ELs were related to verb errors, while less than 25% of grammatical errors were related to verb errors for English-proficient monolinguals.

3.2. Group Differences in Verb Errors

To answer the second research question, we conducted an analysis of variance (ANOVA) between the groups that differed by language proficiency (e.g., monolinguals and ELs) to compare the frequency of verb errors in their writing samples. The frequency of verb errors differed between the two groups, F (1, 154) = 29.218, (p < 0.0001), with a Cohen’s D effect size of 0.69. On average, EL students produced more verb errors per T-units in their writing than their monolingual peers. To further examine the second question, we analyzed the difference between verb error types among the two groups. As displayed by Table 4, there were significant differences between the two groups in the frequency of verb agreement errors.

3.3. Verb Errors and Writing Quality

To answer research question three, we first analyzed descriptive data related to student writing quality outcomes by group, see Table 4. We then calculated Pearson correlation coefficients to assess the relationship between the rate and type of verb errors present in 5th-grade writing samples and composite writing quality scores. There was a significant negative relationship between overall verb errors and composite scores (r = −0.25, p < 0.0001). Each type of verb error also demonstrated a significantly negative relationship between composite quality writing scores, as displayed in Table 5. To further examine the third question, Pearson correlation coefficients were calculated to assess the relationship between the type of verb errors and each category of quality used in the composite writing scores. Verb tense errors demonstrated significant negative relationships with all three components of writing quality ratings, r = −0.22, (p = 0.001); r = −0.20, (p = 0.002); and r = −0.16, (p = 0.02), respectively. Verb agreement errors had a significant negative relationship with ratings of purpose, focus, and organization (r = −0.13, p = 0.05). Further, verb omission errors had a significant negative relationship with ratings of conventions (r = −0.15, p = 0.02). See Table 6 for additional correlations.
To further examine the third research question, measures of grammatical accuracy in writing (proportion of verb errors to number of T-units per sample and overall grammaticality) were entered into a regression analysis. The ratio of verb errors to length were significant predictors of students’ writing quality (refer to Table 7). For every unit increase in the number of errors, composite score for writing quality is predicted to be lower by 0.80 points. With two factors entered (proportion of verb errors to length, and overall grammaticality), the model accounted for a significant but small amount (6%) of the variance in composite writing quality scores, F (2, 228) = 8.78, p < 0.0001.

4. Discussion

The purpose of this study was to examine and describe verb errors in 5th-grade students’ written responses. In addition, this study aimed to determine whether verb errors in writing significantly influenced overall ratings of writing quality. Data analyses revealed three key findings. First, EL students made verb errors in their written responses more often than their English-proficient monolingual peers. Specifically, they had significantly higher verb tense errors when compared to their peers and overall grammatical accuracy was also lower for ELs. Second, the ratio of total number of errors to overall grammatical errors was also significantly different between the two groups, with ELs demonstrating higher overall grammatical errors due to verb errors. With regards to writing quality, there was a significant small negative correlation between overall verb errors and writing quality ratings. Lastly, the total number of verb errors per T-unit was also a significant predictor for writing quality ratings.

4.1. Verb Errors in Writing

The current findings align with previous studies indicating persistent gaps in grammatical accuracy in the written language of EL students. As expected from previous results (e.g., Haznedar 2001; Watcharapunyawong and Usaha 2013), the presence of correct, varied tense-marking morphemes was limited in the EL writing samples. For example, several verb errors present in EL students’ writing samples were related to irregular past tense verbs (e.g., “he win many medals” instead of “he won…”). Consistent with previous research (e.g., Jacobson and Schwartz 2005), EL students were more prone than English-proficient students to overgeneralize the past tense -ed (e.g., “run-ed” for “ran” and “go-ed” for “went”), which is common in younger typically developing English-speaking children (Pence Turnbull and Justice 2016). The overgeneralization of past tense -ed may also be indicative of insufficient knowledge or access to irregular past tense forms. Several of their samples also included omission of BE auxiliary verbs as they included sentences such as “he skiing all day long,” (omission of “was”). Several EL students also demonstrated inconsistent use of third-person singular -s in their written responses, as they were more likely to omit the -s from their verbs (e.g., “he play basketball”).

4.2. Group Differences in Verb Errors

In line with previous literature examining the accuracy and acquisition of different grammatical aspects of language (e.g., Döpke 2000; Gathercole 2002a, 2002b; Gutierréz-Clellen et al. 2008), ELs in the current study also demonstrated higher rates of grammatical errors when compared to their English-proficient peers. This study highlights that more than half (54%) of the grammatical errors demonstrated by EL were attributable to verb accuracy. While errors have been found to be related to knowledge gaps in the language (Ellis 1997; Polio and Lee 2017), the current study outlines the specific gaps between ELs and English-proficient students pertaining to verb knowledge. Although the cause of the observed group differences cannot be determined by the current design, higher rates of verb errors in writing by ELs may be partially attributed to the increased cognitive demands of a writing in a second language (Kellogg 2008; Ortega 2009) and mitigating between the grammatical rules in each language.

4.3. Relation between Verb Errors and Writing Quality Ratings

The findings pertaining to the relation between writing quality ratings and grammaticality are in line with previous studies (e.g., Schoonen et al. 2011; Ortega 2015). Similar to the findings of Guo et al. (2013), in the present study, high occurrences of grammatical errors in written responses were associated with low quality ratings which may have impacted the clarity and coherence of the writing. The current findings related to verb errors as predictors of writing quality align closely to previous studies (e.g., Guo et al. 2013; Kim and Crossley 2018), substantiating that grammatical errors related to verb form contribute to low quality ratings. This finding is concerning considering that rubric-based writing quality ratings are commonly used in many school districts, state assessments, and high-stakes testing as a core metric in the evaluation of writing achievement and advancement (Wagner et al. 2011). In light of the essential role of writing quality in academic success, the current findings substantiate the importance of addressing verb errors and lend support for the need for intensified instructional efforts focused on verb errors.

4.4. Limitations

Results should be interpreted cautiously, recognizing that only one written response was collected per student. It cannot be assumed that the same findings would result with additional samples or if the task elicited specific verb forms. Because it was an open-ended written response, it is possible that students may have avoided using specific verbs or verb forms that they were uncertain about. Although it was beyond the scope of the current study to examine factors that influenced verb selection, prior research supports the notion that ELs are likely to rely on high frequency verbs (Durrant and Brenchley 2019) and early developing or nonspecific verbs (Hasselgren 1994).
It should be noted that although this study focused on three verb error types, the measures used in the current study were not intended to serve as a proxy for all types of errors or grammatical knowledge broadly. We recognize that other types of grammatical errors, not categorized in the current study, may have been present and/or prevalent. Although the measures selected may account for only a portion of the grammatical errors demonstrated, it was not feasible in this exploratory study to categorize every potential type of grammatical error. As such, further study is planned to examine other types of errors within the students’ responses.
Finally, due to the limited information related to EL students’ English language skills at the time of the writing assessment, these findings should be interpreted cautiously. Additional information related to length of English exposure for each student, current English speaking, writing, and reading skills, and socioeconomic status would substantiate the findings of this study. It is important to note that the abovementioned factors may influence students’ writing experience and consequently the number of grammatical errors present in student writing.

4.5. Implications

Despite the limitations, the results of the current study offer insights to inform instruction. The findings have implications for educational personnel in informing areas to target for intensive instructional supports, including the importance of grammatical accuracy, explicit instruction of grammatical markers related to verb tenses and subject–verb agreement rules, and the importance of high-quality writing. Additionally, the group differences would suggest that we cannot assume that all students have sufficient supports to acquire the grammatical rules of English and construct grammatically correct sentences in academic writing. As such, the use of explicit instructional activities addressing verb tenses and subject–verb agreement may be necessary in the contexts of academic writing and to further support effective use of written English in the classroom and real-life settings, although further empirical study is needed.
As mentioned in previous literature, adult EL students have expressed a desire for further grammatical knowledge and skills relating to verb tenses and subject–verb agreement. These desires may be addressed earlier in second language learning through explicit instruction and multiple opportunities in the classroom may off-set future difficulties in writing. Given that the rubric used in the current study is based on common writing assessments used in classrooms, the current finding that verb errors significantly predict lower writing quality ratings supports the importance of incorporating more explicit focus on verb knowledge in the classroom.
Although not surprising, the finding that there was a negative relation between verb errors and overall writing quality denotes the importance of addressing these errors early on in language learning. As high-quality writing samples tend to have more complex grammar and morphosyntax, mastering tense markers and overall grammatical accuracy earlier should positively impact writing quality. Given that academic writing has a large influence on one’s future academic, social, and emotional well-being, addressing factors that impact the quality of writing, such as verb accuracy, earlier and explicitly should be paramount when developing classroom instruction.

4.6. Future Research

Further studies are needed to expand on this research and consider examining and describing a greater range of grammatical errors. It is also important to consider different cultural factors that may influence certain manifestations of different errors in writing. For example, the omission of BE verbs could be attributable to dialect or language transfer. It is important to note that the types of errors demonstrated by ELs may be partially influenced by differences in grammatical rules in students’ heritage language, specifically the rules for verb conjugation and agreement. In addition, the use of different writing tasks could elicit a better depiction of writing capabilities. While argumentative writing warrants the use of more complex syntactic structures than narrative tasks, it would still be beneficial to understand student abilities across tasks. Comparing student written and oral language errors may also better illustrate language proficiency. Finally, future studies should also consider providing descriptive information in terms of the types of verbs, verb frequency, and different number of verbs produced by the students to further quantify student verb knowledge and accuracy.

5. Conclusions

Given the importance of becoming a skilled writer, the findings of the current study provide evidence to support the need to identify effective instructional strategies to improve English writing and grammar acquisition for EL students. In light of the fact that EL students had significantly higher verb tense errors and lower grammatical accuracy when compared to their peers, it is paramount to focus instructional efforts to promote EL students’ academic success. Finally, as academic standards underscore the importance of early writing proficiency, instruction should focus on ensuring that EL students have a grasp of English grammatical rules. Specific focus is needed on tense marking which was shown to influence ratings of overall writing quality in the current study. While quality ratings take several aspects of writing into consideration, the negative impact of verb errors on writing quality ratings in this study highlights the need to identify and address knowledge gaps that EL students may have pertaining to verb form and function.

Author Contributions

Conceptualization, K.F. and C.W.; investigation, C.W.; writing—original draft preparation, K.F.; writing—review and editing, K.F. and C.W.; funding acquisition, K.F. All authors have read and agreed to the published version of the manuscript.

Funding

The first author (Fumero) is funded by the Bilingual Oral Language and Literacy Development personnel development grant, U.S. Department of Education [Grant H325D140068]. The research reported here was supported by the Institute of Education Sciences, U.S. Department of Education, through Grant R305L180019 to Florida State University. The opinions expressed are those of the authors and do not represent views of the Institute or the U.S. Department of Education.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Institutional Review Board (or Ethics Committee) of Florida State University (HCS # 2018.25857).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are not available due to the nature of this research. Participants did not agree to have their data shared publicly.

Acknowledgments

The authors are especially grateful to Jenna Gurklis for her assistance and Temetia Creed for her insights and assistance.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Arfé, Barbara, Julie E. Dockrell, and Bianca De Bernardi. 2016. The effect of language specific factors on early written composition: The role of spelling, oral language and text generation skills in a shallow orthography. Reading and Writing 29: 501–27. [Google Scholar] [CrossRef]
  2. Bhatia, Tej K., and William C. Ritchie, eds. 2012. The Handbook of Bilingualism and Multilingualism. Hoboken: John Wiley and Sons. [Google Scholar]
  3. Celaya, María Luz. 2019. The Emergence and Development of Syntactic Patterns in EFL Writing in a School Context: A Longitudinal Study. Languages 4: 41. [Google Scholar] [CrossRef] [Green Version]
  4. Crossley, Scott A., and Danielle S. McNamara. 2012. Predicting second language writing proficiency: The roles of cohesion and linguistic sophistication. Journal of Research in Reading 35: 115–35. [Google Scholar] [CrossRef]
  5. Crossley, Scott A., and Danielle S. McNamara. 2014. Does writing development equal writing quality? A computational investigation of syntactic complexity in L2 learners. Journal of Second Language Writing 26: 66–79. [Google Scholar] [CrossRef]
  6. Cumming, Alister. 1990. Metalinguistic and ideational thinking in second language composing. Written Communication 7: 482–11. [Google Scholar] [CrossRef]
  7. Curriculum Associates. 2018. I-Ready Teacher Training Guide Curriculum Associates. North Billerica: Curriculum Associates. [Google Scholar]
  8. Dockrell, Julie. E., Vincent Connelly, Kirsty Walter, and Sarah Critten. 2015. Assessing children’s writing products: The role of curriculum-based measures. British Educational Research Journal 41: 575–95. [Google Scholar] [CrossRef]
  9. Döpke, Susanne. 2000. Generation of and retraction from cross-linguistically motivated structures in bilingual first language acquisition. Bilingualism Language and Cognition 3: 209–26. [Google Scholar] [CrossRef] [Green Version]
  10. Dulay, Heidi C., and Marina K. Burt. 1974. Natural sequences in child second language acquisition. Language Learning 24: 37–53. [Google Scholar] [CrossRef]
  11. Durrant, Philip, and Mark Brenchley. 2019. Development of vocabulary sophistication across genres in English children’s writing. Reading and Writing 32: 1927–53. [Google Scholar] [CrossRef] [Green Version]
  12. Eisenberg, Sarita L., and Ling-Yu Guo. 2013. Differentiating children with and without language impairment based on grammaticality. Language, Speech, and Hearing Services in Schools 44: 20–31. [Google Scholar] [CrossRef] [Green Version]
  13. Ellis, Rod. 1997. Second Language Acquisition. Oxford: Oxford University Press. [Google Scholar]
  14. Fareed, Muhammad, Almas Ashraf, and Muhammad Bilal. 2016. ESL learners’ writing skills: Problems, factors and suggestions. Journal of Education and Social Sciences 4: 81–92. [Google Scholar] [CrossRef]
  15. Figueiredo, Sandra. 2019. Competition Strategies during Writing in a Second Language: Age and Levels of Complexity. Languages 4: 11. [Google Scholar] [CrossRef] [Green Version]
  16. Gathercole, Virginia C. Mueller. 2002a. Monolingual and bilingual acquisition: Learning different treatments of that-trace phenomena in English and Spanish. In Language and Literacy in Bilingual Children. Edited by D. Kimbrough Oller and Rebecca E. Eilers. Bristol: Multilingual Matters, pp. 220–54. [Google Scholar]
  17. Gathercole, Virginia C. Mueller. 2002b. Command of the mass/count distinction in bilingual and monolingual children: An English morphosyntactic distinction. In Language and Literacy in Bilingual Children. Edited by D. Kimbrough Oller and Rebecca E. Eilers. Bristol: Multilingual Matters, pp. 175–206. [Google Scholar]
  18. Gathercole, Virginia C. Mueller, and Enlli Môn Thomas. 2005. Minority language survival: Input factors influencing the acquisition of Welsh. In Proceedings of the 4th International Symposium on Bilingualism. Somerville: Cascadilla Press, pp. 852–74. [Google Scholar]
  19. Gathercole, Virginia C. Mueller, D. Kimbrough Oller, and Rebecca E. Eilers. 2002. Grammatical gender in bilingual and monolingual children: A Spanish morphosyntactic distinction. In Language and Literacy in Bilingual Children. Edited by D. Kimbrough Oller and Rebecca E. Eilers. Bristol: Multilingual Matters, pp. 207–19. [Google Scholar]
  20. Genesee, Fred. 2004. What do we know about bilingual education for majority language students? In The Handbook of Bilingualism. Edited by Tej K. Bhatia and William C. Rtichie. Oxford: Blackwell Publishing Ltd., pp. 547–76. [Google Scholar]
  21. Graham, Steve, and Dolores Perin. 2007. A meta-analysis of writing instruction for adolescent students. Journal of Educational Psychology 99: 445–86. [Google Scholar] [CrossRef] [Green Version]
  22. Guo, Liang, Scott A. Crossley, and Danielle S. McNamara. 2013. Predicting human judgments of essay quality in both integrated and independent second language writing samples: A comparison study. Assessing Writing 18: 218–38. [Google Scholar] [CrossRef]
  23. Gutiérrez-Clellen, Vera F., Gabriela Simon-Cereijido, and Christine Wagner. 2008. Bilingual children with language impairment: A comparison with monolinguals and second language learners. Applied Psycholinguistics 29: 3. [Google Scholar] [CrossRef] [Green Version]
  24. Hakuta, Kenji. 1978. A Report on the Development of the Grammatical Morphemes in a Japanese Girl Learning English as a Second Language. In Second Language Acquisition: A Book of Readings. Edited by Evelyn Marcussen Hatch. Newbury: Newbury House Publishers, pp. 132–47. [Google Scholar]
  25. Hasselgren, Angela. 1994. Lexical teddy bears and advanced learners: A study into the ways Norwegian students cope with English vocabulary. International Journal of Applied Linguistics 4: 237–58. [Google Scholar] [CrossRef]
  26. Haznedar, Belma. 2001. The acquisition of the IP system in child L2 English. Studies in Second Language Acquisition 23: 1–39. Available online: https://www.jstor.org/stable/44486530 (accessed on 12 June 2019). [CrossRef]
  27. Jacobson, Peggy F., and Richard G. Schwartz. 2005. English past tense use in bilingual children with language impairment. American Journal of Speech-Language Pathology 14: 313–23. [Google Scholar] [CrossRef]
  28. Kellogg, Ronald T. 2001. Competition for working memory among writing processes. The American Journal of Psychology 114: 175. [Google Scholar] [CrossRef]
  29. Kellogg, Ronald T. 2008. Training writing skills: A cognitive developmental perspective. Journal of Writing Research 1: 1–26. [Google Scholar] [CrossRef]
  30. Kim, Minkyung, and Scott A. Crossley. 2018. Modeling second language writing quality: A structural equation investigation of lexical, syntactic, and cohesive features in source-based and independent writing. Assessing Writing 37: 39–56. [Google Scholar] [CrossRef]
  31. Koo, Terry K., and Mae Y. Li. 2016. A guideline of selecting and reporting intraclass correlation coefficients for reliability research. Journal of chiropractic medicine 15: 155–63. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  32. Koutsoftas, Anthony D. 2016. Writing process products in intermediate-grade children with and without language-based learning disabilities. Journal of Speech, Language, and Hearing Research 59: 1471–83. [Google Scholar] [CrossRef] [PubMed]
  33. Koutsoftas, Anthony D., and Shelley Gray. 2012. Comparison of narrative and expository writing in students with and without language-learning disabilities. Language, Speech, and Hearing Services in Schools 43: 395–409. [Google Scholar] [CrossRef]
  34. Kyle, Kristopher, and Scott Crossley. 2016. The relationship between lexical sophistication and independent and source-based writing. Journal of Second Language Writing 34: 12–24. [Google Scholar] [CrossRef]
  35. Lakshmanan, Usha. 1994. Universal Grammar in Child Second Language Acquisition: Null Subjects and Morphological Uniformity. Amsterdam: John Benjamins. [Google Scholar]
  36. Malecki, Christine Kerres, and Jennifer Jewell. 2003. Developmental, gender, and practical considerations in scoring curriculum-based measurement writing probes. Psychology in the Schools 40: 379–90. [Google Scholar] [CrossRef]
  37. Matuchniak, Tina, Carol Booth Olson, and Robin Scarcella. 2014. Examining the text-based, on-demand, analytical writing of mainstreamed Latino English learners in a randomized field trial of the Pathway Project intervention. Reading and Writing 27: 973–94. [Google Scholar] [CrossRef]
  38. Mellow, J. Dean. 2008. The emergence of complex syntax: A longitudinal case study of the ESL development of dependency resolution. Lingua 118: 499–521. [Google Scholar] [CrossRef]
  39. National Commission on Writing for America’s Families, Schools, and Colleges. 2004. Writing: A Ticket to Work... or a Ticket out. A Survey of Business Leaders. Available online: http://www.writingcommission.org/prod downloads/writing.com/writing-ticket-to-work.pdf (accessed on 12 June 2019).
  40. Nayan, Surina, and Kamaruzaman Jusoff. 2009. A Study of Subject-Verb Agreement: From Novice Writers to Expert Writers. International Education Studies 2: 190–94. [Google Scholar] [CrossRef] [Green Version]
  41. NCES (U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics). 2012. The Nation’s Report Card: Writing 2011. National Center for Education Statistics. Available online: https://www.nationsreportcard.gov/writing_2011/g8_national.aspx?tab_id=tab2andsubtab_id=Tab_1#chart (accessed on 12 June 2019).
  42. NCES (U.S. Department of Education, National Center for Education Statistics, Institute of Education Sciences). 2016. English Language Learners in Public Schools. The Condition of Education. Available online: https://nces.ed.gov/programs/coe/indicator_cgf.asp (accessed on 12 June 2019).
  43. Olson, Carol Booth, Tina Matuchniak, Huy Q. Chung, Rachel Stumpf, and George Farkas. 2017. Reducing achievement gaps in academic writing for Latinos and English learners in Grades 7–12. Journal of Educational Psychology 109. [Google Scholar] [CrossRef]
  44. Ortega, Lourdes. 2009. Studying writing across efl contexts: Looking back and moving forward. In Writing in Foreign Language Contexts: Learning, Teaching, and Research. Edited by R. Manchón. Bristol: Multilingual Matters, pp. 165–81. [Google Scholar]
  45. Ortega, Lourdes. 2015. Syntactic complexity in L2 writing: Progress and expansion. Journal of Second Language Writing 29: 82–94. [Google Scholar] [CrossRef]
  46. Paradis, Johanne. 2005. Grammatical morphology in children learning English as a second language. Language, Speech, and Hearing Services in Schools 36: 172–87. [Google Scholar] [CrossRef]
  47. Paradis, Johanne. 2010. Bilingual children’s acquisition of English verb morphology: Effects of language exposure, structure complexity, and task type. Language Learning 60: 651–80. [Google Scholar] [CrossRef]
  48. Paradis, Johanne, and Fred Genesee. 1996. Syntactic acquisition in bilingual children: Autonomous or interdependent? Studies in Second Language Acquisition, 1–25. [Google Scholar] [CrossRef]
  49. Pearson, Barbara Zurer. 2002. Narrative competence among monolingual and bilingual school children in Miami. In Language and Literacy in Bilingual Children. Edited by D. Kimbrough Oller and Rebecca E. Eilers. Bristol: Multilingual Matters, pp. 135–74. [Google Scholar]
  50. Pence Turnbull, Khara L., and Laura M. Justice. 2016. Language Development from Theory to Practice, 3rd ed. London: Pearson. [Google Scholar]
  51. Penner-Williams, Janet, Tom E. C. Smith, and Barbara C. Gartin. 2009. Written language expression: Assessment instruments and teacher tools. Assessment for Effective Intervention 34: 162–69. [Google Scholar] [CrossRef]
  52. Pierangelo, Roger, and George Giuliani. 2006. The Special Educator’s Comprehensive Guide to 301 Diagnostic Tests: Revised and Expanded Edition. Hoboken: Jossey-Bass. [Google Scholar]
  53. Polio, Charlene, and Jongbong Lee. 2017. Written language learning. In The Routledge Handbook of Instructed Second Language Acquisition. Edited by Shawn Loewen and Masatoshi Sato. Milton Park: Routledge, pp. 299–318. [Google Scholar]
  54. Rust, John. 1996. The Manual of the Weschler Objective Language Dimensions (WOLD). New York: Psychological Corporation. (In English) [Google Scholar]
  55. Schoonen, Rob, Amos van Gelderen, Reinoud D. Stoel, Jan Hulstijn, and Kees de Glopper. 2011. Modeling the development of L1 and EFL writing proficiency of secondary school students. Language Learning 61: 31–79. [Google Scholar] [CrossRef]
  56. Wagner, Richard K., Cynthia S. Puranik, Barbara Foorman, Elizabeth Foster, Laura Gehron Wilson, Erika Tschinkel, and Patricia Thatcher Kantor. 2011. Modeling the development of written language. Reading and Writing 24: 203–20. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  57. Watcharapunyawong, Somchai, and Siriluck Usaha. 2013. Thai EFL Students’ Writing Errors in Different Text Types: The Interference of the First Language. English Language Teaching 6: 67–78. [Google Scholar] [CrossRef] [Green Version]
  58. Wechsler, David. 2005. Wechsler Individual Achievement Test (WIATT-II). London: Pearson. [Google Scholar]
  59. Williams, Jessica. 2012. The potential role (s) of writing in second language development. Journal of Second Language Writing 21: 321–31. [Google Scholar] [CrossRef]
  60. Williams, Gareth. J., Rebecca F. Larkin, and Samarita Blaggan. 2013. Written language skills in children with specific language impairment. International Journal of Language and Communication Disorders 48: 160–71. [Google Scholar] [CrossRef] [PubMed]
  61. Wu, Hsiao-Ping, and Esther V. Garza. 2014. Types and Attributes of English Writing Errors in the EFL Context—A Study of Error Analysis. Journal of Language Teaching and Research 5. [Google Scholar] [CrossRef]
  62. Yip, Virgina, and Stephen Matthews. 2000. Syntactic transfer in a Cantonese–English bilingual child. Bilingualism: Language and cognition 3: 193–208. [Google Scholar] [CrossRef] [Green Version]
Table 1. Participant Demographics.
Table 1. Participant Demographics.
English Learners
(n = 112)
English Proficient
(n = 131)
Characteristic nPercentnPercent
Free/Reduced
Lunch (FRL)Eligible94849270
Not Eligible18163930
Race/EthnicityHispanic102924030.5
Black 43.65340.5
White 10.92821.4
Asian21.800
Multiracial 21.8107.6
Home LanguageEnglish1513.412897.7
Spanish9282.132.3
Chinese10.900
Haitian Créole (French Creole)10.900
Creole10.900
Vietnamese10.900
Other10.900
Table 2. Participant English Vocabulary and Reading Comprehension Skills.
Table 2. Participant English Vocabulary and Reading Comprehension Skills.
English Learners
(n = 93)
English Proficient
(n = 102)
Characteristic MSDMSD
Comprehension: Informational Text524.8946.93562.8549.86
Comprehension: Literature522.1640.54563.9441.35
Vocabulary512.4748.47558.9452.31
Note. Grade-level expectations for typical performance in the 5th grade is 630–640.
Table 3. Descriptive Statistics of Verb Use and Error Patterns by Group.
Table 3. Descriptive Statistics of Verb Use and Error Patterns by Group.
EL Students
(n = 112)
English-Proficient Students
(n = 131)
MSDMSD
Overall Grammatical Errors0.900.890.670.61
Total Verb Errors0.410.460.160.22
Verb Agreement Errors0.080.120.040.09
Verb Tense Errors0.210.370.070.14
Verb Omission Errors0.120.210.040.07
Verb Error and Grammaticality0.560.680.240.25
Note. All ratios were calculated by number of T-units per sample.
Table 4. EL and English-Proficient Students Verb Error Comparisons.
Table 4. EL and English-Proficient Students Verb Error Comparisons.
Mean DifferenceFdfp
Verb Agreement Errors0.025.8351, 2410.016
Verb Tense Errors0.1416.6431, 241<0.0001
Verb Omission Errors0.0818.1451, 241<0.0001
Verb Error and Grammaticality0.3225.0381, 241<0.0001
Note. ANOVA was performed with Welch corrections.
Table 5. Writing Quality Scores for ELs and English-Proficient Students.
Table 5. Writing Quality Scores for ELs and English-Proficient Students.
EL Students
(n = 112)
English-Proficient Students
(n = 131)
MSDMSD
Composite Score2.84 1.614.341.57
Category 1 (purpose, focus, and organization)1.100.611.610.59
Category 2 (use of evidence and elaboration)1.050.561.510.53
Category 3 (accurate English conventions)0.700.621.220.66
Note. Category 1, 2, and 3 are all individual scores used for the quality composite writing score.
Table 6. Verb Errors and Writing Quality Correlations.
Table 6. Verb Errors and Writing Quality Correlations.
Measure123456789
1. Category 11
2. Category 20.89 **1
3. Category 30.69 **0.67 **1
4. Quality Composite 0.94 **0.93 **0.87 **1
5. Total Verb Errors−0.25 **−0.23 **−0.21 **−0.25 **1
6. Agreement Errors−0.13 *−0.11−0.11−0.130.57 **1
7. Tense Errors−0.22 **−0.20 **−0.16 *−0.22 **0.88 **0.40 **1
8. Omission Errors−0.11−0.10−0.15 *−0.14 *0.43 **−0.030.041
9. Grammaticality −0.24 **−0.20 **−0.21 **−0.24 **0.67 **0.49 **0.52 **0.34 **1
Note. Category 1, 2, and 3 are all individual scores used for the quality composite writing score. ** Correlation is significant at the 0.01 level. * Correlation is significant at the 0.05 level.
Table 7. Summary of Regression Results for Predictor Variables of Writing Quality.
Table 7. Summary of Regression Results for Predictor Variables of Writing Quality.
Dependent VariableRR2t Value pBeta
Number of Verb Errors0.250.06t(2, 228) = −2.010.05−0.18
Overall Grammaticality0.110.12t(2, 228) = −1.360.18−0.12
Note. Number of Verb Errors refers to the number of verb errors divided by the number of T-units in the written response. Overall grammaticality refers to the number of grammatical errors divided by the number of T-units in the written response.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Fumero, K.; Wood, C. Verb Errors in 5th-Grade English Learners’ Written Responses: Relation to Writing Quality. Languages 2021, 6, 71. https://doi.org/10.3390/languages6020071

AMA Style

Fumero K, Wood C. Verb Errors in 5th-Grade English Learners’ Written Responses: Relation to Writing Quality. Languages. 2021; 6(2):71. https://doi.org/10.3390/languages6020071

Chicago/Turabian Style

Fumero, Keisey, and Carla Wood. 2021. "Verb Errors in 5th-Grade English Learners’ Written Responses: Relation to Writing Quality" Languages 6, no. 2: 71. https://doi.org/10.3390/languages6020071

Article Metrics

Back to TopTop