Next Article in Journal
Real-Time Demand Forecasting and Multi-Resolution Model Predictive Control for Water Distribution Networks
Previous Article in Journal
Effects of Catalysts on the Structure and Piezoelectric Properties of PVDF/ZnO Nanowires for the Robotic Tactile Sensor
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Impact of Automated Writing Evaluation System and Revision Processes on College Students’ Writing Skills in English as a Foreign Language Course †

Department of Applied English, Chaoyang University of Technology, Taichung 413310, Taiwan
*
Author to whom correspondence should be addressed.
Presented at the 2024 IEEE 4th International Conference on Electronic Communications, Internet of Things and Big Data, Taipei, Taiwan, 19–21 April 2024.
Eng. Proc. 2024, 74(1), 52; https://doi.org/10.3390/engproc2024074052
Published: 5 September 2024

Abstract

:
Previous researchers established the foundation for future scholars to acknowledge the significance of the Automated Writing Evaluation (AWE) system by highlighting the crucial function of revision in enhancing writing skills. We examined the impact of the AWE system and the act of revising the writing skills of 59 students at a beginner level in the English as a Foreign Language (EFL) course. The participants were divided into two groups, and each group utilized the AWE system for ten combined assignments. The experimental group, consisting of 30 students, was required to modify their work based on the feedback provided by the AWE system. The data collection process entailed assessing assignment grades, pre-test and post-test scores, and administering a survey to gauge the efficacy of the AWE system. The result of the t-test conducted during the semester demonstrated a statistically significant improvement in the writing proficiency of the experimental group. Integrating the AWE system with revision activities significantly enhanced the writing proficiency of learners in the EFL course.

1. Introduction

In Taiwan, the focus of English education is placed on reading, which obscures the advancement of free thinking in language learning. The scholarly way presents difficulties for students in communicating their thoughts in English and considerable troubles in composition and talking [1]. Despite a long period of English training in class, college students need assistance in English writing from their perspectives, which necessitates more composition writing practice for learners of English as a Foreign Language (EFL) courses [2]. Consistent preparation and persistent practice are pivotal in English learning. In reading, steady training can increase students’ capacity to communicate in English as a second language [3].
The importance of revision and feedback in refining one’s writing is central to language education. Abshir and Ebadi [4] proposed an iterative methodology of composing drafts, soliciting peer critique, reworking submissions based on peer input, and then repeating this cyclical process. This is widely heralded as an exceptionally impactful means of honing composition abilities. Barrot’s research [5] highlights how this recursive practice positively shapes the precision with which learners express themselves in writing and revising, considering feedback yields substantial improvements in writing quality.
This feedback-and-revision loop advocated by scholars extends beyond merely enhancing linguistic accuracy. It functions as a catalyst for evolving learners’ ideas and expression, facilitating a deeper grasp of language nuances. The iterative process addresses difficulties faced in Taiwanese education by offering a customized remedy to strengthen language proficiency, critical thinking, and communication skills.
By adopting this methodology, language instructors are equipped with a pedagogical tool beyond surface-level corrections. The process becomes dynamic, enabling learners to actively cultivate their linguistic abilities and conceptual understanding. The interplay between initial drafts by receiving peer input and reworking submissions augments linguistic precision and fosters holistic growth in students’ communication aptitude. This methodology aligns with evolving language education priorities, emphasizing accuracy and cultivating individuals capable of proficient, lucid communication.

2. Literature Review

2.1. Automated Writing Evaluation (AWE)

In the academic community of Taiwan, college instructors dedicate considerable effort to elevating their students’ writing prowess. This entails meticulous criticism and guidance to refine pupils’ compositions. As Chen and Pan [6] emphasized, educators are responsible for this role. At the same time, Chen acknowledged the potential for the AWC system to lighten lecturers’ workload while enhancing learners’ abilities. Buttressing this perspective, Fan and Ma’s exploration revealed that such technology not only trims time spent on mundane marking, but also positively impacts undergraduates’ writing expertise over the long term [7]. Consequently, the need for AWE in pedagogical settings is increasing because of its manifold advantages. Occasionally, a longer or more elaborate sentence helps illustrate a nuanced point, whereas brevity suits clarifying straightforward ideas. Overall, variation in phrasing and complexity supports maintained student engagement.
Miranty and Utami [8] emphasized that the primary goal of AWE is to facilitate consistent practice and provide appropriate feedback. These systems offer prompt, impartial, consistent assessments, ensuring learners receive well-timed and pertinent corrections. This method empowers students to actively engage in writing by taking advantage of immediate modifications and recommendations to refine their abilities. Integrating such evaluations is pivotal for modernizing Taiwan’s educational infrastructure and enhancing its higher-education institutions.
This viewpoint underscores the potent sway of AWE systems on pedagogy and scholarship. These technologies can upend essay instruction at Taiwanese universities by lightening instructors’ marking burden while offering learners added chances to hone their craft through iterative drafting and feedback. The proliferating presence of AWE mirrors a more extensive movement toward infusing Edtech and refining educational productivity across the island. This shift promises to meaningfully boost the art of teaching composition and undergraduates’ writing acumen through targeted practice and assessment. Alongside shorter, less complex compositions, lengthier and more convoluted constructions are employed to match the word count while increasing burstiness and perplexity.

2.2. Revision Process

Flower and Hayes [9] summarized the writing process in three crucial stages: planning, translating ideas to text, and reviewing one’s work. First, authors conceptualize their content and devise an organizational scheme. Next, thoughts are transformed into written language. Lastly, careful evaluation and alteration of the written piece improves its quality. Each step is indispensable for constructing a logical, cohesive work.
Geng and Razali [10] emphasized the importance of revision for enhancing proficiency, arguing that students can scrutinize mistakes and hone skills. It grows through reworking text, understanding nuances, and implementing varied techniques. Several researchers expanded that revision demands deep engagement, not just surface edits, requiring comprehending beyond a base level to identify and fix errors, as Kuyyogsuy claimed [11]. Additionally, Zamel [12] spotlighted writing as an exploratory act for growth, as molding ideas sometimes reshape them. The all-encompassing process—planning, writing, re-examining, reworking—highlights writing’s intricate essences. It demands a deft handling of each component and drives overall augmentation of learning and abilities. Employing multiple methodologies underscores the complexity of cultivating mastery piece by piece.

2.3. Assistive Technology for Writing

Technological applications in foreign language instruction positively impact pupils’ educational experiences. Wangin discussed in “Technology Serving Linguistic Acquisition: Developments and Issues” the notable function of digital tools in facilitating language learning [13]. This perspective was adopted by Miranty and Utami [8], who advocated for widespread adoption among language learners and Computer-Aided Language Learning (CALL) educators. They underscore CALL’s transformative potential for enhancing the language learning process. Escalante, Pack, and Barrett [14] explored the distinct benefits of technology in aiding linguistic acquisition, specifically concerning writing. They found that employing diverse digital tools significantly boosted compositional skills. Such instruments include translation programs, editing software, word processors, spell checkers, word predictors, and speech recognition devices. Previous studies emphasized software design’s significance, asserting it critically connects digital tools to learners’ needs. This view underscores how adeptly incorporating these specialized instruments into pedagogical approaches optimizes effectiveness. Thi and Nikolov [15] further underscored technology’s importance in education, particularly for university students. A substantial number of college pupils had a strong inclination to use technology for academic endeavors, especially regarding improving English composition abilities. Previous research showed that properly utilizing technology can greatly enhance linguistic learning, especially writing skills. This perspective proposes that properly employing technology in language education can significantly boost students’ proficiency and learning outcomes.

3. Methodology

3.1. Participants

This research was carried out at a technological university in central Taiwan with 59 undergraduates enrolled in the Department of Practical English Studies. Selection requirements for these learners included a uniform educational background, no less than a decade’s worth of English acquisition, and proactive and steady participation evidenced by completing each of the written assignments during the semester in the Introductory English Composition class. The participants were divided into a control group comprising 29 participants focusing on drafting their submissions, and an experimental group consisting of 30 students who drafted their assignments and actively engaged in the modification process.
The purpose of grouping the participants was to assess the impact of reworking on authorship skills. The outcomes of the additional modification of the writings of the experimental group were compared to those of the control group, which was not involved in the procedure. The impact of the procedure was compared to understand the improvement in the participant’s writing in an academic environment.

3.2. Instruments and Setting

QBook is a specialized digital writing tool that automates grading students’ assignments, allowing instructors to focus on providing feedback to improve students’ writing skills. Through an online platform accessible at https://writingcenter.qbook.org (accessed on 17 January 2017), the program automatically evaluates and re-evaluates work submitted, making the process significantly efficient. QBook lightens instructors’ workloads and cultivates learners’ vocabulary, syntaxes, semantics, and critical thinking through thoughtful commentary. Moreover, QBook’s compatibility with various devices increases practicality regarding task completion. We investigated the impact of drafting and revising on students’ writing development. Progress was monitored through regular homework assessments and standardized testing. Initially, QBook evaluated assignments and issued preliminary scores while teachers conducted additional manual reviews to ensure accuracy, considering locale-specific details that were potentially overlooked. This dual method guaranteed thorough individual analysis. The participants were split into two groups to measure the effectiveness of drafting and revision—one group drafting only, and the other group drafting and revising. Pre- and post-testing were conducted using passages from the Intermediate Level Writing Test of the General English Proficiency Test to gauge proficiency objectively. Scores from the evaluations provided the data to explore the importance of revision for strengthening writing abilities, and the results offer valuable perspectives on differences between those who revised and those who did not.

4. Results

4.1. Efficacy of Revision Process

The independent samples t-test was conducted to assess the English writing skills of the experimental and control groups. The experimental group exhibited a median score of 58.433, whereas the control group displayed an average score of 62.655. Table 1 demonstrates equivalent writing abilities during the pre-test stage. These results indicated no significant dissimilarity in the initial writing skills of the participants before using AWE. The similar early performance of the two groups was the baseline to comprehend the impact of the intercessions on their English writing proficiency.
Table 2 displays the disparate writing performances of students in the experimental and control groups following task completion. The results revealed a discrepancy between these two groups. The experiment group obtained a substantially higher median score of 82.8 with revisions. The control group scored 64.552 on average. The performance of the two groups was significantly different, with a t-value of 3.84 and a p-value of <0.001.
The experiment group demonstrated significantly enhanced writing abilities, substantiating the advantageous effect of revision processes on compositional skills. The experimental group’s higher average score indicated that revision improved the writing of the participants and highlighted the educational value of the revision practice in language education.

4.2. Influence of Revision on Students’ Writing Performance

Table 3 shows that the experimental group showed substantial progress and scored 82.8 in the post-test compared to 58.433 in the pre-test. The standard deviation was significantly reduced from 32.724 in the pre-test to 11.124 in the post-test. The diminished variance indicated that participant’s writing ability converged close to the norm. The experimental group exhibited substantial enhancement in their achievement and accomplished a level closer to the standard in the course (p-value of less than 0.001). The results presented a meaningful and substantial betterment in the writing expertise of the experimental group and highlighted the effectiveness of the revision process in cultivating individual skill progress and facilitating the advancement of writing in English.
Table 4 summarizes the decline in writing mistakes, an increased number of words completed by individuals in the experimental group, and a substantial decrease in errors from the pre-test to the post-test, underlining the advantageous influence of the training method. There was a significant reduction in the average number of mistakes, which was diminished from 22.70 in the pre-test to 9.67 in the post-test. The decrease in errors demonstrated a noteworthy improvement in the writing abilities of the experimental group. The average number of words in the experimental group fluctuated between 240.77 and 273.3. The increase in the number of words illustrated a significant advancement in writing productivity (a p-value of 0.00). Such outcomes emphasized the positive effect of the revision process on decreasing mistakes and increasing vocabulary, resulting in overall enhancements in the writing abilities of the experimental group.

5. Discussion and Conclusions

We explored the impacts of revision practices and AWE systems on the English writing proficiency of learners. The results presented the pivotal role of integrating AWE into English learning and enhancing students’ writing abilities. The participants engaged in composition assignments to reduce repetitive errors. While AWE helped correct mistakes, the clarity of interpreting machine feedback is still required. This underscores the role of educators in guiding learners, highlighting the necessity of combining technological aids with conventional teacher-led instruction in teaching English writing. Similar to the result of Batchelor and King [16], the results of this study emphasized the significance of revision in increasing writing skills, thereby leading to a noticeable advancement in writing abilities. This advancement is attributed to the revision by AWE and timely feedback. This aligns with previous research results indicating the advantages of AWE in the revision stage [17,18]. AWE helps learners correct mistakes, thereby streamlining the revision process. Coinciding with the results of Geng and Razali [9], a preference for immediate feedback and the intricate interplay between AWE tools, student preferences, and the overall effectiveness of writing instruction were also proven in this study. AWE systems can meaningfully enhance the writing abilities of students. Educators must guide students to use automated feedback in the learning process. The participants in this study improved their compositions, indicating that while automated tools offer notable benefits, human intuition remains irreplaceable. By balancing between technology and instruction, effective composition competence can be achieved to maximize the efficiency of writing education in EFL courses in a technologically augmented learning environment.

Author Contributions

Conceptualization, R.-T.H. and K.-H.C.; methodology, R.-T.H.; software, K.T.-C.C.; validation, R.-T.H., K.-H.C. and Y.-S.C.; formal analysis, K.-H.C.; investigation, R.-T.H.; resources, K.-H.C.; data curation, K.T.-C.C.; writing—original draft preparation, R.-T.H.; writing—review and editing, K.-H.C.; visualization, K.T.-C.C.; supervision, R.-T.H.; project administration, R.-T.H.; funding acquisition, K.-H.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data sharing is not applicable to this article as no datasets were generated or analyzed during the study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cotos, E. Potential of AWE feedback. Calico J. 2011, 28, 420–459. [Google Scholar] [CrossRef]
  2. Chen, K.T.-C. Identifying barriers faced by university EFL instructors in integrating mobile-assisted language learning. Contemp. Educ. Technol. 2023, 15, ep467. [Google Scholar] [CrossRef] [PubMed]
  3. Alharbi, W. AI in the foreign language classroom: A pedagogical overview of automated writing assistance tools. Educ. Res. Int. 2023, 4253331. [Google Scholar] [CrossRef]
  4. Barrot, J.S. Integrating technology into ESL/EFL writing through Grammarly. RELC J. 2022, 53, 764–768. [Google Scholar] [CrossRef]
  5. Chen, H.; Pan, J. Computer or human: A comparative study of automated evaluation scoring and instructors’ feedback on Chinese college students’ English writing. Asian-Pac. J. Second Foreign Lang. Educ. 2022, 7, 34. [Google Scholar] [CrossRef]
  6. Fan, N.; Ma, Y. The Effects of AWE (AWE) Feedback on Students’ English Writing Quality: A Systematic Literature Review. Lang. Teach. Res. Q. 2022, 28, 53–73. Available online: https://cyut.idm.oclc.org/login?url=https://www.proquest.com/scholarly-journals/effects-automated-writing-evaluation-awe-feedback/docview/2860855474/se-2?accountid=10048 (accessed on 15 May 2023). [CrossRef]
  7. Miranty, D.; Utami, W. An AWE (AWE) in higher education. Pegem J. Educ. Instr. 2021, 11, 126–137. [Google Scholar] [CrossRef]
  8. Flower, L.; Hayes, J.R. A cognitive process theory of writing. Coll. Compos. Commun. 1981, 32, 365–387. [Google Scholar] [CrossRef]
  9. Geng, J.; Razali, A.B. Effectiveness of the AWE Program on Improving Undergraduates’ Writing Performance. Engl. Lang. Teach. 2022, 15, 49–60. [Google Scholar] [CrossRef]
  10. Kuyyogsuy, S. Promoting Peer Feedback in Developing Students’ English Writing Ability in L2 Writing Class. Int. Educ. Stud. 2019, 12, 76–90. [Google Scholar] [CrossRef]
  11. Zamel, V. The composing processes of advanced ESL students: Six case studies. TESOL Q. 1983, 17, 165–188. [Google Scholar] [CrossRef]
  12. Wang, Q. The use of semantic similarity tools in automated content scoring of fact-based essays written by EFL learners. Educ. Inf. Technol. 2022, 27, 13021–13049. [Google Scholar] [CrossRef]
  13. Escalante, J.; Pack, A.; Barrett, A. AI-generated feedback on writing: Insights into efficacy and ENL student preference: Revista de Universidad y Sociedad del Conocimiento. Int. J. Educ. Technol. High. Educ. 2023, 20, 57. [Google Scholar] [CrossRef]
  14. Wang, C.; Mirzaei, T.; Xu, T.; Lin, H. How learner engagement impacts non-formal online learning outcomes through value co-creation: An empirical analysis. Int. J. Educ. Technol. High. Educ. 2022, 19, 32. [Google Scholar] [CrossRef] [PubMed]
  15. Thi, N.K.; Nikolov, M. Effects of teacher, automated, and combined feedback on syntactic complexity in EFL students’ writing. Asian-Pac. J. Second Foreign Lang. Educ. 2023, 8, 6. [Google Scholar] [CrossRef]
  16. Batchelor, K.E.; King, A. Freshmen and five hundred words: Investigating flash fiction as a genre for high school writing. J. Adolesc. Adult Lit. 2014, 58, 111–121. [Google Scholar] [CrossRef]
  17. Nunes, A.; Cordeiro, C.; Limpo, T.; Castro, S.L. Effectiveness of AWE systems in school settings: A systematic review of studies from 2000 to 2020. J. Comput. Assist. Learn. 2022, 38, 599–620. [Google Scholar] [CrossRef]
  18. Parra, G.L.; Calero, S.X. AWE tools in the improvement of the writing skill. Int. J. Instr. 2019, 12, 209–226. [Google Scholar]
Table 1. Pre-test results on the writing performance of the two groups.
Table 1. Pre-test results on the writing performance of the two groups.
NMSDSEMtp
Experimental group3058.43332.7245.974−0.4230.674
Control group2962.65543.0777.999
Note. p > 0.01, M = mean, SD = standard deviation, SEM = std. error mean.
Table 2. Results of the post-test on writing performance.
Table 2. Results of the post-test on writing performance.
NMSDSEMtp
Experimental group3082.80011.1242.0313.8400.001 ***
Control group2964.55223.1334.296
Note. ***p < 0.001. M = mean, SD = standard deviation, SEM = std. error mean.
Table 3. The results of the pre- and post-test results of the experimental group.
Table 3. The results of the pre- and post-test results of the experimental group.
NMSDSEMtp
Pre-test3082.43332.7245.975−4.5580.000 ***
Post-test3082.80011.1242.031
Note. *** p < 0.001. M = mean, SD = standard deviation, SEM = std. error mean.
Table 4. The pre-test and post-test results of the experimental group on writing errors and number of words.
Table 4. The pre-test and post-test results of the experimental group on writing errors and number of words.
Pre-Test
M
Pre-Test
SD
Post-Test
M
Post-Test
SD
tp
Writing errors22.7020.0129.676.645−4.5880.000***
Number of words240.7751.753273.325.163 0.001***
Note. ***p < 0.001. M = mean, SD = standard deviation, SEM = std. error mean.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hung, R.-T.; Chang, K.-H.; Chen, K.T.-C.; Chuang, Y.-S. Impact of Automated Writing Evaluation System and Revision Processes on College Students’ Writing Skills in English as a Foreign Language Course. Eng. Proc. 2024, 74, 52. https://doi.org/10.3390/engproc2024074052

AMA Style

Hung R-T, Chang K-H, Chen KT-C, Chuang Y-S. Impact of Automated Writing Evaluation System and Revision Processes on College Students’ Writing Skills in English as a Foreign Language Course. Engineering Proceedings. 2024; 74(1):52. https://doi.org/10.3390/engproc2024074052

Chicago/Turabian Style

Hung, Ruei-Teng, Kai-Hsin Chang, Kate Tzu-Ching Chen, and Yun-Sin Chuang. 2024. "Impact of Automated Writing Evaluation System and Revision Processes on College Students’ Writing Skills in English as a Foreign Language Course" Engineering Proceedings 74, no. 1: 52. https://doi.org/10.3390/engproc2024074052

Article Metrics

Back to TopTop