This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Open AccessArticle
Enhancing Cross-Prompt Essay Trait Scoring via External Knowledge Similarity Transfer
by
Tianbao Song
Tianbao Song 1,
Jingbo Sun
Jingbo Sun 2,3,*
and
Weiming Peng
Weiming Peng 4,5
1
School of Computer and Artificial Intelligence, Beijing Technology and Business University, Beijing 100048, China
2
School of Information Engineering, Minzu University of China, Beijing 100081, China
3
National Language Resource Monitoring and Research Center of Minority Language, Minzu University of China, Beijing 100081, China
4
Chinese Character Research and Application Laboratory, Beijing Normal University, Beijing 100875, China
5
Linguistic Data Consortium, University of Pennsylvania, Philadelphia, PA 19104, USA
*
Author to whom correspondence should be addressed.
Symmetry 2025, 17(5), 739; https://doi.org/10.3390/sym17050739 (registering DOI)
Submission received: 18 April 2025
/
Revised: 1 May 2025
/
Accepted: 9 May 2025
/
Published: 11 May 2025
(This article belongs to the Section
Computer)
Abstract
Cross-prompt automated essay scoring presents a significant challenge due to substantial differences in samples across prompts, and recent research has concentrated on evaluating distinct essay traits beyond the overall score. The primary approaches aim to enhance the effectiveness of AES in cross-prompt scenarios by improving shared representation or facilitating the transfer of common knowledge between source and target prompts. However, the existing studies only concentrate on the transfer of shared features within essay representation, neglecting the importance of external knowledge, and measuring the degree of commonality across samples remains challenging. Indeed, higher similarity of external knowledge also results in a better shared representation of the essay. Based on this motivation, in this paper, we introduce an extra-essay knowledge similarity transfer to assess sample commonality. Additionally, there is insufficient focus on the intrinsic meaning of the traits being evaluated and their varied impact on the model. Therefore, we incorporate extra-essay knowledge representation to enhance understanding of the essay under evaluation and the target of the task. Experimental results demonstrate that our approach outperforms other baseline models on the ASAP++ datasets, confirming the effectiveness of our method.
Share and Cite
MDPI and ACS Style
Song, T.; Sun, J.; Peng, W.
Enhancing Cross-Prompt Essay Trait Scoring via External Knowledge Similarity Transfer. Symmetry 2025, 17, 739.
https://doi.org/10.3390/sym17050739
AMA Style
Song T, Sun J, Peng W.
Enhancing Cross-Prompt Essay Trait Scoring via External Knowledge Similarity Transfer. Symmetry. 2025; 17(5):739.
https://doi.org/10.3390/sym17050739
Chicago/Turabian Style
Song, Tianbao, Jingbo Sun, and Weiming Peng.
2025. "Enhancing Cross-Prompt Essay Trait Scoring via External Knowledge Similarity Transfer" Symmetry 17, no. 5: 739.
https://doi.org/10.3390/sym17050739
APA Style
Song, T., Sun, J., & Peng, W.
(2025). Enhancing Cross-Prompt Essay Trait Scoring via External Knowledge Similarity Transfer. Symmetry, 17(5), 739.
https://doi.org/10.3390/sym17050739
Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details
here.
Article Metrics
Article Access Statistics
For more information on the journal statistics, click
here.
Multiple requests from the same IP address are counted as one view.