Next Article in Journal
DLG–IDS: Dynamic Graph and LLM–Semantic Enhanced Spatiotemporal GNN for Lightweight Intrusion Detection in Industrial Control Systems
Previous Article in Journal
Modeling, Control and Monitoring of Automotive Electric Drives
Previous Article in Special Issue
A Multi-Level Feature Fusion Network Integrating BERT and TextCNN
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Enhanced Semantic BERT for Named Entity Recognition in Education

College of Computer and Artificial Intelligence, Nanjing University of Science and Technology Zijin College, Nanjing 210023, China
*
Author to whom correspondence should be addressed.
Electronics 2025, 14(19), 3951; https://doi.org/10.3390/electronics14193951
Submission received: 27 August 2025 / Revised: 3 October 2025 / Accepted: 4 October 2025 / Published: 7 October 2025

Abstract

To address the technical challenges in the educational domain named entity recognition (NER), such as ambiguous entity boundaries and difficulties with nested entity identification, this study proposes an enhanced semantic BERT model (ES-BERT). The model innovatively adopts an education domain, vocabulary-assisted semantic enhancement strategy that (1) applies the term frequency–inverse document frequency (TF-IDF) algorithm to weight domain-specific terms, and (2) fuses the weighted lexical information with character-level features, enabling BERT to generate enriched, domain-aware, character–word hybrid representations. A complete bidirectional long short-term memory-conditional random field (BiLSTM-CRF) recognition framework was established, and a novel focal loss-based joint training method was introduced to optimize the process. The experimental design employed a three-phase validation protocol, as follows: (1) In a comparative evaluation using 5-fold cross-validation on our proprietary computer-education dataset, the proposed ES-BERT model yielded a precision of 90.38%, which is higher than that of the baseline models; (2) Ablation studies confirmed the contribution of domain-vocabulary enhancement to performance improvement; (3) Cross-domain experiments on the 2016 knowledge base question answering datasets and resume benchmark datasets demonstrated outstanding precision of 98.41% and 96.75%, respectively, verifying the model’s transfer-learning capability. These comprehensive experimental results substantiate that ES-BERT not only effectively resolves domain-specific NER challenges in education but also exhibits remarkable cross-domain adaptability.
Keywords: NER; BERT; semantic enhancement; BiLSTM; CRF; focal loss NER; BERT; semantic enhancement; BiLSTM; CRF; focal loss

Share and Cite

MDPI and ACS Style

Huang, P.; Zhu, H.; Wang, Y.; Dai, L.; Zheng, L. Enhanced Semantic BERT for Named Entity Recognition in Education. Electronics 2025, 14, 3951. https://doi.org/10.3390/electronics14193951

AMA Style

Huang P, Zhu H, Wang Y, Dai L, Zheng L. Enhanced Semantic BERT for Named Entity Recognition in Education. Electronics. 2025; 14(19):3951. https://doi.org/10.3390/electronics14193951

Chicago/Turabian Style

Huang, Ping, Huijuan Zhu, Ying Wang, Lili Dai, and Lei Zheng. 2025. "Enhanced Semantic BERT for Named Entity Recognition in Education" Electronics 14, no. 19: 3951. https://doi.org/10.3390/electronics14193951

APA Style

Huang, P., Zhu, H., Wang, Y., Dai, L., & Zheng, L. (2025). Enhanced Semantic BERT for Named Entity Recognition in Education. Electronics, 14(19), 3951. https://doi.org/10.3390/electronics14193951

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop