Muhetaer, M.; Meng, X.; Zhu, J.; Aikebaier, A.; Zu, L.; Bai, Y.
Symmetry and Asymmetry in Pre-Trained Transformer Models: A Comparative Study of TinyBERT, BERT, and RoBERTa for Chinese Educational Text Classification. Symmetry 2025, 17, 1812.
https://doi.org/10.3390/sym17111812
AMA Style
Muhetaer M, Meng X, Zhu J, Aikebaier A, Zu L, Bai Y.
Symmetry and Asymmetry in Pre-Trained Transformer Models: A Comparative Study of TinyBERT, BERT, and RoBERTa for Chinese Educational Text Classification. Symmetry. 2025; 17(11):1812.
https://doi.org/10.3390/sym17111812
Chicago/Turabian Style
Muhetaer, Munire, Xiaoyan Meng, Jing Zhu, Aixiding Aikebaier, Liyaer Zu, and Yawen Bai.
2025. "Symmetry and Asymmetry in Pre-Trained Transformer Models: A Comparative Study of TinyBERT, BERT, and RoBERTa for Chinese Educational Text Classification" Symmetry 17, no. 11: 1812.
https://doi.org/10.3390/sym17111812
APA Style
Muhetaer, M., Meng, X., Zhu, J., Aikebaier, A., Zu, L., & Bai, Y.
(2025). Symmetry and Asymmetry in Pre-Trained Transformer Models: A Comparative Study of TinyBERT, BERT, and RoBERTa for Chinese Educational Text Classification. Symmetry, 17(11), 1812.
https://doi.org/10.3390/sym17111812