Curriculum-Aware Cognitive Diagnosis via Graph Neural Networks
Abstract
1. Introduction
2. Related Works
2.1. Application Scenarios and Challenges
2.2. Mainstream Methodological Approaches
2.3. Closely Related Studies
2.4. Summary
3. Methodology
3.1. Problem Formulation
3.2. Overall Framework
- (1)
- Graph Representation Module: constructs concept and item embeddings by propagating information along the curriculum graph to capture inter-concept dependencies.
- (2)
- Knowledge-Prior Fusion Module: integrates learner–item interactions with embeddings and aligns them with curriculum priors through constraint-based fusion.
- (3)
- Diagnostic Prediction Module: predicts learner performance while generating interpretable attention-based diagnostic reports.
3.3. Module Descriptions
3.3.1. Graph Representation Module
3.3.2. Knowledge-Prior Fusion Module
3.3.3. Diagnostic Prediction Module
| Algorithm 1: Curriculum-Aware Graph Neural Cognitive Diagnosis |
| Input: Interaction matrix R, Q-matrix, curriculum graph G Output: Predicted responses, mastery profiles Initialize concept embeddings H(0) for t = 1,…,T do H(t) ← GraphPropagation(H(t − 1), G) end for for each item qj do ej ← AggregateConcepts(H(T), Qj) end for for each learner ui do zi ← LearnerAggregation(Ri, {ej}) zi ← ApplyCurriculumConstraints(zi, G) end for for each interaction (i,j) do Rhatij ← Prediction(zi, ej) end for Optimize parameters by minimizing total loss |
3.4. Objective Function and Optimization
4. Experiment and Results
4.1. Experimental Setup
4.2. Baselines
4.3. Quantitative Results
4.4. Qualitative Results
4.5. Robustness
4.6. Ablation Study
4.7. Experimental Scene Illustration
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A. Objective Function Details
References
- Noh, M.F.M.; Effendi, M.; Matore, E.M.; Sulaiman, N.A.; Azeman, M.T.; Ishak, H.; Othman, N.; Rosli, N.M.; Sabtu, S.H. Cognitive Diagnostic Assessment in Educational Testing: A Score Strategy-Based Evaluation. e-BANGI J. 2024, 21, 260–272. [Google Scholar]
- Sutarman, A.; Williams, J.; Wilson, D.; Ismail, F.B. A model-driven approach to developing scalable educational software for adaptive learning environments. Int. Trans. Educ. Technol. (ITEE) 2024, 3, 9–16. [Google Scholar] [CrossRef]
- Klasnja-Milicevic, A.; Milicevic, D. Top-N knowledge concept recommendations in MOOCs using a neural co-attention model. IEEE Access 2023, 11, 51214–51228. [Google Scholar] [CrossRef]
- Njah, H.; Jamoussi, S.; Mahdi, W. Interpretable Bayesian network abstraction for dimension reduction. Neural Comput. Appl. 2023, 35, 10031–10049. [Google Scholar] [CrossRef]
- Liu, F.; Zhang, T.; Zhang, C.; Liu, L.; Wang, L.; Liu, B. A review of the evaluation system for curriculum learning. Electronics 2023, 12, 1676. [Google Scholar] [CrossRef]
- Nakagawa, H.; Iwasawa, Y.; Matsuo, Y. Graph-based knowledge tracing: Modeling student proficiency using graph neural network. In Proceedings of the IEEE/WIC/aCM International Conference on Web Intelligence, Thessaloniki, Greece, 14–17 October 2019; pp. 156–163. [Google Scholar]
- Yang, Y.; Shen, J.; Qu, Y.; Liu, Y.; Wang, K.; Zhu, Y.; Zhang, W.; Yu, Y. GIKT: A graph-based interaction model for knowledge tracing. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases; Springer International Publishing: Cham, Switzerland, 2020; pp. 299–315. [Google Scholar]
- Zhang, Y.; Zhang, L.; Zhang, H.; Wu, X. Development of a Computerized Adaptive Assessment and Learning System for Mathematical Ability Based on Cognitive Diagnosis. J. Intell. 2025, 13, 114. [Google Scholar] [CrossRef]
- Bi, H.; Chen, E.; He, W.; Wu, H.; Zhao, W.; Wang, S.; Wu, J. BETA-CD: A Bayesian meta-learned cognitive diagnosis framework for personalized learning. In Proceedings of the AAAI Conference on Artificial Intelligence, Vancouver, BC, Canada, 18–22 June 2023; Volume 37, pp. 5018–5026. [Google Scholar]
- Li, P.; Ding, Z. Application of deep learning-based personalized learning path prediction and resource recommendation for inheriting scientist spirit in graduate education. Comput. Sci. Inf. Syst. 2025, 22, 1229–1250. [Google Scholar] [CrossRef]
- Halkiopoulos, C.; Gkintoni, E. Leveraging AI in e-learning: Personalized learning and adaptive assessment through cognitive neuropsychology—A systematic analysis. Electronics 2024, 13, 3762. [Google Scholar] [CrossRef]
- Diallo, R.; Edalo, C.; Awe, O.O. Machine learning evaluation of imbalanced health data: A comparative analysis of balanced accuracy, MCC, and F1 score. In Practical Statistical Learning and Data Science Methods: Case Studies from LISA 2020 Global Network, USA; Springer Nature: Cham, Switzerland, 2024; pp. 283–312. [Google Scholar]
- Song, B.; Zhao, S.; Dang, L.; Wang, H.; Xu, L. A survey on learning from data with label noise via deep neural networks. Syst. Sci. Control Eng. 2025, 13, 2488120. [Google Scholar] [CrossRef]
- Rice, L.; Wong, E.; Kolter, Z. Overfitting in adversarially robust deep learning. In Proceedings of the International Conference on Machine Learning, Paris, France, 24–26 November 2020; PMLR: Cambridge, UK, 2020; pp. 8093–8104. [Google Scholar]
- Tang, L.; Zhang, L. Robust overfitting does matter: Test-time adversarial purification with fgsm. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 17–21 June 2024; pp. 24347–24356. [Google Scholar]
- Lin, R.; Yu, C.; Liu, T. Eliminating catastrophic overfitting via abnormal adversarial examples regularization. Adv. Neural Inf. Process. Syst. 2023, 36, 67866–67885. [Google Scholar]
- Bick, C.; Gross, E.; Harrington, H.A.; Schaub, M.T. What are higher-order networks? SIAM Rev. 2023, 65, 686–731. [Google Scholar] [CrossRef]
- Mienye, I.D.; Swart, T.G.; Obaido, G. Recurrent neural networks: A comprehensive review of architectures, variants, and applications. Information 2024, 15, 517. [Google Scholar] [CrossRef]
- Sieber, J.; Alonso, C.A.; Didier, A.; Zeilinger, M.N.; Orvieto, A. Understanding the differences in foundation models: Attention, state space models, and recurrent neural networks. Adv. Neural Inf. Process. Syst. 2024, 37, 134534–134566. [Google Scholar]
- Li, Q.; Yuan, X.; Liu, S.; Gao, L.; Wei, T.; Shen, X.; Sun, J. A genetic causal explainer for deep knowledge tracing. IEEE Trans. Evol. Comput. 2023, 28, 861–875. [Google Scholar] [CrossRef]
- Shen, S.; Liu, Q.; Huang, Z.; Zheng, Y.; Yin, M.; Wang, M.; Chen, E. A survey of knowledge tracing: Models, variants, and applications. IEEE Trans. Learn. Technol. 2024, 17, 1858–1879. [Google Scholar] [CrossRef]
- Abdelrahman, G.; Wang, Q.; Nunes, B. Knowledge tracing: A survey. ACM Comput. Surv. 2023, 55, 1–37. [Google Scholar] [CrossRef]
- Ma, H.; Song, S.; Qin, C.; Yu, X.; Zhang, L.; Zhang, X.; Zhu, H. Dgcd: An adaptive denoising gnn for group-level cognitive diagnosis. In Proceedings of the 33rd International Joint Conference on Artificial Intelligence (IJCAI-24), Jeju, Republic of Korea, 3–9 August 2024. [Google Scholar]
- Zhou, J.; Wu, Z.; Yuan, C.; Zeng, L. A Causality-Based Interpretable Cognitive Diagnosis Model. In International Conference on Neural Information Processing; Springer Nature: Singapore, 2023; pp. 214–226. [Google Scholar]
- Zanellati, A.; Di Mitri, D.; Gabbrielli, M.; Levrini, O. Hybrid models for knowledge tracing: A systematic literature review. IEEE Trans. Learn. Technol. 2024, 17, 1021–1036. [Google Scholar] [CrossRef]
- Huang, T.; Geng, J.; Yang, H.; Hu, S.; Ou, X.; Hu, J.; Yang, Z. Interpretable neuro-cognitive diagnostic approach incorporating multidimensional features. Knowl.-Based Syst. 2024, 304, 112432. [Google Scholar] [CrossRef]
- Stevens, A.; De Smedt, J. Explainability in process outcome prediction: Guidelines to obtain interpretable and faithful models. Eur. J. Oper. Res. 2024, 317, 317–329. [Google Scholar] [CrossRef]
- Khine, M.S. Using AI for adaptive learning and adaptive assessment. In Artificial Intelligence in Education: A Machine-Generated Literature Overview; Springer Nature: Singapore, 2024; pp. 341–466. [Google Scholar]
- Li, J.; Gao, R.; Yan, L.; Yao, Q.; Peng, X.; Hu, J. Enhanced Graph Diffusion Learning with Transformable Patching via Curriculum Contrastive Learning for Session Recommendation. Electronics 2025, 14, 2089. [Google Scholar] [CrossRef]
- Corbett, A.T.; Anderson, J.R. Knowledge tracing: Modeling the acquisition of procedural knowledge. User Model. User-Adapt. Interact. 1994, 4, 253–278. [Google Scholar] [CrossRef]
- Downing, S.M. Item response theory: Applications of modern test theory in medical education. Med. Educ. 2003, 37, 739–745. [Google Scholar] [CrossRef]
- Piech, C.; Bassen, J.; Huang, J.; Ganguli, S.; Sahami, M.; Guibas, L.J.; Sohl-Dickstein, J. Deep knowledge tracing. Adv. Neural Inf. Process. Syst. 2015, 28, 505–513. [Google Scholar]
- Pandey, S.; Karypis, G. A self-attentive model for knowledge tracing. arXiv 2019, arXiv:1907.06837. [Google Scholar] [CrossRef]
- Wang, F.; Liu, Q.; Chen, E.; Huang, Z.; Chen, Y.; Yin, Y.; Huang, Z.; Wang, S. Neural cognitive diagnosis for intelligent education systems. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; Volume 34, pp. 6153–6161. [Google Scholar]
- Su, Y.; Cheng, Z.; Wu, J.; Dong, Y.; Huang, Z.; Wu, L.; Chen, E.; Wang, S.; Xie, F. Graph-based cognitive diagnosis for intelligent tutoring systems. Knowl.-Based Syst. 2022, 253, 109547. [Google Scholar] [CrossRef]









| Symbol | Definition |
|---|---|
| Set of learners, indexed by . | |
| Set of items (questions or exercises). | |
| Set of knowledge concepts in the curriculum. | |
| Learner–item response (1 if correct, 0 otherwise). | |
| Q-matrix entry: 1 if item requires concept . | |
| Curriculum graph with nodes as concepts and edges as prerequisite relations. | |
| Adjacency matrix of the curriculum graph. | |
| Embedding of concept at propagation step . | |
| Mastery vector of learner , representing concept mastery probabilities. | |
| Embedding vector of item . | |
| Regularization coefficients for prior and generalization constraints. | |
| Predicted probability that learner answers item correctly. | |
| Prediction loss, prior constraint loss, and total objective. |
| Dataset | Learners | Items | Concepts | Interactions | Sparsity (%) |
|---|---|---|---|---|---|
| ASSISTments2017 | 4162 | 17,740 | 123 | 942,812 | 98.7 |
| EdNet-KT1 | 7453 | 23,409 | 138 | 1,482,061 | 99.2 |
| Eedi | 9827 | 27,361 | 152 | 2,035,678 | 99.3 |
| Component | Specification |
|---|---|
| CPU | Intel Xeon Gold 6338 (2.0 GHz, 32 cores) |
| GPU | NVIDIA A100 80 GB (×4) |
| RAM | 512 GB DDR4 |
| Storage | 10 TB NVMe SSD |
| Software | PyTorch 2.1, CUDA 12.0, cuDNN 9.0 |
| Metric | Definition |
|---|---|
| ACC | Fraction of correctly predicted responses, reflecting overall prediction accuracy. |
| AUC | Area under the ROC curve, measuring the model’s discrimination ability between correct and incorrect responses. |
| F1 | Harmonic mean of precision and recall, balancing false positive and false negative predictions. |
| ECE (Calibration Error) | Expected Calibration Error computed over 10 equal-frequency bins, measuring the average absolute difference between predicted probabilities and empirical accuracies. Lower ECE indicates better probability calibration. |
| Align% | Percentage of attention weights that align with expert-defined prerequisite edges, indicating the consistency between model attention and curriculum structure. |
| Model | ASSISTments2017 (AUC) | EdNet-KT1 (AUC) | Eedi (AUC) | Avg. ACC (%) | Avg. F1 |
|---|---|---|---|---|---|
| BKT | 0.672 | 0.688 | 0.693 | 70.1 | 0.68 |
| IRT | 0.701 | 0.719 | 0.721 | 72.4 | 0.70 |
| DKT | 0.749 | 0.763 | 0.768 | 75.3 | 0.74 |
| AKT | 0.777 | 0.781 | 0.786 | 77.8 | 0.76 |
| NCDM | 0.803 | 0.811 | 0.816 | 80.9 | 0.79 |
| GCD | 0.825 | 0.831 | 0.838 | 82.4 | 0.81 |
| CA-GNCD | 0.873 | 0.881 | 0.887 | 86.2 | 0.85 |
| Variant | AUC |
|---|---|
| Full CA-GNCD | 0.873 |
| 0.829 |
| 0.842 |
| 0.851 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Fu, C.; Fang, Q. Curriculum-Aware Cognitive Diagnosis via Graph Neural Networks. Information 2025, 16, 996. https://doi.org/10.3390/info16110996
Fu C, Fang Q. Curriculum-Aware Cognitive Diagnosis via Graph Neural Networks. Information. 2025; 16(11):996. https://doi.org/10.3390/info16110996
Chicago/Turabian StyleFu, Chensha, and Quanrong Fang. 2025. "Curriculum-Aware Cognitive Diagnosis via Graph Neural Networks" Information 16, no. 11: 996. https://doi.org/10.3390/info16110996
APA StyleFu, C., & Fang, Q. (2025). Curriculum-Aware Cognitive Diagnosis via Graph Neural Networks. Information, 16(11), 996. https://doi.org/10.3390/info16110996
