Creativity in Learning Analytics: A Systematic Literature Review
Abstract
1. Introduction
2. Materials and Methods
2.1. Project Team, Research Questions, and PICO Framework
- How are LA tools applied or developed to assess or foster creativity in educational settings?
- What theoretical frameworks and methodologies are used when applying LA tools to study creativity in education?
- What are the key challenges and limitations in integrating LA to support creativity in education?
- What gaps exist in the current literature, and what future research directions can be identified?
- Population (P): Studies focusing on educational institutions, particularly those in formal or informal learning settings.
- Intervention (I): Integration of LA to study, foster, or assess creativity.
- Comparison (C): Studies comparing traditional educational methods or curricula without LA.
- Outcome (O): Insights into how LA supports creativity, including processes, outcomes, and personalized feedback.
2.2. Eligibility Criteria and Selection Process
2.3. Data Extraction and Management, Quality Assessment, Data Synthesis, and Reporting
3. Results
3.1. Research Approach and Methodology
3.2. Theoretical Framework
3.3. LA Tools and Techniques
3.4. Creativity Assessment Framework and Metrics
3.5. Findings from the Studies
- Enhanced Predictive Analytics: Predictive modeling techniques—such as machine learning, clustering, and regression—were used to identify at-risk students, model learner profiles, and optimize personalized support (Fontana et al. 2021; Peña-Ayala et al. 2017). Tools like SVM and Bayesian Knowledge Tracing (BKT) offered validated mechanisms for profiling and prediction (Ifenthaler and Widanapathirana 2014; Yan et al. 2021).
- Collaboration and Teamwork Dynamics: Several studies employed multimodal LA (MMLA) techniques such as statistical discourse analysis, gaze tracking, facial expression recognition, and peer ratings to understand group interaction patterns and enhance collaborative competencies (Koh et al. 2016; Moon et al. 2024). Creativity in group contexts was further examined using frameworks like the Assessment Scale for Creative Collaboration (Mavri et al. 2020).
- Technological Integration and Personalized Learning: Adaptive dashboards and personalized analytics interventions—such as BookRoll and face-tracking systems—helped deliver real-time feedback and increased behavioral engagement (Yang and Ogata 2023; Moon et al. 2024). Intelligent systems that tailored content based on learning profiles were positively associated with creative thinking (Wang et al. 2023).
- Learning Design and Visualization: Dashboards and data visualization tools like radar charts, scatterplots, and heatmaps were commonly used to facilitate metacognition and learner reflection (Charleer et al. 2016; Hernández-García et al. 2016). Analytics-enabled platforms like StepUp! supported self-regulated learning through time tracking and artefact production (Santos et al. 2013).
- Creativity Metrics and Assessment Approaches: Emerging approaches to assess creativity include:
- Product-based creativity metrics: Tools such as the Test of Creative Thinking–Drawing Production (TCT-DP) were used to automatically assess fluency, elaboration, and originality in outputs (Cropley et al. 2024).
- Behavioral and system-logged metrics: Query diversity (Olivares-Rodríguez et al. 2017), Scratch-based code complexity (Kovalkov et al. 2021), and biometric responses (Bender and Sung 2020) were explored as proxies for creative performance.
- Framework-based surveys and scales: Studies employed structured creativity assessment frameworks across domains like STEAM, engineering, and narrative writing (Bolden et al. 2020; Akdemir-Beveridge et al. 2025).
- AI-Supported Creativity Assessment: New research demonstrates how generative AI models, Natural Language Processing (NLP), and automatic scoring can assess creative ideas and processes across interventions (Hadas and Hershkovitz 2025; Marrone and Cropley 2022). These methods offer scalability but still require careful validation to ensure construct accuracy.
Challenges
- Definitional Ambiguity: Variability in the conceptualization of creativity (e.g., product- vs. process-oriented; domain-general vs. domain-specific) remains a major obstacle to standardized measurement (Henriksen et al. 2021; Bolden et al. 2020).
- Methodological Limitations: Many instruments lack robust validation across different educational contexts or fail to accommodate domain-specific demands. The scalability of creativity assessments, especially in online or automated environments, remains limited (Kovalkov et al. 2021; Wang et al. 2023).
- Ethical and Technological Barriers: Real-time monitoring through LA dashboards and biometric sensors raises ethical concerns regarding student consent, data privacy, and algorithmic bias (Hernández-García et al. 2016; Hadas and Hershkovitz 2025).
- Over-reliance on Self-report Instruments: While tools such as the Engineering Creativity Assessment Tool (Akdemir-Beveridge et al. 2025) and STEAM-based creativity scales (Yulianti et al. 2024) are widely used, they depend on subjective measures that may not fully capture creativity in action.
4. Discussion, Future Directions, and Conclusion
4.1. Self-Efficacy and Learning Analytics
4.2. Methodological Diversity and Innovations
4.3. Creativity in Learning Analytics
4.4. Evidence-Based Decision-Making
4.5. Conclusion and Future Directions
4.5.1. Standardizing Creativity Metrics
4.5.2. Expanding Multimodal and AI-Supported Analytics
4.5.3. Addressing Ethical and Privacy Concerns
4.5.4. Enhancing Scalability and Validation
4.5.5. Integrating Creativity Explicitly into Learning Analytics
4.5.6. Bolstering Self-Efficacy Through Targeted Interventions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Correction Statement
Appendix A
| Population Terms | Exposure Terms 1 | Exposure Terms 2 | Exposure Term 3 |
|---|---|---|---|
| “distance education” OR “digital education*” OR “online learning*” OR “virtual class*”?? | “learning analytic*” OR “learning metric*” OR “data analysis” | creativ* OR “creative thinking” OR “creative pedagogy” OR pedagog* OR innovativ* | “student engagement” OR “educational assessment” OR “academic performance” OR “learning outcomes” |
| Database | Syntax | Limits |
|---|---|---|
| Web of Science | TS = (“distance education” OR “digital education*” OR “online learning*” OR “virtual class*”) AND TS = (“learning analytic*” OR “learning metric*” OR “data analysis”) AND TS = (creativ* OR “creative thinking” OR “creative pedagogy” OR pedagog* OR innovativ*) AND TS = (“student engagement” OR “educational assessment” OR “academic performance” OR “learning outcomes”) | 1 September 2012 onwards English language |
References
- Akdemir-Beveridge, Zeynep G., Arash Zaghi, and Connie Syharat. 2025. Understanding and Evaluating Engineering Creativity: Development and Validation of the Engineering Creativity Assessment Tool (ECAT). arXiv arXiv:2504.12481. [Google Scholar] [CrossRef]
- Alexandron, Giora, Lisa Y. Yoo, José A. Ruipérez-Valiente, Sunbok Lee, and David E. Pritchard. 2019. Are MOOC Learning Analytics Results Trustworthy? With Fake Learners, They Might Not Be! International Journal of Artificial Intelligence in Education 29: 484–506. [Google Scholar] [CrossRef]
- Bandura, Albert, W. H. Freeman, and Richard Lightsey. 1999. Self-Efficacy: The Exercise of Control. Journal of Cognitive Psychotherapy 13: 158–66. [Google Scholar] [CrossRef]
- Bender, Stuart, and Billy Sung. 2020. Data-Driven Creativity for Screen Production Students: Developing and Testing Learning Materials Involving Audience Biometrics. Digital Creativity 31: 98–113. [Google Scholar] [CrossRef]
- Bolden, Benjamin, Christopher DeLuca, Tiina Kukkonen, Suparna Roy, and Judy Wearing. 2020. Assessment of Creativity in K-12 Education: A Scoping Review. Review of Education 8: 343–76. [Google Scholar] [CrossRef]
- Bown, Matt J., and Alex J. Sutton. 2010. Quality Control in Systematic Reviews and Meta-Analyses. European Journal of Vascular and Endovascular Surgery 40: 669–77. [Google Scholar] [CrossRef]
- Bulut, Okan, Güher Gorgun, Seyma N. Yildirim-Erbasli, Tarid Wongvorachan, Lia M. Daniels, Yizhu Gao, Ka Wing Lai, and Jinnie Shin. 2023. Standing on the Shoulders of Giants: Online Formative Assessments as the Foundation for Predictive Learning Analytics Models. British Journal of Educational Technology 54: 19–39. [Google Scholar] [CrossRef]
- Charleer, Sven, Joris Klerkx, Erik Duval, Tinne De Laet, and Katrien Verbert. 2016. Creating Effective Learning Analytics Dashboards: Lessons Learnt. In Adaptive and Adaptable Learning. Cham: Springer International Publishing, pp. 42–56. [Google Scholar] [CrossRef]
- Chou, Elijah, Davide Fossati, and Arnon Hershkovitz. 2024. A Code Distance Approach to Measure Originality in Computer Programming. Paper presented at 16th International Conference on Computer Supported Education, CSEDU, Angers, France, May 2–4; Setúbal: SciTePress, pp. 541–48. [Google Scholar] [CrossRef]
- Constapel, Manfred, Dorian Doberstein, H. Ulrich Hoppe, and Horst Hellbrück. 2019. IKARion: Enhancing a Learning Platform with Intelligent Feedback to Improve Team Collaboration and Interaction in Small Groups. Paper presented at 2019 18th International Conference on Information Technology Based Higher Education and Training (ITHET), Magdeburg, Germany, September 26–27; pp. 1–10. [Google Scholar] [CrossRef]
- Cropley, David H., Caroline Theurer, A. C. Sven Mathijssen, and Rebecca L. Marrone. 2024. Fit-for-Purpose Creativity Assessment: Automatic Scoring of the Test of Creative Thinking–Drawing Production (TCT-DP). Creativity Research Journal 37: 539–54. [Google Scholar] [CrossRef]
- da Cruz Alves, Nathalia, Christiane Gresse von Wangenheim, and Lúcia Helena Martins-Pacheco. 2021. Assessing Product Creativity in Computing Education: A Systematic Mapping Study. Informatics in Education 20: 19–45. [Google Scholar] [CrossRef]
- De Lorenzo, Aurelia, Alessandro Nasso, Viviana Bono, and Emanuela Rabaglietti. 2023. Introducing TCD-D for Creativity Assessment: A Mobile App for Educational Contexts. International Journal of Modern Education and Computer Science 1: 13–27. [Google Scholar] [CrossRef]
- Denson, Cameron D., Jennifer K. Buelin, Matthew D. Lammi, and Susan D’Amico. 2015. Developing Instrumentation for Assessing Creativity in Engineering Design. Journal of Technology Education 27: 23–40. [Google Scholar] [CrossRef]
- El Alfy, Shahira, and Mounir Kehal. 2024. Investigating the Factors Affecting Educators’ Adoption of Learning Analytics Using the UTAUT Model. International Journal of Information and Learning Technology 41: 280–303. [Google Scholar] [CrossRef]
- Elicit. 2023. Elicit: The AI Research Assistant. Available online: https://elicit.com (accessed on 1 January 2025).
- Ferguson, Rebecca. 2012. Learning Analytics: Drivers, Developments and Challenges. International Journal of Technology Enhanced Learning 4: 304–17. [Google Scholar] [CrossRef]
- Fontana, Luca, Chiara Masci, Francesca Ieva, and Anna Maria Paganoni. 2021. Performing Learning Analytics via Generalised Mixed-Effects Trees. Data 6: 74. [Google Scholar] [CrossRef]
- Gough, David, James Thomas, and Sandy Oliver. 2017. An Introduction to Systematic Reviews, 2nd ed. London: Sage. [Google Scholar]
- Hadas, Eran, and Arnon Hershkovitz. 2025. Assessing Creativity across Multi-Step Intervention Using Generative AI Models. Journal of Learning Analytics 12: 91–109. [Google Scholar] [CrossRef]
- Hasan, Bashar, Samer Saadi, Noora S. Rajjoub, Moustafa Hegazi, Mohammad Al-Kordi, Farah Fleti, Magdoleen Farah, Irbaz B. Riaz, Imon Banerjee, Zhen Wang, and et al. 2024. Integrating Large Language Models in Systematic Reviews: A Framework and Case Study Using ROBINS-I for Risk of Bias Assessment. BMJ Evidence-Based Medicine 29: 394–98. [Google Scholar] [CrossRef] [PubMed]
- Henriksen, Danah, Edwin Creely, Michael Henderson, and Punya Mishra. 2021. Creativity and Technology in Teaching and Learning: A Literature Review of the Uneasy Space of Implementation. Educational Technology Research and Development 69: 2091–108. [Google Scholar] [CrossRef]
- Hernández-García, Ángel, and Miguel Ángel Conde. 2014. Dealing with Complexity: Educational Data and Tools for Learning Analytics. Paper presented at Second International Conference on Technological Ecosystems for Enhancing Multiculturality, Salamanca, Spain, October 1–3; New York: Association for Computing Machinery, pp. 263–68. [Google Scholar] [CrossRef]
- Hernández-García, Ángel, Inés González-González, Ana Isabel Jiménez-Zarco, and Julián Chaparro-Peláez. 2016. Visualizations of Online Course Interactions for Social Network Learning Analytics. International Journal of Emerging Technologies in Learning 11: 6–15. [Google Scholar] [CrossRef]
- Hershkovitz, Arnon, Raquel Sitman, Rotem Israel-Fishelson, Andoni Eguíluz, Pablo Garaizar, and Mariluz Guenaga. 2019. Creativity in the Acquisition of Computational Thinking. Interactive Learning Environments 27: 628–44. [Google Scholar] [CrossRef]
- Ifenthaler, Dirk, and Clara Widanapathirana. 2014. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines. Technology, Knowledge and Learning 19: 221–40. [Google Scholar] [CrossRef]
- Israel-Fishelson, Rotem, and Arnon Hershkovitz. 2024. Log-Based Analysis of Creativity in the Context of Computational Thinking. Education Sciences 15: 3. [Google Scholar] [CrossRef]
- Kaliisa, Rogers, Anders I. Morch, and Anders Kluge. 2019. Exploring Social Learning Analytics to Support Teaching and Learning Decisions in Online Learning Environments. In Transforming Learning with Meaningful Technologies. Cham: Springer, pp. 209–23. [Google Scholar] [CrossRef]
- Karaoglan Yilmaz, Fatma Gizem. 2022. Utilizing Learning Analytics to Support Students’ Academic Self-Efficacy and Problem-Solving Skills. Asia-Pacific Education Researcher 31: 175–91. [Google Scholar] [CrossRef]
- Klašnja-Milićević, Aleksandra, Mirjana Ivanović, and Bojana Stantić. 2020. Designing Personalized Learning Environments—The Role of Learning Analytics. Vietnam Journal of Computer Science 7: 231–50. [Google Scholar] [CrossRef]
- Koh, Elizabeth, Antonette Shibani, Jennifer Pei-Ling Tan, and Helen Hong. 2016. A Pedagogical Framework for Learning Analytics in Collaborative Inquiry Tasks: An Example from a Teamwork Competency Awareness Program. Paper presented at Sixth International Conference on Learning Analytics & Knowledge, Edinburgh, UK, April 25–29; New York: Association for Computing Machinery, pp. 74–83. [Google Scholar] [CrossRef]
- Kovalkov, Anastasia, Avi Segal, and Kobi Gal. 2020. In the Eye of the Beholder? Detecting Creativity in Visual Programming Environments. arXiv arXiv:2004.05878. [Google Scholar] [CrossRef]
- Kovalkov, Anastasia, Benjamin Paaßen, Avi Segal, Niels Pinkwart, and Kobi Gal. 2021. Automatic Creativity Measurement in Scratch Programs across Modalities. IEEE Transactions on Learning Technologies 14: 740–53. [Google Scholar] [CrossRef]
- Kupers, Elisa, Marijn van Dijk, and Andreas Lehmann-Wermser. 2018. Creativity in the Here and Now: A Generic, Micro-Developmental Measure of Creativity. Frontiers in Psychology 9: 2095. [Google Scholar] [CrossRef]
- Lahza, Hatim, Hassan Khosravi, and Gianluca Demartini. 2023. Analytics of Learning Tactics and Strategies in an Online Learnersourcing Environment. Journal of Computer Assisted Learning 39: 94–112. [Google Scholar] [CrossRef]
- Li, Yun, Mirim Kim, and Jayant Palkar. 2022. Using Emerging Technologies to Promote Creativity in Education: A Systematic Review. International Journal of Educational Research Open 3: 100177. [Google Scholar] [CrossRef]
- Liñán, Laura Calvet, and Ángel Alejandro Juan Pérez. 2015. Educational Data Mining and Learning Analytics: Differences, Similarities, and Time Evolution. International Journal of Educational Technology in Higher Education 12: 98–112. [Google Scholar] [CrossRef]
- Long, Haiying, Barbara A. Kerr, Trina E. Emler, and Max Birdnow. 2022. A Critical Review of Assessments of Creativity in Education. Review of Research in Education 46: 288–323. [Google Scholar] [CrossRef]
- Long, Hannah A., David P. French, and Joanna M. Brooks. 2020. Optimising the Value of the Critical Appraisal Skills Programme (CASP) Tool for Quality Appraisal in Qualitative Evidence Synthesis. Research Methods in Medicine & Health Sciences 1: 31–42. [Google Scholar] [CrossRef]
- Marrone, Rebecca L., and David H. Cropley. 2022. The Role of Learning Analytics in Developing Creativity. In Social and Emotional Learning and Complex Skills Assessment: An Inclusive Learning Analytics Perspective. Cham: Springer, pp. 75–91. [Google Scholar] [CrossRef]
- Mavri, Aekaterini, Andri Ioannou, and Fernando Loizides. 2020. The Assessment Scale for Creative Collaboration (ASCC) Validation and Reliability Study. International Journal of Human–Computer Interaction 36: 1056–69. [Google Scholar] [CrossRef]
- McKenna, H. Patricia, Marilyn P. Arnone, Michelle L. Kaarst-Brown, Lee W. McKnight, and Sarah A. Chauncey. 2013. Application of the Consensual Assessment Technique in 21st Century Technology-Pervasive Learning Environments. Paper presented at 6th International Conference of Education, Research and Innovation (iCERi2013), Seville, Spain, November 18–20; pp. 6410–19. [Google Scholar]
- Moher, David, Alessandro Liberati, Jennifer Tetzlaff, Douglas G. Altman, and the PRISMA Group. 2009. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. Annals of Internal Medicine 151: 264–69. [Google Scholar] [CrossRef] [PubMed]
- Moon, Jewoong, Sheunghyun Yeo, Seyyed Kazem Banihashem, and Omid Noroozi. 2024. Using Multimodal Learning Analytics as a Formative Assessment Tool: Exploring Collaborative Dynamics in Mathematics Teacher Education. Journal of Computer Assisted Learning 40: 2753–71. [Google Scholar] [CrossRef]
- Munn, Zachary, Micah D. J. Peters, Cindy Stern, Catalin Tufanaru, Alexa McArthur, and Edoardo Aromataris. 2018. Systematic Review or Scoping Review? Guidance for Authors When Choosing between a Systematic or Scoping Review Approach. BMC Medical Research Methodology 18: 143. [Google Scholar] [CrossRef]
- Noorkholisoh, Lulu, Yusi Riksa Yustiana, Nandang Budiman, and Dodi Suryana. 2024. Validity and Reliability Analysis Using the Rasch Model in Developing Creativity Tests Instruments for Elementary School Students. Jurnal Ilmiah Bimbingan Konseling Undiksha 15: 128–35. [Google Scholar] [CrossRef]
- Ochoa, Xavier, Arnon Hershkovitz, Alyssa Wise, and Simon Knight. 2017. Towards a Convergent Development of Learning Analytics. Journal of Learning Analytics 4: 1–6. [Google Scholar] [CrossRef]
- OECD. 2018. The Future of Education and Skills: Education 2030. Paris: OECD Publishing. Available online: https://www.oecd.org/education/2030-project/ (accessed on 20 February 2025).
- Olivares-Rodríguez, Cristian, Mariluz Guenaga, and Pablo Garaizar. 2017. Automatic Assessment of Creativity in Heuristic Problem Solving Based on Query Diversity. Dyna 92: 449–55. [Google Scholar]
- Page, Matthew J., Joanne E. McKenzie, Patrick M. Bossuyt, Isabelle Boutron, Tammy C. Hoffmann, Cynthia D. Mulrow, Larissa Shamseer, Jennifer M. Tetzlaff, Elie A. Akl, and Sue E. Brennan. 2021. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 372: n71. [Google Scholar] [CrossRef]
- Park, Sunyoung, and Nam Hui Kim. 2022. University Students’ Self-Regulation, Engagement and Performance in Flipped Learning. European Journal of Training and Development 46: 22–40. [Google Scholar] [CrossRef]
- Peña-Ayala, Alejandro. 2014. Educational Data Mining: A Survey and a Data Mining-Based Analysis of Recent Works. Expert Systems with Applications 41: 1432–62. [Google Scholar] [CrossRef]
- Peña-Ayala, Alejandro, Leonor Adriana Cárdenas-Robledo, and Humberto Sossa. 2017. A Landscape of Learning Analytics: An Exercise to Highlight the Nature of an Emergent Field. In Learning Analytics: Fundaments, Applications, and Trends: A View of the Current State of the Art to Enhance E-Learning. Cham: Springer International Publishing, pp. 65–112. [Google Scholar] [CrossRef]
- Plucker, Jonathan A. 1999. Is the Proof in the Pudding? Reanalyses of Torrance’s (1958 to Present) Longitudinal Data. Creativity Research Journal 12: 103–14. [Google Scholar] [CrossRef]
- Ryu, Suna, Dagun Lee, and Beomjun Han. 2024. Potential for Game-Based Assessment of Creativity Using Biometric and Real-Time Data. Brain, Digital, & Learning 14: 141–65. [Google Scholar] [CrossRef]
- Saleeb, Noha. 2021. Closing the Chasm between Virtual and Physical Delivery for Innovative Learning Spaces Using Learning Analytics. International Journal of Information and Learning Technology 38: 209–29. [Google Scholar] [CrossRef]
- Santos, José Luis, Katrien Verbert, Sten Govaerts, and Erik Duval. 2013. Addressing Learner Issues with StepUp!: An Evaluation. Paper presented at Third International Conference on Learning Analytics and Knowledge, Leuven, Belgium, April 8–12; New York: Association for Computing Machinery, pp. 14–22. [Google Scholar] [CrossRef]
- Siemens, George, and Ryan S. J. D. Baker. 2012. Learning Analytics and Educational Data Mining: Towards Communication and Collaboration. Paper presented at 2nd International Conference on Learning Analytics and Knowledge, Vancouver, BC, Canada, April 29–May 2; New York: Association for Computing Machinery, pp. 252–54. [Google Scholar] [CrossRef]
- Smyrnaiou, Zacharoula, Eleni Georgakopoulou, and Sofoklis Sotiriou. 2020. Promoting a Mixed-Design Model of Scientific Creativity through Digital Storytelling—The CCQ Model for Creativity. International Journal of STEM Education 7: 25. [Google Scholar] [CrossRef]
- Spikol, Daniel, Emanuele Ruffaldi, Giacomo Dabisias, and Mutlu Cukurova. 2018. Supervised Machine Learning in Multimodal Learning Analytics for Estimating Success in Project-Based Learning. Journal of Computer Assisted Learning 34: 366–77. [Google Scholar] [CrossRef]
- Tan, Jennifer Pei-Ling, Imelda Santos Caleon, Christin Rekha Jonathan, and Elizabeth Koh. 2014. A Dialogic Framework for Assessing Collective Creativity in Computer-Supported Collaborative Problem-Solving Tasks. Research and Practice in Technology Enhanced Learning 9: 411–37. [Google Scholar] [CrossRef]
- The EndNote Team. 2013. EndNote (Version X7). Clarivate. Available online: https://endnote.com (accessed on 10 January 2025).
- Venckutė, Milda, Iselin Berg Mulvik, and Bill Lucas. 2020. Creativity—A transversal skill for lifelong learning. An overview of existing concepts and practices. Final Report. Research Papers in Economics. [Google Scholar] [CrossRef]
- Veritas Health Innovation. 2024. Covidence Systematic Review Software. Available online: www.covidence.org (accessed on 12 March 2025).
- Wang, Shaofeng, Zhuo Sun, and Ying Chen. 2023. Effects of Higher Education Institutes’ Artificial Intelligence Capability on Students’ Self-Efficacy, Creativity and Learning Performance. Education and Information Technologies 28: 4919–39. [Google Scholar] [CrossRef]
- Yan, Hongxin, Fuhua Lin, and Kinshuk. 2021. Including Learning Analytics in the Loop of Self-Paced Online Course Learning Design. International Journal of Artificial Intelligence in Education 31: 878–95. [Google Scholar] [CrossRef]
- Yang, Christopher C. Y., and Hiroaki Ogata. 2023. Personalized Learning Analytics Intervention Approach for Enhancing Student Learning Achievement and Behavioral Engagement in Blended Learning. Education and Information Technologies 28: 2509–28. [Google Scholar] [CrossRef]
- Yulianti, Erni, Hadi Suwono, Nor Farahwahidah Abd Rahman, and Fatin Aliah Phang. 2024. State-of-the-Art of STEAM Education in Science Classrooms: A Systematic Literature Review. Open Education Studies 6: 20240032. [Google Scholar] [CrossRef]
- Zaremohzzabieh, Zeinab, Seyedali Ahrari, Haslinda Abdullah, Rusli Abdullah, and Mahboobeh Moosivand. 2025. Effects of Educational Technology Intervention on Creative Thinking in Educational Settings: A Meta-Analysis. Interactive Technology and Smart Education 22: 235–65. [Google Scholar] [CrossRef]
- Zhang, Linjie, Xizhe Wang, Tao He, and Zhongmei Han. 2022. A Data-Driven Optimized Mechanism for Improving Online Collaborative Learning: Taking Cognitive Load into Account. International Journal of Environmental Research and Public Health 19: 6984. [Google Scholar] [CrossRef]

| Criteria | Inclusion | Exclusion |
|---|---|---|
| Population | Studies conducted in formal or informal educational settings, spanning K–12 (primary/secondary), higher education (undergraduate/postgraduate), and adult/professional learning, including schools, universities, and online learning platforms. | Studies conducted in non-educational contexts, such as business, healthcare, or non-academic organizations. |
| Intervention/Exposure | Studies explicitly exploring the application or development of LA tools or frameworks in educational contexts. | Studies not addressing the use of LA or unrelated to its application in education. |
| Outcome | Research examining creativity, including fostering creative thinking, capturing creative processes, or providing creativity-oriented feedback. | Studies not focusing on creativity, or where creativity is tangential to the primary research goals (i.e., creativity was treated peripherally or operationalized as engagement, innovation, or general problem-solving without clear creative constructs). |
| Study Design | Peer-reviewed journal articles, conference papers, book chapters, systematic reviews, or meta-analyses. | Non-peer-reviewed materials, such as blog posts, editorials, or unpublished dissertations. |
| Methodological Rigor | Studies employing robust quantitative, qualitative, or mixed methods with clearly defined and reproducible methodologies. | Studies lacking methodological rigor, transparency, or sufficient data to support their conclusions. |
| Publication Date | Articles published between September 2012 and September 2024. | Articles published before September 2012. |
| Language | Publications available in English. | Non-English publications without an available translation. |
| Full-Text Accessibility | Studies with full-text articles accessible for review. | Studies with inaccessible or unavailable full text. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mirzaei, S.; Nikmehr, H.; Liu, S.; Marmolejo-Ramos, F. Creativity in Learning Analytics: A Systematic Literature Review. J. Intell. 2025, 13, 153. https://doi.org/10.3390/jintelligence13120153
Mirzaei S, Nikmehr H, Liu S, Marmolejo-Ramos F. Creativity in Learning Analytics: A Systematic Literature Review. Journal of Intelligence. 2025; 13(12):153. https://doi.org/10.3390/jintelligence13120153
Chicago/Turabian StyleMirzaei, Siamak, Hooman Nikmehr, Sisi Liu, and Fernando Marmolejo-Ramos. 2025. "Creativity in Learning Analytics: A Systematic Literature Review" Journal of Intelligence 13, no. 12: 153. https://doi.org/10.3390/jintelligence13120153
APA StyleMirzaei, S., Nikmehr, H., Liu, S., & Marmolejo-Ramos, F. (2025). Creativity in Learning Analytics: A Systematic Literature Review. Journal of Intelligence, 13(12), 153. https://doi.org/10.3390/jintelligence13120153

