Enhance College AI Course Learning Experience with Constructivism-Based Blog Assignments
Abstract
:1. Introduction
2. Literature Review
2.1. Blogs Integrated Teaching
2.2. Constructivist Curriculum
2.2.1. Key Dimensions of Knowledge Construction
2.2.2. Integration of Constructivism and Technology
2.2.3. Application of PjBL
3. Hypotheses Development
3.1. NLP Course as a Representative
3.2. Research Questions
- (1)
- How do blogs influence students’ learning abilities and experience in AI courses?
- (2)
- What impact do blogs have on students’ learning outcomes in AI courses?
- (3)
- How satisfied are the students using blogs as a platform for assignment submission?
- (4)
- What are the effects of using blogs on students’ learning outcomes in an AI course?
4. Methods
4.1. Lecture Details
4.2. Learning Variables
- Course Design (CD): 18 questions that evaluated the overall structure of the course, including teaching methods, resource availability, and opportunities for practical application. Students rated these statements on a five-point Likert scale, ranging from “strongly agree” to “strongly disagree”. These questions aimed to gather comprehensive insights into students’ perceptions of course design. Key indicators within this dimension included content organization, platform interactivity, and course difficulty. An example question is the following: “Do you find the interactive features of the blog platform helpful for your learning?”
- Learning Ability (LA): 3 questions that assessed students’ perceived growth in learning, with a focus on their ability to learn autonomously and apply programming skills. This dimension included indicators such as independent learning ability, speed of knowledge acquisition, and improvements in programming skills. These data helped to analyze the relationship between learning ability, outcomes, and satisfaction. An example question is the following: “Can you quickly master the programming knowledge taught in this course?”
- Learning Experience (LE): 3 questions that explored students’ experiences with challenges, interests, and satisfaction throughout the course. This dimension included indicators such as learning methods (e.g., autonomous or collaborative learning), classroom interaction, and depth of knowledge application. These data shed light on how learning experiences affect students’ outcomes and satisfaction. An example question is the following: “Are you able to apply the programming knowledge learned in the course to real-world projects?”
- Learning Outcomes (LO): 3 questions that assessed the specific knowledge and skills students have acquired and their ability to apply these skills in real-world situations. This dimension included indicators such as mastery of theoretical knowledge, improvement in practical skills, and self-assessment of learning outcomes. An example question is the following: “How would you assess your overall learning outcomes in this course?”
- Learning Satisfaction (LS): 5 questions that evaluated students’ overall satisfaction with the course, including its content, assignments, teaching feedback, and interaction opportunities. The data collected in this dimension supported a comprehensive analysis of the relationships among course design, learning abilities, experiences, outcomes, and overall satisfaction. An example question is the following: "Are you satisfied with your overall learning experience in this course?"
4.3. Samples Collection
5. Data Analysis and Results
5.1. Measurement Model
5.2. Structural Model
6. Discussions
6.1. Constructivist Theory in NLP Course
6.2. Expanding the Reach of Blog-Based Teaching Across Disciplines
6.3. Student-Centered Course Design
This emphasizes the necessity of practical operations in experimental courses and reflects students’ demands for changes in learning formats. Another student noted,“The programming tasks in the experimental course vary too widely in difficulty. Instead of having us write simple code on our own, requiring us to focus on adding comments would be more effective for learning.”
This indicates broad recognition of real-world project-based practical methods among students, representing a specific manifestation of dynamic learning. Additionally, students suggested,“I really like the real-world problem-based approach in practical aspects. It is highly praised and much more practical than other courses, and the results are more apparent. Please keep it up!”
This suggests that students value the ongoing access to resources, which facilitates deeper learning engagement. This feedback highlights students’ emphasis on the continuous accessibility of learning resources, further supporting a student-centered teaching design. Overall, positive feedback from students also confirms the effectiveness of our teaching methods:“Lab source code should be permanently available before the course ends for easy reference.”
“The course is very practical, learned a lot on the blog, and truly benefited.”
In summary, the student-centered course design framework proposed in this study received positive feedback in practical teaching. The course design emphasizes students’ feedback on their teaching and learning experiences, focuses on enhancing their self-abilities, encourages communication and learning among peers and community members, and fosters mutual improvement among students. Simultaneously, teachers engage in multidimensional evaluation practices for student learning outcomes and establish effective self-evaluation systems for students.“Very good course, cutting-edge knowledge, excellent teacher. I hope the college does not cancel it and continues to improve.”
6.4. Learning-Centered Course Design
This model not only helps students gain a deeper understanding of course content but also effectively avoids the limitations of traditional result-oriented teaching by embedding assessment into the entire learning process, encouraging students to continuously improve their learning outcomes through ongoing reflection and self-evaluation.“I received multiple prompts from the teacher while doing my assignments, which allowed me to make timely adjustments.”
This design not only supports students’ metacognitive development but also strengthens their long-term learning abilities. By reviewing and reflecting, students can identify shortcomings in their past learning methods and improve their future learning strategies, thereby leading to better learning outcomes.“The course blog is not deleted, so I regularly review my blog and the feedback received, sometimes gaining new insights.”
Another student added,“By reviewing other students’ projects, I learned many new ways of thinking, and this mutual learning approach gave me a deeper understanding of the course content.”
This interactive feedback mechanism based on the learning community, which includes both student discussions and expert guidance owing to its public nature, helps students continuously enhance their abilities during the learning process. Through these diverse feedback channels, students’ learning capabilities and professional skills are effectively strengthened.“I was thrilled when around 20 users liked my work!”
7. Conclusions
7.1. Contributions of the Study
7.2. Uniqueness of the Study
7.3. Limitations of the Study
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Nomenclature
References
- Alalwan, N. (2022). Actual use of social media for engagement to enhance students’ learning. Education and Information Technologies, 27(7), 9767–9789. [Google Scholar] [CrossRef] [PubMed]
- Alt, D. (2016). Contemporary constructivist practices in higher education settings and academic motivational factors. Australian Journal of Adult Learning, 56(3), 374–399. Available online: https://search.informit.org/doi/10.3316/informit.459301185740227 (accessed on 23 January 2025).
- Anderson, R. C. (1984). Some reflections on the acquisition of knowledge. In Educational researcher (pp. 5–10). Sage Publications. [Google Scholar]
- Andrade, H. L. (2013). Classroom assessment in the context of learning theory and research. In SAGE handbook of research on classroom assessment (pp. 17–34). Sage Publishing. [Google Scholar]
- Ansari, J. A. N., & Khan, N. A. (2020). Exploring the role of social media in collaborative learning: The new domain of learning. Smart Learning Environments, 7(1), 1–16. [Google Scholar] [CrossRef]
- Bereiter, C., & Scardamalia, M. (2006). Knowledge building: Theory, pedagogy and technology. In Cambridge Handbook of Learning Sciences (pp. 97–118). Cambridge University Press. [Google Scholar]
- Berends, M. (2012). Survey methods in educational research. In Handbook of complementary methods in education research (pp. 623–640). Routledge. [Google Scholar]
- Biggs, J., Tang, C., & Kennedy, G. (2022). Teaching for quality learning at university 5e. McGraw-Hill Education (UK). [Google Scholar]
- Black, P., & Wiliam, D. (2010). Inside the black box: Raising standards through classroom assessment. Phi Delta Kappan, 92(8), 81–90. [Google Scholar] [CrossRef]
- Boud, D., & Soler, R. (2016). Sustainable assessment revisited. Assessment & Evaluation in Higher Education, 41(3), 400–413. [Google Scholar]
- Boughey, C., & McKenna, S. (2021). Understanding higher education: Alternative perspectives. African Minds. [Google Scholar]
- Bovill, C., Cook-Sather, A., & Felten, P. (2011). Students as co-creators of teaching approaches, course design, and curricula: Implications for academic developers. International Journal for Academic Development, 16(2), 133–145. [Google Scholar] [CrossRef]
- Brott, P. E. (2023). Vlogging and reflexive applications. Open Learning: The Journal of Open, Distance and e-Learning, 38(3), 281–293. [Google Scholar] [CrossRef]
- Carless, D. (2020). Longitudinal perspectives on students’ experiences of feedback: A need for teacher–student partnerships. Higher Education Research & Development, 39(3), 425–438. [Google Scholar]
- Carless, D. (2022). From teacher transmission of information to student feedback literacy: Activating the learner role in feedback processes. Active Learning in Higher Education, 23(2), 143–153. [Google Scholar] [CrossRef]
- Charania, A., Bakshani, U., Paltiwale, S., Kaur, I., & Nasrin, N. (2021). Constructivist teaching and learning with technologies in the COVID-19 lockdown in Eastern India. British Journal of Educational Technology, 52(4), 1478–1493. [Google Scholar] [CrossRef] [PubMed]
- Chaudhry, M. A., & Kazim, E. (2022). Artificial Intelligence in Education (AIEd): A high-level academic and industry note 2021. AI and Ethics, 2(1), 157–165. [Google Scholar] [CrossRef] [PubMed]
- Chu, S. K., Chan, C. K., & Tiwari, A. F. (2012). Using blogs to support learning during internship. Computers & Education, 58(3), 989–1000. [Google Scholar]
- Churchill, D. (2009). Educational applications of Web 2.0: Using blogs to support teaching and learning. British Journal of Educational Technology, 40(1), 179–183. [Google Scholar] [CrossRef]
- Deng, L., & Yuen, A. H. (2011). Towards a framework for educational affordances of blogs. Computers & Education, 56(2), 441–451. [Google Scholar]
- De Veaux, R. D., Agarwal, M., Averett, M., Baumer, B. S., Bray, A., Bressoud, T. C., Bryant, L., Cheng, L. Z., Francis, A., Gould, R., Kim, A. Y., Kretchmar, R. M., Lu, Q., Moskol, A., Nolan, D., Pelayo, R., Raleigh, S., Sethi, R. J., Sondjaja, M., . . . Ye, P. (2017). Curriculum guidelines for undergraduate programs in data science. Annual Review of Statistics and Its Application, 4(1), 15–30. [Google Scholar] [CrossRef]
- Efimova, L., & Grudin, J. (2007, January 3–7). Crossing boundaries: A case study of employee blogging. 2007 40th Annual Hawaii International Conference on System Sciences (HICSS’07) (p. 86), Big Island, HI, USA. [Google Scholar]
- Eisinga, R., Grotenhuis, M. t., & Pelzer, B. (2013). The reliability of a two-item scale: Pearson, Cronbach, or Spearman-Brown? International Journal of Public Health, 58, 637–642. [Google Scholar] [CrossRef] [PubMed]
- Ellison, N., & Wu, Y. (2008). Blogging in the classroom: A preliminary exploration of student attitudes and impact on comprehension. Journal of Educational Multimedia and Hypermedia, 17(1), 99–122. [Google Scholar]
- Ferdig, R. E., & Trammell, K. D. (2005). Content Delivery in the “Blogosphere”. Journal of Educational Technology, 1(4), 16–19. [Google Scholar] [CrossRef]
- Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18(1), 39–50. [Google Scholar] [CrossRef]
- Fosnot, C. (2013). Constructivism: Theory, perspectives, and practice. Teachers College Press. [Google Scholar]
- Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. [Google Scholar] [CrossRef]
- Garcia, E., Moizer, J., Wilkins, S., & Haddoud, M. Y. (2019). Student learning in higher education through blogging in the classroom. Computers & Education, 136, 61–74. [Google Scholar]
- Gim Chung, R. H., Kim, B. S., & Abreu, J. M. (2004). Asian American multidimensional acculturation scale: Development, factor analysis, reliability, and validity. Cultural Diversity and Ethnic Minority Psychology, 10(1), 66. [Google Scholar] [CrossRef] [PubMed]
- Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT Press. [Google Scholar]
- Gouvea, J. S., Sawtelle, V., Geller, B. D., & Turpen, C. (2013). A framework for analyzing interdisciplinary tasks: Implications for student learning and curricular design. CBE Life Sciences Education, 12(2), 187–205. [Google Scholar] [CrossRef] [PubMed]
- Halic, O., Lee, D., Paulus, T., & Spence, M. (2010). To blog or not to blog: Student perceptions of blog effectiveness for learning in a college-level course. The Internet and Higher Education, 13(4), 206–213. [Google Scholar] [CrossRef]
- Hamadi, M., El-Den, J., Azam, S., & Sriratanaviriyakul, N. (2022). Integrating social media as cooperative learning tool in higher education classrooms: An empirical study. Journal of King Saud University-Computer and Information Sciences, 34(6), 3722–3731. [Google Scholar] [CrossRef]
- Harris, L. R., Brown, G. T., & Harnett, J. A. (2014). Understanding classroom feedback practices: A study of New Zealand student experiences, perceptions, and emotional responses. Educational Assessment, Evaluation and Accountability, 26, 107–133. [Google Scholar] [CrossRef]
- Hattie, J., & Yates, G. C. (2013). Visible learning and the science of how we learn. Routledge. [Google Scholar]
- Hoidn, S., & Reusser, K. (2020). Foundations of student-centered learning and teaching. In The routledge international handbook of student-centered learning and teaching in higher education (pp. 17–46). Routledge. [Google Scholar]
- Jonassen, D. H., & Rohrer-Murphy, L. (1999). Activity theory as a framework for designing constructivist learning environments. In Educational technology research and development (pp. 61–79). Springer. [Google Scholar]
- King, D. B., O’Rourke, N., & DeLongis, A. (2014). Social media recruitment and online data collection: A beginner’s guide and best practices for accessing low-prevalence and hard-to-reach populations. Canadian Psychology/Psychologie canadienne, 55(4), 240. [Google Scholar] [CrossRef]
- Kock, N. (2017). WarpPLS user manual: Version 6.0 (Vol. 141, pp. 47–60). ScriptWarp Systems. [Google Scholar]
- Leguina, A. (2015). A primer on partial least squares structural equation modeling (pls-sem). Taylor & Francis. [Google Scholar]
- Leslie, P. H., & Murphy, E. (2008). Post-secondary students’ purposes for blogging. International Review of Research in Open and Distributed Learning, 9(3), 1–17. [Google Scholar] [CrossRef]
- Li, K., Bado, N., Smith, J., & Moore, D. (2013). Blogging for teaching and learning: An examination of experience, attitudes, and levels of thinking. Contemporary Educational Technology, 4(3), 172–186. [Google Scholar] [CrossRef] [PubMed]
- Licardo, J. T., Domjan, M., & Orehovački, T. (2024). Intelligent Robotics—A Systematic Review of Emerging Technologies and Trends. Electronics, 13(3), 542. [Google Scholar] [CrossRef]
- Matthews, K. E., Adams, P., & Gannaway, D. (2009, June 29–July 1). The impact of social learning spaces on student engagement. 12th Annual Pacific Rim First Year in Higher Education Conference (Vol. 13), Townsville, Australia. [Google Scholar]
- McCune, V. (2009). Final year biosciences students’ willingness to engage: Teaching–learning environments, authentic learning experiences and identities. Studies in Higher Education, 34(3), 347–361. [Google Scholar] [CrossRef]
- Meirbekov, A., Nyshanova, S., Meiirbekov, A., Kazykhankyzy, L., Burayeva, Z., & Abzhekenova, B. (2024). Digitisation of English language education: Instagram and TikTok online educational blogs and courses vs. traditional academic education. How to increase student motivation? Education and Information Technologies, 29(11), 13635–13662. [Google Scholar]
- Miao, J., Chang, J., & Ma, L. (2022). Teacher–student interaction, student–student interaction and social presence: Their impacts on learning engagement in online learning environments. The Journal of Genetic Psychology, 183(6), 514–526. [Google Scholar] [CrossRef] [PubMed]
- Moreno, R., & Mayer, R. (2007). Interactive multimodal learning environments: Special issue on interactive learning environments: Contemporary issues and trends. Educational Psychology Review, 19, 309–326. [Google Scholar] [CrossRef]
- Mosly, I. (2024). Artificial Intelligence’s Opportunities and Challenges in Engineering Curricular Design: A Combined Review and Focus Group Study. Societies, 14(6), 89. [Google Scholar] [CrossRef]
- Nardi, B. A., Schiano, D. J., Gumbrecht, M., & Swartz, L. (2004). Why we blog. Communications of the ACM, 47(12), 41–46. [Google Scholar]
- Peterson, A., Dumont, H., Lafuente, M., & Law, N. (2018). Understanding innovative pedagogies: Key themes to analyse new approaches to teaching and learning. In OECD Education Working Papers (p. 172). OECD Publishing. [Google Scholar] [CrossRef]
- Piaget, J. (2005). The psychology of intelligence. Routledge. [Google Scholar]
- Pianta, R. C., Hamre, B. K., & Allen, J. P. (2012). Teacher-student relationships and engagement: Conceptualizing, measuring, and improving the capacity of classroom interactions. In Handbook of research on student engagement (pp. 365–386). Springer. [Google Scholar]
- Pitchford, A., Owen, D., & Stevens, E. (2020). A handbook for authentic learning in higher education: Transformational learning through real world experiences. Routledge. [Google Scholar]
- Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223–231. [Google Scholar] [CrossRef]
- Rizzo, L. (2022). Creating the schools our children need: Why what we’re doing now won’t help much (and what we can do instead). L’Italian Journal of Special Education for Inclusion, 10(2), 261–264. [Google Scholar]
- Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18(2), 119–144. [Google Scholar] [CrossRef]
- Shim, J., & Guo, C. (2009). Weblog technology for instruction, learning, and information delivery. Decision Sciences Journal of Innovative Education, 7(1), 171–193. [Google Scholar] [CrossRef]
- Tight, M. (2018). Higher education research: The developing field. Bloomsbury Publishing. [Google Scholar]
- Trumbull, E., & Lash, A. (2013). Understanding formative assessment: Insights from Learn theory and measurement theory. WestEd. [Google Scholar]
- Tynjälä, P. (2020). Pedagogical perspectives in higher education research. In The international encyclopedia of higher education systems and institutions (pp. 2186–2191). Springer. [Google Scholar]
- van den Bos, P., & Brouwer, J. (2014). Learning to teach in higher education: How to link theory and practice. Teaching in Higher Education, 19(7), 772–786. [Google Scholar] [CrossRef]
- Vom Brocke, J., Schmiedel, T., Recker, J., Trkman, P., Mertens, W., & Viaene, S. (2014). Ten principles of good business process management. Business Process Management Journal, 20(4), 530–548. [Google Scholar] [CrossRef]
- Vygotsky, L. (1978). Mind in society: The development of higher psychological processes. Harvard UP. [Google Scholar]
- Waldeck, J. H. (2007). Answering the question: Student perceptions of personalized education and the construct’s relationship to learning outcomes. Communication Education, 56(4), 409–432. [Google Scholar] [CrossRef]
- Williams, J. B., & Jacobs, J. (2004). Exploring the use of blogs as learning spaces in the higher education sector. Australasian Journal of Educational Technology, 20(2), 232–247. [Google Scholar] [CrossRef]
- Winkler, R., Hobert, S., Salovaara, A., Söllner, M., & Leimeister, J. M. (2020, April 25–30). Sara, the lecturer: Improving learning in online education with a scaffolding—Based conversational agent. 2020 CHI Conference on Human Factors in Computing Systems (pp. 1–14), Honolulu, HI, USA. [Google Scholar]
- Xie, Y., Ke, F., & Sharma, P. (2008). The effect of peer feedback for blogging on college students’ reflective learning processes. The Internet and Higher Education, 11(1), 18–25. [Google Scholar] [CrossRef]
- Yang, S.-H. (2009). Using blogs to enhance critical reflection and community of practice. Journal of Educational Technology & Society, 12(2), 11–21. [Google Scholar]
- Yang, T., & Zeng, Q. (2022). Study on the design and optimization of learning environment based on artificial intelligence and virtual reality technology. Computational Intelligence and Neuroscience, 2022, 1–9. [Google Scholar] [CrossRef]
- Zhang, T., Huang, H., & Chen, F. (2020, October 21–24). A new training mode and practice for master of engineering in a chinese university. 2020 IEEE Frontiers in Education Conference (FIE) (pp. 1–8), Uppsala, Sweden. [Google Scholar]
- Zhu, Y., Au, W., & Yates, G. (2016). University students’ self-control and self-regulated learning in a blended course. The Internet and Higher Education, 30, 54–62. [Google Scholar] [CrossRef]
Dimension | Number of Items | Sample Size | Cronbach’s Alpha |
---|---|---|---|
Overall (ALL) | 32 | 146 | 0.965 |
KMO and Bartlett’s Test | ||
---|---|---|
KMO Value | 0.918 | |
Bartlett’s Test of Sphericity | Approx. Chi-Square | 3693.579 |
df | 465 | |
p-value | 0.000 |
Dimension | CR | Cronbach’s Alpha | AVE | VIF |
---|---|---|---|---|
Course Design (CD) | 0.953 | 0.947 | 0.731 | |
Learning Ability (LA) | 0.835 | 0.703 | 0.793 | 1.318 |
Learning Experience (LE) | 0.876 | 0.787 | 0.838 | 1.339 |
Learning Outcomes (LO) | 0.896 | 0.825 | 0.861 | 1.035 |
Learning Satisfaction (LS) | 0.910 | 0.875 | 0.820 | 1.069 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, D.; Dong, X.; Zhong, J. Enhance College AI Course Learning Experience with Constructivism-Based Blog Assignments. Educ. Sci. 2025, 15, 217. https://doi.org/10.3390/educsci15020217
Wang D, Dong X, Zhong J. Enhance College AI Course Learning Experience with Constructivism-Based Blog Assignments. Education Sciences. 2025; 15(2):217. https://doi.org/10.3390/educsci15020217
Chicago/Turabian StyleWang, Di, Xingbo Dong, and Jingxun Zhong. 2025. "Enhance College AI Course Learning Experience with Constructivism-Based Blog Assignments" Education Sciences 15, no. 2: 217. https://doi.org/10.3390/educsci15020217
APA StyleWang, D., Dong, X., & Zhong, J. (2025). Enhance College AI Course Learning Experience with Constructivism-Based Blog Assignments. Education Sciences, 15(2), 217. https://doi.org/10.3390/educsci15020217