Next Article in Journal
Training a Special Education Teacher to Implement a Universal Protocol Treatment Package: Effects on Challenging Behaviors
Previous Article in Journal
Self-Concept and Self-Esteem: Relevant Variables in the Life Satisfaction of Teachers
Previous Article in Special Issue
High Expectations During Guided Pretend Play in Kindergarten: A Promising Way to Enhance Agency in a Digitalized Society?
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Investigating AI Chatbots’ Role in Online Learning and Digital Agency Development

1
Department of Languages, Literature, and Culture, Faculty of Teacher Education and Languages, Østfold University College, 1757 Halden, Norway
2
Department of Pedagogy, ICT, and Learning, Faculty of Teacher Education and Languages, Østfold University College, 1757 Halden, Norway
3
Centre for Teaching, Learning and Technology, The Arctic University of Norway, 9019 Tromsø, Norway
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(6), 674; https://doi.org/10.3390/educsci15060674
Submission received: 26 March 2025 / Revised: 14 May 2025 / Accepted: 27 May 2025 / Published: 29 May 2025

Abstract

:
The integration of artificial intelligence (AI) chatbots in online learning environments has transformed the way students engage with educational content, offering personalised learning experiences, instant feedback, and scalable support. This study investigates the role of AI-driven chatbots in the Pedagogical Information and Communication Technology (ICTPED) Massive Open Online Course (MOOC), a professional development course aimed at enhancing teachers’ Professional Digital Competence (PDC). The study pursues two connected aims: (1) to examine how chatbots support content comprehension, self-regulated learning, and engagement among pre- and in-service teachers, and (2) to explore, through a cultural-historical perspective, how chatbot use contributes to the development of students’ digital agency. Based on data from 46 students, collected through structured questionnaires and follow-up interviews, the findings show that chatbots functioned as interactive learning partners, helping students clarify complex concepts, generate learning resources, and engage in reflection—thereby supporting their PDC. At the same time, chatbot interactions mediated learners’ development of digital agency, enabling them to critically interact with digital tools and navigate online learning environments effectively. However, challenges such as over-reliance on AI-generated responses, inclusivity issues, and concerns regarding content accuracy were also identified. The results underscore the need for improved chatbot design, pedagogical scaffolding, and ethical considerations in AI-assisted learning. Future research should explore the long-term impact of chatbots on students’ learning and the implications of AI-driven tools for digital agency development in online education.

1. Introduction

The integration of chatbots into online learning represents a promising advancement in digital education, offering benefits such as personalised learning, immediate feedback, and enhanced engagement (Hew et al., 2023; Jeon et al., 2023). However, challenges related to chatbot comprehension, student over-reliance, and inclusivity should be addressed to reveal their pedagogical potential (Chang et al., 2023; Lee et al., 2020). Crucially, research is needed to examine how both students and educators, particularly those engaged in professional development, utilise chatbots in asynchronous online learning environments (Du et al., 2021; Han & Lee, 2022). A more comprehensive understanding of chatbot interactions in these contexts will inform the development of more effective and equitable online educational environments (Engeness & Nohr, 2020; Siddiq et al., 2024).
By leveraging artificial intelligence (AI) and natural language processing (NLP), chatbots provide scalable support for students, addressing challenges such as limited instructor availability in flexible learning environments (Hew et al., 2023; Hwang & Chang, 2023). The ability of chatbots to facilitate student engagement, assist with content navigation, and promote self-directed learning has led to their widespread adoption in various educational contexts, including Massive Open Online Courses (MOOCs) and teacher professional development programmes (Du et al., 2021; Han & Lee, 2022).
In MOOCs, where large numbers of participants require scalable and adaptive support, chatbots have proven particularly beneficial. They serve as interactive learning assistants by responding to student queries, guiding learners through course content, and providing timely feedback (Jeon et al., 2023). Research highlights the effectiveness of chatbots in STEM education, where they support problem-solving and conceptual understanding through step-by-step guidance (Atmosukarto et al., 2021; Cheng et al., 2024). Similarly, in language learning, chatbots act as conversational partners, aiding in vocabulary acquisition, fluency development, and self-regulated learning (Sáiz-Manzanares et al., 2023). Despite these advantages, concerns remain about chatbot reliability, students’ over-reliance on AI-generated responses, and the inclusivity of chatbot-based learning experiences (Chang et al., 2023; Lee et al., 2020).
A growing body of research suggests that chatbots not only enhance student engagement and content mastery but also play a role in fostering digital agency—the ability of learners to critically interact with digital tools, adapt their learning strategies, and engage in collaborative knowledge construction (Engeness & Nohr, 2020; Siddiq et al., 2024). From a cultural-historical perspective, digital agency emerges through the mediation of technological tools within socio-cultural learning environments (Engeness, 2020; Engeström, 1999). In this context, chatbots act as cognitive mediators, enabling learners to navigate digital learning spaces, develop metacognitive skills, and exercise autonomy in their learning processes. However, to maximise the pedagogical potential of chatbots, it is essential to understand how they shape students’ digital agency and learning experiences.
This study examines the role of AI chatbots in the ICTPED MOOC, an online course designed to develop Professional Digital Competence (PDC) among pre- and in-service teachers. As one of Norway’s longest running and most widely attended MOOCs, ICTPED offers a rich context for investigating chatbot integration in professional learning. The study pursues two interconnected aims: (1) to explore how participants engaged with chatbots to support learning, content comprehension, and self-regulated strategies that contribute to PDC; and (2) to apply a cultural-historical perspective to analyse how chatbot use mediated the development of students’ digital agency. These dual aims guide the investigation into participants’ motivations, usage patterns, and perceived benefits and limitations of AI-supported learning.
To address these questions, data were collected from 46 participants through a structured questionnaire and follow-up interviews with a subset of learners. The findings contribute to the growing discourse on chatbot use in professional development by exploring the extent to which chatbots influence learning engagement, content understanding, and the development of digital agency. The study also examines the challenges and limitations associated with chatbot-assisted learning, offering insights into how AI-driven educational tools can be improved to better support diverse learner needs. By situating chatbot interactions within a broader theoretical framework of digital agency, this research provides implications for the future design and implementation of AI chatbots in online teacher education and professional development programmes.

2. Students’ Use of Chatbots in Online Courses: What Do We Know?

Chatbots have emerged as a significant technological innovation in online learning, offering a range of functionalities designed to support student engagement, self-regulated learning, and academic achievement. Initially developed as simple rule-based systems, contemporary chatbots leverage artificial intelligence (AI) and natural language processing (NLP) to provide personalised and adaptive learning experiences (Hwang & Chang, 2023). The growing interest in chatbots as educational tools is reflected in the increasing number of studies examining their efficacy, particularly since 2017 (Hwang & Chang, 2023).
The literature identifies several categories of chatbots employed in online learning environments. Rule-based chatbots, such as those implemented in Massive Open Online Courses (MOOCs), primarily function as FAQ assistants, addressing student queries related to course logistics and content navigation (Han & Lee, 2022). More advanced AI-powered chatbots, such as those used in self-regulated learning contexts, facilitate goal setting, personalised feedback, and adaptive learning pathways (Du et al., 2021).
Empirical studies have consistently demonstrated the advantages of chatbot-assisted learning. One of the primary benefits is the provision of immediate feedback, which enhances students’ engagement, motivation, and self-efficacy (Fidan & Gencel, 2022; Sáiz-Manzanares et al., 2023). Chatbots also contribute to more flexible learning experiences by allowing students to access support outside traditional instructional hours (Deveci Topal et al., 2021). Moreover, they have been found to facilitate social presence in asynchronous learning environments, mitigating the sense of isolation often reported in fully online courses (Hew et al., 2023). Chatbots play a crucial role in self-regulated learning by supporting students in setting and achieving learning goals (Du et al., 2021). Studies indicate that chatbots used for goal setting, such as “Learning Buddy,” enhance students’ awareness of their learning objectives, leading to increased self-discipline and academic success (Hew et al., 2023). Additionally, chatbots serve as effective tutors in STEM disciplines, providing adaptive problem-solving guidance and improving student comprehension in complex subjects such as chemistry (Atmosukarto et al., 2021) and mathematics (Cheng et al., 2024). Furthermore, chatbots can enhance inclusivity in online education. By offering multilingual capabilities and accessibility features such as text-to-speech conversion, chatbots support diverse learners, including those with disabilities (Chang et al., 2023). Their ability to personalise learning experiences based on students’ performance data also allows for differentiated instruction, catering to varying levels of prior knowledge and learning paces (Sáiz-Manzanares et al., 2023). Another key advantage is the ability of chatbots to support peer collaboration and interaction in online courses. Studies show that chatbot-assisted discussions promote critical thinking and knowledge sharing, particularly in collaborative learning environments (Fidan & Gencel, 2022). Additionally, chatbots integrated into flipped learning models facilitate personalised student support and enable real-time clarification of complex concepts (Baskara, 2023).
Despite these benefits, several challenges limit the effectiveness of chatbots in educational settings. One recurring issue is the restricted contextual understanding of chatbots, which often leads to irrelevant or inadequate responses (Lee et al., 2020). Students have reported frustration with chatbots that fail to interpret nuanced queries or provide meaningful explanations beyond predefined responses (Cheng et al., 2024). Moreover, the initial enthusiasm often surrounding chatbot use may diminish over time—a phenomenon known as the novelty effect—resulting in reduced engagement and weakened long-term impact on learning (Wu & Yu, 2024). Another significant concern is the potential over-reliance on chatbots, which may reduce students’ critical thinking and problem-solving abilities by encouraging passive learning behaviours (Chang et al., 2023). Overuse of chatbots in problem-solving scenarios has been linked to decreased confidence in independent learning, as students tend to depend on AI-generated responses rather than developing their own reasoning skills (Cheng et al., 2024). While positive outcomes have been reported in STEM and language learning domains, other studies raise questions about the generalisability of these benefits. For example, Wu and Yu (2024) suggest that chatbots may be less effective in supporting interpretive, critical, or affective learning tasks typically found in humanities or higher-order reflective contexts, where AI-generated responses may oversimplify or obscure nuanced understanding. Additionally, while chatbots facilitate self-regulated learning, their effectiveness varies based on students’ prior knowledge and educational level, with master’s students demonstrating greater benefits compared to undergraduates (Sáiz-Manzanares et al., 2023). While positive outcomes have been reported in STEM and language learning domains, other studies raise questions about the generalisability of these benefits. For example, it has been suggested that chatbots may be less effective in supporting interpretive, critical, or affective learning tasks typically found in humanities or higher-order reflective contexts, where AI-generated responses may oversimplify or obscure nuanced understanding (Wu & Yu, 2024).
Ethical concerns surrounding chatbot implementation in education also pose significant challenges. Issues related to data privacy, algorithmic bias, and potential breaches of academic integrity have been highlighted in the literature (Aguilera-Hermida, 2024). Students’ data privacy is a critical concern, particularly when chatbots collect and analyse large amounts of user data to personalise learning experiences (Chang et al., 2023). Moreover, there is an ongoing debate on the extent to which chatbots should be allowed in assessment environments, given their ability to generate responses that may compromise academic integrity (Ilieva et al., 2023). In addition to ethical and usability concerns, studies have highlighted the importance of pedagogical intentionality when integrating chatbots into learning environments. Ilieva et al. (2023) caution against the premature or overly generic use of AI tools in education, noting that without proper alignment to curricular goals, chatbots risk functioning as isolated or even disruptive elements within the learning process. These findings emphasise the need for thoughtful planning to ensure that chatbot interactions contribute meaningfully to educational objectives. Another limitation is chatbot usability across diverse student demographics. Research indicates that non-native English speakers and students from specific geographical regions experience greater difficulties in using chatbots due to language barriers and cultural differences in communication styles (Han & Lee, 2022). Furthermore, some students find chatbot interactions impersonal and prefer human instructor guidance, particularly in disciplines that require subjective interpretation, such as humanities and social sciences (Hwang & Chang, 2023). Scalability remains another challenge, as maintaining chatbot effectiveness in large-scale online courses can be complex. Studies suggest that chatbots require continuous updates and improvements to keep pace with evolving course content and student needs (Deng & Yu, 2023). Without thoughtful integration into the institutional, curricular, and pedagogical structures that shape teaching and learning—alongside collaboration between key stakeholders such as educators, instructional designers, IT developers, and policy-makers—chatbots risk being underutilised or misaligned with educational goals, ultimately reducing their pedagogical impact (Hwang & Chang, 2023). These findings collectively suggest that without thoughtful pedagogical integration and domain-sensitive adaptation, chatbots may not only be underutilised but also risk becoming pedagogically irrelevant or even counterproductive. Their value depends not merely on technical availability, but on the alignment between chatbot functionality, subject-specific learning goals, and instructional design.
Although research on chatbot-assisted learning has expanded considerably, several gaps remain. Most existing studies focus on short-term outcomes, with limited attention given to the long-term impact of chatbots on learning behaviours and academic success (Hwang & Chang, 2023). Additionally, while chatbots have been extensively examined in STEM and language learning contexts, their application in humanities and social sciences remains underexplored (Hwang & Chang, 2023). A particularly underdeveloped area of research concerns the use of chatbots in supporting teachers’ professional development within asynchronous online courses. While some studies suggest that chatbots can enhance teacher support and reduce administrative workload (Chiu et al., 2024), their role in fostering professional learning communities and facilitating pedagogical reflection has yet to be systematically investigated. This study addresses this gap by examining how teachers used chatbot in ICTPED MOOC aimed at enhancing teacher digital competence, professional development and agency.

3. Cultural-Historical Perspective on Digital Agency in Online Learning

The cultural-historical theory, originally developed by Vygotsky, provides a theoretical lens through which human cognition, learning, and agency are understood as deeply embedded in social and cultural contexts. From this perspective, agency is not an inherent or static trait of the individual, but rather an emergent and relational phenomenon—shaped through participation in collective activities and mediated by cultural tools and signs (Vygotsky, 1978; Engeström, 1987). Galperin further advanced this perspective by conceptualising learning as the internalisation of actions mediated by material and symbolic resources, positioning learners as active agents who transform both their environment and themselves through goal-directed activity (Engeness, 2021a; Engeness & Gamlem, 2025; Engeström, 1987).
From this perspective, human activity is always socially and historically situated, and the development of agency is contingent on individuals’ ability to appropriate and transform available cultural tools. The interplay between human cognition and social context is further reinforced by Galperin’s contribution, which conceptualises learning as a dynamic interaction between subject, object, and mediating artefacts (Engeness, 2021a). This systemic approach underscores that agency is not merely about independent action but about the ways individuals participate in collective practices and reshape them through engagement. Furthermore, Galperin’s work suggests that cognitive development is deeply tied to material and semiotic tools, which shape learners’ abilities to act purposefully within their environments (Engeness, 2021a). These foundational premises reinforce that agency is a process of cultural participation rather than an individual trait, a notion that becomes increasingly significant in the digital age.
In contrast to traditional, transmission-oriented learning environments, socio-cultural learning environments are characterised by collaborative, tool-mediated meaning-making, where knowledge is co-constructed through interaction with both human and non-human mediators. In such contexts, AI chatbots can be understood as cognitive mediators—tools that shape how learners engage with knowledge, reflect on their understanding, and regulate their learning behaviour. These chatbots do not merely deliver information; they structure activity by enabling learners to question, reframe, and evaluate content.
Within the cultural-historical framework, agency is understood as the capacity of individuals to act within and transform their socio-cultural environment. Unlike perspectives that treat agency as an individual attribute, this approach emphasises that agency is relational and context-dependent, emerging through participation in social practices and mediated by cultural tools (Siddiq et al., 2024). Edwards (2015) highlights that agency involves the ability to interpret and respond to social contexts by mobilising resources and engaging in relational practices (Edwards, 2015). This means that agency is not just about independent action but also about forming networks of support and using available tools effectively (Edwards, 2015). Mäkitalo (2016) extends this view by demonstrating how agency emerges through interactional and discursive practices in learning environments, emphasising that it is dynamically constructed rather than a static trait (Mäkitalo, 2016). Engeness (2020) further argues that digital agency is developed through iterative participation in online and digital learning spaces, where learners engage with tools that mediate knowledge acquisition and transformation. In digital settings, agency is not merely about the ability to act but also about learners’ capacity to engage in epistemic practices that facilitate meaning-making in technologically mediated environments (Engeness & Nohr, 2020) Digital agency, in this context, extends beyond technical competence to encompass learners’ ability to critically engage with digital tools, adapt learning strategies, and participate in collaborative knowledge construction. Within this framework, digital agency is not simply the ability to use technology, but the capacity to strategically appropriate digital tools for epistemic tasks—such as inquiry, reflection, and synthesis—in response to socially situated learning demands (Siddiq et al., 2024; Engeness & Nohr, 2020). This perspective informs our investigation into how participants in the ICTPED MOOC developed digital agency through their interaction with chatbots, and how the design of such tools affords or constrains transformative engagement.
By adopting the cultural-historical perspective, digital agency acquires its transformative nature and is conceptualised as an epistemic and ontological phenomenon where learners actively shape their learning environments through interactions with digital tools (Engeness, 2021b). This transformative process implies that digital agency evolves as learners develop strategies to navigate the demands of online learning. Transformative digital agency is closely linked to learners’ ability to engage with and repurpose digital tools for learning and the students’ capacity to use digital resources strategically (Engeness, 2020).
However, challenges arise when learners lack the necessary scaffolding to develop digital agency. While some students thrive in digitally mediated environments, others struggle with motivation, digital literacy, and self-regulation (Brevik et al., 2019). Therefore, fostering digital agency requires intentional pedagogical strategies that balance autonomy with structured support. From a cultural-historical perspective, this requires students to appropriate digital tools as mediational artifacts, transforming passive consumption into active knowledge construction (Aagaard & Lund, 2019).
This study employs the cultural-historical perspective to examine how students develop digital agency through their interactions with chatbots in asynchronous online learning environments. By analysing how learners engaged with AI-driven chatbots, the study aims to uncover the socio-cultural dynamics that influence digital agency development. This perspective provides a robust framework for understanding the interplay between AI-technology, social interaction, and learner autonomy in digital education.

4. Method

4.1. Participants and Setting

Data were collected through a questionnaire administered online to 228 pre- and in-service teachers enrolled in the ICTPED MOOC during the spring semester of 2024 (response rate 46 out of 228—approximately 20.2%). In addition, interviews with eight students who had participated in the course were conducted. The study aimed to explore the participants’ learning experiences and their use of subject-specific chatbots in the ICTPED MOOC. The questionnaire was designed to capture: (a) general information about the participants, (b) how they used the chatbots in the ICTPED MOOC, and (c) how students’ interactions with chatbots contributed to achieving their learning outcomes.
The questionnaire consisted of 35 questions, employing a mix of formats, including a five-point Likert scale for quantitative measures and open-ended questions for qualitative insights. Table 1 presents the number of respondents to the questionnaire, their professional background, and general evaluation of the ICTPED MOOC.

4.2. ICTPED MOOC

The ICTPED MOOC was first introduced in Norway in 2016 as an xMOOC (Pedagogical Information and Communication Technology Massive Open Online Course) developed by researchers and development specialists at Østfold University College. It is one of Norway’s longest running and most popular MOOCs, designed to enhance the Professional Digital Competence (PDC) of pre- and in-service teachers with 1529 participants passed and acquired 15 European Credit Transfer and Accumulation System (ECTS) credits since its launch in 2016 (Engeness & Nohr, 2022). The course integrates pedagogical and technological elements, utilising digital tools to foster active, flexible, and self-regulated learning (Engeness & Nohr, 2020).
The ICTPED MOOC follows an xMOOC structure (Armellini & Padilla Rodriguez, 2016; Fidalgo-Blanco et al., 2016), hosted on the Canvas platform, and is characterised by an institutional focus, reliance on video resources, and automated assessments through quizzes. The course consists of eight modules completed over 20 weeks, each beginning with an introductory section outlining learning goals and expected outcomes. These modules include:
  • Textual information presented on webpages
  • Embedded research articles
  • Videos and audio resources
  • Individual tasks and reflection questions
  • Multiple-choice quizzes for both formative and summative assessments
Each module starts with an introduction video and textual information, followed by tasks, quizzes, and interactive discussions. Formative assessments are integrated through strategically placed small multiple-choice tests, while summative assessments include larger multiple-choice quizzes at the end of each module (Engeness & Nohr, 2020).
To ensure accessibility, Universal Design principles have been embedded into the ICTPED MOOC. Participants can download course materials in multiple formats, including audio files, podcasts, flat PDF files, and e-books. Each module page also includes embedded audio files to facilitate learning for diverse needs. Additionally, the course design enables flexible engagement, allowing participants to follow different learning pathways based on their preferences.
A key feature of the ICTPED MOOC is its emphasis on participant agency in digital environments. The course structure supports diverse engagement strategies, with some students following a sequential approach (reading text first, watching videos, then engaging with assignments), while others prioritise assignments or collaboration before reviewing instructional content. Analysis of previous course iterations indicates that participants engaged in different entry activities, including: Reading textual information (52.94%); Watching videos (21.57%); Engaging in assignments (9.80%); Listening to audio materials (7.84%) and Other activities (e.g., collaboration and content conversion) (7.86%) (Engeness & Nohr, 2020). This flexible approach allows participants to construct their own learning trajectories, aligning with the course’s goal of fostering digital competence and independent learning strategies.
The ICTPED MOOC follows a structured progress plan, detailed in Table 2, which outlines the modules and their corresponding timeline:
Upon successful completion of the course (evaluated as pass/fail), participants receive 15 ECTS credits (Engeness & Nohr, 2022).
In the final three modules, students were introduced to chatbots specifically trained on the curriculum for each module. These chatbots were developed using OpenAI’s GPT-4 technology and trained using Retrieval-Augmented Generation (RAG) technology to ensure accurate, subject-specific responses. The chatbots were integrated into the course platform and prominently displayed on each module’s main page.
To guide students in using the chatbots, three instructional videos were provided, explaining their functionality and purpose. Access to the chatbots required participants to have their own subscription to the paid version of ChatGPT4o.
The ICTPED MOOC has been highly successful, with over 80% of participants completing the course. The integration of structured learning paths, interactive digital tools, and accessibility features has made it a cornerstone of digital competence training for educators in Norway (Engeness & Nohr, 2020).

4.3. Data and Analysis

This study employed a mixed-methods approach (Creswell, 2012) to investigate two central research questions:
  • How did participants engage with AI chatbots to support their learning and the development of Professional Digital Competence (PDC)?
  • How did chatbot use contribute to the development of digital agency, viewed through a cultural-historical perspective?
Data were collected through a structured online questionnaire and semi-structured interviews with participants enrolled in the ICTPED MOOC. A total of 46 participants completed the questionnaire, and eight participants were selected for in-depth follow-up interviews. The questionnaire included a combination of Likert-scale items and open-ended questions, while the interviews explored participants’ experiences, motivations, and learning processes related to chatbot use.

4.3.1. Analysis for Research Question 1: Chatbot Use and Professional Digital Competence (PDC)

To explore how chatbots supported participants’ learning and contributed to the development of PDC, the questionnaire included three key items:
  • Have you used GPT/chatbots in Modules 6, 7, and 8? (Yes/No)
  • How do you assess the use of GPT/chatbots? (5-point Likert scale: very negative to very positive)
  • How did you use GPT/chatbots in Modules 6, 7, and 8? (Open-ended)
Quantitative data from items 1 and 2 were analysed using descriptive statistics to assess chatbot usage and perceived effectiveness. The results revealed patterns in adoption rates and satisfaction levels among learners.
Thematic analysis (Braun & Clarke, 2014) was applied to responses from item 3 to identify keyways participants used chatbots to support learning. Themes related to clarification of course content, task completion, study resource generation, and assessment preparation were central to understanding how chatbots contributed to PDC.

4.3.2. Analysis for Research Question 2: Chatbot Use and Digital Agency Development

To address the second research question, qualitative data from both the open-ended questionnaire responses and the eight follow-up interviews were analysed through a cultural-historical lens. The interviews focused on the following questions:
4.
How did you use the GPT/chatbots?
5.
Why did you use the GPT/chatbots?
6.
What support did the GPT/chatbots provide to you?
All interviews were transcribed and analysed in NVivo 12 using an inductive coding approach (Patton, 2015). The analysis focused on identifying patterns that reflected learners’ development of digital agency—such as metacognitive regulation, critical evaluation of chatbot responses, and strategic use of AI tools in learning.
Emerging themes were interpreted using a cultural-historical framework, with particular attention to how chatbot interactions acted as mediational tools enabling learners to engage in autonomous learning and meaning-making practices.

4.3.3. Integration of Findings

By triangulating the quantitative questionnaire data with the qualitative insights from open-ended responses and interviews, the study offers a comprehensive view of chatbot integration in the ICTPED MOOC. This mixed-methods approach allows for a deeper understanding of how chatbot use supported both the development of Professional Digital Competence and the emergence of digital agency among participants.

5. Findings

5.1. Analysis of the Questionnaire (Research Question 1: Chatbot Use and Professional Digital Competence)

This section presents findings related to Research Question 1: How did participants engage with AI chatbots to support their learning and the development of Professional Digital Competence (PDC)? The analysis draws on three questions from the structured questionnaire, which aimed to explore participants’ use of chatbots in Modules 6, 7, and 8 of the ICTPED MOOC.
  • Q1: Chatbot Usage Frequency
Participants were first asked, “Have you used GPT/chatbots in Modules 6, 7, and 8?” to determine the overall extent of chatbot integration into their learning process.
  • 52.2% of participants reported using GPT/chatbots.
  • 47.8% stated that they did not engage with chatbots.
These figures indicate that just over half of the participants adopted the chatbot feature. The remaining 47.8% of non-users may reflect a lack of awareness, uncertainty regarding functionality, technical access issues, or a preference for more traditional learning tools. Follow-up interviews confirmed that some learners were unaware of how to access or effectively use the chatbots, while others expressed scepticism about their pedagogical value.
  • Q2: Perceived Satisfaction with Chatbot Use
The second question asked participants to rate their experience using GPT/chatbots on a five-point Likert scale (1 = very negative; 5 = very positive). The distribution of responses was as follows:
  • 0% reported being very little satisfied (1/5)
  • 0% reported being little satisfied (2/5)
  • 22.2% were somewhat satisfied (3/5)
  • 55.6% were strongly satisfied (4/5)
  • 22.2% were very strongly satisfied (5/5)
These results reveal a strikingly positive response among users, with 77.8% reporting strong or very strong satisfaction. Importantly, no dissatisfaction was recorded. This high satisfaction rate suggests that those who used the chatbots found them beneficial for supporting content comprehension, navigating the course material, and developing key digital learning strategies.
However, 22.2% of participants who were only somewhat satisfied indicated that the experience, while generally helpful, was not without limitations. Some of these learners cited inconsistent or overly generic responses from the chatbot, while others wanted tighter integration of the chatbot’s feedback with course-specific language and tasks. These moderate ratings point to areas for improvement, particularly around alignment between chatbot output and course objectives, subject specificity, and clearer usage guidance.
  • Q3: How Participants Used Chatbots
In response to the open-ended question, “How did you use GPT/chatbots in Modules 6, 7, and 8?”, participants provided a wide range of reflections. A thematic analysis (Braun & Clarke, 2014) revealed five dominant themes, each relating to practices that support the development of Professional Digital Competence (PDC):
  • Clarification and Explanation: Many participants used chatbots to better understand course content by asking, clarifying questions or requesting simplified explanations. The chatbot was described as “a tool for breaking down complex concepts” and “summarising difficult texts”. This function supported conceptual mastery and pedagogical understanding—core components of PDC.
  • Assessment Preparation and Study Support: Several participants used the chatbot as a study aid. They asked for example quiz questions, explanations of potential answers, and test preparation summaries. This use case reinforced participants’ self-directed learning and goal-setting strategies.
  • Generating Learning Resources: Participants used the chatbot to produce personalised study notes, concept summaries, and learning prompts. This creative use demonstrated higher-order application of digital tools for knowledge organisation and instructional planning, reinforcing PDC dimensions related to content structuring and knowledge dissemination.
  • Exploring Alternative Perspectives: Some respondents reported using the chatbot to request multiple explanations of the same concept or to relate theory to classroom practice. This experimentation supported critical reflection and pedagogical adaptability—key attributes of digitally competent educators.
  • Interactive and Dialogic Engagement: A smaller group of participants described engaging in sustained, back-and-forth conversations with the chatbot, using it as a dialogic partner to test hypotheses or explore what-if scenarios. These learners demonstrated more sophisticated interaction, bordering on exploratory learning and metacognitive reflection. Although this theme overlaps with digital agency, it also signals a high degree of digital competence in terms of strategy, tool use, and autonomy in learning.
In summary, the questionnaire analysis reveals that chatbot adoption among participants was moderate, with just over half reporting active use during the course. While this indicates promising uptake, the relatively high proportion of non-users points to potential gaps in chatbot design, accessibility, or integration into the learning experience. Among those who did use the chatbots, satisfaction levels were notably high, suggesting that the tool was perceived as effective in supporting learning. However, some participants expressed a desire for improved alignment between chatbot responses and course-specific content, highlighting the need for greater contextual relevance. The findings also show that participants engaged with chatbots in diverse and purposeful ways—using them to clarify concepts, generate study materials, and prepare for assessments—all of which support key aspects of Professional Digital Competence (PDC). Additionally, several patterns of use, particularly dialogic engagement and the exploration of alternative perspectives, suggest emerging forms of digital agency. These aspects are discussed in greater detail in Section 5.2.

5.2. Findings from Interviews (Research Question 2: Chatbot Use and Digital Agency)

This section presents findings related to Research Question 2: How did chatbot use contribute to the development of digital agency, as viewed through a cultural-historical perspective? Drawing primarily on the qualitative interview data, the analysis explores how participants appropriated chatbots as cognitive and cultural tools within their learning processes. Thematic analysis of the eight semi-structured interviews revealed three overarching patterns that illustrate different dimensions of digital agency development: (1) metacognitive awareness and self-regulation, (2) critical interaction and tool appropriation, and (3) epistemic engagement and reflective autonomy.
Metacognitive Awareness and Self-Regulation: Several participants described how chatbot interactions encouraged them to monitor their own understanding and learning progress. By prompting the chatbot to explain or rephrase difficult concepts, participants reported greater clarity and confidence in their grasp of the material. These interactions often served as checkpoints for self-evaluation, enabling learners to identify gaps in their knowledge and adjust their study strategies accordingly.
From a cultural-historical perspective, these practices illustrate self-regulated learning mediated by digital tools, in which the chatbot supports both the planning and monitoring students’ learning. The chatbot, in this case, functioned as a mediational artifact that scaffolded metacognitive awareness, a key component of digital agency.
Critical Interaction and Tool Appropriation: Participants also demonstrated critical engagement with chatbot responses, frequently evaluating the relevance and trustworthiness of the output. Some described instances where they challenged the chatbot’s answers or asked for clarification until the response aligned with their expectations or course content.
Such engagement reflects epistemic agency as learner’s ability to question, adapt, and repurpose digital tools to suit their own learning objectives. Rather than passively accepting AI-generated responses, these participants used the chatbot dynamically, taking control of the interaction to extract meaningful learning benefits. This aligns with cultural-historical perspective on tool-mediated learning, where learners actively reshape their interaction with tools based on their evolving needs and socio-cultural context.
Epistemic Engagement and Reflective Autonomy: A smaller group of participants used the chatbot to explore educational content from multiple angles, often engaging in extended dialogue to test ideas or examine alternative interpretations. These learners described chatbot interactions not only as information retrieval, but as a space for conceptual exploration and reflective thinking.
This form of dialogic engagement mirrors the exploratory discourse found in collaborative learning environments and signals a high degree of reflective autonomy. Participants demonstrated the capacity to use the chatbot not just to answer questions but to pose new ones, extend discussion, and reframe their understanding—practices central to transformative digital agency.
In summary, the interview data suggest that chatbot use contributed meaningfully to the development of digital agency among participating teachers. Through scaffolded reflection, adaptive use of AI-generated responses, and epistemic experimentation, learners engaged with chatbots in ways that extended beyond instrumental use. From a cultural-historical perspective, the chatbot functioned as a mediational tool enabling learners to participate in technologically mediated meaning-making, enhance their autonomy, and exercise control over their learning processes.
While not all participants exhibited these behaviours to the same extent, the findings point to the potential of AI chatbots to foster critical, reflective, and agentive learning practices, provided that learners are supported in developing the skills and dispositions to engage with such tools productively. These insights have important implications for the design of professional learning environments that aim to strengthen both digital competence and digital agency.

6. Discussion

This study aimed to explore pre- and in-service teachers’ learning experiences with AI chatbots in the ICTPED MOOC, focusing on their usage patterns, perceived benefits, and challenges. The findings align with previous research on the integration of chatbots in online education, reinforcing their role in facilitating learning, enhancing engagement, and providing support across various educational contexts (Fidan & Gencel, 2022; Hew et al., 2023; Sáiz-Manzanares et al., 2023).

6.1. AI Chatbots as Interactive Learning Partners

Consistent with prior studies that highlight chatbots’ potential to support problem-solving and personalized learning in STEM and language education (Hew et al., 2023; Jeon et al., 2023), our findings demonstrate that participants used chatbots primarily for clarification and understanding of course content. Many students reported leveraging the chatbots to verify their understanding, clarify complex concepts, and receive tailored explanations. This reflects the adaptive and responsive nature of chatbots, similar to their role in STEM education, where they provide step-by-step guidance to enhance learning outcomes (Atmosukarto et al., 2021). Additionally, chatbots were utilised for content creation and task assistance, with participants generating assignments, quizzes, and instructional materials. These findings echo research by Chang et al. (2023), who emphasised the scalability of chatbots in MOOCs by assisting learners with course navigation and task completion.
Participants found chatbots beneficial for resource gathering and creating summaries, using them to locate relevant curriculum materials, academic references, and video content. This aligns with previous studies on chatbots’ ability to facilitate self-directed learning (Sáiz-Manzanares et al., 2023). Moreover, the ability of chatbots to provide real-time feedback contributed to a more interactive learning environment, supporting both cognitive and metacognitive processes in learning (Du et al., 2021).

6.2. Engagement and Motivation in Digital Learning Environments

The study findings highlight that participants were motivated to use chatbots for various reasons, including curiosity about the technology, positive past experiences, and the ease of communication offered by the tool. These findings resonate with research showing that chatbots enhance learner engagement by providing immediate feedback and promoting interactivity in online courses (Fidan & Gencel, 2022; Cheng et al., 2024). Several participants described their experience as opening “a new world for students,” indicating a sense of empowerment and excitement, which aligns with previous studies emphasising the motivational aspects of chatbots in online learning (Wu & Yu, 2024).
Additionally, the chatbot’s perceived reliability and accessibility contributed to students’ trust in the tool as a valuable resource for online learning. Similar to findings by Han and Lee (2022), who noted that chatbots improve accessibility in large-scale learning environments, participants in our study recognised chatbots as a convenient and efficient means of obtaining support, particularly when instructor availability was limited. The positive impact of chatbots on motivation suggests that AI-driven tools can help sustain student engagement over time, particularly in self-paced learning environments.

6.3. Supporting Self-Regulated Learning and Digital Competence

One of the key findings in this study is the chatbot’s role as a communication and brainstorming partner. Participants found chatbots valuable in exploring ideas, refining their understanding, and generating new insights. This aligns with previous research on chatbots’ ability to foster self-regulated learning by guiding learners through goal setting, progress monitoring, and reflection (Du et al., 2021; Sáiz-Manzanares et al., 2023). The ICTPED MOOC participants used chatbots to engage in reflective learning processes, confirming prior findings that these tools can act as intelligent learning companions that scaffold the learning process (Engeness & Nohr, 2020).
Furthermore, the chatbot’s ability to facilitate metacognitive strategies was evident, as participants used it to test their knowledge, analyse different viewpoints, and generate alternative explanations. This finding is in line with research by Edwards (2015) and Mäkitalo (2016), who emphasise the importance of interactive tools in fostering critical thinking and knowledge construction. However, the effectiveness of chatbot-assisted self-regulated learning depends on learners’ ability to critically assess chatbot responses, suggesting a need for explicit instruction on how to effectively engage with AI tools (Engeness & Gamlem, 2025).

6.4. Limitations and Challenges in Using AI-Driven Chatbots

Despite the reported benefits, several challenges were identified in participants’ responses. Some students noted concerns regarding the accuracy of chatbot-generated content, aligning with existing literature that highlights chatbots’ limitations in providing domain-specific, contextually accurate responses (Lee et al., 2020; Cheng et al., 2024). This suggests the need for ongoing improvements in chatbot training to enhance content precision and minimise misleading or superficial explanations.
Another key concern is the potential over-reliance on chatbots, which could impact students’ critical thinking and problem-solving skills, as highlighted in previous studies (Chang et al., 2023; Wu & Yu, 2024). Some participants expressed hesitation about using chatbots extensively, indicating a preference for traditional learning methods or instructor-led guidance. These findings suggest the importance of integrating chatbots as a supplementary rather than a primary learning tool to balance independent learning with expert guidance.
Additionally, chatbot usability across diverse learner demographics remains an issue. Research indicates that non-native English speakers and students from specific geographical regions experience greater difficulties in using chatbots due to language barriers and cultural differences in communication styles (Han & Lee, 2022). In our study, some participants reported struggling with phrasing their questions effectively, leading to inconsistent chatbot responses. Future research should explore ways to enhance chatbot usability by incorporating multilingual capabilities and adaptive learning features.

6.5. Chatbots as Mediators of Digital Agency Development

The findings of this study contribute to a deeper understanding of how AI chatbots mediate the development of students’ digital agency in online learning environments. Drawing on a cultural-historical perspective, we conceptualised agency not as a stable trait of the learner, but as an emergent and relational capacity, shaped through mediated interaction with cultural tools—in this case, chatbots (Vygotsky, 1978; Engeström, 1987; Engeness, 2021a).
In this socio-cultural learning environment (ICTPED MOOC), the chatbot functioned as a cognitive mediator, offering opportunities for learners to engage in reflective inquiry, strategic use of feedback, and iterative meaning-making. Participants’ ability to use the chatbot to question, reframe, and evaluate content represents a form of epistemic action mediated by digital technology. This contrasts with traditional learning contexts, where tools often serve only to deliver information passively.
Our findings show that while some learners used chatbots dialogically—to test hypotheses, explore alternative interpretations, and reflect on their own thinking, others engaged more instrumentally or superficially. These differences illustrate that digital agency does not arise automatically from tool use but must be supported through pedagogical design that encourages critical, reflective, and purposeful interaction.
Moreover, while several participants demonstrated metacognitive activity, such as monitoring comprehension or adapting their strategies, others expressed difficulty in phrasing effective prompts or assessing the quality of chatbot responses. These patterns suggest that uneven development of agency may result from differences in teachers’ PDC, confidence, or prior knowledge, and highlight the need for explicit scaffolding when integrating AI into professional learning contexts.
Although this study was situated in a MOOC designed to support professional digital competence (PDC), its socio-cultural orientation allowed for more than technical skills development; it enabled learners to participate in a community of digital inquiry, where tools such as chatbots shaped not only what learners knew, but how they came to know it. However, the extent to which this transformative potential was realised varied significantly among participants.
In summary, chatbots served not simply as functional support systems, but as mediational tools within a socio-cultural learning environment, influencing how learners engaged with knowledge and developed digital agency. The emergence of this agency was marked by variation: some learners actively appropriated the chatbot for critical and conceptual learning, while others exhibited more passive or surface-level engagement. These findings reinforce the need to design chatbot-integrated learning environments that are not only technically effective but pedagogically intentionally supporting learners to interact meaningfully, reflect critically, and develop as autonomous digital agents.

7. Implications, Further Research and Conclusions

7.1. Implications and Further Research

The findings of this study have several implications for the implementation of AI chatbots in teacher education and professional development, particularly in socio-cultural learning environments such as the ICTPED MOOC. Both quantitative and qualitative data revealed how learners used chatbots for clarification, assessment preparation, and generation of study resources—practices that aligned with high satisfaction ratings (77.8% reporting strong or very strong satisfaction) and frequent engagement with the tools. These usage patterns informed the development of the implications below.
First, chatbot design should prioritise subject-specific accuracy and contextual sensitivity, particularly in complex domains such as pedagogy and teacher education. Participants’ difficulty in phrasing questions and receiving inconsistent responses suggests that current chatbot functionality may not always meet the nuanced needs of professional learners. To address this, developers should integrate domain-specific training data and design prompts that better reflect authentic learning tasks in professional education.
Second, instructional strategies should be designed to promote critical engagement with chatbot outputs. While chatbots were helpful in supporting comprehension and task completion, our findings also highlighted the risk of passive reliance on AI-generated answers. Educators should scaffold student use of chatbots through activities that require comparison of chatbot responses with course materials, prompting learners to reflect on accuracy, relevance, and applicability.
Third, issues related to accessibility and inclusivity must be addressed in chatbot design. The quantitative data showed a nearly even split between users and non-users, suggesting that not all students felt confident or motivated to engage with the tool. Language proficiency and digital literacy were identified as barriers by some participants. As a result, we recommend the development of chatbots with multilingual support, simplified user interfaces, and onboarding materials that support diverse learner profiles. Although participants did not directly request these features, these suggestions emerged from our interpretation of recurring usability issues and are supported by existing literature on inclusive and adaptive AI in education (Ilieva et al., 2023; Wu & Yu, 2024).
Fourth, institutions should provide structured pedagogical support for chatbot use. Our findings revealed that learners are often engaged with chatbots independently, without consistent guidance on how to integrate them into their study routines. Educators should receive training on how to embed chatbot activities into their instructional design, and students should be encouraged to treat chatbots as reflective learning partners rather than automated answer providers.
The cultural-historical perspective adopted in this study has also informed our understanding of how digital agency is shaped through tool-mediated activity. Rather than viewing chatbots as static technologies, we interpreted them as cognitive mediators that helped learners externalise thought, test ideas, and regulate their own understanding. This theoretical lens added value by highlighting the process through which digital agency emerged—not just the outcomes—and underscores the need for future chatbot design to prioritise adaptive interaction, dialogic depth, and transparent scaffolding mechanisms that promote reflective autonomy.
While this study focused on a specific teacher education MOOC, the findings point to several broader directions for future research. First, it would be valuable to investigate how specific chatbot features—such as personalised feedback, conversational depth, multilingual capabilities, and adaptability—affect long-term learner engagement, knowledge retention, and the development of digital agency (Wu & Yu, 2024). Second, future studies could explore the potential of chatbots to foster collaborative learning, examining how these tools function not only in individual reflection but also in group-based digital learning environments. Third, research should consider how the integration of AI tools influences instructional design practices, particularly how educators balance pedagogical intentionality with automation in planning and delivery (Ilieva et al., 2023). Finally, continued attention is needed to address ethical concerns, including algorithmic bias, data privacy, and AI transparency, to ensure that the integration of chatbots into education remains equitable, responsible, and aligned with core educational values (Aguilera-Hermida, 2024; Ilieva et al., 2023).

7.2. Conclusions

By situating chatbot interactions within a cultural-historical framework of digital agency, this research makes a unique contribution to the field of AI in education. Specifically, it highlights how chatbots can serve as mediational tools that not only support Professional Digital Competence (PDC) but also foster transformative digital agency among educators. These findings provide actionable insights for the design of AI-driven tools and professional learning environments, offering a pathway for more equitable, reflective, and impactful integration of chatbots in online education.

Author Contributions

Conceptualization, methodology, formal analysis, investigation, resources, writing—original draft preparation, writing—review and editing, project administration, I.E.; investigation, data curation, M.N.; investigation, review and editing, T.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by The Norwegian Directorate for Shared ICT and Digital Services in Higher Education and Research on 22 May 2024, Ref. 888839.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data created in this project is stored at Østfold University College Service and is unavailable due to privacy and ethical restrictions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Aagaard, T., & Lund, A. (2019). Digital agency in higher education: Transforming teaching and learning. Routledge. [Google Scholar]
  2. Aguilera-Hermida, A. P. (2024). AI chatbots in education: The importance of accuracy. International Forum of Teaching & Studies. [Google Scholar]
  3. Armellini, A., & Padilla Rodriguez, B. C. (2016). Are massive open online courses (MOOCs) pedagogically innovative? Journal of Interactive Online Learning, 14(1), 17–28. [Google Scholar]
  4. Atmosukarto, I., Sin, C. W., Iyer, P., Tong, N. H., & Yu, K. W. P. (2021, December 5–8). Enhancing adaptive online chemistry course with AI-chatbot. 2021 IEEE International Conference on Engineering, Technology & Education (TALE), Wuhan, China. [Google Scholar]
  5. Baskara, F. R. (2023). Chatbots and flipped learning: Enhancing student engagement and learning outcomes through personalised support and collaboration. IJORER: International Journal of Recent Educational Research, 4(2), 223–238. [Google Scholar] [CrossRef]
  6. Braun, V., & Clarke, V. (2014). What can “thematic analysis” offer health and wellbeing researchers? International Journal of Qualitative Studies on Health and Well-Being, 9, 26152. [Google Scholar] [CrossRef]
  7. Brevik, L. M., Gudmundsdottir, G. B., Lund, A., & Strømme, T. A. (2019). Transformative agency in teacher education: Fostering professional digital competence. Teaching and Teacher Education, 86, 102875. [Google Scholar] [CrossRef]
  8. Chang, D. H., Lin, M. P.-C., Hajian, S., & Wang, Q. Q. (2023). Educational design principles of using AI chatbot that supports self-regulated learning in education: Goal setting, feedback, and personalization. Sustainability, 15(17), 12921. [Google Scholar] [CrossRef]
  9. Cheng, L., Croteau, E., Baral, S., Heffernan, C., & Heffernan, N. (2024). Facilitating student learning with a chatbot in an online math learning platform. Journal of Educational Computing Research, 62(4), 907–937. [Google Scholar] [CrossRef]
  10. Chiu, T. K., Moorhouse, B. L., Chai, C. S., & Ismailov, M. (2024). Teacher support and student motivation to learn with Artificial Intelligence (AI) based chatbot. Interactive Learning Environments, 32(7), 3240–3256. [Google Scholar] [CrossRef]
  11. Creswell, J. W. (2012). Qualitative inquiry and research design: Choosing among five approaches. Sage. [Google Scholar]
  12. Deng, X., & Yu, Z. (2023). A meta-analysis and systematic review of the effect of chatbot technology use in sustainable education. Sustainability, 15(4), 2940. [Google Scholar] [CrossRef]
  13. Deveci Topal, A., Dilek Eren, C., & Kolburan Geçer, A. (2021). Chatbot application in a 5th grade science course. Education and Information Technologies, 26(5), 6241–6265. [Google Scholar] [CrossRef]
  14. Du, J., Huang, W., & Hew, K. F. (2021, December 5–8). Supporting students goal setting process using chatbot: Implementation in a fully online course. 2021 IEEE International Conference on Engineering, Technology & Education (TALE), Wuhan, China. [Google Scholar]
  15. Edwards, A. (2015). Designing tasks which engage learners with knowledge. In I. Thompson (Ed.), Designing tasks in secondary education: Enhancing subject understanding and student engagement (pp. 13–27). Routledge. [Google Scholar]
  16. Engeness, I. (2020). Developing teachers’ digital identity: Towards the pedagogic design principles of digital environments to enhance students’ learning in the 21st century. European Journal of Teacher Education, 44(1), 96–114. [Google Scholar] [CrossRef]
  17. Engeness, I. (2021a). PY Galperin’s development of human mental activity: Lectures in educational psychology (Vol. 14). Springer Nature. [Google Scholar] [CrossRef]
  18. Engeness, I. (2021b). Tools and signs in massive open online courses: Implications for learning and design. Human Development, 65(4), 221–233. [Google Scholar] [CrossRef]
  19. Engeness, I., & Gamlem, S. M. (2025). Exploring AI-driven feedback as a cultural tool: A cultural-historical perspective on design of AI environments to support students’ writing process. Integrative Psychological and Behavioral Science, 59(1), 23. [Google Scholar] [CrossRef]
  20. Engeness, I., & Nohr, M. (2020). Engagement in learning in the massive open online course: Implications for epistemic practices and development of transformative digital agency with pre-and in-service teachers in Norway. Cultural-Historical Psychology, 16(3), 71–82. [Google Scholar] [CrossRef]
  21. Engeness, I., & Nohr, M. (2022). Assessment as Learning: Use of reflection videos in the massive open online course to enhance learning and digital identity among pre-and in-service teachers in Norway. Forum Oświatowe. [Google Scholar]
  22. Engeström, Y. (1987). Learning by expanding: An activity-theoretical approach to developmental research. Cambridge University Press. [Google Scholar]
  23. Engeström, Y. (1999). Activity theory and individual and social transformation. In Perspectives on activity theory (pp. 19–38). Cambridge University Press. [Google Scholar]
  24. Fidalgo-Blanco, Á., Sein-Echaluce, M. L., & García-Peñalvo, F. J. (2016). From massive access to cooperation: Lessons learned and proven results of a hybrid xMOOC/cMOOC pedagogical approach to MOOCs. International Journal of Educational Technology in Higher Education, 13(1), 24. [Google Scholar] [CrossRef]
  25. Fidan, M., & Gencel, N. (2022). Supporting the instructional videos with chatbot and peer feedback mechanisms in online learning: The effects on learning performance and intrinsic motivation. Journal of Educational Computing Research, 60(7), 1716–1741. [Google Scholar] [CrossRef]
  26. Han, S., & Lee, M. K. (2022). FAQ chatbot and inclusive learning in massive open online courses. Computers & Education, 179, 104395. [Google Scholar] [CrossRef]
  27. Hew, K. F., Huang, W., Du, J., & Jia, C. (2023). Using chatbots to support student goal setting and social presence in fully online activities: Learner engagement and perceptions. Journal of Computing in Higher Education, 35(1), 40–68. [Google Scholar] [CrossRef]
  28. Hwang, G.-J., & Chang, C.-Y. (2023). A review of opportunities and challenges of chatbots in education. Interactive Learning Environments, 31(7), 4099–4112. [Google Scholar] [CrossRef]
  29. Ilieva, G., Yankova, T., Klisarova-Belcheva, S., Dimitrov, A., Bratkov, M., & Angelov, D. (2023). Effects of generative chatbots in higher education. Information, 14(9), 492. [Google Scholar] [CrossRef]
  30. Jeon, J., Lee, S., & Choe, H. (2023). Beyond ChatGPT: A conceptual framework and systematic review of speech-recognition chatbots for language learning. Computers & Education, 206, 104898. [Google Scholar]
  31. Lee, L. K., Fung, Y. C., Pun, Y. W., Wong, K. K., Yu, M. T. Y., & Wu, N. I. (2020, August 24–27). Using a multiplatform chatbot as an online tutor in a university course. 2020 International Symposium on Educational Technology (ISET), Bangkok, Thailand. [Google Scholar]
  32. Mäkitalo, Å. (2016). On the notion of agency in studies of interaction and learning. Learning, Culture and Social Interaction, 10, 64–67. [Google Scholar] [CrossRef]
  33. Patton, M. Q. (2015). Qualitative research and methods: Integrating theory and practice. SAGE Publications. [Google Scholar]
  34. Sáiz-Manzanares, M. C., Marticorena-Sánchez, R., Martín-Antón, L. J., Díez, I. G., & Almeida, L. (2023). Perceived satisfaction of university students with the use of chatbots as a tool for self-regulated learning. Heliyon, 9(1), e12843. [Google Scholar] [CrossRef] [PubMed]
  35. Siddiq, F., Røkenes, F. M., Lund, A., & Scherer, R. (2024). New kid on the block? A conceptual systematic review of digital agency. Education and Information Technologies, 29(5), 5721–5752. [Google Scholar] [CrossRef]
  36. Vygotsky, L. (1978). Interaction between learning and development. Readings on the Development of Children, 23(3), 34–41. [Google Scholar]
  37. Wu, R., & Yu, Z. (2024). Do AI chatbots improve students learning outcomes? Evidence from a meta-analysis. British Journal of Educational Technology, 55(1), 10–33. [Google Scholar] [CrossRef]
Table 1. The number of respondents to the questionnaire in 2023–24 and their general evaluation of the ICTPED MOOC.
Table 1. The number of respondents to the questionnaire in 2023–24 and their general evaluation of the ICTPED MOOC.
YearsNumber of RespondentsMale/FemaleProfessional BackgroundGeneral Evaluation of the ICTPED MOOC Mean
2023–202446Male = 5
Female = 41
In-service teachers = 88%
Pre-service teachers = 9%
Other = 3%
Very little satisfied = 2.2%
Little satisfied = 2.2%
Somewhat satisfied = 6.5%
Strongly satisfied = 65.2%
Very strongly satisfied = 26.1%
Table 2. Progress plan and the modules in the ICTPED MOOC.
Table 2. Progress plan and the modules in the ICTPED MOOC.
Module NumberModule NameProgress Plan (Week)
0Pre-course2
1ICT and learning3–4
2Digital studying techniques5–6
3Multimodal texts (examination module)7–9
4Cyber ethics10–11
5Classroom management in digital
learning environments
12–13
6Assessment for learning14–15
7AI in education16–17
8Flipped classroom (examination module)18–21
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Engeness, I.; Nohr, M.; Fossland, T. Investigating AI Chatbots’ Role in Online Learning and Digital Agency Development. Educ. Sci. 2025, 15, 674. https://doi.org/10.3390/educsci15060674

AMA Style

Engeness I, Nohr M, Fossland T. Investigating AI Chatbots’ Role in Online Learning and Digital Agency Development. Education Sciences. 2025; 15(6):674. https://doi.org/10.3390/educsci15060674

Chicago/Turabian Style

Engeness, Irina, Magnus Nohr, and Trine Fossland. 2025. "Investigating AI Chatbots’ Role in Online Learning and Digital Agency Development" Education Sciences 15, no. 6: 674. https://doi.org/10.3390/educsci15060674

APA Style

Engeness, I., Nohr, M., & Fossland, T. (2025). Investigating AI Chatbots’ Role in Online Learning and Digital Agency Development. Education Sciences, 15(6), 674. https://doi.org/10.3390/educsci15060674

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop