Next Article in Journal
Impact of Professional Development on Ancillary Staff’s Knowledge and Confidence in Supporting Twice-Exceptional Students
Previous Article in Journal
Student Experiences in Context-Based STEM Instructional Design: An Investigation Focused on Scientific Creativity and Interest in STEM Career
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Voices from the Flip: Teacher Perspectives on Integrating AI Chatbots in Flipped English Classrooms

1
Faculty of Social Sciences and Liberal Arts, UCSI University, Kuala Lumpur 56000, Malaysia
2
Department of English Language, Faculty of Languages and Linguistics, Universiti Malaya, Kuala Lumpur 50603, Malaysia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(9), 1219; https://doi.org/10.3390/educsci15091219
Submission received: 15 July 2025 / Revised: 12 September 2025 / Accepted: 13 September 2025 / Published: 15 September 2025

Abstract

Drawing on the Technological Pedagogical Content Knowledge (TPACK) framework, this qualitative case study investigates how university English teachers integrate AI chatbots into flipped classrooms. Findings reveal that teachers employed chatbots across multiple pedagogical functions—including vocabulary support, grammar explanation, dialogue simulation, and creative content generation—embedded purposefully into both pre-class preparation and in-class collaboration. Rather than passively adopting these tools, teachers strategically positioned chatbots to enhance student autonomy, confidence, and interaction, while tailoring their use to suit specific flipped classroom designs. Meanwhile, teachers acknowledged the risks of over-reliance on AI chatbot content and the disruptions caused by vague or incorrect responses. They responded by developing structured guidance and reforming their roles as facilitators rather than content deliverers. This study contributes new insights into teacher agency in AI-mediated language education, highlighting the complex pedagogical negotiations required to meaningfully integrate emerging technologies into flipped learning environments.

1. Introduction

Artificial intelligence (AI) chatbots—computer programs designed to simulate human conversation through natural language processing—have rapidly gained attention across various industries, transforming how users interact with technology. In e-retailing, chatbots enhance online customer experience (Chen et al., 2021); in travel and hospitality, they assist with bookings and real-time updates (Pillai & Sivathanu, 2020); and in human resources, they streamline recruitment by answering questions and scheduling interviews (Majumder & Mondal, 2021). Together, these examples highlight the expanding role of AI chatbots in improving service delivery and user engagement across sectors.
In the education sector, the release of advanced conversational AI models such as ChatGPT has accelerated public and academic interest in AI’s potential to support learning and teaching. Despite concerns about bias and misinformation inherent in general-purpose chatbots, AI-driven conversational agents tailored for educational purposes offer promising opportunities for personalized, accessible support to learners anytime and anywhere (Wollny et al., 2021). In higher education, AI technologies, including chatbots, are increasingly explored as tools to enhance student engagement, provide instant feedback, and scaffold learning outside traditional classroom settings (Zawacki-Richter et al., 2019).
What happens when two technology-enhanced innovations—AI chatbots and flipped learning—collide in the language classroom? On their own, each offers transformative potential: flipped learning redesigns instructional time by shifting foundational learning outside the classroom (Bishop & Verleger, 2013), while AI chatbots provide scalable, real-time conversational support powered by artificial intelligence (Diwanji et al., 2018).
Recent studies highlight the educational value of AI chatbots in language learning. They enhance linguistic outcomes such as writing fluency (Boudouaia et al., 2024), grammatical accuracy (Kucuk, 2024), and conversational competence (Huang et al., 2022), while also supporting emotional development by boosting motivation, reducing anxiety, and fostering perseverance (Chiu et al., 2024; Ghafouri, 2023). At the same time, chatbots pose risks: they can generate inaccurate responses (Bašić et al., 2023; Guo et al., 2023) and may encourage plagiarism or over-reliance, undermining authentic practice and critical thinking (Yan, 2023; Cai et al., 2023). These issues call for careful teacher guidance to ensure chatbots support, rather than hinder, learning goals (Chiu et al., 2024; Ho, 2024). In this regard, the flipped classroom offers a strategic response: by shifting foundational learning to the pre-class phase and using class time for teacher-facilitated engagement, it positions teachers to shape how students use AI tools.
Likewise, a growing body of international research has demonstrated that flipped learning can improve academic performance, strengthen student motivation, and foster more active participation (Asef-Vaziri, 2015; Goedhart et al., 2019; Kurt, 2017; Tsai, 2021; Unal & Unal, 2017). However, the model is not without its challenges. Learners may struggle with increased preparation demands, digital access, and the need for greater self-regulation (Jakobsen & Knetemann, 2017; Paul et al., 2019). Teachers, in turn, must assume more complex roles as designers of active learning experiences, facilitators of in-class interaction, and supporters of diverse learner needs (Yıldız et al., 2022). In this context, AI chatbots have emerged as a promising tool that helps ease some of these difficulties and enrich the flipped learning experience (Diwanji et al., 2018).
Despite growing interest in flipped learning and AI chatbots, research combining the two remains limited. Integrating chatbots into flipped learning is still an emerging concept (Lo & Hew, 2023). While reviews have examined chatbot use in education (e.g., Wollny et al., 2021), few address flipped contexts. Wollny et al. (2021) identified only one such study (Huang et al., 2019), and Lo and Hew’s (2023) mini review found just ten. Most of these relied on quantitative or mixed methods and focused on students’ experiences. Some studies show chatbots can scaffold pre-class learning, enhance speaking fluency, and increase interaction (Huang et al., 2019; Jeon & Lee, 2024; Lin & Mubarok, 2021), while others highlight their role in promoting self-regulated learning, personalized feedback, or microlearning to foster higher-order thinking and motivation (Gonda & Chu, 2019; Han et al., 2025; Silitonga et al., 2024). Collectively, these works demonstrate pedagogical potential but share a key limitation: they overlook a central actor in the flipped classroom—the teacher. Little is known about how instructors implement and adapt chatbot technologies in flipped classrooms, a gap that is particularly significant in EFL contexts where teacher mediation is central to technology integration (Yıldız et al., 2022).
This gap is not new to flipped classroom research more broadly. As noted in previous studies, the majority of flipped classroom literature continues to center on students and content delivery, with very few studies explicitly examining the teacher’s role (Yıldız et al., 2022). Yet teachers are central to the flipped model: they plan learning sequences, develop materials, and guide students through each phase (Davies et al., 2013). The integration of AI chatbots may further reshape these responsibilities, requiring teachers to adapt strategies, redesign learning to fit chatbot functions, and make new decisions about AI use. Teacher attitudes strongly influence AI adoption (Rapti & Panagiotidis, 2024). Yavuz et al. (2025) also shows that AI-enhanced flipped classrooms can improve discussion, interaction, and learning outcomes, though causal evidence is limited. Systematic reviews further indicate that AI chatbots can enhance speaking skills, confidence, and engagement (Du & Daniel, 2024), with benefits mediated by sociocultural and pedagogical factors (Li et al., 2025). Recent work on tailored AI chatbots in graduate-level flipped courses (Tang et al., 2025) emphasizes that thoughtful design and teacher facilitation are essential for realizing these gains. Collectively, these studies demonstrate the pedagogical potential of chatbots while revealing a persistent gap: despite their central role in planning, guiding, and mediating learning, little is known about how instructors implement and adapt chatbot technologies in flipped classrooms, a gap particularly significant in EFL contexts where teacher mediation is crucial for effective technology integration (Yıldız et al., 2022).
In light of this, the present study focuses on university English teachers’ experiences with integrating AI chatbots into flipped classrooms. It aims to explore how teachers engage with this emerging technology, how it affects their practice, and what challenges they encounter during implementation. This study is guided by the following research questions:
  • RQ1: How do university English teachers experience and describe the integration of AI chatbots in flipped classrooms?
  • RQ2: What effects do teachers perceive AI chatbots have on students in flipped classrooms?
  • RQ3: What challenges do teachers encounter when implementing AI chatbots in their flipped teaching and how do teachers respond to these challenges?

2. Theoretical Framework

To understand how university English teachers engage with AI chatbots in flipped classroom settings, this study draws upon the Technological Pedagogical Content Knowledge (TPACK) framework (Mishra & Koehler, 2006). At its core, TPACK emphasizes that effective teaching with technology requires not only content knowledge (CK), pedagogical knowledge (PK), and technological knowledge (TK), but also their intersections: pedagogical content knowledge (PCK), technological content knowledge (TCK), and technological pedagogical knowledge (TPK). Together, these domains form an integrated model of teacher knowledge that is especially relevant in the flipped classroom, where teachers must transform subject expertise into accessible pre-class resources (PCK), employ digital tools for content delivery (TCK), and design in-class activities where technology enhances collaboration and communication (TPK). At the TPACK level, teachers integrate these domains holistically to create learning environments that are pedagogically sound, content-appropriate, and technologically effective.
The addition of AI chatbots introduces further complexity within this framework, as teachers must now consider how these tools influence both content delivery and student interaction in and outside the classroom. Teachers must mobilize TK to grasp the affordances and constraints of chatbot technologies, CK to evaluate the linguistic accuracy and appropriateness of chatbot outputs, and PK to scaffold student engagement with automated dialogue partners. These domains intersect in critical ways: TCK is evident when teachers consider how chatbots reshape access to authentic language practice, TPK emerges in the design of chatbot-mediated interaction tasks, and PCK underpins the alignment of chatbot use with English language learning goals. At the integrated TPACK level, teachers’ decision-making reflects how they balance pedagogical opportunities with concerns over reliability, bias, and ethical use. While TPACK has been critiqued as largely descriptive, applying it to AI-mediated flipped learning environments extends its analytical reach and provides a valuable lens for understanding the evolving role of teacher knowledge and mediation in technology-rich pedagogy.

3. Methodology

3.1. Design of the Study

This study employed a qualitative single-case study design to explore the experiences of English teachers integrating AI chatbots in flipped classrooms at a Chinese university. A qualitative approach was chosen to gain a deep, contextual understanding of how teachers interpret and respond to this pedagogical shift in their natural teaching environments (Creswell, 2013). The case involved a bounded system (Yin, 2014)—six English teachers engaged in flipped chatbots teaching during the 2024–2025 academic year at Chengdu, Sichuan Province, China. The university is equipped with smart classrooms that support online interactive instructional models to facilitate the use of chatbots in flipped learning. Given China’s internet governance policies, international chatbot tools like ChatGPT 5 are not commonly accessible in public university settings. Instead, participating teachers primarily used Baidu’s ERNIE Bot—a large language model developed and localized for Chinese users. The ERNIE Bot is an advanced generative AI chatbot capable of real-time speech recognition, personalized learning support, classroom instruction assistance, and automated feedback provision. Recent studies have demonstrated its effectiveness in facilitating language learning, highlighting its potential as a valuable tool in language education (Wang & Xue, 2024).

3.2. Participants

Participants were selected through purposive sampling to ensure rich, relevant insights. All six teachers had at least five years of teaching experience and received AI training in language teaching. They were recommended by the Director of the English Department and selected based on availability and willingness to contribute. All participants were required by the university to join the Flipped Program, in which foundational content was delivered online before class, while in-class time was devoted to interactive, teacher-facilitated activities such as discussions, debates, and collaborative tasks. AI chatbots were integrated as an instructional intervention to support both pre-class autonomous learning and in-class collaborative activities. Teachers received training on chatbot use and were guided to incorporate them strategically into their courses, allowing the study to examine how instructors mediated AI tools and adapted their pedagogical roles. To confirm this, a brief pre-screening survey was administered, and participants’ syllabi, lesson plans, and pre-class materials were reviewed. During interviews, participants also described their teaching practices, which were cross-checked against these materials. Each teacher taught one to two courses, with class sizes ranging from 25 to 35 students, over a one-semester period (16 weeks).
Ethical approval was obtained, and all participants gave informed consent. Anonymity was preserved using coded identifiers (T1–T6), which were used in reporting quotations. During classroom observations, students were informed of the study and that their activities might be observed for research purposes. No identifying data were collected, and all interactions were anonymized to protect student privacy. Participant demographic information is summarized in Table 1.

3.3. Data Collection

Data were collected in two phases: classroom observations and semi-structured interviews. Observations were semi-structured and conducted in the third week of the fall semester 2024. A total of eight class sessions were observed by the researcher. The researcher acted as a non-participating observer, taking detailed field notes on the classroom environment, student–teacher interactions, and teaching practices to supplement interview data and reduce self-report bias (Yin, 2014). No identifiable student data were recorded, and all observations were anonymized to ensure privacy.
Following the observations, one-on-one semi-structured interviews were conducted via Tencent Meeting, each lasting 30–45 min. Open-ended questions encouraged participants to reflect on their experiences. An interview protocol was followed to ensure consistency, and all questions were pilot-tested for clarity (see Table 2 for detailed interview questions). Interviews were recorded with consent and transcribed for analysis.

3.4. Data Analysis

Thematic analysis is used as the method of data analysis in this research. This method involves the identification, organization, description, analysis, and reporting of themes present in a given data set. It is defined as short and simple to learn, but it is nonetheless an effective method for assessing the experiences of research participants (Braun & Clarke, 2014). Codes were generated inductively, grouped into categories, and refined into overarching themes representing shared experiences across participants. The data analysis process is presented in Figure 1.
As can be seen in Figure 1, the researcher first became familiar with the data by repeatedly reviewing interview transcripts, observation notes, analyzing each source separately before combining them. In vivo coding was used to capture participants’ actual language and reduce researcher bias (Manning, 2017). Initial hand coding was conducted before importing the transcripts into MAXQDA (VERBI Software, 2024) to identify patterns and organize themes. Themes were generated by clustering codes into categories and subthemes, guided by the TPACK framework. Initial open coding generated numerous descriptive codes related to chatbot use, teaching strategies, and language content. These codes were then categorized under the TPACK domains as follows:
  • Technological Knowledge (TK): Teachers’ understanding and use of chatbot features, usability, and technical affordances.
  • Pedagogical Knowledge (PK): Teachers’ approaches to flipped learning, lesson design, scaffolding, and interaction facilitation.
  • Content Knowledge (CK): Teachers’ expertise in English language teaching, focusing on specific language skills and learning objectives.
Further, the intersections of these domains were identified to capture integrated knowledge areas:
  • Technological Pedagogical Knowledge (TPK): How teachers use chatbots as pedagogical tools within flipped classrooms.
  • Pedagogical Content Knowledge (PCK): How chatbots in flipped learning align with English language teaching goals.
  • Technological Content Knowledge (TCK): How chatbot technology supports specific language content.
  • Technological Pedagogical Content Knowledge (TPACK): The holistic integration of technology, pedagogy, and content in teachers’ chatbot-mediated flipped instruction.
Themes were reviewed and refined to ensure relevance, and a narrative of participants’ experiences in flipped English classes was constructed using quotes, visual data, and literature.

3.5. Trustworthiness

Trustworthiness in qualitative research ensures that findings are credible, transferable, dependable, and confirmable (Lincoln & Guba, 1985). This study enhanced credibility through methodological triangulation—interviews and observation to provide a comprehensive understanding of the flipped classroom (Yin, 2014). Member checking was also used, allowing participants to review and validate the data and interpretations (Patton, 2015). An academic expert participated in the process as an interrater, with a calculated agreement of 84%, exceeding the 80% benchmark for reliability.
To ensure transferability, thick descriptions were provided, and participants were purposefully sampled from an art university with a high proportion of art majors. Dependability was addressed through detailed protocols for interviews, observations, independent review of the coding process, and re-analysis after a time gap. A pilot study was conducted to refine the interview design, improving data consistency (Patton, 2015). Confirmability was established by grounding interpretations in data and achieving credibility, transferability, and dependability, thereby ensuring the overall trustworthiness of the study (Lincoln & Guba, 1985).

4. Findings

This section presents the main findings of this study, based on in-depth interviews with university English teachers who integrated AI chatbots into their flipped classroom practice. A thematic analysis was conducted using a hybrid approach: deductive coding informed by the TPACK framework and inductive coding emerging directly from the data. Initial codes were guided by constructs of technological, pedagogical, and content knowledge, as well as AI integration considerations. These codes were iteratively refined through repeated reading of interview transcripts and classroom observation notes, allowing themes and subthemes to emerge naturally from the data while remaining grounded in the theoretical framework.

4.1. Integration of Chatbots in Flipped Classroom

The first research question is specifically on exploring how teachers experience and describe the use of chatbots in flipped classroom. Three interconnected themes emerged from the analysis: (1) the way teachers conceptualized the chatbot’s role, (2) the pedagogical functions assigned to chatbot use, and (3) the alignment of chatbot use with the principles of flipped learning. Figure 2 presents an overview of the integration.
As can be seen from Figure 2, teachers conceptualized the chatbot in multiple pedagogical roles, each informing their classroom use. Some described it as a practical tool for students to complete language exercises independently, while others emphasized its role as a feedback provider for writing or grammar tasks. Several teachers viewed the chatbot as a conversation partner to help students build fluency in a low-stress setting, and a few saw it as a content generator—a source of model texts or error-based prompts for in-class analysis. While most teachers identified one or two roles for chatbots, some (e.g., T1, T3) described using the chatbot in multiple ways. Some of the direct quotations from the statements of the participants were as follows:
“It’s like a pre-class helper. They use it to review language points on their own.”
(T5)
“Students ask the chatbot to check their grammar, and it gives instant corrections. This helps them reflect before class.”
(T3)
“It gives them a chance to practice speaking without fear. They’re more willing to try because no one is watching them.”
(T4)
“Sometimes I ask it to give examples and then we judge (critique) and revise it together.”
(T1)
Across both pre-class and in-class phases, teachers employed chatbots to support a wide range of language learning tasks. These included vocabulary building, dialogue generation, grammar explanation, creative language production, and reading comprehension support. Teachers reported that chatbot use helped students prepare foundational knowledge and rehearse communicative language in a more autonomous way. Some of the direct quotations from the statements of the participants were as follows:
“I give them vocabulary lists and tell them to ask the chatbot for examples, affixes, suffixes, meanings and so on.”
(T1)
“Before class, students ask the bot to generate dialogues about the topic. We use them in pairs during activities.”
(T6)
“If they don’t understand a grammar point in the video, they ask the chatbot to explain in Chinese or give more examples.”
(T5)
“They like to ‘chat’ with it to come up with funny dialogues or imaginative scenarios.”
(T4)
“They use it to paraphrase reading texts or explain difficult words.”
(T2)
Teachers integrated chatbot use in ways that strongly aligned with the flipped classroom model. During the pre-class phase, students were encouraged to use the chatbot for self-paced vocabulary practice, grammar review, or text analysis, fostering autonomous learning. In the in-class phase, teachers built collaborative activities around the chatbot outputs—such as role-plays using student-generated dialogues, group corrections of chatbot feedback, or discussions based on chatbot-created examples. Some of the direct quotations from the statements of the participants were as follows:
“I ask them to work with the chatbot before class, especially for vocabulary and grammar review. This way, they come in with some basic understanding.”
(T3)
“In class, we take the dialogues they made with the chatbot and act them out in groups or improve them together.”
(T6)

4.2. Effects of Integrating Chatbots in Flipped Classroom

Across the interviews, teachers expressed a wide range of perceptions regarding how AI chatbots influenced student learning and behavior in flipped English classrooms. These effects were organized into two major thematic categories: positive influences on learning and emerging concerns. Teachers’ reflections revealed that while chatbots often supported student engagement, confidence, and preparation, they also brought risks of over-reliance, confusion, and diminished interpersonal interaction. Figure 3 presents an overview of these effects.
As can be seen in Figure 3, integrating chatbots in flipped classroom led to a range of observable effects on students, as perceived by teachers. These effects—both supportive and problematic—reflect how teachers perceive chatbot use reshaping students’ engagement, preparation, confidence, and interaction patterns in the flipped learning model.
The supportive effects were evident in four areas: enhanced autonomy during pre-class preparation, increased confidence in language use, more sustained interaction with learning content, and stronger motivation to experiment with language. Teachers observed that chatbots helped foster a stronger sense of learner autonomy. Students were able to complete vocabulary review, grammar explanations, and reading support on their own, often without prompting from the teacher. Several teachers also noted that students seemed to take more ownership of their learning when given access to an AI-based tool that could respond instantly to their questions. This supported one of the core intentions of the flipped model—shifting foundational learning outside the classroom. Some of the direct quotations from the statements of the participants were as follows:
“They explore language on their own before class. They don’t always need to wait for a teacher to explain something.”
(T2)
“They use it to test their own understanding, especially vocabulary and grammar.”
(T1)
“They use the chatbot to check grammar or confirm a meaning before coming to class. It helps them feel more prepared.”
(T4)
Teachers noted that chatbots played a role in reducing student anxiety and promoting greater confidence—especially among lower-proficiency or introverted students. The chatbot provided a non-judgmental, private space where students could rehearse language use without fear of embarrassment. This safe environment encouraged them to take more linguistic risks. Teachers also pointed out that chatbots often included positive, encouraging feedback—such as compliments on well-formed sentences or praise for creative output. This built students’ emotional engagement and contributed to their growing confidence. Some of the direct quotations from the statements of participants were as follows:
“I notice some shy students used to say nothing. But after practicing with the chatbot, they are more confident to speak in class.”
(T5)
“They told me that it gives them a chance to practice without fear. The bot doesn’t judge, so they’re more willing to try.”
(T3)
“One of my students said to me that the encouragement from the bot makes him feel successful, things like ‘Great job!’ or ‘You may have an accent but remember this is a natural thing. You’re doing great’. Students feel happy when they see that. Even if it is automatic, they feel motivated”
(T6)
Teachers also described how chatbots encouraged more sustained and meaningful engagement with learning materials among students during in-class activities. Instead of spending valuable class time on basic comprehension or initial practice, teachers and students can focus on deeper interaction, such as discussing nuances, refining language use, or applying concepts in collaborative tasks like role-plays or debates. Chatbots effectively serve as a personalized pre-class tutor, enabling students to experiment and test understanding in a low-pressure environment. This leads to richer, more sustained peer-to-peer and teacher–student interactions during class. Some of the direct quotations from the statements of participants were as follows:
“They bring chatbot-generated dialogues to class, and we build on them. It’s a good starting point.”
(T4)
“One group had already done a role-play with the chatbot. So, in class, we focused on refining it rather than starting from zero.”
(T1)
“Chatbots give students a chance to practice before class, so in the classroom we can dive deeper into discussion and application. It really shifts the focus from basic comprehension to higher-level thinking.”
(T2)
Teachers observed that chatbots encourage students to engage in playful and creative experimentation with language. Unlike traditional exercises that often prioritize accuracy and memorization, interactions with chatbots encouraged students to explore language more freely and creatively. Students did not just complete assigned tasks—they went beyond, using chatbots to rephrase sentences, adjust tone and style, or generate alternative expressions. This form of linguistic play made the learning process more engaging and personally meaningful. Some of the direct quotations from the statements of participants were as follows:
“They like playing with it. It’s more fun than just memorizing.”
(T3)
“One student asked the chatbot to rewrite his paragraph in different styles—he said it helped him understand tone.”
(T5)
“They enjoy experimenting, like making role-play scripts or asking for idioms. It makes them curious.”
(T6)
The negative effects were evident in three areas: over-reliance on chatbot answers, and confusion due to inaccurate or vague responses. Several teachers expressed concern that students were becoming overly dependent on chatbot-generated content. Instead of using the chatbot as a support tool to enhance understanding or test ideas, some students treated it as a shortcut—copying answers directly without processing or critically engaging with the content. This undermined the goal of fostering independent thinking and active language production, particularly in a flipped classroom where pre-class preparation is intended to encourage students for deeper engagement. Some of the direct quotations from the statements of participants were as follows:
“I noticed a few students just copied the chatbot’s answer without trying to understand it. They rely on it too much.”
(T3)
“There are students who tend to see the bot as a magic helper. When they get an answer, they stop thinking. It’s like, ‘Done, chatbot gave it to me.’”
(T4)
“Some treat it like a Translator—they don’t bother to write their own drafts anymore.”
(T1)
Teachers also observed that the chatbot occasionally produced vague, overly generic, or even incorrect responses—especially when students asked nuanced or open-ended questions. While more advanced learners could detect or question these inaccuracies, lower-proficiency students often accepted them at face value. This created confusion during in-class discussions and added to teachers’ workload, as they had to spend time correcting misconceptions or clarifying chatbot-generated content. Some of the direct quotations from the statements of participants were as follows:
“Sometimes the chatbot gives vague or strange examples, and students don’t know it’s wrong. I have to correct it in class.”
(T2)
“One student used an idiom the chatbot gave him, but it didn’t fit the context at all. He thought it sounded natural.”
(T5)
“The answers can be too general, and students don’t always realize that. It can confuse them more than help.”
(T1)

4.3. Challenges and Solutions of Integrating Chatbots in Flipped Classroom

While teachers generally acknowledged the potential of AI chatbots to enrich flipped English instruction, they also reported significant challenges that emerged during classroom implementation. These challenges were primarily cognitive and motivational in nature, disrupting established workflows and reshaping teacher–student dynamics. Two major themes were consistently identified: (1) the increased instructional complexity and workload caused by the unpredictability of chatbot use, and (2) growing ambiguity around the teacher’s role in relation to AI-mediated learning. Despite these pressures, teachers demonstrated a strong sense of adaptability by developing context-sensitive strategies to manage these tensions. Their solutions ranged from providing students with structured chatbot usage guidelines to reframing their own pedagogical roles—positioning themselves as critical mediators and discussion leaders, particularly for deeper interpretive tasks. Figure 4 presents an overview of this.
One of the most frequently reported challenges was the added complexity introduced by chatbot integration. In a traditional flipped classroom, pre-class tasks are predictable and teacher-designed. However, when students interact with AI chatbots independently, they often bring diverse, sometimes unexpected outputs to class—ranging from sophisticated drafts to problematic or incorrect examples. This unpredictability disrupted lesson planning and required teachers to exert additional cognitive effort to assess, validate, and integrate these varied materials in real time. In addition to real-time adjustments, teachers also reported an increased workload in reviewing chatbot-generated texts, identifying errors, and reorienting students during class discussions. This level of constant vigilance created stress and, in some cases, reduced motivation to adopt new technologies. Some of the direct quotations from the statements of participants were as follows:
“I don’t know what the chatbot has told them until I see their drafts. Sometimes I have to change my plan right before class.”
(T2)
“It’s like I have to teach the chatbot too—make sure it’s not confusing them.”
(T3)
“I spend more time checking what the bot gave them than checking their own ideas.”
(T4)
“It’s useful, but honestly, sometimes it makes my work harder, not easier.”
(T1)
A second major challenge concerned teachers’ evolving sense of professional identity and authority. In flipped classrooms supported by AI chatbots, students often turned to the chatbot for explanations, drafts, or revisions before seeking help from teachers. This shift led some educators to feel that their centrality in the learning process was being eroded, especially when students relied heavily on chatbot feedback without questioning its validity. This dynamic created motivational tension, as teachers had to navigate a new instructional role—less as direct knowledge providers and more as facilitators, evaluators, and mediators of chatbot–student interactions. Some of the direct quotations from the statements of participants were as follows:
“They show me what the chatbot said and expect me to just agree. Sometimes I have to explain why it’s wrong, and they’re surprised.”
(T2)
“It’s strange—before, they asked me first. Now they ask the chatbot, then maybe me.”
(T6)
In response to the increased complexity and shifting pedagogical dynamics introduced by chatbot integration, teachers demonstrated adaptive expertise by developing a set of strategies. These strategies helped them manage instructional workload, reassert their professional roles, and guide students toward more responsible and meaningful use of AI chatbots. Some teachers developed structured guidance for chatbot use and to improve chatbot literacy. This guidance typically involves the following:
  • Providing model prompts to help students ask more precise, purposeful questions.
  • Teaching evaluation strategies so students could judge the relevance, accuracy, and appropriateness of chatbot responses.
  • Distributing checklists or reflection tasks that encouraged students to verify, revise, or supplement the chatbot’s suggestions with their own thinking before class.
Some of the direct quotations from the statements of participants were as follows:
“At first, they typed vague questions like ‘help me write this’, and the results weren’t useful. So I taught them to ask specific things like ‘Can you give me a formal version of this sentence? Could you explain more about it? That works better.”
(T3)
“I give them three questions to reflect on after using the chatbot: What did it help you with? What parts did you change? And why?”
(T2)
“If I don’t guide them, they just copy and paste. But when I give examples and ask them to evaluate, they start thinking more critically.”
(T6)
In addition, rather than viewing chatbots as replacements for their instructional authority, teachers repositioned themselves as facilitators of critical engagement. They emphasized their role in helping students interpret, evaluate, and refine chatbot-generated content, especially in areas requiring nuance and judgment. By reframing their role, teachers maintained a sense of pedagogical control while fostering student agency and critical literacy. Some of the direct quotations from the statements of participants were as follows:
“I tell them—think of the chatbot as a classmate, not a teacher. It gives ideas, but you have to decide if they’re good ones.”
(T1)
“I’m not here to compete with the chatbot. I help them go deeper, ask better questions.”
(T3)
Additionally, in order to strike a balance between AI support and human-led learning, teachers encouraged a strategic division of cognitive labor. Students were advised to use chatbots for more surface-level tasks (e.g., definitions, examples, explanation), while deeper reasoning, contextual interpretation, and communicative tasks were reserved for classroom interaction. This approach not only lightened the teacher’s load but also helped students become more intentional and critical users of AI, reinforcing the human value in collaborative and higher-order learning process. Some of the direct quotations from the statements of participants were as follows:
“Use the bot to check meanings or get examples but save the why-questions for us. That’s where the real thinking happens.”
(T5)
“The chatbot is fine for the ‘what’, but the ‘how’ and ‘why’ are still our job as teachers.”
(T4)

5. Discussion

This study explores university English teachers’ experiences and perceptions of integrating AI chatbots within flipped English classrooms. Specifically, it investigates how teachers describe chatbot use, the effects they perceive on student learning, and the challenges encountered during implementation. The findings offer three key empirical insights, two theoretical contributions, and several practical recommendations to inform effective AI chatbot integration in flipped language teaching contexts.

5.1. Empirical Insights in Relation to Literature

Teachers in this study utilized AI chatbots across multiple pedagogical functions—including vocabulary support, grammar explanation, dialogue generation, and creative content production—integrated purposefully into both pre-class autonomous learning and in-class collaborative activities. This aligns with prior research documenting chatbot use in flipped classrooms for real-time feedback, answering questions, and supporting self-regulated learning (Gonda & Chu, 2019; Huang et al., 2019; Jeon & Lee, 2024). Recent studies further show that AI-enhanced flipped classrooms improve discussion, interaction, and overall student outcomes, though causal evidence remains limited (Yavuz et al., 2025; Rapti & Panagiotidis, 2024). This study extends these findings by highlighting teacher agency: instructors actively interpret and shape chatbot integration according to evolving pedagogical goals and role conceptualizations, positioning chatbots as scaffolds that amplify pre-class autonomy and enrich in-class interactions (Li et al., 2025; Tang et al., 2025). Teachers also demonstrated understanding of chatbot limitations and mediated their use through structured guidance and role reframing. This complements systematic reviews showing that chatbots can enhance speaking skills, confidence, and engagement (Du & Daniel, 2024), but that their benefits are mediated by sociocultural and pedagogical factors (Li et al., 2025). By emphasizing teacher facilitation, our findings provide a teacher-centered perspective absent from much of the prior research, which often focused primarily on student experiences (Huang et al., 2019; Lo & Hew, 2023).
Additionally, teachers in this study reported four key effects of AI chatbots on students in flipped English classrooms: enhanced autonomy in pre-class preparation, increased confidence in language use, greater interaction with learning content in class, and stronger motivation to experiment with language. The first three effects echo those reported in earlier studies on chatbot-assisted language learning. Prior research has shown that chatbots can promote student autonomy by supporting pre-class readiness (Gonda & Chu, 2019), boost learner confidence through low-stress interaction (Timpe-Laughlin et al., 2022; Lin & Mubarok, 2021), and increase engagement with course materials (Huang et al., 2019; Hew et al., 2021). However, the finding that students were motivated to experiment with language—modifying, challenging, or creatively extending chatbot responses—has been less emphasized in prior research. This form of exploratory engagement suggests that chatbots, when integrated meaningfully, may not only reinforce existing content but also spark new linguistic risk-taking and personalization. Besides positive effects, teachers also expressed concern about student over-reliance on chatbot-generated content and occasional confusion due to vague or incorrect responses, reinforcing the need for teacher oversight. These mixed effects suggest that while chatbots can enhance engagement and autonomy, they also introduce new layers of complexity into the flipped learning process.
Moreover, teachers in this study identified two primary challenges in integrating AI chatbots into flipped English classrooms: increased instructional complexity and growing ambiguity around their professional roles. These challenges align in part with previous research documenting technical limitations of chatbots, such as restricted response options and difficulty handling unstructured or conceptual queries (Hew et al., 2021; Huang et al., 2019; Timpe-Laughlin et al., 2022). Similarly, concerns about the authenticity of chatbot-generated tasks and responses have been raised, with students preferring human guidance for more complex reasoning (Huang et al., 2019; Timpe-Laughlin et al., 2022). However, while earlier studies often emphasized student-side issues, such as reduced motivation or inconsistent chatbot usage (Varnavsky, 2022; Ito et al., 2021), the current study shifts the focus to the teacher’s perspective. This study emphasizes how such limitations manifest as pedagogical burdens for teachers, particularly in a flipped context where chatbot outputs become shared classroom resources. More importantly, this study offers new insights by showing how teachers actively responded to the challenges of using chatbots. Their solutions included giving students structured support, such as example prompts and checklists to enhance chatbot literacy. Teachers also redefined their roles—instead of just delivering knowledge, they acted as guides who helped students evaluate and build on chatbot outputs. Finally, they used a strategic division of labor: chatbots were mainly used for what questions, while teachers took charge of how and why questions in class to review, explain, and extend what students had generated with AI.

5.2. Theoretical Contributions

This study contributes to the growing body of research at the intersection of AI integration and flipped language learning by offering a teacher-centered perspective on the use of AI chatbots. The findings enrich our understanding of how teachers interpret, adapt, and mediate AI tools—not as isolated technologies, but as pedagogical resources embedded within evolving instructional designs.
First, this study extends research on technology integration in language education by highlighting how chatbot use is deeply shaped by teachers’ evolving pedagogical goals, conceptualizations of AI tools, and alignment with flipped learning principles. While previous studies often focused on student experiences or technical affordances (e.g., Huang et al., 2019; Lin & Mubarok, 2021), this study shows that teachers play an active role in structuring the conditions under which chatbots influence learning—emphasizing the human layer of mediation in AI-supported classrooms.
Second, the findings offer meaningful additions to the Technological Pedagogical Content Knowledge (TPACK) framework (Mishra & Koehler, 2006). Teachers in this study demonstrated not only content and pedagogical knowledge, but also flexible and evolving technological knowledge in response to emerging AI challenges. Their ability to strategically assign chatbot functions, guide student use, and adjust their own roles reflects dynamic TPACK enactment in real-world flipped classroom contexts.
Finally, this study also speaks to emerging discussions around teacher agency in AI-mediated education. Rather than being displaced or diminished by AI, teachers here assumed more complex, adaptive roles—as designers of learning flows, curators of AI outputs, and facilitators of human–AI interaction. This shifts the narrative from technology replacing teachers to one in which teacher expertise becomes even more central in guiding meaningful learning in hybrid human–AI environments.

6. Limitations and Recommendation

This study has several limitations. First, it is based on a relatively small sample of university English teachers from a single institution, which may limit the generalizability of the findings. Future research should include larger and more diverse samples across different educational contexts and disciplines to validate and extend these insights.
Second, the study relies primarily on teacher self-reports, which may be subject to bias or incomplete perspectives. Incorporating student voices and classroom observations could provide a more comprehensive understanding of AI chatbot integration and its effects.
Third, as AI technologies rapidly evolve, the findings reflect experiences with specific chatbot tools at a particular time. Ongoing research is needed to track how changes in AI capabilities and classroom practices influence teacher roles and student outcomes.
Based on these limitations, it is recommended that future studies adopt mixed-method designs, include multiple stakeholder perspectives, and investigate longitudinal impacts of AI chatbot use. Additionally, professional development programs should be developed to support teachers in effectively integrating AI tools, addressing both technical skills and pedagogical strategies.

Author Contributions

Conceptualization, Y.L. and J.M.J.; methodology, Y.L.; software, Y.L.; validation, Y.L. and J.M.J.; formal analysis, Y.L.; investigation, Y.L.; resources, Y.L.; data curation, Y.L.; writing—original draft preparation, Y.L.; writing—review and editing, Y.L. and J.M.J.; visualization, Y.L.; supervision, J.M.J.; project administration, Y.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of the Sichuan Film and Television University (protocol code SFTU20240622) on 22 June 2024.

Informed Consent Statement

Informed consent was obtained from all subjects involved in this study.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Asef-Vaziri, A. (2015). The flipped classroom of operations management: A not-for-cost-reduction platform. Decision Sciences Journal of Innovative Education, 13(1), 71–89. [Google Scholar] [CrossRef]
  2. Bašić, Ž., Banovac, A., Kružić, I., & Jerković, I. (2023). ChatGPT-3.5 as writing assistance in students’ essays. Humanities and Social Sciences Communications, 10(1), 750. [Google Scholar] [CrossRef]
  3. Bishop, J., & Verleger, M. A. (2013, June 23–26). The flipped classroom: A survey of the research. 2013 ASEE Annual Conference & Exposition (pp. 23–1200), Atlanta, GA, USA. Available online: https://peer.asee.org/the-flipped-classroom-a-survey-of-the-research (accessed on 11 September 2025).
  4. Boudouaia, A., Mouas, S., & Kouider, B. (2024). A study on ChatGPT-4 as an innovative approach to enhancing english as a foreign Language writing learning. Journal of Educational Computing Research, 62(6), 1289–1317. [Google Scholar] [CrossRef]
  5. Braun, V., & Clarke, V. (2014). What can “thematic analysis” offer health and wellbeing researchers? International Journal of Qualitative Studies on Health and Well-being, 9(1), 26152. [Google Scholar] [CrossRef]
  6. Cai, Q., Lin, Y., & Yu, Z. (2023). Factors influencing learner attitudes towards ChatGPT-assisted language learning in higher education. International Journal of Human–Computer Interaction, 40(22), 7112–7126. [Google Scholar] [CrossRef]
  7. Chen, J. S., Le, T. T. Y., & Florence, D. (2021). Usability and responsiveness of artificial intelligence chatbot on online customer experience in e-retailing. International Journal of Retail & Distribution Management, 49(11), 1512–1531. [Google Scholar] [CrossRef]
  8. Chiu, T. K. F., Moorhouse, B. L., Chai, C. S., & Ismailov, M. (2024). Teacher support and student motivation to learn with artificial intelligence (AI) chatbot. Interactive Learning Environments, 32(7), 3240–3256. [Google Scholar] [CrossRef]
  9. Creswell, J. W. (2013). Qualitative inquiry & research design: Choosing among five approaches (3rd ed.). SAGE Publications. [Google Scholar]
  10. Davies, R. S., Dean, D. L., & Ball, N. (2013). Flipping the classroom and instructional technology integration in a college-level information systems spreadsheet course. Educational Technology Research and Development, 61(4), 563–580. [Google Scholar] [CrossRef]
  11. Diwanji, P., Hinkelmann, K., & Witschel, H. F. (2018, March 21–24). Enhance classroom preparation for flipped classroom using AI and analytics. 20th International Conference on Enterprise Information Systems (ICEIS) (Vol. 1, pp. 477–483), Funchal, Portugal. [Google Scholar] [CrossRef]
  12. Du, J., & Daniel, B. K. (2024). Transforming language education: A systematic review of AI-powered chatbots for English as a foreign language speaking practice. Computers and Education: Artificial Intelligence, 6, 100230. [Google Scholar] [CrossRef]
  13. Ghafouri, M. (2023). ChatGPT: The catalyst for teacher-student rapport and grit development in L2 class. System, 120, 103209. [Google Scholar] [CrossRef]
  14. Goedhart, N. S., Blignaut-van Westrhenen, N., Moser, C., & Zweekhorst, M. B. M. (2019). The flipped classroom: Supporting a diverse group of students in their learning. Learning Environments Research, 22(2), 297–310. [Google Scholar] [CrossRef]
  15. Gonda, D. E., & Chu, B. (2019, December 10–13). Chatbot as a learning resource? Creating conversational bots as a supplement for teaching assistant training course. 2019 IEEE International Conference on Engineering, Technology and Education (TALE) (pp. 1–5), Yogyakarta, Indonesia. [Google Scholar] [CrossRef]
  16. Guo, K., Li, Y., Li, Y., & Chu, S. K. W. (2023). Understanding EFL students’ chatbot–assisted argumentative writing: An activity theory perspective. Education and Information Technologies, 29, 1–20. [Google Scholar] [CrossRef]
  17. Han, I., Ji, H., Jin, S., & Choi, K. (2025). Mobile-based artificial intelligence chatbot for self-regulated learning in a hybrid flipped classroom. Journal of Computing in Higher Education, 1–25. [Google Scholar] [CrossRef]
  18. Hew, K. F., Bai, S., Huang, W., Dawson, P., Du, J., Huang, G., Jia, C., & Thankrit, K. (2021). On the use of flipped classroom across various disciplines: Insights from a second-order meta-analysis. Australasian Journal of Educational Technology, 37(2), 132–151. [Google Scholar] [CrossRef]
  19. Ho, P. X. P. (2024). Using ChatGPT in English language learning: A study on I.T. students’ attitudes, habits, and perceptions. International Journal of TESOL & Education, 4(1), 55–68. [Google Scholar] [CrossRef]
  20. Huang, W. J., Hew, K. F., & Fryer, L. K. (2022). Chatbots for Language learning—Are they really useful? A systematic review of chatbot-supported Language learning. Journal of Computer Assisted Learning, 38(1), 237–257. [Google Scholar] [CrossRef]
  21. Huang, W. J., Hew, K. F., & Gonda, D. E. (2019). Designing and evaluating three chatbot-enhanced activities for a flipped graduate course. International Journal of Mechanical Engineering and Robotics Research, 8(5), 813–818. [Google Scholar] [CrossRef]
  22. Ito, T., Tanaka, M. S., Shin, M., & Miyazaki, K. (2021). The online PBL (project-based learning) education system using AI (artificial intelligence). In DS 110: Proceedings of the 23rd international conference on engineering and product design education. Design Society. [Google Scholar] [CrossRef]
  23. Jakobsen, K. V., & Knetemann, M. (2017). Putting structure to flipped classrooms using team-based learning. International Journal of Teaching and Learning in Higher Education, 29(1), 177–185. [Google Scholar]
  24. Jeon, J., & Lee, S. (2024). The impact of a chatbot-assisted flipped approach on EFL learner interaction. Educational Technology & Society, 27(4), 218–234. [Google Scholar] [CrossRef]
  25. Kucuk, T. (2024). ChatGPT integrated grammar teaching and learning in EFL classes: A study on Tishkinternational university students in Erbil, Iraq. Arab World English Journal, 1(1), 100–111. [Google Scholar] [CrossRef]
  26. Kurt, G. (2017). Implementing the flipped classroom in teacher education: Evidence from Turkey. Educational Technology and Society, 20(1), 211–221. [Google Scholar]
  27. Li, Y., Zhang, X., & Wang, L. (2025). Designing language learning with artificial intelligence (AI) chatbots based on activity theory: A systematic review. Smart Learning Environments, 12(1), 1–15. [Google Scholar] [CrossRef]
  28. Lin, C. J., & Mubarok, H. (2021). Learning analytics for investigating the mind map-guided AI chatbot approach in an EFL flipped speaking classroom. Educational Technology & Society, 24(4), 16–35. [Google Scholar] [CrossRef]
  29. Lincoln, Y., & Guba, E. G. (1985). Naturalistic inquiry. Sage. [Google Scholar]
  30. Lo, C. K., & Hew, K. F. (2023). A review of integrating AI-based chatbots into flipped learning: New possibilities and challenges. Frontiers in Education, 8, 1175715. [Google Scholar] [CrossRef]
  31. Majumder, S., & Mondal, A. (2021). Are chatbots really useful for human resource management? International Journal of Speech Technology, 24(4), 969–977. [Google Scholar] [CrossRef]
  32. Manning, J. (2017). In vivo coding. In J. Matthes (Ed.), The international encyclopedia of communication research methods (pp. 1–2). Wiley. [Google Scholar] [CrossRef]
  33. Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017–1054. [Google Scholar] [CrossRef]
  34. Patton, M. Q. (2015). Qualitative research & evaluation methods: Integrating theory and practice (4th ed.). Sage publications. [Google Scholar]
  35. Paul, S., Priyadarshini, H. R., Fernandes, B., Muttalib, K., Western, J., & Dicksit, D. (2019). Blended classroom versus traditional didactic lecture in teaching oral surgery to undergraduate students of dentistry program: A comparative study. Journal of International Oral Health, 11(1), 36–39. [Google Scholar] [CrossRef]
  36. Pillai, R., & Sivathanu, B. (2020). Adoption of AI-based chatbots for hospitality and tourism. International Journal of Contemporary Hospitality Management, 32(10), 3199–3226. [Google Scholar] [CrossRef]
  37. Rapti, C., & Panagiotidis, P. (2024). Teachers’ attitudes towards AI integration in foreign language learning: Supporting differentiated instruction and flipped classroom. European Journal of Education (EJED), 7(2), 88–104. [Google Scholar]
  38. Silitonga, L. M., Wiyaka, W., Suciati, S., & Prastikawati, E. F. (2024, July). The impact of integrating AI chatbots and microlearning into flipped classrooms: Enhancing students’ motivation and higher-order thinking skills. In International Conference on Innovative Technologies and Learning (pp. 184–193). Springer Nature. [Google Scholar] [CrossRef]
  39. Tang, Y., Lee, J., & Zhang, H. (2025). Integrating tailored generative AI into the flipped classroom: A pilot implementation in higher education. Journal of Educational Technology, 32(4), 123–145. [Google Scholar] [CrossRef]
  40. Timpe-Laughlin, V., Sydorenko, T., & Daurio, P. (2022). Using spoken dialogue technology for L2 speaking practice: What do teachers think? Computer Assisted Language Learning, 35(7), 1194–1217. [Google Scholar] [CrossRef]
  41. Tsai, Y. R. (2021). Promotion of learner autonomy within the framework of a flipped EFL instructional model: Perception and perspectives. Computer Assisted Language Learning, 34(7), 979–1011. [Google Scholar] [CrossRef]
  42. Unal, Z., & Unal, A. (2017). Comparison of student performance, student perception, and teacher satisfaction with traditional versus flipped classroom models. International Journal of Instruction, 10(4), 145–164. [Google Scholar] [CrossRef]
  43. Varnavsky, A. N. (2022, May 26–27). Chatbot to increase the effectiveness of the «flipped classrooms» technology. 2022 2nd International Conference on Technology Enhanced Learning in Higher Education (TELE) (pp. 289–293), Lipetsk, Russian. [Google Scholar] [CrossRef]
  44. VERBI Software. (2024). MAXQDA 2024 [Computer software]. VERBI Software. Available online: https://www.maxqda.com (accessed on 11 September 2025).
  45. Wang, Y., & Xue, L. (2024). Using AI-driven chatbots to foster Chinese EFL students’ academic engagement: An intervention study. Computers in Human Behavior, 159, 108353. [Google Scholar] [CrossRef]
  46. Wollny, S., Schneider, J., Di Mitri, D., Weidlich, J., Rittberger, M., & Drachsler, H. (2021). Are we there yet? A systematic literature review on chatbots in education. Frontiers in Artificial Intelligence, 4, 654924. [Google Scholar] [CrossRef]
  47. Yan, D. (2023). Impact of ChatGPT on learners in a L2 writing practicum: An exploratory investigation. Education and Information Technologies, 28, 3943–3967. [Google Scholar] [CrossRef]
  48. Yavuz, M., Balat, Ş., & Kayalı, B. (2025). The effects of artificial intelligence supported flipped classroom applications on learning experience, perception, and artificial intelligence literacy in higher education. Open Praxis, 17(2), 286–304. [Google Scholar] [CrossRef]
  49. Yin, R. K. (2014). Case study research: Design and methods (5th ed.). Sage. [Google Scholar]
  50. Yıldız, E., Doğan, U., Özbay, Ö., & Seferoğlu, S. S. (2022). Flipped classroom in higher education: An investigation of instructor perceptions through the lens of TPACK. Education and Information Technologies, 27(8), 10757–10783. [Google Scholar] [CrossRef]
  51. Zawacki-Richter, O., Marín, V. I., Bond, M., & Gouverneur, F. (2019). Systematic review of research on artificial intelligence applications in higher education—Where are the educators? International Journal of Educational Technology in Higher Education, 16(1), 1–27. [Google Scholar] [CrossRef]
Figure 1. Data analysis process.
Figure 1. Data analysis process.
Education 15 01219 g001
Figure 2. Overview of chatbot integration in flipped classroom.
Figure 2. Overview of chatbot integration in flipped classroom.
Education 15 01219 g002
Figure 3. Effects of chatbot integration in flipped classroom.
Figure 3. Effects of chatbot integration in flipped classroom.
Education 15 01219 g003
Figure 4. Challenges and solutions of chatbot use in flipped classroom.
Figure 4. Challenges and solutions of chatbot use in flipped classroom.
Education 15 01219 g004
Table 1. Demographic information of participants.
Table 1. Demographic information of participants.
TeachersGenderAgeYears of Teaching ExperienceAcademic Title
T1Female4016Asst. Prof.
T2Female337Lecturer
T3Female3712Asst. Prof.
T4Female316Lecturer
T5Female328Lecturer
T6Female319Lecturer
Table 2. Interview questions.
Table 2. Interview questions.
Questions:
1. Can you describe how you use AI chatbots (e.g., ERNIE Bot) in your flipped English classroom?
2. How would you describe your overall experience of integrating AI chatbots into your flipped teaching approach?
3. What changes have you noticed in student learning when AI chatbots are used in the flipped classroom model?
4. What are the main challenges you’ve faced in using AI chatbots in your flipped English teaching?
5. How do you respond to these challenges? (strategies, training…anything you find effective)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ling, Y.; Jan, J.M. Voices from the Flip: Teacher Perspectives on Integrating AI Chatbots in Flipped English Classrooms. Educ. Sci. 2025, 15, 1219. https://doi.org/10.3390/educsci15091219

AMA Style

Ling Y, Jan JM. Voices from the Flip: Teacher Perspectives on Integrating AI Chatbots in Flipped English Classrooms. Education Sciences. 2025; 15(9):1219. https://doi.org/10.3390/educsci15091219

Chicago/Turabian Style

Ling, Yingxue, and Jariah Mohd Jan. 2025. "Voices from the Flip: Teacher Perspectives on Integrating AI Chatbots in Flipped English Classrooms" Education Sciences 15, no. 9: 1219. https://doi.org/10.3390/educsci15091219

APA Style

Ling, Y., & Jan, J. M. (2025). Voices from the Flip: Teacher Perspectives on Integrating AI Chatbots in Flipped English Classrooms. Education Sciences, 15(9), 1219. https://doi.org/10.3390/educsci15091219

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop