Next Article in Journal
Emerging Technology-Based Motivational Strategies: A Systematic Review with Meta-Analysis
Next Article in Special Issue
From Cadavers to Neural Networks: A Narrative Review on Artificial Intelligence Tools in Anatomy Teaching
Previous Article in Journal
The Implications and Applications of Developmental Spelling After Phonics Instruction
Previous Article in Special Issue
Evaluating Recent Advances in Affective Intelligent Tutoring Systems: A Scoping Review of Educational Impacts and Future Prospects
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Prompting Theory into Practice: Utilizing ChatGPT-4 in a Curriculum Planning Course

by
Liat Biberman-Shalev
Faculty of Education, Levinsky-Wingate Academic College, Tel-Aviv 48130, Israel
Educ. Sci. 2025, 15(2), 196; https://doi.org/10.3390/educsci15020196
Submission received: 24 December 2024 / Revised: 1 February 2025 / Accepted: 5 February 2025 / Published: 6 February 2025

Abstract

:
As of late, generative AI tools have been rapidly gaining purchase as an important part of life. Thus, one cannot ignore their growing integration into educational landscapes, including teacher education. This qualitative study examines how pre-service teachers (PSTs) leverage ChatGPT-4 to apply constructivist theory in curriculum planning (CP). The findings revealed three approaches through which PSTs used the chatbot to apply theory to practice: (a) simplifying theory, (b) applying theory, and (c) visualizing theory. The findings suggest that the need to refine prompts using curricular language and engaging in creative and critical thinking supported the translation process. In incorporating ChatGPT-4 into their CP, PSTs considered multiple factors, including ideation, inspiration, creativity, reliability, and insufficient personalization—attesting to a balanced perspective on the use of this tool, i.e., recognizing the potential and benefits of utilizing the chatbot, while remaining cognizant of its associated risks and limitations. This study points to aspects of CP using generative AI, which teacher educators should discuss with PSTs.

1. Introduction

ChatGPT-4, a generative AI (GAI) tool, is equipped with extensive text-to-text and text-to-image models that enable it to generate original material in response to users’ prompts (OpenAI, 2023). As of late, GAI has been rapidly becoming an integral part of everyday life. Thus, although scholars have pointed out challenges in integrating GAI tools into educational institutions—which comprise ethical concerns, such as maintaining students’ privacy and copyright (Crawford et al., 2023), as well as technological concerns, such as computer access and training for proficient use and issues related to reinforcing misinformation (Cooper, 2023)—one cannot ignore its imminent infiltration into educational landscapes, including teacher education (Wang et al., 2023).
Emergent research evidence attests to teachers’ overall positive attitude toward implementing AI in education, but also to their lack of skills in using GAI tools (Dolezal et al., 2025; Polak et al., 2022), compounded by a general skepticism about its potential in this area (Vazhayil et al., 2019). These findings call for exploring ways in which in- and pre-service teachers (PSTs) can be guided to acquire the knowledge and competencies to effectively integrate GAI into their practices.
In line with the prevailing positive orientation toward integrating GAI in teacher education (Trust et al., 2023), the current study focuses on a core academic course of such programs (Lim et al., 2018), curriculum planning (CP). Scholars have given considerable thought to the educational value of GAI. Thus, Strzelecki (2023) examined the potential of integrating ChatGPT into higher education in enhancing critical thinking, and improving writing and other skills, and concluded by calling for further examination of how this chatbot can be adopted in learning and teaching. Noting that only a few studies have actually charted out the integration of GAI tools in teacher preparation, Lee and Zhai (2024) appeal to teacher educators (TEs) to provide student teachers with opportunities to practice the use of GAI tools.
Following these calls, the current study explores ways to incorporate ChatGPT-4 into a CP course, and simultaneously addresses two challenges: the first is the well-documented “Achilles’ heel” of linking theory and practice (Juarez, 2019; Korthagen & Kessels, 1999), and the second the complexity of understanding constructivism (Cook et al., 2002; Krahenbuhl, 2016). In particular, the PSTs who participated in a college-based course, titled Curriculum Planning, which is typically a mandatory course in the context of teacher education, were instructed to work on a group project of designing a constructivist learning environment by utilizing ChatGPT-4 throughout the planning process. By analyzing PSTs’ prompts, reflections, and interview protocols, this study aims to reveal how they leverage ChatGPT-4 to apply constructivism in their CP. From a wider perspective, this study aims to contribute to the field by exploring PSTs’ perceptions as regards using ChatGTP-4. Through this procedure, the following two research questions were addressed:
(1)
How do PSTs leverage ChatGPT-4 to link constructivist theory to practice in CP?
(2)
What are PSTs’ perceptions regarding the incorporation of ChatGPT-4 into CP?

2. Literature Review

2.1. Generative AI and Curriculum Planning

Insofar as CP is a core element of teaching, a CP course is included in teacher education programs worldwide (Flores, 2016; van den Berg & du Plessis, 2023). A fundamental principle of curriculum theory, development, and implementation is consideration of contextual factors (Luke, 2008). Accordingly, a reality in which GAI tools are used by more than 1 billion people per month, including in the educational sphere (Baytak, 2024), necessitates discussions about curriculum.
A curriculum, in Cahapay’s (2020)’s minimalistic definition, is “a plan that has elements” (p. 1). As generic elements of curriculum planning, Tyler’s classic model (1949) posits (1) goals, (2) content, (3) pedagogical approach, and (4) evaluation. Zhao and Watterston (2021) argue that, in the uncertain and rapidly changing reality of today, a curriculum should be dynamic and adaptable to evolving technologies and contexts. Aktan (2021) demonstrates, in this connection, that curricular frameworks have been consistently influenced by technological advancements, and calls to carefully consider the turning points in human history that instigated reconceptualization of education. To address the questions of Why and How curriculums should evolve, Aktan examines Bobbitt’s (1918) and Tyler’s (1949) systematic approaches to curriculum reform. Upholding Dewey’s advocacy for the relevance of the curriculum to daily life in the face of exacerbated economic and social disparities today, Aktan underscores the need to revise curriculums, in an endeavor to mitigate educational inequalities and advance social and environmental justice. Aktan’s arguments, combined with Pinar’s (2004) vision of curriculum as a “provocation for students to reflect on and to think critically about themselves and the world they will inherit” (pp. 186–187), substantiate the need to rethink CP, aligning it with a reality in which GAI is an integral part.
The current study follows Lim et al. (2018) in conceiving of CP as an ongoing learning process throughout one’s teaching career, rather than a skill that must be fully mastered before starting to teach. These scholars describe three curriculum models that are usually presented to PSTs in teacher education: (1) offloading (adhering to materials included in a curriculum without change), (2) adapting (using original curricular content and materials and modifying them as necessary), and (3) improvising (creating one’s own curriculum). With reference to GAI tools, offloading would mean that PSTs adopt a chatbot’s responses without any modifications; adapting that PSTs adjust such responses and incorporate them into their CP to better align them with their own teaching style or student needs; while improvising that PSTs create original curriculum contents using a chatbot’s responses as inspiration for developing new instructional materials.
In all the above CP models, GAI tools contribute to a PST’s CP by automatically creating educational materials and contents (Gurl et al., 2024). This input is significant, especially in light of evidence that CP increases workload, which in turn may lead to teachers experiencing burnout and unsatisfaction and leaving the profession (Agyapong et al., 2022). Further, Cevikbas et al. (2023) showed that PSTs tend to find CP complicated. A growing body of evidence demonstrates that GAI tools may assist a teacher in this task by generating activities faster and more efficiently, thus allowing the teacher to focus more on student interactions and less on preparing content and class activities (Jauhiainen & Guerra, 2023).
Hitherto, empirical evidence on the use of ChatGPT in CP has been accrued mainly at the micro-level of lesson planning. The chatbot was found to assist teachers in quickly producing high-quality lesson plans (Gupta et al., 2023; Zhai, 2023). According to Karaman (2024), who investigated primary school mathematics teaching in Turkey, students’ academic achievements attest to the effectiveness of lesson plans prepared using ChatGPT. In South Korea, Lee and Zhai (2024) found that PSTs in training to become science teachers demonstrated a reasonable level of skill in incorporating ChatGPT into their lesson planning.
Notwithstanding the potential significant benefits of GAI tools in CP, critics worry that the growing enthusiasm for implementing them in education may have unintended consequences. One concern is that students might become less mentally engaged (Neumann et al., 2023). Another is related to the accuracy of the output, which could be incorrect and thus entrench misconceptions (Trust et al., 2023). Ethical dilemmas have also been discussed, mainly revolving around plagiarism (Chan, 2023), affirming the imperative for PSTs to adopt a reflective approach when utilizing GAI in CP (Gurl et al., 2024).
Further, van den Berg and du Plessis (2023) concluded that, while ChatGPT can generate materials that assist the teacher, it cannot provide emotional support, human interaction, or the personal touch necessary for effective teaching. In connection with this, Gupta et al. (2023) emphasize that human oversight and intervention are necessary to ensure that the outputs are suitable for the intended audience and align with the learning objectives—an aspect that relates to a teacher’s skill set to link theory and practice in a specific context (Biberman-Shalev et al., 2024). PSTs are often mistakenly assumed to be able to translate theoretical content to practice in CP (Lim et al., 2018). A pertinent question in this regard is whether PSTs’ interactions with a GAI tool such as ChatGPT may help them to address this challenge.

2.2. Research Context—Translating Constructivist Theory to Practice in a Curriculum Planning Course

The current study examines a college-based course titled Issues in Curriculum Planning administered in the academic year of 2023–2024. This two-semester course is mandatory for all PSTs pursuing a B.Ed. degree. During the first semester, PSTs are familiarized with a variety of theories and pedagogical approaches (e.g., behaviorism, cognitivism, and constructivism), and theoretical models (e.g., Tyler and Doll)—all of which are learned in the traditional frontal lecture format. During the second semester, PSTs learn to implement these contents through a group-based CP project. The rationale follows Deng’s (2004, p. 153) principle of “technological application”, i.e., providing PSTs with insights and skills for marshaling theory to come up with practical solutions.
However, translating theory to practice is not reducible to technological application alone; it justifiably constitutes one of the course project’s aims. That said, the macro-context of the entire course aligns with Pinar’s (2004, p. 165) conceptualization of curriculum theory as “an interdisciplinary field in which teacher education is conceived as the professionalization of academic freedom, including intellectual dissent, creativity, and self-reflexive, interdisciplinary education.” Accordingly, the course project is designed to promote PSTs’ intellectual, creative, and moral endeavor in CP.
Shulman (1998, p. 115) conceives of teachers’ role as transformers who “adapt, merge, and synthesize, criticize, and invent in order to move from theoretical understanding and research-based knowledge of the academy to the kind of practical knowledge needed to engage in professional work”. In this approach, CP may be seen as largely a link between theory and practice, wherein the transfer remains a central challenge in teacher preparation (Biberman-Shalev et al., 2024; Korthagen & Kessels, 1999).
Another challenge in linking theory and practice in CP is PSTs’ understanding of constructivism (Cook et al., 2002; Krahenbuhl, 2016), insofar as “constructivism as a theory of learning is different enough from commonsense ideas about how people learn things to be confusing and, thus, ignored even when it is presented in an engaging and substantive way to pre-service teachers” (Kroll, 2004, p. 217). Moreover, as school students, PSTs are usually socialized to adopt the more traditional approaches to teaching and learning.
In an endeavor to address these challenges by utilizing ChatGPT-4, at the beginning of the second semester, a techno-pedagogue conducted an introductory session to acquaint PSTs with the nature, capabilities, and limitations of AI, focusing specifically on ChatGPT-4. In this session, a techno-pedagogue, who is an expert in generative AI (GAI) tools, explained the concept of GAI, its mechanisms, and its potential applications in educational settings. The session included demonstrations of interactions with ChatGPT-4, showcasing its text generation and text-to-image functionalities. Particular emphasis was placed on the importance of crafting effective prompts, as well as demonstrating how refining prompts could elicit more accurate and contextually appropriate responses from the AI. Following this, the PSTs participated in a hands-on activity where they were given 30 min to freely interact with the chatbot. This period allowed them to experiment with various prompts, observe the chatbot’s responses, and familiarize themselves with its capabilities and limitations in a practical setting. After this introductory phase, the PSTs commenced their course project, where they were guided to plan (text-to-text) and design (text-to-image) a constructivist learning environment. This environment was structured around five generic components of curriculum planning (CP): rationale, goals, activities, evaluation, and reflection. The PSTs were required to document the interaction with ChatGPT-4 by including prompt-sharing links in their project’s appendix.

3. Methods

Tapping perceptions of participating PSTs regarding their experiences in integrating ChatGPT-4 into CP and leveraging it to link theory to practice enabled a qualitative phenomenological framework (Flick, 2004). Phenomenological education research focuses on the meanings that learners attribute to the learning process within a specific context, in an effort to understand how this process influences their behavior and beliefs (Prosser, 2000).

3.1. Participants

Participants were 45 sophomore PSTs (43 females and 2 males; mean age: 23.8) taking the mandatory course Issues in Curriculum Planning, in training to become elementary school teachers in a variety of school subjects (e.g., English, Mathematics, and Science). At the initial stage of the course project, approximately half of the participants reported that they were unfamiliar with GAI tools. It is worth noting that 95% of the student teachers in the college sampled were female. According to UNESCO (2023), women comprised the majority of total primary teachers. The study was approved by the Ethics Committee of the college sampled (Approval number 2024022501).

3.2. Data Collection

As part of triangulation to enhance the study’s trustworthiness criterion (Carcary, 2009), data were collected using three data sources: (a) PSTs’ prompts documented by ChatGPT-4 (a total of 665 prompts, with an average of 23 words per prompt); (b) participants’ reflections at course completion, which were submitted by each of the 45 PSTs as part of course requirements (the reflections were written on a single page of a Word file, with an average of 453 words per reflection); and (c) semi-structured interviews conducted with 10 PSTs who had agreed to be interviewed after receiving final course grades. The interview questions were designed to shed more light on PSTs’ experiences in using ChatGPT-4, including the following questions: Please, describe your experience with using ChatGPT-4 in the course project; In which part of the project did you use ChatGTP-4 the most, why and how? In your opinion, what are the pros and cons of using chatbot in CP? How can you envision using chatbot in your practicum and in the future, as a teacher in your classes?
Combined together, the above three data sources were deemed appropriate, as they enabled taking account of PSTs’ “voice” (Mitra, 2004) and understanding their perceptions regarding the potential of integrating GAI tools in CP, with a focus on linking theory to practice.

3.3. Data Analysis

Data were analyzed based on a thematic analysis of the PSTs’ prompts, interview protocols, and reflections. Data were continuously triangulated across these three data sources to enrich information obtained. A theme was identified as a pattern of shared meaning across these three data sources (Braun & Clarke, 2019). To ensure trustworthiness, Nowell et al.’s (2017) six phases of inductive thematic analysis were implemented. In the first phase, the researcher and the research assistant read all the PSTs’ prompts, reflections, and interview transcripts. In the second phase, the researcher and the research assistant independently read through all the data and assigned initial emic codes (i.e., categories that reflect the understanding and perspective of the participants, representing their local meanings and interpretations), creating a codebook detailing each code and exemplifying it with an excerpt from each of the three data sources. The third phase involved identifying themes, as well as conceptual connections and hierarchies. The two coders held multiple meetings to discuss findings and to resolve discrepancies in categorizing themes. In the fourth phase, the two coders went over the list of the themes derived in the third stage, achieving full agreement on the themes themselves. However, for approximately 10% of the citations within the themes, there was no consensus. These citations were subsequently excluded from the data set. Therefore, the intercoder agreement was quantified at 90%, as measured by Cohen’s (1960) kappa coefficient, indicating a high level of consistency between the coders. In the fifth stage, the titles for all the themes were finalized, and the sixth stage was devoted to the writing up of the rationale and decisions as part of this study’s results section.

4. Findings

4.1. Part 1: Prompting Constructivist Theory into Practice

Overall, PSTs felt that integrating ChatGPT-4 into their CP project had helped them to grasp constructivist theory. Approximately half of the respondents saw the importance of integrating constructivism into their CP in the future. Data analysis revealed that PSTs marshaled three main approaches in leveraging ChatGPT-4 to translate constructivism into practice, which are elaborated in what follows.

4.1.1. Simplifying Theory

Simplification may be viewed as the process of making something less complex for the PSTs and easier to understand or manage. In this regard, the first approach was to ask the chatbot to explain constructivism in simpler and more accessible language (see Figure 1). All nine groups of PSTs began by prompting the chatbot to explain what is meant by “constructivist theory”, for example, “Can you explain what constructivism is, exactly”. After receiving a reply, all nine groups prompted the chatbot to rephrase the explanation in simpler terms. Only one group contextualized their prompt: “We are student-teachers studying a curriculum planning course. Please explain to us constructivism.”
In the interviews and in their post-course reflections, 35 PSTs underscored the necessity for simplifying explanations, for example,
I think that the power of ChatGPT is to explain complicated ideas simply and very clearly, so that everyone can understand (Interview 3).
I found it very difficult to understand constructivism. The articles we had to read were also complex. I felt that ChatGPT explained the theory in clearer words for me (Reflection 12).
One PST personified the chatbot:
I felt like I was talking to a real person who made things clear and used understandable language so I could understand it [constructivism] in the best way possible (Interview 7).
In other words, this PST approach in using the chatbot may imply that the academic course presents challenges for students due to a lack of prerequisite knowledge or insufficient contextual framing. Additionally, limited variety in presentation formats or linguistic barriers could further hinder PSTs’ understanding of constructivism.
It is worth noting that four PSTs felt that, as far as theoretical understanding was concerned, the chatbot was not helpful, for example,
To be honest, asking the ChatGPT [about constructivism] didn’t really help me; it actually confused me. So I relied less on this tool and more on the course materials (Interview 5).
This may suggest that, in contrast to the more general responses from the chatbot, the course materials can provide the PSTs with a structured and familiar framework tailored to their learning needs, making them more beneficial for understanding constructivism. Another interpretation could be that some PSTs may prioritize human interaction for a more personalized and contextually sensitive understanding of theoretical concepts. Simplification was also reflected in the PSTs’ prompts that asked the chatbot to summarize constructivism as a list of core principles. Six of the nine groups asked the chatbot to outline the main principles of constructivism, for example, see Figure 2.
Eleven PSTs reported that this approach was helpful, as it mitigated information overflow and answered their need for feedback:
I find constructivism too abstract. Having a framework or checklist could serve as a guide to see if my curriculum planning reflects constructivist ideas (Interview 6).

4.1.2. Applying Theory

PSTs frequently resorted to ChatGTP-4 for help with the activities component of CP. Thus, all groups requested examples of activities in the constructivist spirit, and the chatbot responded with a list of pedagogies (see Figure 3).
One PST commented the following in the interview:
We asked the chat for examples of activities that fit the constructivist theory as we wanted to understand how the theory could be applied in our learning environment (Interview 10).
All groups’ prompts included requests for activities that specifically related to their respective learning environment topics, for example, as shown in Figure 4.
Twenty-two PSTs felt that examples for constructivism-based activities that the chatbot suggested helped them to better understand the theory, for example,
Through the examples [the chatbot provided], I felt that I better understood constructivism and how to design teaching and learning according to constructivist ideas (Interview 1).
Examples of constructivist curriculum planning that is essentially and directly related to my needs and goals clarified for me how to be a constructivist teacher (Reflection 38).
One PST said that, although she was familiar with the pedagogies the chatbot listed, their relation to constructivism that the tool underscored was new to her:
In my practicum I observed my teacher mentor encouraging her students to work together or to research a subject, but I did not know that this was related to constructivism until I read the chatbot’s examples of activities (Reflection 24).
This reflection may imply that PSTs may not be adequately prepared to recognize constructivist methods during their practicum experiences. Additionally, this may highlight the need to provide opportunities to reflect on observed practices.
Seven PSTs described how they had used ChatGPT-4 to obtain feedback on their own ideas for constructivist activities:
Through ChatGPT, we could check if our activities reflected constructivism. It was like having a personal tutor who ensured my teaching methods are aligned with this theory (Reflection 8).

4.1.3. Visualizing Theory

All the PSTs reported that ChatGPT-4 had helped them to understand constructivism by tangibly demonstrating how its concepts can be applied in practical settings (see Figure 5).
PSTs viewed the ChatGPT-4 text-to-image tool as having the ability to design and organize theory-based learning spaces, thus rendering abstract and complicated constructivist ideas more concrete and achievable:
The image helped me to better understand constructivist concepts by demonstrating how to create a setting that fosters active learning and collaboration (Reflection 41).
For most PSTs, chatbot-generated images were a “wow factor” that gave them insights as to how to actually apply constructivism in practice. However, several PSTs remonstrated, “…as a teacher, only in my dreams can I hope to get a budget to create a learning space like the one the chatbot produced” (Reflection 9). Additionally, almost half of the PSTs described that they had found this kind of interaction with the chatbot very frustrating, as sometimes parts of the image contained inaccuracies (see Figure 5), and sometimes the chatbot failed to graphically realize their ideas. Interestingly, four PSTs felt that visualizing a constructivist learning environment spurred them to critically assess the ones created in their practicum.

4.1.4. Translation Mechanisms: Iteration, Language Precision, and Critical Thinking

Within each of the three approaches elaborated above, the main mechanism for translating theory to practice was iterative interactions between PSTs and ChatGPT-4, for example,
Ultimately, ChatGPT helped me understand the material more clearly because I could keep asking the chatbot to explain it over and over again, until I felt that I understood the theory (Reflection 26).
In order to interact effectively with the chatbot to obtain the responses they needed each time, PSTs had to formulate their prompts with a fair degree of precision. Although some PSTs described this process challenging and even frustrating, it helped them to understand the theory:
It took a lot of time to refine prompts, but when we came up with a fairly precise one, we got amazing results (Interview 2).
When we were not precise in conveying what we wanted, the responses were not always what we were looking for. It was frustrating to get irrelevant answers over and over again (Reflection 19).
The possibility of an iterative dialog led one group to deal with the challenge of precise formulations by asking the chatbot to rephrase prompts:
As we did not receive the responses we wanted, we asked the chatbot to help us refine our prompts (Reflection 25).
Data suggest that the iterative nature of the interaction and the imperative for precise formulations enhanced PSTs’ critical thinking and activated professional phraseology, and specifically the curricular language (e.g., outdoor learning, students interacting with each other, and students are active). These cognitive improvements may have helped to increase PSTs’ curricular knowledge, for example,
We asked the chatbot to create a constructivist learning environment for our topic. However, the results weren’t good enough, so we started using more constructivist terms in our prompts and gave the chatbot negative feedback when the image didn’t correspond (Reflection 40).

4.2. Part 2: Incorporating ChatGPT-4 into Curriculum Planning

According to the data, PSTs were generally in favor of incorporating ChatGPT-4 into CP, and all described the chatbot as a kind of “teacher assistant.” While almost all PSTs felt that they could not unquestioningly rely on the chatbot, they said they would continue using it in CP, for example,
Planning lessons take a long time and can be frustrating sometimes. I feel like ChatGPT is like having a teaching assistant. It doesn’t always give me the perfect answer, but it helps me brainstorm ideas and organize my thoughts. (Reflection 3)
Discussed below are the themes and subthemes pertaining to PSTs’ optimistic perspectives on and concerns about GAI-based CP.

4.2.1. Objectives: Enhancing Personalization and Saving Time

As regards GAI-based CP, 21 PSTs were concerned about the “personalization gap”, i.e., the chatbot’s limitation, compared to a human teacher, in nuancing responses according to individual students’ learning styles, emotional dynamic, and needs:
I felt that the [ChatGPT] responses were technical and didn’t capture important emotional and social aspects, as would a real teacher (Interview 9).
The chat cannot create an optimal curriculum because it doesn’t know the students as I do (Reflection 5).
PSTs felt that the chatbot’s tendency to provide generic responses may be especially detrimental when goals need to be formulated for a specific setting, for example,
We asked the ChatGPT to help us formulate the goals for our learning environment. Time and again we got responses describing very general goals (Reflection 16).
The notion of the “personalization gap” identified by the PSTs reveals a significant challenge in integrating GAI-based tools into CP. This gap reflects the chatbot’s inability to adjust responses based on the individual needs, emotional states, and learning preferences of students—elements that human teachers can professionally respond to in real-time. As one PST noted, ChatGPT lacks the capacity to understand students on a personal level, highlighting its limitation in creating optimal, personalized curricula. The recognition of the “personalization gap” in the interviews and reflections may indicate that PSTs are aware of the necessity to address individual student needs within their CP when utilizing GAI tools. This awareness, however, also underscores the importance of ongoing adjustments and refinements in order to optimize the use of GAI in creating personalized learning experiences.
Possibly, in order to obtain more relevant answers, conducive to personalized learning, PSTs need to learn to refine their prompts; what is important, however, is that the mention of the “personalization gap” in the above excerpts unequivocally indicates that they may recognize the need to take count of their students’ needs in CP.
That said, five PSTs did find ChatGPT-4 helpful in designing a personalized curriculum, for example,
I found it very helpful that the chat suggested activities suitable for students with different academic levels and different motivations (Reflection 3).
Thirty-two PSTs thought that ChatGPT-4 could significantly streamline the process of CP by saving time, which—as ten PSTs pointed out—could instead be allocated to addressing individual students’ emotional and social needs, for example,
ChatGPT saves a lot of the teacher’s valuable time, and so enables the teacher to devote more time to answering questions and addressing needs of individual students (Interview 4).
Notably, 11 PSTs viewed the chatbot as a personalized environment for their own learning:
ChatGPT supports independent learning and provides responses to my unique needs. So actually, I think that we experienced learning in a constructivist way by interacting with the chatbot (Reflection 15).
Crucially, however, only five PSTs related to the idea that the chatbot could assist their students as a learning tool, for example,
ChatGPT can be adapted to variety of learning styles and could be useful to a wide range of students with different verbal and visual needs (Reflection 27).

4.2.2. Content: Organization, Articulation, and Reliability

The chatbot was perceived by 19 PSTs as helpful in producing and organizing CP contents, for example,
ChatGPT can organize large amounts of content into summary tables and other different formats, according to my needs. I’m not sure I could have managed to do this on my own (Interview 2).
In this way, these PSTs thought, it enhanced their content knowledge for CP:
Through ChatGPT, I was exposed to a level of knowledge beyond what I had learned in my studies and certainly beyond what I had known before (Reflection 34).
Twenty PSTs found the chatbot instrumental to improving their linguistic competence, by adeptly formulating contents and refining their teaching materials, for example,
The chatbot helped me to write down my thoughts and ideas more accurately, to express myself better, and thus to present a better plan (Reflection 19).
Seven PSTs wished to increase their content knowledge regarding AI, for example,
I feel that I do not know enough about AI to teach my students about it (Reflection 8).
As regards GAI-based CP, 29 PSTs voiced concerns about the reliability of the chatbot responses. These PSTs were uncertain about the accuracy of information supplied by ChatGPT-4, and were thus apprehensive of including it into their CP, for example,
I’m not sure if the information I got from the chat is accurate or not, and it makes it hard for me to trust it (Reflection 23).
Such misgivings suggest that PSTs felt responsible for the quality and accuracy of the contents they teach, and could thus relate to their self-efficacy in relation to content knowledge. Moreover, these PSTs may, themselves, have identified inconsistencies in ChatGPT’s responses—for example, “The chat gave me different answers when the same question was worded differently, which confused me”—or the information it supplied may have conflicted with other sources, for example,
The chat provided me with information that contradicts what I had learned in my courses, and this made me question its reliability (Reflection 41).
One PST voiced concern about relying excessively on chatbot responses:
I am worried that, on many occasions, I fully relied on the chatbot’s answers and did not verify the information (Reflection 9).
These reliability concerns may reflect the tension between the convenience of using GAI tools in CP and the responsibility of PSTs to ensure the accuracy and appropriateness of the content they use in their teaching. Such hesitations may also reflect PSTs’ awareness of the limitations of GAI-based CP, as well as their role in critically assessing and verifying the information provided by GAI tools.
In this connection, approximately half of the PSTs emphasized the importance of teaching their students to think critically and to always question the reliability of chatbot responses, for example,
The learning processes will be less effective if we do not teach our students to critically assess the information and content they receive from the chat (Reflection 37).

4.2.3. Activities: Inspiration, Ideas, and Creativity Catalysis

Data revealed that almost all PSTs saw ChatGPT-4 as an inspirational tool that spurred their creativity, especially when the activities component of CP was concerned, for example,
There is something enjoyable for a teacher in getting inspiration from the chat and then developing the ideas independently (Reflection 36).
Some PSTs who found the chatbot inspirational also described it as “very smart” and “capable of generating massive information.” Half of the PSTs felt that receiving help from the chatbot gave them motivation to plan creative activities:
By generating materials ChatGPT gave me more time to use my imagination and think more creatively about activities (Interview 7).
However, 11 PSTs felt that using the chatbot to create a curriculum compromised the intellectual creative aspect of the process, which they thought to be highly enjoyable, for example,
CP, in my view, reflects the teacher’s character. Every teacher conveys to the students their unique attitude, and if we all prepare the same assignments and lessons, we will lose our creativity (Reflection 12).
I felt that the chat was making me less creative. Instead of improving my mind and coming up with my own ideas, I got an immediate solution, and I didn’t feel that I was using my own creativity (Interview 5).

4.2.4. Assessment: Feedback

Seven PSTs felt that the chatbot had helped them to evaluate their CP by providing feedback on whether it addresses their students’ needs. Their eagerness to receive such feedback may relate to PSTs’ efficacy in creating a CP, for example, “Receiving feedback from the chat made me more confident about my CP” (Reflection 11). However, none of the PSTs raised the possibility of the chatbot assessing their students’ learning progress.

5. Discussion

In light of the growing number of published works on implementing GAI in education, consideration must also be given to exploring how to integrate GAI tools into teacher preparation, all the more so because teacher education plays an important role in shaping an effective and just educational sphere (Cochran-Smith, 2000).
This study has focused on integrating ChatGPT-4 into CP, as a core component of teacher education, in the framework of constructivist theory and with a view to its translation to practice. Findings revealed three approaches through which PSTs leveraged ChatGPT-4 to link constructivist theory to practice: (a) simplifying theory, (b) applying theory, and (c) visualizing theory. These three principles guided PSTs in their interactions with ChatGTP-4 to gain knowledge of constructivist framework and learn to implement it in practice.
Through the demand for linguistically precise and terminologically correct prompts, such commensalism symbiosis may have catalyzed PSTs’ theoretical understanding, as well as practical knowledge—epitomizing praxis, where practice and reflection interweave to deepen insight (Van Manen, 1977). Furthermore, according to findings, visualizing constructivism may have promoted the process whereby PSTs linked theory and practice (Wursta et al., 2004). Hence, one may conclude, albeit with caution, that ChatGPT-4 may function as a form of scaffolding, facilitating the translation of theory to practice while concomitantly enhancing PSTs’ professional language competence related to CP.
As was also found by other researchers (Hashem et al., 2024; Jauhiainen & Guerra, 2023), some of the PSTs in the current study felt that resorting to ChatGPT-4 in CP could be helpful in reducing workload and thus freeing time to address students’ needs. Yet, the majority mentioned the above-discussed “personalization gap”, realizing that the chatbot’s responses might not be sufficiently nuanced, and this insight further confirmed their appreciation of the teacher’s role in CP. Put differently, PSTs felt that the chatbot suggested implementing constructivism through technical practices, while they put emphasis on personalized strategies. Consistent with Baytak’s (2024) conclusion that, when ChatGPT-4 is used in education, the human aspect must necessarily be lacking, most of the PSTs felt that teachers, as human beings who are familiar with each of their students’ needs, can integrate constructivism into practice in an eclectic fashion. To use Schwab’s (2013, p. 611) words, “The stuff of theory is abstract or idealized representations of real things. But curriculum in action treats real things: real acts, real teachers, real children, things richer than and different from their theoretical representations. Curriculum will deal badly with its real things if it treats them merely as replicas of their theoretic representations.”
With the ongoing development of GAI tools, there is potential for these technologies to identify complexities in students’ behavior and learning patterns that may go unnoticed by teachers. By enhancing GAI literacy among PSTs, they could be better equipped to utilize GAI tools to gain deeper insights into the unique needs of their students. These tools could alert teachers to areas of difficulty or emotional distress, prompting timely interventions and enabling more personalized, responsive teaching. Nevertheless, despite the advancement of technology, it may not be possible to fully bridge the personalization gap. As Felix (2020) argues, “AI will not, should not, and indeed cannot replace the teacher” (p. 33).
Similar to Lee and Zhai’s (2024) findings, the perspective of PSTs in the current study regarding the use of the chatbot in their CP appeared to be quite balanced. They described ChatGPT-4 as a helpful tool, especially to ignite creativity in designing activities. Echoing Neumann et al. (2023), several PSTs were concerned that using the chatbot for CP may compromise mental engagement and dampen the enjoyment of creativity—possibly indicating that creativity constitutes part of PSTs’ professional identity, at least where CP is concerned.
Supporting Trust et al.’s (2023) findings, PSTs in the current study were also concerned about the accuracy of the information supplied by the chatbot, having encountered some discrepancies between its responses and other sources. Thus, notwithstanding all PSTs’ stated intention to use GAI tools for CP, one may plausibly argue that some of them may see GAI tools as a professional threat. This perception of GAI tools as a threat may stem from a fear that they could diminish the teacher’s role as a profession, as well as the career that PSTs have chosen. Another interpretation of this threat may be related to the fear that increasing reliance on GAI tools in CP could devalue PSTs’ professional skills, ultimately undermining their professional identity. Thus, the PSTs’ concern is reflected in the emphasis they place on the importance of human teachers in CP. In this regard, the PSTs’ perspective is more instrumental, focusing on the potential risks to their professional roles, rather than considering the possibilities of leveraging the tool to enhance the learning of their students. PSTs’ recognition of and apprehensions regarding the chatbot’s inability to provide personalized solutions may also attest to what philosophy calls phronesis, and Korthagen and Kessels (1999) term, more mundanely, a realistic approach. This perspective suggests a reverse dynamic, namely, that PSTs may have displayed a tendency to theorize GAI-based CP when reflecting in and on their hands-on interactions with the chatbot.
Additionally, although PSTs may have felt that ChatGPT-4 simplified the theory-to-practice transition through such technological application (Deng, 2004), it afforded PSTs insight into some of the ”big questions” about CP: what contents and skills need to be addressed in CP? (e.g., AI as subject-matter, information reliability, language, and critical thinking); how should GAI be integrated in CP? (e.g., creativity, personalization, feedback, and prompts); and where should learning take place? (e.g., visualization). However, no results pertained to the when question, and as concerns the who question, PSTs may have taken full responsibility for their CP, in line with more traditional perceptions of the teacher’s role. Neither did the results bring into relief other aspects of CP, such as content ownership or plagiarism.
This study’s research findings suggest a positive, optimistic perspective regarding PSTs and their ability or capacity to engage in pedagogical design by recognizing and capitalizing on both personal resources, such as their knowledge and beliefs, and external curriculum resources such as ChatGPT responses (Lim et al., 2018). PSTs not only identified the need to integrate GAI tools thoughtfully into their own CP but also perceived the importance of teaching their students to think critically when using these tools. Additionally, the findings indicate a balance in the PSTs’ perception of the role of ChatGPT in CP. On one hand, engaging in GAI-based CP appears to have enhanced their critical thinking skills, as evidenced by their evaluation of the chatbot’s responses. They demonstrated this by not taking the responses at face value, comparing them to the course material taught by the TE, and recognizing that the chatbot’s responses to the same prompt could vary significantly, sometimes resulting in inconsistent or inadequate outputs. On the other hand, despite recognizing these limitations, the PSTs continued to view the chatbot as an important teaching assistant, especially in the context of CP. This suggests that while the chatbot enhanced the PSTs’ critical thinking, it also remained an essential support in their planning process.
Revealing PSTs’ perceptions regarding the integration of the chatbot into their CP may indicate the potential long-term impacts of this integration on the training of student teachers. It seems that the PSTs realized that the quality of prompting is an important aspect of receiving better results from the chatbot. Thus, teacher educators may consider designing courses and activities that promote GAI literacy as both content knowledge and pedagogical knowledge related to student teachers training. In this regard, teacher educators may refer to Annapureddy et al. (2024), who suggest twelve competencies for the responsible integration of GAI tools, such as knowledge of the capacities and limitations of GAI tools, prompt engineering, and the ability to program and fine-tune them. The PSTs’ realizations related to the decrease in teachers’ workload due to GAI tools may also reflect some potential risks, such as over-reliance on GAI, resulting in the reinforcement of misconceptions based on misinformation (Ran et al., 2025). This may promote the perspective that teachers should be trained as subject matter experts, leading to a return to more traditional training that focuses on teacher content knowledge, rather than recognizing that new digital literacy skills and knowledge should be brought to the forefront.
This study is not without limitations, and its findings should be interpreted with these limitations in mind. The first has to do with the nature of the project assigned to PSTs: they were asked to design a constructivist curriculum, and thus incentivized to focus only on one theory. Additionally, the project was intended as a group project, and thus the prompts could have been formulated by all group members together or by just some of them. Moreover, the 10 PSTs who agreed to be interviewed may not be representative of all PSTs taking the course. For example, four interviewees admitted that, before the project, they had not been familiar with GAI. Additionally, this study did not include a quasi-experiment to measure the extent to which ChatGPT-4 helped PSTs to successfully design a constructivist curriculum relative to the results achieved by PSTs who did not use the chatbot in their CP. Moreover, this study explored PSTs’ interactions specifically with ChatGPT-4; exploring a variety of GAI tools may have resulted in different findings. Another limitation is that this study does not include the chatbot’s responses and how the PSTs integrated these responses into their project. This is an important aspect that future research should explore. Finally, as this study adopted a qualitative methodology to answer the research questions, the ability to generalize its findings is limited.
The above discussion of this study’s findings in light of its limitations points to a broad range of possibilities for future research. By adopting a quantitative approach, one could examine if PSTs’ demographic characteristics, such as age, gender, and specialization subjects, would affect the efficacy of their interactions with the chatbot with the objective to link theory to practice. For example, Gurl et al. (2024) revealed that ChatGPT tended to suggest to mathematics PSTs teacher-centered activities—a finding that begs the question of whether chatbots embed certain teaching methods as a default, and if so, how this impacts PSTs’ attitudes towards CP. Quantitatively oriented research may also suggest examining the impact of integrating GAI tools in teacher education by including courses where GAI tools were absent and defining the PSTs who participated in such courses as a control group. Future research should also track how PSTs implement GAI-based CP in their practicum, and how TEs may use PSTs’ prompts to pinpoint theoretical misconceptions. Finally, the findings call for further exploration of the role of GAI in teacher agency (Molla & Nolan, 2020), which may in turn shed light on the notions embedded in this tool regarding PSTs’ autonomy and decision-making in CP.
Zhao and Watterston (2021) argue that today’s uncertain and rapidly changing reality requires a fundamental rethinking of curriculums. While a curriculum framework needs to be defined at the system level, it should be flexible enough to allow stakeholders the autonomy to make changes. As Cochran-Smith (2000) compellingly argues, teacher education depends to a great extent on how three fundamental questions are framed: about knowledge, learning, and outcomes. Accordingly, in a reality in which AI is an integral part, TEs, PSTs, and other stakeholders should collaboratively reconsider the purposes of teaching. TEs and PSTs should discuss GAI-based CP, so as to prevent PSTs from being dazzled by chatbots’ responses and resorting to the offloading model of CP (Lim et al., 2018), and so as to address PSTs’ concerns regarding GAI-based CP. It stands to reason that working in a complex and labor-intensive profession may tempt PSTs to sometimes dispense with critical thinking (i.e., taking the chatbot’s responses for granted without any evaluation) and take problematic shortcuts. Hence, TEs should encourage PSTs to reflect and critically discuss the role of theories with regard to GAI-based CP.
Deng (2004, p. 155) aptly notes that “the role of theory is not just to assist in the training of PSTs in skills and procedures, but more importantly, to educate them more widely about the complexities, intellectual and moral dimensions of classroom management.” In line with this perspective, one could ask the following with regard to the TEs-PSTs-GAI synergy: could it create a new dynamic conducive to the reevaluation of major CP-related questions and to discussions regarding new curricular paradigms?

6. Conclusions

The findings yield three primary conclusions: (1) ChatGPT has the potential to facilitate the translation of theory to practice through the simplification, application, and visualization of theory; (2) iteration and language precision may enhance PSTs’ professional language; and (3) analyzing and reflecting on ChatGPT responses may enhance PSTs’ critical thinking.

Funding

This research received no external funding. The APC was funded by The Levinsky-Wingate Academic College.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by Ethics Committee of The Levinsky-Wingate Academic College (protocol code: 2024022501; date of approval: 1 July 2024).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The original contributions presented in the study are included in the article.

Acknowledgments

I would like to thank Karen Ettinger, Head of the Innovation Unit at the Levinsky-Wingate Academic College, for her technical-pedagogical guidance in the course.

Conflicts of Interest

The author declares no conflicts of interest.

References

  1. Agyapong, B., Obuobi-Donkor, G., Burback, L., & Wei, Y. (2022). Stress, burnout, anxiety and depression among teachers: A scoping review. International Journal of Environmental Research and Public Health, 19(17), 10706. [Google Scholar] [CrossRef]
  2. Aktan, S. (2021). Waking up to the dawn of a new era: Reconceptualization of curriculum post COVID-19. Prospects, 51, 205–217. [Google Scholar] [CrossRef]
  3. Annapureddy, R., Fornaroli, A., & Gatica-Perez, D. (2024). Generative AI literacy: Twelve defining competencies. Digital Government: Research and Practice. [Google Scholar]
  4. Baytak, A. (2024). The content analysis of the lesson plans created by ChatGPT and Google Gemini. Research in Social Sciences and Technology, 9(1), 329–350. [Google Scholar] [CrossRef]
  5. Biberman-Shalev, L., Pinku, G., Hemi, A., Nativ, Y., Paz, O., & Enav, Y. (2024). Portion coherence: Enhancing the relevance of introductory courses in teacher education. Frontiers in Education, 9, 1415518. [Google Scholar] [CrossRef]
  6. Bobbitt, J. F. (1918). The curriculum. Houghton Mifflin. [Google Scholar]
  7. Braun, V., & Clarke, V. (2019). Thematic analysis. In P. Liamputtong (Ed.), Handbook of research methods in health social sciences (pp. 843–860). Springer. [Google Scholar]
  8. Cahapay, M. B. (2020). Rethinking education in the new normal post COVID-19 era: A curriculum studies perspective. Aquademia, 4(2), ep20018. [Google Scholar] [CrossRef] [PubMed]
  9. Carcary, M. (2009). The research audit trial—Enhancing trustworthiness in qualitative inquiry. Electronic Journal of Business Research Methods, 7, 11–24. [Google Scholar]
  10. Cevikbas, M., König, J., & Rothland, M. (2023). Empirical research on teacher competence in mathematics lesson planning: Recent developments. ZDM–Mathematics Education, 56, 101–113. [Google Scholar] [CrossRef]
  11. Chan, C. K. Y. (2023). Is AI changing the rules of academic misconduct? An in-depth look at students’ perceptions of ‘AI-giarism’. arXiv, arXiv:2306.03358. [Google Scholar] [CrossRef]
  12. Cochran-Smith, M. (2000). The future of teacher education: Framing the questions that matter. Teaching Education, 11(1), 13–24. [Google Scholar] [CrossRef]
  13. Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20, 37–46. [Google Scholar] [CrossRef]
  14. Cook, L. S., Smagorinsky, P., Fry, P. G., Konopak, B., & Moore, C. (2002). Problems in developing a constructivist approach to teaching: One teacher’s transition from teacher preparation to teaching. The Elementary School Journal, 102(5), 389–413. [Google Scholar] [CrossRef]
  15. Cooper, G. (2023). Examining science education in ChatGPT: An exploratory study of generative artificial intelligence. Journal of Science Education and Technology, 32, 444–452. [Google Scholar] [CrossRef]
  16. Crawford, J., Cowling, M., & Allen, K. (2023). Leadership is needed for ethical ChatGPT: Character, assessment, and learning using artificial intelligence (AI). Journal of University Teaching and Learning Practice, 20(3), 02. [Google Scholar] [CrossRef]
  17. Deng, Z. (2004). The role of theory in teacher preparation: An analysis of the concept of theory application. Asia-Pacific Journal of Teacher Education, 32(2), 143–157. [Google Scholar] [CrossRef]
  18. Dolezal, D., Motschnig, R., & Ambros, R. (2025). Pre-service teachers’ digital competence: A call for action. Education Sciences, 15(2), 160. [Google Scholar] [CrossRef]
  19. Felix, C. V. (2020). The role of the teacher and AI in education. In International perspectives on the role of technology in humanizing higher education (pp. 33–48). Emerald Publishing Limited. [Google Scholar]
  20. Flick, U. (2004). Design and process in qualitative research. In U. Flick, E. von Kardoff, & I. Steinke (Eds.), A companion to qualitative research (pp. 146–152). Sage. [Google Scholar]
  21. Flores, M. A. (2016). Teacher education curriculum. In J. Loughran, & M. L. Hamilton (Eds.), International handbook of teacher education (pp. 187–230). Springer. [Google Scholar]
  22. Gupta, P., Raturi, S., & Venkateswarlu, P. (2023). ChatGPT for designing course outlines: A boon or bane to modern technology. Preprint. [Google Scholar] [CrossRef]
  23. Gurl, T. J., Markinson, M. P., & Artzt, A. F. (2024). Using ChatGPT as a lesson planning assistant with preservice secondary mathematics teachers. Preprint. [Google Scholar] [CrossRef]
  24. Hashem, R., Ali, N., El Zein, F., Fidalgo, P., & Khurma, O. A. (2024). AI to the rescue: Exploring the potential of ChatGPT as a teacher ally for workload relief and burnout prevention. Research & Practice in Technology Enhanced Learning, 19, 1–26. [Google Scholar] [CrossRef]
  25. Jauhiainen, J. S., & Guerra, A. G. (2023). Generative AI and ChatGPT in school children’s education: Evidence from a school lesson. Sustainability, 15(18), 14025. [Google Scholar] [CrossRef]
  26. Juarez, B. (2019). The intersection of theory and practice in teacher preparation courses. Journal of Instructional Research, 8(2), 84–88. [Google Scholar] [CrossRef]
  27. Karaman, M. R. (2024). Are lesson plans created by ChatGPT more effective? An experimental study. International Journal of Technology in Education, 7(1), 107–127. [Google Scholar] [CrossRef]
  28. Korthagen, F. A., & Kessels, J. P. (1999). Linking theory and practice: Changing the pedagogy of teacher education. Educational Researcher, 28(4), 4–17. [Google Scholar] [CrossRef]
  29. Krahenbuhl, K. S. (2016). Student-centered education and constructivism: Challenges, concerns, and clarity for teachers. The Clearing House: A Journal of Educational Strategies, Issues and Ideas, 89(3), 97–105. [Google Scholar] [CrossRef]
  30. Kroll, L. R. (2004). Constructing constructivism: How student-teachers construct ideas of development, knowledge, learning, and teaching. Teachers and Teaching, 10(2), 199–221. [Google Scholar] [CrossRef]
  31. Lee, G. G., & Zhai, X. (2024). Using ChatGPT for science learning: A study on pre-service teachers’ lesson planning. IEEE Transactions on Learning Technologies, 17, 1683–1700. [Google Scholar] [CrossRef]
  32. Lim, W., Son, J. W., & Kim, D. J. (2018). Understanding preservice teacher skills to construct lesson plans. International Journal of Science and Mathematics Education, 16, 519–538. [Google Scholar] [CrossRef]
  33. Luke, A. (2008). Introductory essay. In F. M. Connelly (Ed.), Curriculum and instruction (pp. 145–151). Sage Publication Inc. [Google Scholar]
  34. Mitra, D. L. (2004). The significance of students: Can increasing “student voice” in schools lead to gains in youth development? Teachers College Record, 106, 651–688. [Google Scholar] [CrossRef]
  35. Molla, T., & Nolan, A. (2020). Teacher agency and professional practice. Teachers and Teaching, 26(1), 67–87. [Google Scholar] [CrossRef]
  36. Neumann, M., Rauschenberger, M., & Schön, E. M. (2023, May 16). We need to talk about ChatGPT: The future of AI and higher education. 2023 IEEE/ACM 5th International Workshop on Software Engineering Education for the Next Generation (SEENG) (pp. 29–32), Melbourne, Australia. [Google Scholar] [CrossRef]
  37. Nowell, L. S., Norris, J. M., White, D. E., & Moules, N. J. (2017). Thematic analysis: Striving to meet the trustworthiness criteria. International Journal of Qualitative Methods, 16(1), 1609406917733847. [Google Scholar] [CrossRef]
  38. OpenAI. (2023). ChatGPT: Optimizing language models for dialogue. Available online: https://openai.com/blog/chatgpt/ (accessed on 25 July 2024).
  39. Pinar, W. F. (2004). What is curriculum theory? Routledge. [Google Scholar]
  40. Polak, S., Schiavo, G., & Zancanaro, M. (2022, April 29–May 5). Teachers’ perspective on artificial intelligence education: An initial investigation. CHI Conference on human factors in computing systems extended abstracts (pp. 1–7), New Orleans, LA, USA. [Google Scholar] [CrossRef]
  41. Prosser, M. (2000). Using phenomenographic research methodology in the context of research in teaching and learning. In J. A. Bowden, & E. Walsh (Eds.), Phenomenography (pp. 34–46). RMIT University Press. [Google Scholar]
  42. Ran, Y., Cheng, X., He, Y., Zeng, A., & Mou, J. (2025). Misfortune or blessing? The effects of GAI misinformation on human-GAI collaboration. Available online: https://hdl.handle.net/10125/108851 (accessed on 25 July 2024).
  43. Schwab, J. J. (2013). The practical: A language for curriculum. Journal of Curriculum Studies, 45(5), 591–621, (Original work published 1969). [Google Scholar] [CrossRef]
  44. Shulman, L. S. (1998). Theory, practice, and the education of professionals. The Elementary School Journal, 98(5), 511–526. [Google Scholar] [CrossRef]
  45. Strzelecki, A. (2023). To use or not to use ChatGPT in higher education? A study of students’ acceptance and use of technology. Interactive Learning Environments, 32, 5142–5155. [Google Scholar] [CrossRef]
  46. Trust, T., Whalen, J., & Mouza, C. (2023). Editorial: ChatGPT: Challenges, opportunities, and implications for teacher education. Contemporary Issues in Technology and Teacher Education, 23(1), 1–23. [Google Scholar]
  47. Tyler, R. W. (1949). Basic principles of curriculum and instruction. University of Chicago Press. [Google Scholar]
  48. UNESCO. (2023). International task force on teachers for education 2030. UNESCO. [Google Scholar]
  49. van den Berg, G., & du Plessis, E. (2023). ChatGPT and generative AI: Possibilities for its contribution to lesson planning, critical thinking, and openness in teacher education. Education Sciences, 13(10), 998. [Google Scholar] [CrossRef]
  50. Van Manen, M. (1977). Linking ways of knowing with ways of being practical. Curriculum Inquiry, 6(3), 205–228. [Google Scholar] [CrossRef]
  51. Vazhayil, A., Shetty, R., Bhavani, R. R., & Akshay, N. (2019, December 9–11). Focusing on teacher education to introduce AI in schools: Perspectives and illustrative findings [Conference session]. 2019 IEEE Tenth International Conference on Technology for Education (pp. 71–77), Goa, India. [Google Scholar] [CrossRef]
  52. Wang, X., Li, L., Tan, S. C., Yang, L., & Lei, J. (2023). Preparing for AI-enhanced education: Conceptualizing and empirically examining teachers’ AI readiness. Computers in Human Behavior, 146, 107798. [Google Scholar] [CrossRef]
  53. Wursta, M., Brown-DuPaul, J., & Segatti, L. (2004). Teacher education: Linking theory to practice through digital technology. Community College Journal of Research & Practice, 28(10), 787–794. [Google Scholar] [CrossRef]
  54. Zhai, X. (2023). ChatGPT for next generation science learning. XRDS: Crossroads, The ACM Magazine for Students, 29(3), 42–46. [Google Scholar] [CrossRef]
  55. Zhao, Y., & Watterston, J. (2021). The changes we need: Education post COVID-19. Journal of Educational Change, 22, 3–12. [Google Scholar] [CrossRef] [PubMed]
Figure 1. An example of the “Theory in Simple Words” approach.
Figure 1. An example of the “Theory in Simple Words” approach.
Education 15 00196 g001
Figure 2. An example of the “Listing Theoretical Principles” approach.
Figure 2. An example of the “Listing Theoretical Principles” approach.
Education 15 00196 g002
Figure 3. An example of the ‘Applying Theory’ approach.
Figure 3. An example of the ‘Applying Theory’ approach.
Education 15 00196 g003
Figure 4. An example of the “Applying Theory” approach in a specific context.
Figure 4. An example of the “Applying Theory” approach in a specific context.
Education 15 00196 g004
Figure 5. An example of the “Visualizing Theory” approach.
Figure 5. An example of the “Visualizing Theory” approach.
Education 15 00196 g005aEducation 15 00196 g005b
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Biberman-Shalev, L. Prompting Theory into Practice: Utilizing ChatGPT-4 in a Curriculum Planning Course. Educ. Sci. 2025, 15, 196. https://doi.org/10.3390/educsci15020196

AMA Style

Biberman-Shalev L. Prompting Theory into Practice: Utilizing ChatGPT-4 in a Curriculum Planning Course. Education Sciences. 2025; 15(2):196. https://doi.org/10.3390/educsci15020196

Chicago/Turabian Style

Biberman-Shalev, Liat. 2025. "Prompting Theory into Practice: Utilizing ChatGPT-4 in a Curriculum Planning Course" Education Sciences 15, no. 2: 196. https://doi.org/10.3390/educsci15020196

APA Style

Biberman-Shalev, L. (2025). Prompting Theory into Practice: Utilizing ChatGPT-4 in a Curriculum Planning Course. Education Sciences, 15(2), 196. https://doi.org/10.3390/educsci15020196

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop