Next Article in Journal
The Detachment of Function and the Return to Essence: Exploring the Public’s Emotional Attitudes Towards Gamified Education
Previous Article in Journal
Developing Systems Thinking Skills with a Global Climate Change Module: A Mixed Methods Design
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Exploring Teachers’ Beliefs About ChatGPT in Arts Education

by
Maria Kladaki
,
Apostolos Kostas
and
Panagiotis Alexopoulos
*
Department of Primary Education, University of the Aegean, 85100 Rhodes, Greece
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(7), 795; https://doi.org/10.3390/educsci15070795
Submission received: 2 May 2025 / Revised: 2 June 2025 / Accepted: 12 June 2025 / Published: 20 June 2025

Abstract

In recent years, there has been growing interest in the pedagogical use of ChatGPT within arts education, including literature, drama, music, and painting. This research investigates the beliefs of primary and secondary school teachers who teach arts regarding the pedagogical use of ChatGPT, exploring potential use, expected benefits or risks, support or rejection from the educational community, and possible barriers or facilitators, based on Ajzen’s Theory of Planned Behavior. A qualitative study was conducted with a sample of 67 teachers familiar with or having used ChatGPT in education. Data were collected through semi-structured interviews and analyzed thematically based on behavioral, normative, and control beliefs. Teachers identified expected benefits such as increased student interest, creativity, and critical thinking, as well as the facilitation of research and support for students with special needs. Concerns included copying, misinformation, and reduced critical thinking and creativity. They expressed ambivalence and skepticism toward ChatGPT’s pedagogical use, being optimistic about educational benefits and community support but concerned about future challenges. Finally, they emphasized the need for training and adequate technological infrastructure. The findings highlight the importance of equipping teachers with the necessary skills and institutional support to ensure the responsible and effective integration of AI in arts education.

1. Introduction

1.1. ChatGPT in Arts Teaching

The convergence of New Technologies (NT) and the Arts forms a new framework for the existing understanding of how art is produced and shared. NT and generative artificial intelligence (GenAI) applications, along with the rapid development of computer programming systems, have led to the ever-increasing development of machine learning language models that can understand and produce text in a human-like way. Machine learning language models are systems that, in order to understand and produce text, analyze the language’s patterns and structures by interacting with users and using a “training” database. One such example is ChatGPT (Generative Pre-trained Transformer), an advanced machine learning model that was first made available to the general public in November 2022, experiencing rapid popularity and attracting millions of users. The main advantage of ChatGPT is its ability to rapidly generate lifelike texts in response to users’ text-based questions.
Although seemingly unconnected—since art is linked to the human creative mood—GenAI applications such as ChatGPT have a potential relationship with the arts, such as literature and theatre, since they provide new possibilities for the expression of human creativity. For instance, ChatGPT can create impressive descriptions, narratives, or even poems; in the field of theatre, it can also contribute to creating scripts and developing dialogues (Haleem et al., 2022; Mizumoto & Eguchi, 2023). However, this dynamic also raises several concerns, such as issues regarding the authenticity of creation and the safeguarding of human, personal expression (C. Guo et al., 2023). Alongside concerns about authenticity and personal expression, a particularly critical issue that arises when using generative AI tools such as ChatGPT in the arts is that of intellectual and creative property. The use of GenAI in content creation raises fundamental questions about the ownership and rights of creative works. In traditional artistic and educational contexts, authorship is clear and directly attributed to the human creator. However, when artworks, texts, or educational materials are produced or co-produced with the help of artificial intelligence, the boundaries of authorship and ownership become blurred.
Specifically, questions arise as to whether a work created by artificial intelligence can be considered the intellectual property of the person who initiated it, the developer of the artificial intelligence, or whether it remains completely outside the current legal framework. Indeed, if we take into account that generative AI draws and synthesizes data from a certain database, issues arise regarding the intellectual property of the original content from which the new content is reconstructed. Since the database is not always visible at present, there is no reference to the original creator, while concerns also arise about the possible standardization of art, as the results often resemble each other. In the context of art education, these issues become even more intense. When students or teachers use ChatGPT to create poems, scripts, or course material, questions arise about who owns the copyright, who is responsible for possible plagiarism, and whether the creative result can truly be considered original (Elbanna & Armstrong, 2023; Rahman & Watanobe, 2023; Smits & Borghuis, 2022).
The aforementioned rapid technological progress has confronted contemporary education with new challenges but also with diverse possibilities. The evolution of GenAI and its applications has created new perspectives on the way humans interact with machines, and among these interactions, the field of education could not remain unaffected. This finding is reinforced by the advent of ChatGPT and other GenAI applications that create new pedagogical perspectives and risks (Dimeli & Kostas, 2025; Fokides & Peristeraki, 2025; Kostas & Chanis, 2024). As NTs and now GenAI applications are diffusing into everyone’s lives, they are therefore coming to affect education as well, with teachers being asked to adapt to the new, modern digital reality of the 21st century and the challenges that accompany it (Dimeli & Kostas, 2025).
In this context, this study aims to investigate primary and secondary school teachers’ beliefs about the pedagogical use of ChatGPT in arts subjects such as literature, music, theatre, and art—a field in which there is a significant gap in the literature, given the recent nature of GenAI applications. Specifically, the study intends to investigate, according to Ajzen’s (1991) Theory of Planned Behavior, teachers’ beliefs that influence their intention to utilize ChatGPT in art classes.
As mentioned above, Ajzen’s (1991) Theory of Planned Behavior was chosen to investigate teachers’ beliefs, according to which behavior is determined by behavioral intentions that form a network shaped by attitudes toward behavior, subjective norms, and perceived behavioral control. This theory has been used many times to interpret and evaluate teachers’ beliefs regarding the introduction and use of technological tools (Kan & Fabrigar, 2017; Ajzen, 1991).
As stated in the categorical framework of its creator, beliefs can be divided into behavioral beliefs, normative beliefs, and control beliefs. Behavioral beliefs relate to attitudes toward behavior, the intention that precedes behavior, and shape the subjective norm, the perceived control of behavior. On the other hand, normative beliefs are those on the basis of which a person believes that they should or should not perform a behavior. In other words, it is a person’s perception of social normative pressures or the beliefs of others about behaviors that should or should not be performed. Based on these, the individual’s subjective norm is formed, i.e., their perception of the behavior in question, which is influenced by the judgment of “significant others,” such as parents, managers, and colleagues. Finally, control beliefs are related to the presence of factors that can facilitate or hinder the manifestation of a behavior and determine the perceived control of the behavior, that is, the perceived ease or difficulty of the individual to perform it. Perceived control is determined by control beliefs and is related to self-efficacy (Ajzen, 1991). So, based on the above categories of beliefs arising from Ajzen’s (1991) Theory, which is used to investigate teachers’ beliefs about NT tools, were formulated the main questions asked to teachers (Figure 1). In the context of educational innovation, and specifically the adoption of GenAI tools such as ChatGPT, the Theory of Planned Behavior provides a robust lens for analyzing teachers’ intentions and actions. It enables the systematic examination of how teachers’ attitudes towards GenAI, the expectations and pressures from their professional and social environment, and their perceived control over its use determine their willingness to integrate such technologies into their teaching practice.

1.2. Studies and Research Contribution

The involvement of children in the arts during their leisure time has shown significant benefits for their overall development and engagement (Koulianou et al., 2025; Mastrothanasis & Kladaki, 2020; Mastrothanasis et al., 2025a, 2025b), making it essential to explore technological tools like ChatGPT that could further enhance their creative experiences. In order to investigate both teachers’ and students’ beliefs about the pedagogical use of ChatGPT and its role in motivating English language learning, Ali et al. (2023) administered online questionnaires (n = 80) in school settings. The results showed that both students and teachers had a neutral attitude regarding the effect that ChatGPT can have on listening and speaking skills. However, teachers agreed that teaching involving ChatGPT can provide learning motivation to students. It is worth noting that this research was conducted just a few months after ChatGPT was made widely available, reflecting the early stage of empirical research in this field.
Further studies focusing on school contexts have emphasized the dual nature of teachers’ beliefs regarding the use of GenAI in the classroom. On the one hand, educators recognize the value of such tools in enhancing student engagement, creativity, and the overall learning process (Dimeli & Kostas, 2025; Fokides & Peristeraki, 2025; Kostas & Chanis, 2024). On the other hand, they frequently express concerns related to the authenticity of student work, potential copying, reduced critical thinking, and issues surrounding ethical use and data privacy (Ali et al., 2023). Despite these concerns, some teachers remain optimistic, highlighting the potential of GenAI to motivate students and support more innovative and differentiated teaching approaches.
The study by Chen et al. (2025) employed a cross-sectional survey design to collect data from teachers in a large urban school district in the United States. A total of 1454 K-12 teachers participated, responding to both closed- and open-ended questions regarding their experiences, perceptions, and concerns about generative artificial intelligence (GenAI) tools, including ChatGPT, in their teaching practice. The sample included teachers from various disciplines, among which were teachers of the arts (language arts, literature, creative writing, music, and visual arts). Analysis included both quantitative and qualitative methods, allowing for the identification of domain-specific perspectives.
According to the findings, teachers in the arts disciplines expressed both potential and reservations concerning the use of generative AI tools like ChatGPT in their instruction. A number of arts teachers reported that generative AI could support the process of writing, idea generation, and creative activities. Respondents noted that tools such as ChatGPT might assist students in brainstorming ideas for creative writing assignments, poems, and stories, as well as help them overcome writer’s block or generate alternative phrasings and perspectives. Some teachers mentioned that GenAI could serve as a useful scaffold for students who struggle with creative expression or have difficulty starting assignments.
Despite recognizing these affordances, teachers in the arts also raised substantial concerns about the use of GenAI in their subjects. The issue of academic honesty was particularly prominent. Many teachers worried that students might use ChatGPT or similar tools to generate entire essays, poems, or creative projects, thereby bypassing genuine creative effort and undermining the educational goals of their courses. They expressed fears that such practices could lead to increased plagiarism and a decline in original student work. Teachers also voiced concerns regarding the impact on students’ creative and critical thinking skills. There was apprehension that frequent use of GenAI could foster overreliance on automated assistance, impeding the development of authentic creative abilities and the willingness to engage deeply with artistic processes. Some respondents noted that, in the arts, personal voice, style, and emotional investment are fundamental, and there was skepticism about whether GenAI could support or respect these essential elements. Furthermore, teachers highlighted the risk of misinformation or inaccuracies in AI-generated content, especially in creative or artistic contexts where nuance, intent, and cultural references are significant. They emphasized the importance of teacher guidance to ensure that students use GenAI tools ethically and responsibly. Several arts teachers called attention to the need for clear guidelines and policies regarding the use of GenAI in creative assignments. They requested professional development to help them understand both the capabilities and limitations of these tools and to equip them with strategies for teaching students to use GenAI as a supplementary resource rather than a substitute for their own creativity.
In addition, Bergdahl and Sjöberg (2025) conducted a cross-sectional study involving 312 K-12 teachers from Sweden. Data were collected through a comprehensive questionnaire that included both closed and open-ended items. The teachers represented various subjects, with 28% indicating that they taught “Arts and aesthetics”. The study aimed to explore teachers’ attitudes, perceptions, and self-efficacy regarding the use of artificial intelligence (AI) tools, specifically chatbots like ChatGPT, in educational practice.
Teachers of “Arts and aesthetics” provided a diverse set of views concerning the integration of AI tools such as ChatGPT into their teaching. Many expressed an overall positive attitude toward the potential of AI in education. They saw value in using chatbots for inspiration and as a means to create engaging learning materials, particularly in creative and artistic domains.
Several arts teachers noted that AI tools could facilitate creative processes by offering new ideas and perspectives for artistic expression. For instance, they mentioned that chatbots might help students generate prompts or overcome creative blocks in activities like creative writing, music composition, or visual arts projects. Some teachers described how AI could be used to support lesson planning, giving them more time to focus on direct interaction with students and the development of artistic skills.
However, teachers also highlighted several concerns and limitations regarding the use of AI in the arts. A prominent concern was the potential impact on originality and authenticity in students’ work. Teachers worried that excessive reliance on AI-generated content could diminish students’ own creative efforts, leading to less original art or writing. The authenticity of artistic output was seen as a crucial aspect of arts education, and teachers emphasized that AI should not replace personal engagement, imagination, and the development of individual style. Another concern raised was the risk of dependency on technological solutions. Some teachers feared that if students turned too frequently to chatbots for ideas or solutions, then it could hamper their ability to solve creative problems independently. There was also a sense of uncertainty about the quality and appropriateness of AI-generated material, especially in terms of artistic nuance, emotional content, and alignment with educational objectives in the arts. Arts teachers expressed a need for professional development and support to help them better understand the capabilities and limitations of AI tools. Many indicated that they lacked confidence in their own AI skills (“AI self-efficacy”) and wanted more guidance on integrating chatbots into arts education in a meaningful way. They stressed the importance of clear guidelines and best practices to ensure that AI was used to enhance, rather than detract from, the artistic and creative goals of their subjects (Bergdahl & Sjöberg, 2025).
Overall, the literature in primary and secondary education underlines both the promise and the complexity of integrating GenAI tools like ChatGPT into arts education. Teachers’ perceptions reflect a careful weighing of benefits, such as increased motivation and creativity, against risks related to academic integrity and the changing nature of learning processes. The ongoing expansion of empirical studies in school environments is expected to further clarify these trends and support the development of effective pedagogical practices for GenAI integration.
Although the primary focus of this review is on research conducted in primary and secondary education, it is important to briefly acknowledge findings from the higher education context, as they highlight issues that may also emerge in school settings. For instance, Mohamed (2023) investigated the beliefs of ten lecturers at Northern Border University in Saudi Arabia regarding the effectiveness of ChatGPT on students’ English language learning. This study highlighted both supportive and critical attitudes: some lecturers emphasized the usefulness of ChatGPT in providing quick and accurate answers, while others raised concerns about its impact on students’ critical thinking, research skills, and the perpetuation of biases and misinformation.
In contrast, Iqbal et al. (2022) conducted semi-structured interviews with 20 higher education teachers in Pakistan, capturing a predominantly negative attitude towards ChatGPT. Participants in this study expressed strong concerns related to copying and plagiarism, although some pedagogical benefits, such as for assessment and programming skills, were also noted. The researchers concluded that there is a need for the training and informing of teaching staff, and further investigation into students’ beliefs.
In the study by Sáez-Velasco et al. (2024), five educators in higher arts education participated in a focus group to discuss the use of generative AI in the arts. The educators recognized that generative AI can speed up creative processes, reduce costs, and serve as a useful tool for inspiration or overcoming creative blocks. However, they stressed that AI-generated content often results in more homogeneous and less emotionally nuanced output than human-made art. Educators believed that foundational artistic knowledge and critical capacity are necessary before using AI, and overreliance on AI may hinder the development of personal artistic skills. They also highlighted the need for clear authorship and regulation to distinguish between AI-generated and human-created works. Overall, educators considered AI to be a valuable supplementary tool but not a replacement for human creativity and expertise in arts education.
While these studies originate from the higher education sector and may not directly correspond to the realities of primary and secondary education, they offer valuable insights into broader trends and challenges surrounding the adoption of GenAI in educational contexts. The study of teachers’ beliefs about the pedagogical use of ChatGPT is quite an uncharted field. Given that ChatGPT was recently released, it is to be expected that there are few studies focusing on teachers’ beliefs about this tool and even fewer to nonexistent ones focusing specifically on the dynamic relationship between ChatGPT and arts education. On the one hand, the present study’s contribution lies in the exploration of a hitherto relatively barren research field, and its novelty and originality are focused on the particular interconnection of ChatGPT with teachers’ beliefs about humanities courses involving the arts.
The mapping of teachers’ beliefs is expected to shed research light on the advantages and challenges arising from a possible involvement of this tool in education, and more specifically in arts courses, and to highlight and record the specific expectations, concerns, and worries of the teachers themselves. By knowing teachers’ beliefs, it is possible to create the conditions for the emergence of practices that exploit the potential of ChatGPT and reduce potential risks. Furthermore, analyzing teachers’ beliefs in the field of art allows for a better understanding of the way in which ChatGPT can influence the human creative process, while at the same time evaluating its reception by its final recipients—the users, and in this case, in education, the teachers themselves.

2. Methodology

2.1. Research Aims and Questions

Considering the aforementioned gap in the literature, this study aims to investigate primary and secondary school teachers’ beliefs about the pedagogical use of ChatGPT in art lessons, such as literature, music, drama, and painting. Utilizing Ajzen’s (1991) Theory of Planned Behavior, we investigate teachers’ beliefs that positively or negatively influence their intention to utilize ChatGPT in art lessons. Specifically, the research aims to map teachers’ beliefs about expected positive and negative outcomes from the pedagogical use of ChatGPT in the arts, expected support or rejection from the wider educational community, and expected barriers and facilitators. Accordingly, the research questions are framed as follows:
RQ1. 
What are the positive and negative outcomes that teachers expect from the pedagogical utilization of ChatGPT in the arts that contribute to shaping a positive or negative attitude?
RQ2. 
What are teachers’ beliefs about normative pressures, or the beliefs of others, regarding the pedagogical use of ChatGPT in the arts that shape their decision about whether or not to use it themselves?
RQ3. 
What are teachers’ beliefs about the factors they believe could facilitate or hinder the use of ChatGPT in lessons involving the arts?

2.2. Research Design and Participants

In order to study teachers’ beliefs about the pedagogical use of ChatGPT in the arts, a qualitative analysis of the data collected from semi-structured interviews was chosen. Among the qualitative research approaches, the one that was selected is phenomenology, with its methodological principles (Cohen et al., 2017). Specifically, phenomenological research has as its main objective to bring out the deeper meaning—the essence—of a phenomenon or experience as experienced by the research subjects. It focuses on the subjective experiences of individuals. As ChatGPT is a new tool and there is an absence of research on teachers’ beliefs in arts teaching, it seems appropriate to first take a qualitative approach and analyze the data.
Purposive sampling, and then snowball or chain sampling, was utilized to gather the research sample. Primary and secondary school teachers who are familiar with ChatGPT and who teach art subjects in school were selected. To be invited for an interview, an online questionnaire was provided. Of the 250 teachers interested in participating, 67 who had answered in the affirmative were selected (Mason, 2017; Mertler, 2018). At the conclusion of each interview, it was mentioned that they could inform other teachers who knew ChatGPT. By doing so, another 10 teachers were enrolled (Goodman, 1961).
The total sample consisted of 67 teachers who teach art classes in primary and secondary schools in Greece and are familiar with ChatGPT. The sample is geographically heterogeneous. Of the 67, 53 (79.1%) were female and 14 (20.9%) were male. In total, 27 (40.3%) had 1 to 5 years of service, 24 (35.8%) had 6 to 15 years, 10 (14.9%) had 16 to 25 years, and 6 (9%) had more than 25 years. Additionally, 41 (61.2%) taught in secondary education, while 26 (38.8%) taught in primary education. Furthermore, 42 (62.7%) held a postgraduate degree, 16 (23.9%) held only an undergraduate degree, and 7 (10.4%) also held a PhD. Only 2 (3%) held a second basic degree.
Regarding their expertise, the participants included 6 English teachers (9%), 1 French teacher (1.5%), 3 music teachers (4.5%), 7 kindergarten teachers (10.4%), 6 art teachers (9%), 1 physical education teacher (1.5%), 3 special education teachers (4.8%), 24 Greek language teachers (35.8%), and 16 primary teachers (23.9%). They all taught arts, such as literature, poetry, dance, music, painting, and theatre in Greek schools.

2.3. Data Collection and Analysis

Semi-structured interviews were conducted online (via Zoom) or by telephone, according to participant preference, with each session lasting approximately 15–20 min. All interviews were audio-recorded with the participants’ consent. Zoom interviews were recorded using the platform’s integrated feature (capturing both video and audio), while telephone interviews were recorded with a separate audio device. They were preceded by demographic questions regarding gender, years in service, level of education, specialization, and studies. Participants were asked questions about their beliefs (behavioral beliefs, normative beliefs, and control beliefs), formulated according to Ajzen’s (1991) Theory of Planned Behavior (the full set of interview prompts is provided in Appendix A).
Transcription was performed by the lead researcher using Microsoft Word, followed by a verification of accuracy through comparison with the original audio recordings. To ensure reliability, a second researcher reviewed a random selection of transcripts, and any uncertainties were resolved by consulting the audio files. All transcripts were anonymized and securely stored (Mastrothanasis & Alexopoulos, 2025).
Data analysis was conducted using thematic analysis, primarily adopting a deductive approach based on Ajzen’s (1991) Theory of Planned Behavior. After transcription, responses were initially coded according to the three TPB categories: (a) behavioral beliefs, (b) normative beliefs, and (c) control beliefs. At the same time, the analysis remained open to new patterns and themes that emerged from the data (inductive coding), in order to capture important issues not strictly covered by the theoretical framework. Coding was performed independently by two researchers; discrepancies were discussed and resolved to ensure reliability. Frequency (n) and relative frequency (%) were used to present and compare results, and indicative participant quotes were also included. A detailed description and explanation of the codes used in the analysis is provided in Appendix B.
The coding process followed the six-phase method of Braun and Clarke (2006), starting with data familiarization and progressing through initial coding, theme development, review, definition, and final reporting. The methodological approach was informed by established qualitative research guidelines (Braun & Clarke, 2006; Cohen et al., 2017).

3. Results

3.1. Behavioral Beliefs

The first area of analysis refers to teachers’ behavioral beliefs, according to Ajzen’s (1991) theory. In total, 355 references were detected, which were then divided into 2 thematic areas and 20 sub-codes (Table 1 and Table 2). This means that, according to Ajzen’s (1991) theory, the positive and negative outcomes expected by teachers from the pedagogical use of ChatGPT in the arts were included in this domain.
As the results showed, teachers expressed equally positive (177 reports out of 355, n% = 49.85%) and negative expectations about the pedagogical use of ChatGPT in the arts. In fact, the positive outcomes appear to vary (10 codes). Specifically, the main benefit, as expressed by several teachers, is the improvement of students’ participation, engagement, and activation of their interest in art lessons (n = 51, n% = 14.37%, Participant 28: “I definitely think it will improve students’ engagement in art lessons because all the children are familiar with the technologies”).
Furthermore, the contribution of ChatGPT to research was also noted due to the speed of responses (n = 43, n% = 12.11%, Participant 26: “They would have answers for every question they had. For example, they could ask about a painting or an artwork”).
The teachers also highlighted its potential contribution as an assistant (n = 33, n% = 9.30%) in a variety of ways, such as providing materials, ideas, and exercises, saving time, and creating teaching scenarios. As they reported, it could save them valuable time and help them in organizing lessons (Participant 67: “It would help me in organizing my lessons, giving me ideas and also exercises”).
In addition, there were also reports about ChatGPT’s contribution regarding the improvement of both students’ and teachers’ creativity in lessons involving the arts, such as literature, drama, music, and art (n = 15, n% = 4.23%, Participant 8: “It provides inspiration and original, creative ideas that you find hard to think of on your own”).
Furthermore, there were some references (n = 10, n% = 2.82%) from the teachers about the benefit that could be gained in improving students’ vocabulary and writing skills (Participant 15: “I consider the added value of modifying and correcting texts between ChatGPT and students as the most important benefit”). Equal reports were found (n = 10, n% = 2.82%) regarding the general positive contribution of New Technologies in education and getting students familiar with them in a pedagogical context (Participant 63: “It would act as a motivation to engage with technology, and students would acquire digital skills”). Also, reports (n = 8, n% = 2.25%) of ChatGPT’s beneficial role in developing students’ critical thinking and ability were detected (Participant 41: “Still, through the proper use of AI, it can help the student, the cultivation of critical thinking and reflection”).
On the other hand, there were fewer reports (n = 4, n% = 1.13%) about its positive impact on the development of group collaboration among students to implement art projects, which implies the enhancement of dialogue in the classroom (Participant 18: “It can help in teaching complex topics in the process of group projects, where students”). Few references (n = 2, n% = 0.56%) were also made to themes related to its pedagogical use in the arts to reduce anachronistic perceptions and stereotypes (Participant 2: “They listen, see, read the opinion of a robot who may not have prejudices and stereotypes like others around him”).
Finally, one teacher (n = 1, n% = 0.28%) mentioned the help that writing written responses can provide to students with special educational needs and/or disabilities (Participant 55: “Mostly it can help deaf children because of the AI technology that gives written text so they can read it. Writing can help them”). However, a variety of concerns were also expressed (n = 178, n% = 50.14%), which related to several areas (10 total codes), as presented in Table 2.
Among the concerns, the most frequent was the mention of the potential risk of students using ChatGPT to copy and avoid personal effort when solving exercises and homework assignments (n = 44, n% = 12.39%, Participant 10: “Without a clear orientation of the use of ChatGPT, the results would be disastrous as they would use it purely as a tool for copying and easy solving”). Furthermore, equally strong seems to be, according to the teachers, the possibility that ChatGPT may contribute to the inactivation of thinking and the reduction in students’ critical faculties (n = 36, n% = 10.14%, Participant 7: “Students will become passive, they will stop thinking”). In addition, several concerns (n = 21, n% = 5.92%) were also expressed about the misinformation it may provide (Participant 60: “ChatGPT gives wrong information even in its advanced form in answers that it gives when you ask it something”).
On the other hand, there were also several reports (n = 16, n% = 4.51%) that mentioned a potential risk of reducing students’ creativity in art classes (Participant 24: “There are definitely a lot of negatives that affect my attitude to art lessons at school. For example, there is a risk of resorting to ready-made production, lack of authenticity and creativity, and standardization—and these are just some of the things that come up”).
There were also equal reports (n = 16, n% = 4.51%) about the potential negative impact on students’ participation, engagement, and interest in the course (Participant 1: “It will probably superficially enhance students’ engagement and then they will lose interest”). There was also reference to the potential change in the role of teachers with the entry of ChatGPT into education, particularly in the arts sector (n = 15, n% = 4.23%, Participant 51: “Maybe it will degrade the teacher’s work; our role will be reduced and degraded, I think, in the future”). A minority of reports (n = 12, n% = 3.38%) identified concerns about privacy, the possibility of plagiarism, and copyright infringement in the arts sector (Participant 44: “My concern is about the data that is taken from us and how it can be used by ChatGPT”).
A possible disadvantage of ChatGPT was pointed out as the lack of human communication (n = 9, n% = 2.54%), which was considered by some to be an integral and irreplaceable part of the educational process that cannot be provided through ChatGPT (Participant 13: “I am afraid that there will be an attachment to it, with use for purposes not so desirable, i.e., incomplete communication and classroom discussion with students”).
A smaller percentage of reports (n = 8, n% = 2.25%) were associated with the consequences that students’ use of ChatGPT may have on their reading and writing skills, grammar, syntax, and vocabulary diversity (Participant 66: “Sometimes what is listed as answers is simplistic though and not very helpful in their discourse and in helping students to delve into art issues”). Finally, one teacher (n = 1, n% = 0.28%) mentioned as a disadvantage the standardized nature of the answers given by ChatGPT, which cannot be beneficial and rather takes on a negative connotation for students who are part of the special education and training context (Participant 64: “The disadvantages have to do with the fact that a teaching is entirely based on it without taking into account the individual needs of children, especially those with learning difficulties”).

3.2. Normative Beliefs

The second axis of the analysis refers to teachers’ normative beliefs, which were collected in 296 reports and distributed into 2 themes and 15 codes (Table 3 and Table 4). That is, according to Ajzen’s (1991) Theory of Planned Behavior, what teachers reported regarding potential support or rejection they might encounter from others was placed in this axis. Based on the results (Table 3), teachers made several reports of potential acceptance by others (n = 112, n% = 37.84). However, of the total reports (n = 296), reports of acceptance by others (Table 3) were significantly fewer (37.84% vs. 62.16%) than those of rejection by others (Table 4).
Specifically, the majority of reports (n = 29, n% = 9.80%) focused on personal indifference to whether others agree with the use of ChatGPT in art classes and emphasized that their decision on whether to use it does not depend on the consent or approval of others (Participant 29: “Since it is a tool that can be used to make the class more engaging or useful for what I want to teach the children, I am not so influenced by the educational community”).
Furthermore, some highlighted the possibility of being met with enthusiastic acceptance by their own students (n = 26, n% = 8.78%, Participant 4: “Students would accept ChatGPT, I believe, with great pleasure as something new”). Reports were also identified (n = 19, n% = 6.42%) of support from individuals in the educational community who are generally in favor of the integration and pedagogical use of New Technologies in education. For instance, one teacher noted (Participant 11), “I believe I will receive support from the educational community and those who support New Technologies”.
Support also seems to be expected from those who see ChatGPT as an important supportive tool in education (n = 18, n% = 6.08%, Participant 20: “Probably positively would be seen by colleagues and the principal as a facilitator”). Several references focused on the support the teacher might receive from those who view ChatGPT with a willingness to innovate in education (n = 13, n% = 4.39%, Participant 40: “It would have huge acceptance, because most people today are into the clichéd way of teaching that is tried and trusted, but in a school that is faltering. We want innovation”).
Finally, some teachers linked its use and acceptance to institutional support (n = 7, n% = 2.36%) and to the need for support from the state, with the definition of clear rules of use (Participant 41: “We need training”). However, teachers also expressed their beliefs regarding the possible resistance they might encounter in their attempts to pedagogically use ChatGPT in lessons involving the arts. More specifically, some factors (n = 184, n% = 62.16%) that could contribute to rejection by others were articulated (Table 4).
Specifically, the main reason for the rejection that teachers believe they may encounter seems to be the lack of knowledge from colleagues, parents, and the wider educational community (n = 39, n% = 13.18%) (Participant 31: “I think I will generally encounter resistance from those who are not familiar”). Equally frequent were references made to the general skeptical attitude of parents towards ChatGPT (n = 34, n% = 11.49%, Participant 11: “Other parents may be suspicious of ChatGPT because of the risks”).
Almost equal reports (n = 33, n% = 11.15%) were also related to rejection by individuals who are, in general, dismissive towards the use of not only GenAI but also New Technologies in general (Participant 15: “In general, there is a reservation in the use of digital media and new technologies in teaching”).
To the above-mentioned factors is added the teacher’s own personal fear of breaking with the ‘important others’—colleagues, parents, principals, students themselves, and the wider educational community in general (n = 25, n% = 8.45%, Participant 17: “It affects the opinion of others, and a lot, because you can’t keep coming to a conflict”). In addition to concerns about parents’ skepticism, a similar code emerged for their colleagues (n = 21, n% = 7.09%, Participant 52: “Most teachers I have dealt with express some reservation, it’s still new, which I would agree with”).
On the other hand, a belief was expressed that there may be rejection from others due to ethical issues that may arise (n = 20, n% = 6.76%, Participant 2: “Clearly others will see it negatively. We already see that our students bring ready-made exercises to every lesson, and we cannot prevent them”). Among the reports, some focused on the resistance that educators may encounter in using ChatGPT in arts classes, regarding the reduction in students’ critical thinking (n = 9, n% = 3.04%) (Participant 33: “There will be resistance from some, mostly out of fear that it replaces critical thinking in humans”) and their creativity.
Fewer reports were found of issues of resistance or indifference from the students themselves (n = 2, n% = 0.68%), who “will take it as a joke”, as noted by Participant 16. Finally, one report was detected from a teacher (n = 1, n% = 0.34%) who mentioned resistance from those who are afraid of any mistakes that ChatGPT might make and discourage its use (Participant 46: “They will see it negatively, because it makes mistakes”).

3.3. Control Beliefs

The third axis of the analysis refers to teachers’ control beliefs (potential facilitators and barriers) about the pedagogical use of ChatGPT in art lessons. In their responses, 367 references were detected, which were divided into two distinct themes and 12 sub-codes, as presented in Table 5 and also in Table 6.
According to the above results, teachers made many references to potential factors that could facilitate the use of ChatGPT in education, especially in art classes such as literature, drama, music, and painting (n = 161, n% = 43.87). It should be noted that there was an almost equal number of references to helpful factors (Table 5), as well as to potential barriers (Table 6, n = 206, n% = 56.13).
Specifically, several references were made during the interviews to issues related to the availability of training/resources (n = 57, n% = 15.53%, Participant 26: “We have, I think, enough facilities in the school, so I don’t think we have a problem, so we can use it, we have the resources”).
Furthermore, in several reports (n = 53, n% = 14.44%), the helpful role of institutional support was pointed out, meaning the presence of a clear framework of rules for the use and proper pedagogical exploitation of ChatGPT by students and also by teachers (Participant 52: “It is important, and it will help to put a law framework in AI. With clear instructions and guidance, it would be easier to use it”).
However, the personal confidence of each teacher that they will be able to use ChatGPT in arts lessons was also identified and recorded as a helpful factor (n = 35, n% = 9.54%, Participant 24: “I also feel quite a lot of personal confidence that I can use it myself”). Finally, it seemed that previous positive experiences of teachers (n = 16, n% = 4.36%), both with ChatGPT and other GenAI tools, could help (Participant 25: “I don’t have any experience of using it in the classroom, but I am positively influenced by what I have heard and read. I have even read about risks, and old movies I had seen in the past about AI”).
On the other hand, teachers also mentioned some barriers that they felt might hinder their eventual willingness or desire to use ChatGPT in arts classes (n = 206, n% = 56.13%). The references to barriers that were detected were broken down into eight sub-codes, as shown below in Table 6.
In particular, the lack of knowledge and experience in using ChatGPT in the arts was identified as a key barrier (n = 83, n% = 22.62%, Participant 7: “I don’t know how it can be used for school arts-related lessons and the potential risks of using it”). A second reported barrier (n = 53, n% = 14.44%) is the limited technological access in Greek schools (Participant 10: “Well, let’s be honest. How to use it? Due to the lack of basic and necessary technological tools in Greek schools…”).
Another barrier mentioned by teachers focused on the negative attitude of their own students (n = 19, n% = 5.18%), who either will not treat ChatGPT as a useful educational tool or will not approach it with due attention and seriousness (Participant 23: “Keep in mind as a barrier that students will not engage”). Furthermore, they mentioned negative influence or discouragement from members of the educational community (n = 17, n% = 4.63%). For example, Participant 34: “One barrier I think I will encounter if I go to use ChatGPT in my classroom in art classes is resistance from older colleagues who may not reject it”.
The teachers’ responses also revealed references to barriers that could be caused by personal resistance and distrust of ChatGPT and GenAI in general (n = 15, n% = 4.09%, Participant 4: “One barrier to using it might be me. Well, I know about ChatGPT, but I wouldn’t want it to be used in literature, like in any other art. I’m not convinced it will be useful”).
Fewer reports were also identified about barriers that may exist due to ChatGPT’s own lack of credibility, errors, weaknesses, and failures, which may ultimately discourage its pedagogical use in arts classes (n = 9, n% = 2.45%, Participant 16: “One barrier is also that ChatGPT doesn’t have enough knowledge about the arts and many topics related to the arts”). In addition, teachers reported the difficulty of integrating it into the existing educational system and curriculum as a barrier (n = 8, n% = 2.18%, Participant 13: “A challenge and a barrier is to organize my lesson that way and include ChatGPT in the context of a live art lesson. That, I think, would be a difficulty”). Finally, there were two reports from teachers about concerns regarding privacy and copyright issues that might arise in the pedagogical use of ChatGPT in lessons involving the arts (n = 2, n% = 0.54%) (Participant 41: “As a barrier we could think, I think, about different copyright issues, especially in the field of art and in lessons related to art. There would be an issue with copyright and authors in areas such as painting, music, visual arts, poetry”).

4. Discussion

Moving on to a critical discussion of the research results, it becomes clear that the Theory of Planned Behavior (TPB) provides a robust framework for interpreting teachers’ intentions and actions regarding the integration of generative AI in arts education. According to Ajzen (1991), the TPB posits that intention, the most immediate antecedent of behavior, is shaped by three core components: behavioral beliefs, normative beliefs, and control beliefs. In this study, each of these constructs offers valuable insight into the complexities underlying teachers’ decision-making processes.

4.1. Expected Benefits and Challenges

Moving on to a critical discussion of the research results that emerged, the most important benefit that teachers expect from the use of ChatGPT in the arts is the increase in interest, the strengthening of students’ participation, and, consequently, their greater involvement in the educational process. The above finding is in line with the beliefs of teachers who also participated in Ali et al.’s (2023) study and contributed to the expectation of increased engagement and motivation from the use of ChatGPT, while Elbanna and Armstrong (2023) and Kasneci et al. (2023) also agree with the finding that the pedagogical utilization of ChatGPT in the literature can have multiple benefits for students. Finally, an increase in interest when ChatGPT was integrated into the educational process was also noted by K. Guo et al. (2023).
Then, a second main positive that teachers expect from ChatGPT focuses on its contribution to research, given the rapid and often apt responses that they find it provides. This ability, they believe, will act as a stimulant to students’ inquisitiveness and curiosity. Indeed, its potential contribution to research is also identified in the literature, with ChatGPT suggesting unexplored topics, approaches, and research questions, offering new possibilities, particularly in higher education (Farrokhnia et al., 2023; Kasneci et al., 2023). Furthermore, the literature review by Sallam (2023) also highlighted the tool’s contribution to research through authoring and improving scientific texts. In fact, some teachers who participated in the research of this study also mentioned the possibility of providing information directly. Indeed, despite any erroneous information, ChatGPT has a satisfactory database that can be utilized for research purposes, as confirmed by the research of Choi et al. (2023), in which ChatGPT passed law school examinations. Finally, in a study of higher education teachers’ beliefs about ChatGPT, Mohamed (2023) recorded their belief that it is highly beneficial for research purposes to provide accurate answers quickly and directly, which was confirmed by the results of this study.
A third positive factor highlighted by teachers was its use as an assistive tool, providing ideas, exercises, and teaching scenarios for arts lessons and saving them valuable time. Indeed, the above statement is repeated in the literature in several studies and is supported by several researchers. For instance, Kasneci et al. (2023) report that ChatGPT can provide exercises and problems for students, create lesson scenarios, as well as notes and supporting materials, a finding that is also supported by Elbanna and Armstrong (2023). Lou (2023) found that teachers who teach English use ChatGPT as an assistant to prepare their teaching and suggest effective teaching strategies, a possibility also highlighted by Rahman and Watanobe (2023).
Another benefit, according to the teachers, is the tool’s contribution to increasing students’ creativity in lessons involving the arts. For example, it was reported that, based on ChatGPT’s textual descriptions, students can be inspired and creative, with the tool suggesting or even drafting poems and plays, a belief also reported by K. Guo et al. (2023).
Teachers also highlighted the potential benefit for improving students’ reading and writing skills and vocabulary range through the correction of ChatGPT errors and the positive impact of the extensive responses provided by ChatGPT, which often have adequate vocabulary and exemplary structure. However, in Ali et al.’s (2023) study, a “neutral attitude” of teachers and students was found toward the effect that ChatGPT can have on listening and speaking skills, while Kasneci et al. (2023), Su et al. (2023), and K. Guo et al. (2023) emphasize that, through syntactic and grammatical corrections, as well as selecting the right questions to engage in dialogue with the tool, students’ writing and reading skills, as well as their argumentative literacy, can be favored. Especially in the arts, Kangasharju et al. (2022) report that the ability of such GenAI tools to write poems can be particularly beneficial in this field.
The need for interventions and corrections in ChatGPT text is expected, according to the teachers, to increase the critical thinking the students, who will be called upon to critically distinguish the important from the unimportant and the right from the wrong. This belief is in line with Bitzenbauer (2023) and Kasneci et al. (2023), who utilized ChatGPT in physics teaching and observed an enhancement of students’ critical thinking.
On the other hand, as revealed in the interviews, another expected benefit is the improvement of students’ teamwork and the enhancement of classroom dialogue. The above finding of improved teamwork opportunities with ChatGPT is also verified in the literature (Baidoo-Anu & Owusu Ansah, 2023; Haleem et al., 2022; Kasneci et al., 2023; Rudolph et al., 2023), while research (Roy & Naidoo, 2021; Tlili et al., 2023) has highlighted the strong interactive abilities of chatbots themselves when interacting with humans, as the latter are more persuaded and prefer to converse with them over real humans.
Finally, participating teachers reported a reduction in stereotypes and support for students with special educational needs and/or disabilities. Regarding the reduction in stereotypes and prejudice from its use in art classes, there is a contradiction with the concerns in the literature. In particular, Kasneci et al. (2023) mention as a disadvantage of ChatGPT the reproduction of stereotypes, as the information from which it compiles its answers comes from a limited database, and in case the texts of this database are biased against a group of people, the same will happen with the given answers of the tool. ChatGPT’s bias against groups can have consequences in education in areas such as student assessment (Sok & Heng, 2023). Particularly in arts courses, and despite its positive contribution, there is a risk of perpetuating stereotypes through art (Ali et al., 2023; Shoufan, 2023). Regarding the support of students with special educational needs and/or disabilities, the literature points to ChatGPT’s main advantage as providing personalized responses (Baidoo-Anu & Owusu Ansah, 2023; Elbanna & Armstrong, 2023; Farrokhnia et al., 2023; Kasneci et al., 2023; Rahman & Watanobe, 2023; Sallam, 2023). Recent research has further highlighted the potential of AI-based tools to support inclusive education and to address the diverse needs of students, including those with specific learning difficulties and disabilities, such us autistic spectrum disorders (Drossinou Korea & Alexopoulos, 2023a, 2023b, 2024a, 2024b).
However, the main consequence of using ChatGPT in the arts, according to the teachers, is the risk of using ChatGPT for copying and avoiding effort, which is also reflected in surveys by Iqbal et al. (2022) and Waltzer et al. (2023). According to the teachers, copying would have the direct consequence of inactivating students’ critical thinking skills and reducing their creativity. Also, Sullivan et al. (2023) raise the issue of ChatGPT’s lack of artistic originality and creativity. The absence of creativity, the passivation of thinking and its confinement to the ‘ready answer’ solution, and the subsequent shrinkage of writing skills are issues raised by teachers and are in line with the concerns expressed in the literature (Kasneci et al., 2023; Pavlik, 2023; Rahman & Watanobe, 2023; Zhai, 2022).
Furthermore, the potential risk of misinformation was pointed out by the teachers, as ChatGPT draws from a specific and limited database from which it is asked to provide plausible answers that look as if they were compiled by a human, regardless of whether they correspond to reality—a weakness that is also verified by the literature (Choi et al., 2023; Elbanna & Armstrong, 2023; Mohamed, 2023; Rahman & Watanobe, 2023; Tlili et al., 2023). As reported in a similar study by Mohamed (2023), concerns were raised about its consequences for critical thinking, the perpetuation of stereotypes, and misinformation. These findings create a new context of challenges in education. It should be noted, of course, that based on purposive sampling, we selected teachers who were already aware of ChatGPT and were able to express an opinion about it.
It is also worth pointing out teachers’ fear about the possibility of being replaced or having their role reduced due to GenAI applications such as ChatGPT, while stressing that the human teacher–student relationship is irreplaceable, especially in art lessons, where a human-centered approach is even more needed. This replacement phobia is also confirmed in the literature by Elbanna and Armstrong (2023).
Finally, teachers also mentioned issues such as plagiarism, protection of copyright in art and personal data, and the provision of standardized responses versus the individualization needed to appropriately support students with special educational needs. Indeed, such issues are confirmed by several researchers (Adiguzel et al., 2023; Kasneci et al., 2023; Kohnke et al., 2023; Rahman & Watanobe, 2023; Sallam, 2023) but were also expressed by teachers in similar research by Iqbal et al. (2022).
A deeper analysis of these findings through the lens of the Theory of Planned Behavior (TPB) reveals that teachers’ behavioral beliefs function as both enablers and inhibitors of GenAI adoption. Teachers do not merely articulate positive or negative attitudes; rather, they actively negotiate their professional and personal identities in response to technological innovation. Where anticipated benefits, such as enhanced student engagement and inspiration, are perceived as substantial, a readiness to experiment with GenAI emerges. However, even strong positive beliefs may be neutralized in practice by persistent concerns over creativity, authenticity, and the potential erosion of pedagogical values. This suggests that educators are not passive recipients of innovation but rather active agents engaged in an internal negotiation process regarding their professional role. In this way, the TPB does not merely categorize attitudes but helps explain the dynamic tension and ambivalence underlying teachers’ intentions toward behavioral change.

4.2. Expected Support and Facilitators

As it emerged from the teachers’ responses, based on the above possible positive and negative outcomes, they expect to face the acceptance or rejection of students, parents, colleagues, and supervisors—”significant others”—members of the educational community, as well as the possible facilitators and obstacles they think they will encounter. In terms of barriers, a particular reference needs to be made regarding the lack of knowledge about ChatGPT, as highlighted by teachers. It should be noted that of the teachers who volunteered to participate, few were aware of it and were eventually selected. However, even this proportion expressed a lack of awareness of how ChatGPT could be used in art lessons or a lack of confidence in their ability to use it properly and highlighted the need for formal training.
In some cases, they even referred to the ignorance and rejection of others as a discouraging factor and stressed that their attitude towards ChatGPT is determined by personal experimentation and by what they have heard, read, or seen about GenAI. Definitely, as pointed out by the teachers, training is needed, since it is surprising that even they themselves—who, based on purposive sampling, were knowledgeable about ChatGPT and therefore familiar—expressed embarrassment and a request for formal institutional guidance, accompanied by technical support.
The analysis of normative beliefs demonstrates the fluidity and complexity of the social pressures experienced by teachers in relation to GenAI adoption. The data show that teachers can simultaneously perceive enthusiastic acceptance from students and marked hesitation or even resistance from parents and colleagues. Within the TPB framework, this translates into a “hybrid” subjective norm, in which the final behavioral intention is the outcome of a continual negotiation among competing social expectations. This insight extends the interpretive power of the TPB; rather than focusing on a dominant influence, it becomes crucial to examine how teachers weigh and prioritize the various social voices that shape their professional landscape. The process is not static but an ongoing balancing act between personal convictions and the desire for external approval or legitimacy.
Teachers’ control beliefs are not a static acknowledgment of available or missing resources; they represent a dynamic catalyst in the transformation of intention into actual behavior. When teachers perceive that they have sufficient support, training, and infrastructure, their sense of agency translates into a genuine willingness to experiment with and adopt GenAI. In contrast, when uncertainty or institutional barriers prevail, even strong positive attitudes or supportive social norms are insufficient to foster behavioral intention. Interpreted through the TPB, perceived behavioral control does not operate in isolation but systematically interacts with both attitudinal and normative components, either amplifying or constraining their effect. In this context, control emerges as the pivotal moderator that determines whether favorable dispositions or norms are ultimately converted into action.
Taken together, interpreting the findings through the TPB lens highlights that teachers’ intention to adopt GenAI is not simply the result of an additive process of positive and negative influences. Instead, it is a dynamic negotiation field, where each belief component interacts with the others, strong individual convictions may be outweighed by social pressures, while perceived control often acts as the ultimate regulator of decision-making. In this way, the TPB proves valuable not only as a categorization framework but as a tool for understanding the complex psychosocial mechanisms underlying behavioral change in education.

4.3. Implications for Practice and Policy

This research encourages the design of professional development programs focusing on the proper use of ChatGPT in arts teaching. Institutional support through targeted training programs is expected to soften teachers’ resistance and reservations in order to promote the integration of ChatGPT into education. Furthermore, emphasis needs to be placed on the adequacy of technological equipment in schools.

4.4. Limitations of This Study and Suggestions for Future Research

First of all, research limitations can be defined by the research methodology. In particular, qualitative research lends itself to explorations at an initial level, when there are a lot of unknown data and research in the field in question is at an early stage, but it is not suitable for generalized conclusions. In this research case, qualitative research was preferred, as teachers’ beliefs about ChatGPT are relatively unknown in the literature—especially about the arts in particular—but the limitation of qualitative research is noted as the impossibility of drawing general conclusions. Accordingly, the sample size is considered sufficient for a qualitative approach but cannot safely lead to universal generalizations and quantifications. Another limitation regarding the sample is related to the fact that the selected teachers were, based on purposive sampling, aware of ChatGPT, but not all of them had pedagogically used it in the classroom. Finally, a limitation was the absence of sufficient research on teachers’ beliefs about the pedagogical use of ChatGPT in the arts, on which this research could be based.
The research could be extended to larger samples in order to conduct quantitative research to draw conclusions with generalizable validity. It would be interesting to conduct it using a questionnaire versus semi-structured interviews, or even to broaden the sample not only in primary and secondary education but also in higher education, especially among higher education teachers who teach arts courses (for instance, investigating the beliefs of teachers of theatre departments in the country or literature departments). In this way, there could also be comparative relationships about differences in beliefs in higher education. The codes derived by belief category, based on the Theory of Planned Behavior, during this first qualitative approach, can be used as variables in subsequent quantitative studies in order to draw general conclusions.
Furthermore, this study could be extended to investigate the students’, parents’, or principals’ beliefs about ChatGPT and its relation to the arts sector or to identify the research area in a specific educational context or course. Finally, as ChatGPT is continually being updated, there is a need for ongoing review and updating of the research results.

5. Conclusions

The purpose of this study was to investigate teachers’ beliefs regarding the pedagogical use of ChatGPT in arts-related lessons and determine their decision to use ChatGPT for pedagogical purposes. As revealed by the semi-structured interviews, the main positive outcomes that teachers expect focus on enhancing student participation and interest, speeding up the research process, using ChatGPT as a teacher’s assistant, and enhancing teacher and student creativity.
On the other hand, the main negative effects expected by teachers, which impact their attitude towards ChatGPT’s use in the arts, are students using it for copying, reduced critical thinking, misinformation, loss of creativity, reduced interest, substitution of the teacher’s role, and violation of copyright and personal data. Teachers stated that they were not influenced by others’ opinions but expect support from enthusiastic students, advocates of New Technologies and GenAI, those who see ChatGPT as an auxiliary tool, innovators, and the Ministry of Education. Conversely, they expect resistance from those unfamiliar with ChatGPT, skeptical parents, opponents of GenAI, themselves due to fear of conflict with the educational community, and skeptical colleagues.
Facilitating factors include the availability of training and resources, institutional support, personal confidence, and prior positive experiences. Barriers include lack of knowledge and experience with ChatGPT, limited technological access, student resistance, discouragement from the educational community, distrust of GenAI, lack of reliability, difficulty of integration into the curriculum, and concerns about data breaches. According to Ajzen’s (1991) Theory of Planned Behavior, these beliefs influence teachers’ decisions regarding the use of ChatGPT in arts classes.

Author Contributions

The authors collaborated on all parts of the research (including study design, data collection, analysis of papers and writing of results). All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The research was conducted in the framework of a master thesis approved by the Department of Primary Education of the University of the Aegean.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data supporting the findings of this study are available from the corresponding author upon reasonable request. Due to the qualitative nature of the data and the format of participant consent, the full dataset is not publicly shared to respect participants’ privacy and maintain the integrity of the research process.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Semi-Structured Interview Guide

Behavioral beliefs:
  • How do you think the use of ChatGPT would affect teaching in the courses you teach?
  • What do you think are the potential educational benefits of using ChatGPT in teaching?
  • What do you think are the potential drawbacks or ethical issues of using ChatGPT in teaching?
  • What impact would the use of ChatGPT have on your students’ engagement?
  • Can you describe some of your experiences or observations regarding the potential effects of ChatGPT in arts-related courses?
Normative beliefs:
  • How do you think your colleagues perceive the educational use of ChatGPT in the arts? Positively or negatively? Please document your opinion.
  • How do you think your students and their parents would react to the use of artificial intelligence, such as ChatGPT, in your classroom?
  • Do you think you would encounter support or resistance from the wider educational community (colleagues, school management, parents, etc.) if you used ChatGPT in the classroom in arts-related lessons? Why?
  • How does the opinion of the wider educational community (colleagues, school administration, parents, etc.) on AI in education influence your beliefs about using ChatGPT in your teaching practice?
Control beliefs:
  • What challenges or obstacles do you expect to encounter when using ChatGPT in arts-related courses?
  • How confident are you in your ability to effectively use AI tools such as ChatGPT in your teaching?
  • What resources or what kind of support would you need to feel capable of effectively utilizing ChatGPT in your classroom?
  • How do you assess the possibility of utilizing ChatGPT in relation to the current technological infrastructure at your school?
  • Can you describe any personal experiences that influence your perception of your ability to use AI in the classroom?
    Demographics:
    Gender:
    Years of service:
    Level of education:
    Specialization:
    Employment status: Permanent, substitute, part-time.
    Education: Bachelor’s degree, second bachelor’s degree, master’s degree, doctorate.

Appendix B. Interview Coding Description

Behavioral BeliefsCodesCode Explanation
1. Positive outcomesIncrease participation—interestAll reports mentioned an expected increase in student interest and, consequently, their engagement in the course through the use of ChatGPT.
Faster research processAll reports mentioned facilitating research, increasing students’ interest in research and their curiosity, due to the speed of responses.
Teacher’s assistantAll reports mentioned the use of ChatGPT as a teaching aid in various areas of art classes (e.g., providing ideas, exercises, teaching scenario texts, saving time, etc.).
Foster creativityAll reports mentioned enhancing students’ creativity in the arts (e.g., providing textual descriptions for creating visual art or writing poetry, etc.).
Improving writing skillsAll reports mentioned the enhancement of students’ writing and reading skills through their interaction with ChatGPT (e.g., correcting textual errors, etc.).
Familiarity with New TechnologiesAll reports mentioned the benefits of familiarizing students with ICT and, in particular, NT (e.g., digital literacy, etc.).
Improving critical thinkingAll reports mentioned strengthening students’ critical thinking skills (e.g., by helping them distinguish between correct and incorrect information, etc.).
Teamwork—dialogueAll reports mentioned ChatGPT’s contribution to teamwork and dialogue skills among students.
Reducing stereotypesAll reports mentioned a reduction in stereotypes and prejudices through the use of ChatGPT.
Helping students with learning difficultiesAll reports mentioned ChatGPT’s contribution to the field of special education.
2. Negative outcomesCopyingAll reports referred to issues of direct copying of answers.
Reducing critical thinkingAll reports mentioned a decline in students’ critical thinking skills due to ChatGPT’s quick, immediate responses.
MisinformationAll reports mentioned misinformation due to the incorrect information it provides.
Lack of creativityAll reports mentioned the loss of creativity among students in art classes.
Decrease in participation—interestAll reports mentioned a decline in their participation and interest in art classes.
Replacing teacherAll reports mentioned the risk/fear of replacing or downgrading the role of the teacher.
Copyright—dataAll reports referred to ethical issues such as copyright, plagiarism, and personal data.
Lack of human communicationAll reports mentioned the danger of poor human communication.
Reduction in writing skillsAll reports mentioned a decline in students’ writing and reading skills.
Lack of personalizationAll reports mentioned the impact on students with special educational needs and/or disabilities.
Normative BeliefsCodesCode Explanation
1. Acceptance from othersPersonal indifferenceAll reports referred to personal independence from opinion (acceptance or rejection of others).
Students’ enthusiasmAll reports mentioned the acceptance of students due to enthusiasm.
People supporting GenAI All references mentioned acceptance by those who support GenAI and NT.
Those who find it helpful All reports mentioned acceptance by those who consider it a useful tool.
Those who find it innovativeAll reports mentioned references from those who consider it an important innovation in education.
Institutional support All reports mentioned the use of ChatGPT with the acceptance and cooperation of the State.
2. Rejection from othersRejection due to lack of knowledgeAll reports mentioned rejection by those who are unfamiliar with ChatGPT.
Skeptical parentsAll reports mentioned parents’ reservations about ChatGPT.
People against technology/GenAI All reports mentioned rejection by those who are generally opposed to NT and GenAI.
Personal fear of conflictAll reports mentioned personal rejection of ChatGPT due to fear of possible conflict or rejection by the educational community (e.g., colleagues, parents, administration, etc.).
Co-workers skeptical about ChatGPT All reports mentioned rejection by fellow teachers.
Ethical concerns of parents/co-workersAll reports mentioned rejection by others due to concerns about ethical issues (e.g., data protection, rights, etc.).
People concerned about reduction in critical thinking All reports mentioned rejection by those who are concerned that students’ critical thinking skills will be diminished.
Rejection from studentsAll reports mentioned rejection, lack of interest, or indifference from students.
Those worried about mistakesAll reports mentioned rejection or reservations on the part of those concerned about the inaccuracy of ChatGPT’s responses (misinformation).
Control BeliefsCodesCode Explanation
1. FacilitatorsTraining availability/resourcesAll reports consider the existing training and technical support to be helpful.
Institutional supportAll reports find institutional support from the ministry to be helpful.
Personal confidence All reports mention personal confidence and certainty as a contributing factor.
Positive previous experiences All reports mention previous positive experiences of teachers as a contributing factor.
2. BarriersLack of knowledge/experienceAll reports that consider personal lack of knowledge and experience with ChatGPT to be an issue.
Limited technological accessAll reports that consider the school’s technical equipment to be inadequate.
Student resistance All reports consider students’ indifference or resistance to ChatGPT in the arts to be an obstacle.
Influence from significant others All reports mention the non-utilization of ChatGPT due to influence from the wider educational community.
Distrust of GenAI capabilitiesAll reports mention the non-use of ChatGPT due to personal distrust of its capabilities in art classes.
Lack of reliabilityAll reports mentioned the inaccuracy of ChatGPT’s responses as an obstacle to its pedagogical use.
Integration into existing systemAll reports mention difficulties in integrating and utilizing ChatGPT in the existing curriculum of arts-related courses.
Concerns about dataAny reports mentioning obstacles due to ethical issues (e.g., personal data, rights, etc.).

References

  1. Adiguzel, T., Kaya, M. H., & Cansu, F. K. (2023). Revolutionizing education with AI: Exploring the transformative potential of CHATGPT. Contemporary Educational Technology, 15(3), ep429. [Google Scholar] [CrossRef] [PubMed]
  2. Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179–211. [Google Scholar] [CrossRef]
  3. Ajzen, I. (2019). Theory of planned behavior diagram. Icek Ajzen. Available online: https://people.umass.edu/aizen/tpb.diag.html (accessed on 28 April 2025).
  4. Ali, J. K., Shamsan, M. A., Hezam, T. A., & Mohammed, A. A. (2023). Impact of CHATGPT on learning motivation: Teachers and students’ voices. Journal of English Studies in Arabia Felix, 2(1), 41–49. [Google Scholar] [CrossRef]
  5. Baidoo-Anu, D., & Owusu Ansah, L. (2023). Education in the era of Generative Artificial Intelligence (AI): Understanding the potential benefits of CHATGPT in promoting teaching and learning. SSRN Electronic Journal. [Google Scholar] [CrossRef]
  6. Bergdahl, N., & Sjöberg, J. (2025). Attitudes, perceptions and AI self-efficacy in K-12 education. Computers and Education: Artificial Intelligence, 8, 100358. [Google Scholar] [CrossRef]
  7. Bitzenbauer, P. (2023). ChatGPT in physics education: A pilot study on easy-to-implement activities. Contemporary Educational Technology, 15(3), ep430. [Google Scholar] [CrossRef]
  8. Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. [Google Scholar] [CrossRef]
  9. Chen, R., Lee, V. R., & Lee, M. G. (2025). A cross-sectional look at teacher reactions, worries, and professional development needs related to generative AI in an urban school district. Education and Information Technologies, 1–38. [Google Scholar] [CrossRef]
  10. Choi, J. H., Hickman, K. E., Monahan, A., & Schwarcz, D. B. (2023). ChatGPT goes to law school. SSRN Electronic Journal, 71, 387. [Google Scholar] [CrossRef]
  11. Cohen, L., Manion, L., & Morrison, K. (2017). Research methods in education (8th ed.). Routledge. [Google Scholar]
  12. Dimeli, M., & Kostas, A. (2025). The role of ChatGPT in education: Applications, challenges: Insights from a systematic review. Journal of Information Technology Education: Research, 24, 1–30. [Google Scholar] [CrossRef]
  13. Drossinou Korea, M., & Alexopoulos, P. (2023a). Informal pedagogical assessment in inclusive secondary education for a student with autism. International Journal of Social Science and Human Research, 6(5), 2507–2515. [Google Scholar] [CrossRef]
  14. Drossinou Korea, M., & Alexopoulos, P. (2023b). Observation methodology, informal pedagogical assessment, and school integration in a student with autism spectrum disorder. International Journal of Social Science and Human Research, 6(2), 775–784. [Google Scholar] [CrossRef]
  15. Drossinou Korea, M., & Alexopoulos, P. (2024a). Factors arising from the utilization of artificial intelligence and large language models in special education and training. European Journal of Special Education Research, 10(2), 1–16. [Google Scholar] [CrossRef]
  16. Drossinou Korea, M., & Alexopoulos, P. (2024b). Higher education students’ views on the use of artificial intelligence for teaching students with specific learning difficulties. European Journal of Open Education and E-learning Studies, 9(1), 73–90. [Google Scholar] [CrossRef]
  17. Elbanna, S., & Armstrong, L. (2023). Exploring the integration of CHATGPT in education: Adapting for the future. Management & Sustainability: An Arab Review, 3(1), 16–29. [Google Scholar] [CrossRef]
  18. Farrokhnia, M., Banihashem, S. K., Noroozi, O., & Wals, A. (2023). A SWOT analysis of ChatGPT: Implications for educational practice and research. Innovations in Education and Teaching International, 61(3), 460–474. [Google Scholar] [CrossRef]
  19. Fokides, E., & Peristeraki, E. (2025). Comparing ChatGPT’s correction and feedback comments with that of educators in the context of primary students’ short essays written in English and Greek. Education and Information Technologies, 30(2), 2577–2621. [Google Scholar] [CrossRef]
  20. Goodman, L. A. (1961). Snowball sampling. Annals of Mathematical Statistics, 32(1), 148–170. [Google Scholar] [CrossRef]
  21. Guo, C., Lu, Y., Dou, Y., & Wang, F.-Y. (2023). Can chatgpt boost artistic creation: The need of imaginative intelligence for parallel art. IEEE/CAA Journal of Automatica Sinica, 10(4), 835–838. [Google Scholar] [CrossRef]
  22. Guo, K., Zhong, Y., Li, D., & Chu, S. K. (2023). Effects of chatbot-assisted in-class debates on students’ argumentation skills and task motivation. Computers & Education, 203, 104862. [Google Scholar] [CrossRef]
  23. Haleem, A., Javaid, M., & Singh, R. P. (2022). An era of CHATGPT as a significant futuristic support tool: A study on features, abilities, and challenges. BenchCouncil Transactions on Benchmarks, Standards and Evaluations, 2(4), 100089. [Google Scholar] [CrossRef]
  24. Iqbal, N., Ahmed, H., & Azhar, K. A. (2022). Exploring teachers’ attitudes towards using CHATGPT. Global Journal for Management and Administrative Sciences, 3(4), 97–111. [Google Scholar] [CrossRef]
  25. Kan, M. P. H., & Fabrigar, L. R. (2017). Theory of planned behavior. In V. Zeigler-Hill, & T. Shackelford (Eds.), Encyclopedia of personality and individual differences. Springer. [Google Scholar] [CrossRef]
  26. Kangasharju, A., Ilomäki, L., Lakkala, M., & Toom, A. (2022). Lower secondary students’ poetry writing with the AI-based poetry machine. Computers and Education: Artificial Intelligence, 3(1), 100048. [Google Scholar] [CrossRef]
  27. Kasneci, E., Sessler, K., Küchemann, S., Bannert, M., Dementieva, D., Fischer, F., Gasser, U., Groh, G., Günnemann, S., Hüllermeier, E., Krusche, S., Kutyniok, G., Michaeli, T., Nerdel, C., Pfeffer, J., Poquet, O., Sailer, M., Schmidt, A., Seidel, T., … Kasneci, G. (2023). ChatGPT for good? on opportunities and challenges of large language models for education. Learning and Individual Differences, 103, 102274. [Google Scholar] [CrossRef]
  28. Kohnke, L., Moorhouse, B. L., & Zou, D. (2023). ChatGPT for language teaching and learning. RELC Journal, 54(2), 537–550. [Google Scholar] [CrossRef]
  29. Kostas, A., & Chanis, C. (2024). ChatGPT and AI in K-12 education: Views and practices of Greek teachers. In E. Varonis, & A. Pyrini (Eds.), Proceedings of the international conference on ICT in education (ICICTE2024) (pp. 198–208). ICICTE. Available online: http://icicte.org/assets/15-kostas2.pdf (accessed on 24 April 2025).
  30. Koulianou, M., Dumitru, C., Mastrothanasis, K., Stan, M. M., & Samartzi, S. (2025). Are teachers metacognitively aware while teaching? A cross-cultural analysis of Greek and Romanian teachers. Reflective Practice, 1–15. [Google Scholar] [CrossRef]
  31. Lou, Y. (2023). Exploring the application of ChatGPT to English teaching in a Malaysia primary school. Journal of Advanced Research in Education, 2(4), 47–54. [Google Scholar] [CrossRef]
  32. Mason, J. (2017). Qualitative researching (3rd ed.). Sage Publications. [Google Scholar]
  33. Mastrothanasis, K., & Alexopoulos, P. (2025). Educational research and the questionnaire as a data collection tool. Educational Paths, 1(2), 106–130. (In Greek) [Google Scholar] [CrossRef]
  34. Mastrothanasis, K., & Kladaki, M. (2020). The involvement of children in the arts during their leisure time. Asian Journal of Language, Literature and Culture Studies, 3(2), 10–19. [Google Scholar]
  35. Mastrothanasis, K., Kladaki, M., & Andreopoulou, A. (2025a). The role of psychodrama in the prevention of cyberbullying: School psychologist’s perspective. In T. Touloupis, A. (Loizos) Sofos, & A. Vasiou (Eds.), Students’ online risk behaviors: Psychoeducational predictors, outcomes, and prevention (pp. 367–391). IGI Global. [Google Scholar] [CrossRef]
  36. Mastrothanasis, K., Pikoulis, E., Kladaki, M., Pikouli, A., Karamagioli, E., & Papantoniou, D. (2025b). Digital drama-based interventions in emergency remote teaching: Enhancing bilingual literacy and psychosocial support during polycrisis. Psychology International, 7(2), 53. [Google Scholar] [CrossRef]
  37. Mertler, C. A. (2018). Introduction to educational research (2nd ed.). Sage Publications. [Google Scholar]
  38. Mizumoto, A., & Eguchi, M. (2023). Exploring the potential of using an AI language model for automated essay scoring. Research Methods in Applied Linguistics, 2(2), 100050. [Google Scholar] [CrossRef]
  39. Mohamed, A. M. (2023). Exploring the potential of an AI-based chatbot (CHATGPT) in enhancing English as a foreign language (EFL) teaching: Perceptions of EFL faculty members. Education and Information Technologies, 29(3), 3195–3217. [Google Scholar] [CrossRef]
  40. Pavlik, J. V. (2023). Collaborating with ChatGPT: Considering the implications of generative artificial intelligence for journalism and media education. Journalism & Mass Communication Educator, 78(1), 84–93. [Google Scholar] [CrossRef]
  41. Rahman, M. M., & Watanobe, Y. (2023). ChatGPT for education and research: Opportunities, threats, and strategies. Applied Sciences, 13(9), 5783. [Google Scholar] [CrossRef]
  42. Roy, R., & Naidoo, V. (2021). Enhancing chatbot effectiveness: The role of anthropomorphic conversational styles and time orientation. Journal of Business Research, 126, 23–34. [Google Scholar] [CrossRef]
  43. Rudolph, J., Tan, S., & Tan, S. (2023). ChatGPT: Bullshit spewer or the end of traditional assessments in higher education? Journal of Applied Learning & Teaching, 6(1), 342–363. [Google Scholar] [CrossRef]
  44. Sallam, M. (2023). ChatGPT utility in healthcare education, research, and practice: Systematic review on the promising perspectives and valid concerns. Healthcare, 11(6), 887. [Google Scholar] [CrossRef]
  45. Sáez-Velasco, S., Alaguero-Rodríguez, M., Delgado-Benito, V., & Rodríguez-Cano, S. (2024). Analysing the impact of generative AI in arts education: A cross-disciplinary perspective of educators and students in higher education. Informatics, 11(2), 37. [Google Scholar] [CrossRef]
  46. Shoufan, A. (2023). Exploring students’ perceptions of ChatGPT: Thematic analysis and follow-up survey. IEEE Access, 11, 38805–38818. [Google Scholar] [CrossRef]
  47. Smits, J., & Borghuis, T. (2022). Generative AI and intellectual property rights. In B. Custers, & E. Fosch-Villaronga (Eds.), Law and artificial intelligence (Vol. 35, pp. 391–408). T.M.C. Asser Press. [Google Scholar] [CrossRef]
  48. Sok, S., & Heng, K. (2023). ChatGPT for Education and research: A review of benefits and risks. Cambodian Journal of Educational Research, 3(1), 110–121. [Google Scholar] [CrossRef]
  49. Su, Y., Lin, Y., & Lai, C. (2023). Collaborating with CHATGPT in argumentative writing classrooms. Assessing Writing, 57, 100752. [Google Scholar] [CrossRef]
  50. Sullivan, M., Kelly, A., & McLaughlan, P. (2023). ChatGPT in higher education: Considerations for academic integrity and student learning. Journal of Applied Learning & Teaching, 6(1), 1–10. [Google Scholar] [CrossRef]
  51. Tlili, A., Shehata, B., Adarkwah, M. A., Bozkurt, A., Hickey, D. T., Huang, R., & Agyemang, B. (2023). What if the devil is my guardian angel: ChatGPT as a case study of using Chatbots in education. Smart Learning Environments, 10(1), 15. [Google Scholar] [CrossRef]
  52. Waltzer, T., Cox, R. L., & Heyman, G. D. (2023). Testing the ability of teachers and students to differentiate between essays generated by CHATGPT and high school students. Human Behavior and Emerging Technologies, 1, 1–9. [Google Scholar] [CrossRef]
  53. Zhai, X. (2022). ChatGPT user experience: Implications for education. SSRN Electronic Journal, 1–18. [Google Scholar] [CrossRef]
Figure 1. Theory of Planned Behavior diagram (Ajzen, 2019).
Figure 1. Theory of Planned Behavior diagram (Ajzen, 2019).
Education 15 00795 g001
Table 1. Positive outcomes.
Table 1. Positive outcomes.
CategoryCodesnn%
1. Positive outcomesIncrease participation—interest5114.37
Faster research process4312.11
Teacher’s assistant339.30
Foster creativity154.23
Improving writing skills102.82
Familiarity with New Technologies102.82
Improving critical thinking82.25
Teamwork—dialogue41.13
Reducing stereotypes20.56
Helping students with learning difficulties10.28
Total positive references17749.85
Total references355100
Table 2. Negative outcomes.
Table 2. Negative outcomes.
CategoryCodesnn%
2. Negative outcomesCopying4412.39
Reducing critical thinking3610.14
Μisinformation215.92
Lack of creativity164.51
Decrease in participation—interest164.51
Replacing teacher154.23
Copyright—data123.38
Lack of human communication92.54
Reduction in writing skills82.25
Lack of personalization10.28
Total negative references17850.14
Total references355100
Table 3. Acceptance from others.
Table 3. Acceptance from others.
CategoryCodesnn%
1. Acceptance from othersPersonal indifference299.80
Students’ enthusiasm268.78
People supporting GenAI 196.42
Those who find it helpful 186.08
Those who find it innovative134.39
Institutional support 72.36
Total references to acceptance11237.84
Total references296100
Table 4. Rejection from others.
Table 4. Rejection from others.
CategoryCodesNN%
2. Rejection from othersRejection due to lack of knowledge3913.18
Skeptical parents3411.49
People against technology/GenAI 3311.15
Personal fear of conflict258.45
Co-workers skeptical about ChatGPT 217.09
Ethical concerns of parents/co-workers206.76
People concerned about reduction in critical thinking 93.04
Rejection from students20.68
Those worried about mistakes10.34
Total references to rejection18462.16
Total references296100
Table 5. Facilitators.
Table 5. Facilitators.
CategoryCodesnn%
1. FacilitatorsTraining availability/resources5715.53
Institutional support5314.44
Personal confidence 359.54
Positive previous experiences 164.36
Total references for facilitators16143.87
Total references367100
Table 6. Barriers.
Table 6. Barriers.
CategoryCodesnn%
2. BarriersLack of knowledge/experience8322.62
Limited technological access5314.44
Student resistance 195.18
Influence from significant others 174.63
Distrust of GenAI capabilities154.09
Lack of reliability92.45
Integration into existing system82.18
Concerns about data20.54
Total reports of barriers20656.13
Total reports367100
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kladaki, M.; Kostas, A.; Alexopoulos, P. Exploring Teachers’ Beliefs About ChatGPT in Arts Education. Educ. Sci. 2025, 15, 795. https://doi.org/10.3390/educsci15070795

AMA Style

Kladaki M, Kostas A, Alexopoulos P. Exploring Teachers’ Beliefs About ChatGPT in Arts Education. Education Sciences. 2025; 15(7):795. https://doi.org/10.3390/educsci15070795

Chicago/Turabian Style

Kladaki, Maria, Apostolos Kostas, and Panagiotis Alexopoulos. 2025. "Exploring Teachers’ Beliefs About ChatGPT in Arts Education" Education Sciences 15, no. 7: 795. https://doi.org/10.3390/educsci15070795

APA Style

Kladaki, M., Kostas, A., & Alexopoulos, P. (2025). Exploring Teachers’ Beliefs About ChatGPT in Arts Education. Education Sciences, 15(7), 795. https://doi.org/10.3390/educsci15070795

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop