Previous Article in Journal
Critical Student Agency in an Integrated Social Studies Unit
Previous Article in Special Issue
Advancing Conceptual Understanding: A Meta-Analysis on the Impact of Digital Technologies in Higher Education Mathematics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Teacher Education Students’ Practices, Benefits, and Challenges in the Use of Generative AI Tools in Higher Education

by
Stavros Athanassopoulos
1,
Aggeliki Tzavara
2,
Spyridon Aravantinos
2,
Konstantinos Lavidas
2,
Vassilis Komis
2 and
Stamatios Papadakis
3,*
1
Department of Philosophy, University of Patras, 26504 Rio, Greece
2
Department of Educational Sciences and Early Childhood Education, University of Patras, 26504 Rio, Greece
3
Department of Preschool Education, Faculty of Education, University of Crete, 74100 Rethymno, Greece
*
Author to whom correspondence should be addressed.
Educ. Sci. 2026, 16(2), 228; https://doi.org/10.3390/educsci16020228 (registering DOI)
Submission received: 13 December 2025 / Revised: 25 January 2026 / Accepted: 28 January 2026 / Published: 2 February 2026
(This article belongs to the Special Issue Unleashing the Potential of E-learning in Higher Education)

Abstract

Despite the growing adoption of generative artificial intelligence (GenAI) tools in higher education, limited research has examined how future educators perceive and use these technologies in their academic practices. This study investigates the practices, perceived benefits, and challenges associated with the use of GenAI tools—such as ChatGPT—among undergraduate students enrolled in programs that confer teaching qualifications. Using a mixed-methods design, data were collected from 314 students from the Early Childhood Education, Philosophy, and Philology departments. The findings indicate that the majority of students use GenAI tools primarily for academic purposes, most commonly for information searching, data analysis, study advice, and exam preparation. Students reported several perceived benefits, including rapid access to information, time efficiency, improved comprehension of complex concepts, enhanced study organization, and support with assignments and research-related tasks such as summarizing or translating academic texts. At the same time, participants expressed notable concerns, particularly regarding over-reliance on AI, reduced personal effort, risks to academic integrity, diminished critical thinking, and weakened research skills. Additional challenges included misinformation, reduced creativity, improper use of AI-generated content, skill underdevelopment, and potential technological dependence. The study concludes that teacher education programs should systematically integrate AI literacy and responsible-use training to prepare future educators to address the pedagogical and ethical implications of GenAI in educational settings.

1. Introduction

Artificial Intelligence (AI) has become one of the most revolutionary and important technologies of the modern age. According to the European Commission (European Parliament & Council, 2024). AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may show adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments.” This definition highlights AI systems’ ability to process environmental data and make decisions or perform actions on their own to accomplish specific goals. Generative AI (GenAI) models are inherently generative, aiming to produce novel outputs. It has emerged as a transformative subfield of AI that focuses on creating new content—text, images, code, audio—based on patterns learned from vast datasets (Lavidas et al., 2025).
GenAI has already begun to transform the field of education (Chan & Hu, 2023; Roll & Wylie, 2016), offering new possibilities for enhancing teaching and learning, personalizing educational experiences, and improving administrative efficiency (Lavidas et al., 2025). Its implementation in universities and other higher education institutions can significantly enhance both student experiences and faculty effectiveness (Hwang et al., 2020). However, successful integration requires adequate technological infrastructure, appropriate teacher training, and a broader adaptation of educational practices to AI’s capabilities and characteristics. With the right support and strategic planning, AI can become a valuable tool for enhancing the learning process, fostering critical thinking, assessing student progress, and offering personalized support (Lavidas et al., 2025).
Under these circumstances, many countries in Europe and internationally have begun to adopt policies for integrating AI into education. For instance, in the United Kingdom, the national center for higher education (Jisc) publishes reports on student perceptions and institutional practices, thereby shaping policies for GenAI (A. E. Sousa & Cardoso, 2025), while Wilson (2025) noted the wide development of guidelines and policies by the Russell Group, an association of 24 public universities in the UK focused on academic excellence. In the USA, governance on AI in higher Education is driven by institutions and although a single federal policy is not present, An et al. (2025), in their study for guidelines among the top 50 universities, found that most of them have explicitly published GenAI policies for their staff and students. In Australia, the Australian Academic Integrity Network (AAIN) Generative AI WorkingGroup, provided guidance on the appropriate use of GenAI in higher education for students, teaching and academic support staff and education providers, aligned with the Higher Education Standards Framework (Threshold Standards) 2021 (HESF) and the Tertiary Education Quality and Standard Agency (TEQSA) (Munoz et al., 2023). Similarly, the EU has taken a leading role in shaping policies for the responsible use of AI in education. The institutional policies and guidelines for AI in many Sweden universities, are nowadays officially present and generally encourage teachers to explore its use, demand transparency when students use it and emphasize responsibility and critical use according to the national law for data protection (Humble, 2025). As demonstrated in a survey, two-thirds of higher education institutions have or are developing guidance on AI use, highlighting the evidence that the guidance on AI is emerging globally in every region, even in countries where laws for AI in education have not been established yet (Jin et al., 2025).
Given the rapid integration of AI in education and the crucial role of students in teacher education programs in shaping future teaching methods, it is essential to systematically explore their experiences and perceptions of the benefits and challenges associated with GenAI tools to develop evidence-based educational strategies. While these policies establish institutional frameworks, their effectiveness depends fundamentally on how students—particularly future educators—engage with these tools, yet this student’s voice remains underrepresented in policy development. Previous research has mainly focused on understanding how students engage with and view such technologies in higher education (Stracke et al., 2025; Irfan et al., 2025). However, despite this expanding body of work, most studies have concentrated on the general university student population (e.g., J. Kim et al., 2025; Yang, 2024) rather than on future educators who will soon incorporate AI tools into their classrooms. This group warrants particular attention, as their opinions and skills will directly impact how AI is integrated into pedagogical practice.
The present study seeks to address this gap by examining how prospective educators perceive, use, and critically reflect on GenAI tools, thereby contributing to a more comprehensive understanding of AI literacy and pedagogical preparedness in teacher education. More particularly, students in education-related fields—such as Early Childhood Education, Philosophy, and Philology—will play a key role in shaping future pedagogical practices. Investigating their perceptions towards tools like ChatGPT-4.0 sheds light on their level of digital readiness and the extent of their critical engagement with AI in educational contexts (Liu et al., 2024; Sublime & Renna, 2024).
Based on this context, the present study aims to explore the department of teacher education program students’ practices, benefits, and challenges regarding the use of GenAI tools—specifically, large language models (LLMs), such as ChatGPT—in educational settings. The findings of such research can inform the development of targeted digital literacy programs, tailored to students’ needs and perceptions. Simultaneously, these insights can support the formulation of pedagogical approaches that integrate AI in ethically responsible and educationally meaningful ways.

2. Literature Review

2.1. Students’ Practices and Their Engagement with GenAI

The integration of AI—especially GenAI tools such as ChatGPT—into higher education is reshaping how students approach learning and academic work (Chan & Hu, 2023). AI is increasingly being used to support diverse educational tasks, including idea generation, content creation, and academic writing, serving as a valuable research and learning assistant (Nikolopoulou, 2024). Students often turn to tools like ChatGPT as personalized tutors that explain complex concepts and promote digital and information literacy. In regional contexts, such as Latin America, AI has been deployed in higher education through predictive modeling, analytics, and assistive technologies to enhance student outcomes and institutional services (Salas-Pilco & Yang, 2022). Likewise, in health education, AI tools are integrated to develop students’ competencies in telehealth, health informatics, and data ethics (M. J. Sousa et al., 2021). Ansari et al. (2024) outline four key domains of AI use in education—profiling and prediction, intelligent tutoring, assessment and evaluation, and adaptive learning—which help contextualize the multifaceted ways students engage with AI. Collectively, these studies illustrate not only the variety of AI applications in higher education but also how disciplinary and contextual factors shape the scope and purpose of student engagement.
Building on these observations, research further indicates that student engagement with AI varies systematically across disciplines and educational contexts. Applied fields demonstrate higher knowledge, usage intentions, and engagement—particularly for cognitively demanding tasks—while economics and business act as bridges between STEM and social sciences, and adoption patterns differ by career stage and cross-disciplinary pedagogical or cultural contexts (S. Y. Kim et al., 2025; Qu et al., 2024; Hutson et al., 2022). Documented evidence shows that students use AI regularly for academic learning and coursework, with 8–31% using it daily, 6–20% weekly, 14–40% monthly, and 20–29% never (Nechyporenko et al., 2025), while in other higher-education settings AI supports teaching, learning, governance, and administrative decision-making (Chatterjee & Bhattacharjee, 2020). Students primarily adopt AI instrumentally to improve productivity and efficiency, though they also recognize its transformative potential to reshape tasks and decision-making processes (Pajany, 2021). Moreover, institutional policies and ethical guidelines—including accountability requirements, human oversight, and flexible instructor practices—directly influence how students engage with AI, aligning top-down frameworks with course-level implementation and shaping both learning practices and academic conduct (Dabis & Csáki, 2024). Together, these findings highlight the interplay of disciplinary, instrumental, and institutional factors in shaping student AI adoption and practices.

2.2. Educational Benefits of AI Use in Higher Education

Numerous studies have documented the educational benefits of AI, particularly generative tools like ChatGPT. These technologies support personalized learning pathways, enhance student motivation, and contribute to improved language proficiency and engagement (Ansari et al., 2024; Chan & Hu, 2023; Gökçearslan et al., 2024). AI systems adapt content to individual learners’ needs, increasing retention, and academic success (Singh & Hiran, 2022). In academic writing and research, AI offers direct support through personalized feedback, summarization, and code generation, functioning as both a tutor and collaborative peer (Silva et al., 2024; J. Kim et al., 2025; Lavidas et al., 2025; Michel-Villarreal et al., 2023). These capabilities promote autonomy, self-regulation, and affective benefits such as increased confidence. The contribution of AI to critical thinking and collaboration is equally significant. Research shows that students using GenAI demonstrate improved skills in evaluating information and participating in complex discussions (Ruiz-Rojas et al., 2024; Rusdin et al., 2024). ChatGPT, in particular, supports collaboration and inclusivity in remote and diverse learning environments (Nikolopoulou, 2024). Motivational and behavioral impacts are also well-documented. Studies indicate that AI tools can boost academic aspirations and facilitate applied learning through adaptive feedback (Gao et al., 2024; Jo, 2024; Niloy et al., 2024). Students’ willingness to innovate and their behavioral intentions are central predictors of effective AI use, even when challenges such as technophobia or privacy concerns are present. Overall, the literature portrays GenAI as a powerful asset in higher education, offering both cognitive and socio-emotional benefits when integrated strategically.

2.3. Challenges and Limitations in AI Adoption

Despite these advantages, students face several challenges when engaging with AI in academic contexts. Issues of accessibility and affordability remain prominent, particularly when paired with limited awareness and uncertainty about ethical use (Ali et al., 2024; Chan & Hu, 2023; Lavidas et al., 2025). Academic integrity is a key concern, with AI tools prompting fears of plagiarism, reduced originality, and ambiguity in assessing authentic learning (Rasul et al., 2023; Bhullar et al., 2024; Cotton et al., 2024). Moreover, students often question the accuracy and reliability of AI-generated content, particularly in specialized domains that require subject-matter expertise (Michel-Villarreal et al., 2023; Silva et al., 2024). A further obstacle is the lack of AI literacy and prompt engineering skills among students, which limits the educational value of large language models like ChatGPT (Knoth et al., 2024; Walter, 2024). This skill gap is exacerbated by the absence of AI training within most higher education curricula. Additionally, students report difficulties integrating AI into academic writing due to both task-related limitations and system-specific constraints, highlighting the need for student-informed instructional design (J. Kim et al., 2025). Finally, unequal access to GenAI tools creates disparities in learning opportunities, reinforcing the urgency of inclusive, equitable integration strategies (Kurtz et al., 2024).
Based on the previous works, the adoption of GenAI in higher education offers both significant potential and notable challenges (Ansari et al., 2024; Lavidas et al., 2025). Understanding how students perceive and experience these tools is therefore crucial for informing ethical, effective, and pedagogically sound integration strategies (Kasneci et al., 2023). Importantly, the participants of this study—undergraduate students in Departments of Education and Pedagogical Studies—hold a unique position as future educators who will shape the learning experiences of the next generation. Their engagement with GenAI tools, such as ChatGPT, goes beyond personal academic use; it positions them as pioneers in integrating these technologies into educational practice (Cotton et al., 2024). Teachers have long been recognized as central agents in fostering critical thinking, creativity, and ethical reasoning in students (Lavidas et al., 2025), and how these future educators adopt, model, and guide AI use will directly influence classroom culture and learning outcomes (Kohnke et al., 2025). Understanding their perspectives, experiences, and perceptions toward AI is therefore essential not only for effective adoption but also for informing pedagogical strategies that balance technological innovation with human-centered teaching principles. Hence, higher education should empower students to engage critically with the opportunities and challenges of AI, enabling future educators to lead its responsible, inclusive, and purposeful integration in classrooms (European Commission, 2021; UNESCO, 2021).

3. Research Aim

Based on the framework developed previously, this research aims to investigate teacher education students’ engagement with GenAI tools by addressing three key questions:
(i)
What practices do teacher education students adopt when employing GenAI tools for academic purposes?
(ii)
What benefits do teacher education students perceive from integrating GenAI tools for academic purposes?
(iii)
What challenges do teacher education students encounter in integrating GenAI tools for academic purposes?

4. Research Method

The present study employed a mixed-methods approach (Bryman, 2016) using a structured questionnaire that included both closed-ended and open-ended questions. The research was conducted in May and June 2025, at the end of the 2024–2025 academic year, following all ethical guidelines. The questionnaire was administered to undergraduate students enrolled in the Departments of Early Childhood Education, Philosophy, and Philology at the University of Patras. All three departments provide a curriculum with integrated pedagogical training that awards the necessary Pedagogical and Teaching Competence. This allows graduates to qualify for teaching positions in public and private secondary education directly upon graduation; consequently, students from all programs are considered future educators within the Greek higher education context. Before data collection, official permission was obtained from the respective university departments, and the study was approved by the Institutional Review Board of the Department of Educational Science and Early Childhood Education of the University of Patras (77925/23-10-2023). Participants were informed in advance about the purpose of the survey and their right to withdraw at any time, and all provided informed consent by signing a consent form. All data remained anonymous and confidential, ensuring that no information could lead to the identification of participants.
Data collection was conducted anonymously through a self-administered questionnaire distributed via Google Forms, to encourage openness and authenticity in participants’ responses. A convenience sampling method was used (Bryman, 2016). Specifically, we asked instructors who teach courses involving the integration of computers in the educational process to share the questionnaire with their students. Table 1 presents the demographic characteristics of the 314 teacher education students from the University of Patras who participated in the study. Although the sample was predominantly female (89.8%), this study did not examine gender or year of study as factors influencing AI practices or perceptions. Future research could explore potential demographic effects to provide a more nuanced understanding of GenAI adoption in teacher education.

5. Research Instrument and Strategy of Data Analysis

The questionnaire was designed to combine both quantitative and qualitative data, allowing for a comprehensive understanding of students’ experiences and perceptions regarding the use of GenAI tools in academic contexts. It consisted of three sections, each focusing on different aspects of this topic. The first section collected demographic information from the participants, including their year of study, gender, age, and department of study. This section aims to provide a general profile of the respondents. The second section included two closed-ended questions. The first question explored whether participants use Artificial Intelligence (AI) tools (responses: “No”, “Not yet, but I plan to,” and “Yes”). The second item investigates the specific students’ practices with AI tools (“How do you use ChatGPT or similar tools for your studies?”). The participants had to fill out the degree of agreement with nine distinct AI practices (see Table 2 for the corresponding items) of AI technologies in their academic activities. The nine AI practices were selected based on their recurrent identification in prior empirical studies on students’ academic use of GenAI tools (e.g., Chan & Hu, 2023; Lavidas et al., 2025) and were reviewed for clarity and relevance to the target population before data collection. For these items, a Five-point Likert-type scale, ranging from Never to Very Often (Never–Very Rarely–Rarely–Often–Very Often) was used. Finally, the third section consisted of two open-ended questions intended to capture students’ perceptions of the potential benefits and challenges associated with AI use in higher education. The questions were as follows: 1. Do you believe that the use of ChatGPT (or similar AI tools) will benefit students and the university? If yes, what are they? 2. Do you believe that the use of ChatGPT (or similar AI tools) poses challenges for students and the university? If yes, what are they?
We presented the students’ responses to closed-ended questions using descriptive statistics. Regarding the open-ended questions, we analyzed the data using qualitative content analysis (Bryman, 2016) to identify recurring patterns in students’ perceptions of GenAI tools. All responses were read multiple times to become familiar with the data, followed by open coding, where meaningful words or statements were highlighted and given preliminary codes. These codes were then refined and grouped into broader themes representing key benefits and challenges of AI use in academic settings. Additionally, to ensure reliability, coding was performed iteratively, and a second reviewer independently coded a subset of responses. Any discrepancies were discussed until a consensus was reached.
Finally, during the preparation of this study, the authors used Generative AI tools to assist with language editing and clarity of expression. Specifically, Grammarly was used to improve the language and enhance the overall readability of the text.

6. Results

Based on the data, the majority of participants (n = 231, 73.6%) reported having used GenAI tools, such as ChatGPT, for their studies (Figure 1). A smaller portion (8.9%) indicated that they had not yet used such tools but intended to do so in the future, suggesting a growing interest and potential for adoption. Only 17.5% of respondents stated that they had never used GenAI tools. These results reveal a strong integration of AI technologies in students’ academic practices, reflecting both familiarity with and acceptance of emerging digital tools in higher education contexts.
Based on the first research question of our study, which explored the practices of those who use GenAI tools such as ChatGPT (n = 231), Table 2 reveals substantial variation in the frequency with which students employ GenAI tools across different academic practices. The most pronounced pattern concerns information searching, which shows the highest level of engagement: 80.1% of participants reported using AI often or very often for this purpose, while only 6.5% indicated never or exceedingly rare use. In contrast, text translation represents the least frequent practice, with only 19.1% reporting often or very often use and a majority (55.0%) indicating rare, exceedingly rare, or no use. This wide range highlights a clear differentiation in how students prioritize AI-supported academic tasks.
A second notable pattern emerges in activities related to academic production. Writing assignments display a polarized distribution, with 34.2% of students using AI often or very often, compared to 40.3% who report never or very rarely using such tools for this purpose. A similar tendency toward moderate or limited use is observed for text paraphrasing, literature processing, and assistance with references, where responses cluster around the middle-frequency categories. These distributions suggest that, while AI is present in writing-related tasks, its adoption is neither uniform nor dominant across the sample.
Finally, relatively high levels of AI use are reported for analytically oriented practices. Data analysis stands out, with 50.6% of participants indicating frequent use (often or very often), despite the sample consisting primarily of students from Humanities and Social Sciences programs. Comparable, though slightly lower, levels of frequent use are observed for study advice (59.3%) and exam preparation (45.5%). Together, these patterns indicate that students engage extensively with GenAI tools for information-intensive and analytical support tasks, while usage declines for practices more directly associated with linguistic transformation and authorship.
Overall, these findings suggest that students primarily adopt GenAI tools for information retrieval and analytical support, while maintaining a more cautious perception toward activities involving academic authorship and original writing.
Table 3, presents the main categories with the corresponding codes in parentheses that emerged from the analysis of all respondents’ answers to the first open-ended question: “Do you believe that the use of ChatGPT (or similar AI tools) will have benefits for students and the university? If yes, what are they?” Among the benefits reported by the participants regarding the use of GenAI tools in their studies, the most frequently mentioned advantages are related to information and answer finding (160 responses, 50.96%) and time saving/speed (90 responses, 28.66%). Examples of students’ comments related to information and answer finding include: “It provided me with quick answers when I didn’t know where to look,” “I could easily find reliable information without spending hours searching online,” and “It gave me clear explanations and examples for concepts I was unsure about.” Regarding time saving and speed, students reported statements such as: “It allowed me to finish my assignments much faster than usual,” “I saved a lot of time preparing notes and summaries thanks to the tool,” and “It made research quicker by concisely summarizing information.” These findings underscore the practical and functional value of AI tools, highlighting their potential to support students in accessing information efficiently and completing academic tasks more quickly. Academic support aspects, such as understanding and simplifying complex concepts (20.70%) and assistance with assignments and research (19.11%), are also notable, indicating that students perceive these practices as enhancing their learning process. Organization of study/notes (12.74%) and summarizing/analyzing/translating texts (11.15%) reflect the importance of managing and processing knowledge effectively. Creative and critical thinking skills, including idea generation and inspiration (9.55%) and critical thinking abilities (7.96%), appear less prominent, suggesting that these outcomes are either less immediately associated with the use of the tools or harder for students to measure. Finally, lower percentages for performance/effectiveness improvement (4.78%) and familiarization with technology and new tools (3.18%) indicate that technological familiarity is not the primary perceived benefit. For instance, some students stated: “Using ChatGPT helped me become more comfortable with AI and other digital tools I might need in my future studies or career,” and “It allowed me to explore new technologies and understand how they can support learning and research.”
Table 4 presents the main challenges’ categories with the corresponding codes in parentheses associated with the use of ChatGPT and similar AI tools, as reported by the participating students in response to the open-ended question: “Do you believe that the use of ChatGPT (or similar AI tools) poses challenges for students and the university? If yes, what are they?” The most frequently mentioned issues relate to over-reliance and reduce personal effort (57.32%) and plagiarism or academic integrity concerns (47.77%), highlighting that while these tools can facilitate learning, they also pose risks to independent work and ethical practice. As one student noted, “I tend to depend too much on it and put in less effort myself,” while another admitted, “It makes it easy to copy answers without truly understanding the content.” A significant number of students also mentioned concerns about the erosion of critical thinking and deep learning (44.59%) and diminished research skills, including less cross-checking and investigation (35.03%), suggesting that reliance on these tools may undermine higher-order cognitive processes. For instance, one participant reflected, “It stops me from thinking deeply because I get ready-made answers,” and another shared, “I no longer double-check information from different sources.” Additional concerns include inaccuracies, misinformation, and reliability issues (33.44%), as expressed by students such as “Sometimes it gives wrong or outdated information, and it’s hard to notice,” as well as incomplete development of both cognitive and non-cognitive skills (22.29%), with comments like “It prevents me from developing my own problem-solving and communication skills.” Finally lower but notable challenges include technology dependence or addictive use (19.11%), with some students admitting, “I find myself using it too often, even when I don’t need to.” Overall, the data suggest a tension between the practical benefits of AI-based academic tools and their potential to negatively impact independent learning, critical thinking, and academic integrity.

7. Discussion

The present study provides a comprehensive overview of undergraduate students’ use of GenAI tools, such as ChatGPT, in higher education, with a specific focus on students from the Departments of Preschool Education, Philosophy, and Philology. Most participants were enrolled in education-related programs, representing the future educators who will be responsible for integrating AI tools into their teaching practices. A significant portion of students (73.6%) reported using GenAI tools in their studies, demonstrating the growing acceptance and familiarity of these technologies in academic settings and highlighting their potential relevance for future pedagogical applications.
The findings of the present study both align with and extend prior research regarding students’ engagement with generative AI tools in higher education. Consistent with earlier studies (Nikolopoulou, 2024; Salas-Pilco & Yang, 2022), participants reported extensive use of GenAI for information searching, idea generation, and academic task support, confirming its emerging role as a personalized learning assistant. A particularly noteworthy finding is the exceptionally high frequency of AI use for study advice and exam preparation (nearly 60%), suggesting that students increasingly perceive GenAI not merely as a tool but as a form of academic mentor or advisor that guides their study strategies and learning decisions. This shift toward a “mentor-like” function underscores the depth of trust students place in AI for shaping their academic judgment—a trend that deserves further attention in the literature. Similarly, the high frequency of AI use for study advice and exam preparation echoes previous evidence that generative models increasingly function as intelligent tutoring systems (Gökçearslan et al., 2024; Ansari et al., 2024). The strong emphasis placed by students on efficiency and quick access to information is also congruent with earlier findings that AI fosters cognitive offloading and improves learning productivity (Gökçearslan et al., 2024; Singh & Hiran, 2022).
Moreover, the reported benefits—such as enhanced comprehension of complex concepts, increased confidence, and improved organization of study materials—mirror outcomes documented in research demonstrating GenAI’s contribution to language development, self-regulation, and academic engagement (Silva et al., 2024; S. Y. Kim et al., 2025; Ruiz-Rojas et al., 2024). In this respect, the present study reinforces the broader consensus that GenAI functions not only as a tool for information processing but also as an affective and motivational learning aid (Gao et al., 2024; Jo, 2024; Niloy et al., 2024).
However, several divergences also emerge when comparing current findings to existing literature. While many studies emphasize the collaborative and inclusive potential of GenAI (Nikolopoulou, 2024), students in this study expressed comparatively fewer references to collaborative benefits, focusing instead on individual academic efficiency. This may reflect differences in pedagogical context or limited formal guidance on collaborative AI use. Additionally, although prior research highlights GenAI’s capacity to support creativity and higher-order thinking (Rusdin et al., 2024), participants here reported concerns about reduced creativity, diminished critical thinking, and cognitive passivity, indicating a discrepancy between theoretical expectations of AI-enhanced learning and students lived experiences.
The challenges identified in this study also deepen existing concerns surrounding academic integrity, the reliability of AI-generated content, and inadequate AI literacy. Similarly to findings by Rasul et al. (2023) and Cotton et al. (2024), students expressed fears related to plagiarism and the erosion of authentic learning. Yet the high frequency of these concerns in the present dataset suggests that such issues may be more prominent among teacher education students, who are trained to value ethical and reflective practice. The strong emphasis on misinformation echoes earlier critiques of GenAI’s factual inconsistencies (Michel-Villarreal et al., 2023; Silva et al., 2024), supporting calls for pedagogical frameworks that promote verification and critical evaluation of AI outputs.
Finally, this study’s emphasis on unequal access and skill gaps aligns with previous work highlighting inequities in digital readiness and the absence of structured AI training in higher education (Knoth et al., 2024; Walter, 2024; Kurtz et al., 2024). However, the present findings extend the literature by showing that these limitations are perceived not merely as technical obstacles but as threats to the development of core academic and professional competencies—particularly for future educators responsible for guiding AI integration in schools.
Based on the findings of this study, several recommendations can be formulated to guide the effective and responsible integration of generative AI in teacher education. First, undergraduate programs should embed systematic AI literacy training that encompasses not only technical competence but also ethical, critical, and reflective use of AI tools. Prior research highlights significant gaps in students’ AI literacy and prompt engineering skills (Knoth et al., 2024; Walter, 2024), underscoring the need for structured instruction that promotes critical evaluation of AI-generated content and safeguards academic integrity, which remains a central concern in the literature (Cotton et al., 2024; Rasul et al., 2023). Second, teacher education curricula should adopt pedagogical models that demonstrate how generative AI can support learning without replacing core cognitive processes. Studies indicate that GenAI can enhance motivation, comprehension, and analytical skills when used strategically (Gökçearslan et al., 2024; Ruiz-Rojas et al., 2024), yet the present findings—aligned with concerns raised by Bhullar et al. (2024) and Michel-Villarreal et al. (2023)—show that over-reliance may impede creativity, deep learning, and independent research. Therefore, programs should incorporate collaborative problem-solving tasks, inquiry-based learning, and guided simulations to ensure that AI supplements, rather than substitutes, students’ intellectual engagement. Finally, institutions should design equitable assessment frameworks that address disparities in access to AI tools and the challenges of evaluating AI-assisted work. This aligns with broader discussions on fairness, transparency, and reliability in AI-mediated educational contexts (Kurtz et al., 2024). Clear guidelines can help maintain academic standards while offering students structured opportunities to use AI constructively.

7.1. Implications for Pre-Service Teacher Preparation

The findings of this study have important implications for pre-service teacher education. Students’ concerns about over-reliance on GenAI highlight the need to equip future educators with strategies for promoting balanced AI use in their own classrooms. Teacher training programs should therefore include not only technical and ethical guidance but also reflective practices that enable prospective teachers to model critical engagement with AI tools, fostering both student autonomy and responsible technology use.
Moreover, the “mentor-like” perception of GenAI—where students view AI as a personalized guide supporting study strategies and decision-making—has implications for teacher professional identity and the student–teacher relationship. Pre-service teachers must understand that while AI can serve as a supplementary mentor, it cannot replace the relational, pedagogical, and ethical dimensions of teaching. Integrating discussions and simulations around AI-assisted learning can help future educators critically evaluate when and how to incorporate AI in ways that enhance, rather than supplant, their instructional role.
By addressing these aspects, teacher education programs can better prepare students to navigate the pedagogical and ethical challenges of AI integration, ensuring that future teachers maintain agency, cultivate critical thinking, and model responsible AI use for their own students.

7.2. Limitations and Future Research

A key limitation of this study is its cross-sectional design, which offers only a snapshot of students’ current perceptions and does not capture how these perceptions may change over time. Future research would benefit from longitudinal approaches that examine how sustained exposure to AI tools shape students’ learning behaviors, ethical awareness, and pedagogical intentions. Such designs are consistent with broader recommendations for forward-looking and context-sensitive research in the field (Chiu, 2024). In addition to longitudinal work, qualitative methods—such as interviews, focus groups, or think-aloud protocols—could provide deeper insight into the reasoning, critical reflections, and pedagogical considerations underlying students’ engagement with GenAI tools. Another limitation arises from the fact that the sample consisted solely of students from Humanities and Social Sciences programs, meaning the findings may not reflect the experiences of students in Science, Technology, Engineering, and Mathematics (STEM) fields, who often use GenAI tools in more technical or discipline-specific ways. Recognizing this contextual constraint helps prevent overgeneralization and situates the results within the specific academic background of the participants. A more nuanced understanding of these experiences would help clarify how AI influences not only individual study practices but also developing professional identities, particularly future educators. Future studies should therefore adopt a broader perspective that investigates both the evolution of students’ perceptions and the long-term educational implications of AI integration in teacher education contexts. Examining how early exposure and guided practice with AI tools affect learning outcomes, instructional decision-making, and classroom practices will be essential for informing evidence-based curriculum design. Through such research, higher education institutions can better prepare prospective teachers to harness AI effectively, responsibly, and in ways that enrich learners’ educational experiences.

Author Contributions

Conceptualization, S.A. (Stavros Athanassopoulos), A.T., S.A. (Spyridon Aravantinos), K.L., V.K. and S.P.; Methodology, A.T.; Software, A.T.; Validation, S.A. (Stavros Athanassopoulos), A.T. and S.A. (Spyridon Aravantinos); Formal analysis, S.A. (Spyridon Aravantinos) and K.L.; Investigation, S.A. (Spyridon Aravantinos) and K.L.; Resources, S.A. (Stavros Athanassopoulos), S.A. (Spyridon Aravantinos), K.L. and S.P.; Data curation, S.A. (Stavros Athanassopoulos), A.T., S.A. (Spyridon Aravantinos) and V.K.; Writing—original draft, S.A. (Stavros Athanassopoulos), A.T., S.A. (Spyridon Aravantinos) and K.L.; Writing—review & editing, K.L. and S.P.; Visualization, S.A. (Spyridon Aravantinos); Supervision, K.L. and V.K.; Project administration, K.L.; Funding acquisition, S.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Review Board of the Department of Educational Science and Early Childhood Education of the University of Patras (protocol code 77925 and date of approval 23 October 2023).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

The authors wish to thank all the undergraduate students who took part in this study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ali, M. S., Suchiang, T., Saikia, T. P., & Gulzar, D. D. (2024). Perceived benefits and concerns of ai integration in higher education: Insights from India. Educational Administration: Theory and Practice, 30(5), 656–668. [Google Scholar] [CrossRef]
  2. An, Y., Yu, J. H., & James, S. (2025). Investigating the higher education institutions’ guidelines and policies regarding the use of generative AI in teaching, learning, research, and administration. International Journal of Educational Technology in Higher Education, 22, 10. [Google Scholar] [CrossRef]
  3. Ansari, A. N., Ahmad, S., & Bhutta, S. M. (2024). Mapping the global evidence around the use of ChatGPT in higher education: A systematic scoping review. Education and Information Technologies, 29(9), 11281–11321. [Google Scholar] [CrossRef]
  4. Bhullar, P. S., Joshi, M., & Chugh, R. (2024). ChatGPT in higher education—A synthesis of the literature and a future research agenda. Education and Information Technologies, 29(16), 21501–21522. [Google Scholar] [CrossRef]
  5. Bryman, A. (2016). Social research methods. Oxford University Press. [Google Scholar]
  6. Chan, C. K. Y., & Hu, W. (2023). Students’ voices on generative AI: Perceptions, benefits, and challenges in higher education. International Journal of Educational Technology in Higher Education, 20(1), 43. [Google Scholar] [CrossRef]
  7. Chatterjee, S., & Bhattacharjee, K. K. (2020). Adoption of artificial intelligence in higher education: A quantitative analysis using structural equation modelling. Education and Information Technologies, 25(5), 3443–3463. [Google Scholar] [CrossRef]
  8. Chiu, T. K. F. (2024). Future research recommendations for transforming higher education with generative AI. Computers and Education: Artificial Intelligence, 6, 100197. [Google Scholar] [CrossRef]
  9. Cotton, D. R. E., Cotton, P. A., & Shipway, J. R. (2024). Chatting and cheating: Ensuring academic integrity in the era of ChatGPT. Innovations in Education and Teaching International, 61(2), 228–239. [Google Scholar] [CrossRef]
  10. Dabis, A., & Csáki, C. (2024). AI and ethics: Investigating the first policy responses of higher education institutions to the challenge of generative AI. Humanities and Social Sciences Communications, 11(1), 1006. [Google Scholar] [CrossRef]
  11. European Commission. (2021). Ethics by design and ethics of use approaches for artificial intelligence (1.0), European commission—DG research and innovation. Available online: https://ec.europa.eu/info/funding-tenders/opportunities/docs/2021-2027/horizon/guidance/ethics-by-design-and-ethics-of-use-approaches-for-artificial-intelligence_he_en.pdf (accessed on 10 September 2025).
  12. European Parliament & Council. (2024). Regulation (EU) 2024/1689—Artificial intelligence act. Official Journal of the European Union, 3(1). Available online: https://artificialintelligenceact.eu/article/3/ (accessed on 8 September 2025).
  13. Gao, Z., Cheah, J.-H., Lim, X.-J., & Luo, X. (2024). Enhancing academic performance of business students using generative AI: An interactive-constructive-active-passive (ICAP) self-determination perspective. International Journal of Management Education, 22(2), 100958. [Google Scholar] [CrossRef]
  14. Gökçearslan, S., Tosun, C., & Erdemir, Z. G. (2024). Benefits, challenges, and methods of artificial intelligence (AI) chatbots in education: A systematic literature review. International Journal of Technology in Education, 7(1), 19–39. [Google Scholar] [CrossRef]
  15. Humble, N. (2025). Higher education AI policies—A document analysis of university guidelines. European Journal of Education, 60(3), e70214. [Google Scholar] [CrossRef]
  16. Hutson, J., Jeevanjee, T., Graaf, V., Lively, J., Weber, J., Weir, G., Arnone, K., Carnes, G., Vosevich, K., Plate, D., Leary, M., & Edele, S. (2022). Artificial intelligence and the disruption of higher education: Strategies for integration across disciplines. Creative Education, 13(12), 3953–3980. [Google Scholar] [CrossRef]
  17. Hwang, G. J., Xie, H., Wah, B. W., & Gašević, D. (2020). Vision, challenges, roles, and research issues of artificial intelligence in education. Computers and Education: Artificial Intelligence, 1, 1–5. [Google Scholar] [CrossRef]
  18. Irfan, M., Murray, L., Aldulayani, F., Ali, S., Youcefi, N., & Haroon, S. (2025). From Europe to Ireland: Artificial intelligence pivotal role in transforming higher education policies and guidelines. University of Limerick. [Google Scholar] [CrossRef]
  19. Jin, Y., Yan, L., Echeverria, V., Gašević, D., & Martinez-Maldonado, R. (2025). Generative AI in higher education: A global perspective of institutional adoption policies and guidelines. Computers and Education: Artificial Intelligence, 8, 100348. [Google Scholar] [CrossRef]
  20. Jo, H. (2024). From concerns to benefits: A comprehensive study of ChatGPT usage in education. International Journal of Educational Technology in Higher Education, 21(1), 35. [Google Scholar] [CrossRef]
  21. Kasneci, E., Seßler, K., Küchemann, S., Bannert, M., Dementieva, D., Fischer, F., Gasser, U., Groh, G., Günnemann, S., Hüllermeier, E., Krusche, S., Kutyniok, G., Michaeli, T., Nerdel, C., Pfeffer, J., Poquet, O., Sailer, M., Schmidt, A., Seidel, T., … Kasneci, G. (2023). ChatGPT for good? On opportunities and challenges of large language models for education. Learning and Individual Differences, 103, 102274. [Google Scholar] [CrossRef]
  22. Kim, J., Yu, S., Detrick, R., & Li, N. (2025). Exploring students’ perspectives on Generative AI-assisted academic writing. Education and Information Technologies, 30(1), 1265–1300. [Google Scholar] [CrossRef]
  23. Kim, S. Y., Lee, W. K., Jee, S. J., & Sohn, S. Y. (2025). Discovering AI adoption patterns from big academic graph data. Scientometrics, 130(2), 809–831. [Google Scholar] [CrossRef]
  24. Knoth, N., Tolzin, A., Janson, A., & Leimeister, J. M. (2024). AI literacy and its implications for prompt engineering strategies. Computers and Education: Artificial Intelligence, 6, 100225. [Google Scholar] [CrossRef]
  25. Kohnke, L., Zou, D., Ou, A. W., & Gu, M. M. (2025). Preparing future educators for AI-enhanced classrooms: Insights into AI literacy and integration. Computers and Education: Artificial Intelligence, 8, 100398. [Google Scholar] [CrossRef]
  26. Kurtz, G., Amzalag, M., Shaked, N., Zaguri, Y., Kohen-Vacs, D., Gal, E., Zailer, G., & Barak-Medina, E. (2024). Strategies for integrating generative AI into higher education: Navigating challenges and leveraging opportunities. Education Sciences, 14(5), 503. [Google Scholar] [CrossRef]
  27. Lavidas, K., Koskina, E., Pitsili, A., Komis, V., & Arvanitis, E. (2025). Humanities and social sciences students’ views on the use of AI tools for academic purposes: Practices, benefits, challenges, and suggestions. Advances in Mobile Learning Educational Research, 6(1), 1699–1709. [Google Scholar] [CrossRef]
  28. Liu, Y., Park, J., & McMinn, S. (2024). Using generative artificial intelligence/ChatGPT for academic communication: Students’ perspectives. International Journal of Applied Linguistics, 34, 1437–1461. [Google Scholar] [CrossRef]
  29. Michel-Villarreal, R., Vilalta-Perdomo, E., Salinas-Navarro, D. E., Thierry-Aguilera, R., & Gerardou, F. S. (2023). Challenges and opportunities of generative AI for higher education as explained by ChatGPT. Education Sciences, 13(9), 856. [Google Scholar] [CrossRef]
  30. Munoz, A., Wilson, A., Pereira Nunes, B., Del Medico, C., Slade, C., Bennett, D., Tyler, D., Seymour, E., Hepplewhite, G., Randell-Moon, H., Janine, A., McPherson, J., Game, J., Rhall, J., Myers, K., Absolum, K., Edmond, K., Nicholls, K., Adam, L., … Duke, Z. (2023). AAIN generative artificial intelligence (AI) guidelines (Version 1). Deakin University. Available online: https://hdl.handle.net/10779/DRO/DU:22259767.v1 (accessed on 8 September 2025).
  31. Nechyporenko, V., Hordiienko, N., Pozdniakova, O., Pozdniakova-Kyrbiatieva, E., & Siliavina, Y. (2025). How often do university students use artificial intelligence in their studies? WSEAS Transactions on Information Science and Applications, 22, 203–214. [Google Scholar] [CrossRef]
  32. Nikolopoulou, K. (2024). Generative artificial intelligence in higher education: Exploring ways of harnessing pedagogical practices with the assistance of ChatGPT. International Journal of Changes in Education, 1(2), 103–111. [Google Scholar] [CrossRef]
  33. Niloy, A. C., Bari, M. A., Sultana, J., Chowdhury, R., Raisa, F. M., Islam, A., Mahmud, S., Jahan, I., Sarkar, M., Akter, S., Nishat, N., Afroz, M., Sen, A., Islam, T., Tareq, M. H., & Hossen, M. A. (2024). Why do students use ChatGPT? Answering through a triangulation approach. Computers and Education: Artificial Intelligence, 6, 100208. [Google Scholar] [CrossRef]
  34. Pajany, P. (2021). AI Transformative influenceextending the tram to management student’s AI’s machine learning adoption [Doctoral dissertation, Franklin University]. [Google Scholar]
  35. Qu, Y., Tan, M. X. Y., & Wang, J. (2024). Disciplinary differences in undergraduate students’ engagement with generative artificial intelligence. Smart Learning Environments, 11(1), 51. [Google Scholar] [CrossRef]
  36. Rasul, T., Nair, S., Kalendra, D., Robin, M., Santini, F. d. O., Ladeira, W. J., Sun, M., Day, I., Rather, R. A., & Heathcote, L. (2023). The role of ChatGPT in higher education: Benefits, challenges, and future research directions. Journal of Applied Learning and Teaching, 6(1), 41–56. [Google Scholar] [CrossRef]
  37. Roll, I., & Wylie, R. (2016). Evolution and revolution in artificial intelligence in education. International Journal of Artificial Intelligence in Education, 26, 582–599. [Google Scholar] [CrossRef]
  38. Ruiz-Rojas, L. I., Salvador-Ullauri, L., & Acosta-Vargas, P. (2024). Collaborative working and critical thinking: Adoption of generative artificial intelligence tools in higher education. Sustainability, 16(13), 5367. [Google Scholar] [CrossRef]
  39. Rusdin, D., Mukminatien, N., Suryati, N., & Laksmi, E. D. (2024). Critical thinking in the AI era: An exploration of EFL students’ perceptions, benefits, and limitations. Cogent Education, 11(1), 2290342. [Google Scholar] [CrossRef]
  40. Salas-Pilco, S. Z., & Yang, Y. (2022). Artificial intelligence applications in Latin American higher education: A systematic review. International Journal of Educational Technology in Higher Education, 19(1), 21. [Google Scholar] [CrossRef]
  41. Silva, C. A. G. D., Ramos, F. N., de Moraes, R. V., & Santos, E. L. D. (2024). ChatGPT: Challenges and benefits in software programming for higher education. Sustainability, 16(3), 1245. [Google Scholar] [CrossRef]
  42. Singh, S. V., & Hiran, K. K. (2022). The impact of AI on teaching and learning in higher education technology. Journal of Higher Education Theory and Practice, 12(13). Available online: https://articlearchives.co/index.php/JHETP/article/view/3553 (accessed on 8 September 2025). [CrossRef]
  43. Sousa, A. E., & Cardoso, P. (2025). Use of generative AI by higher education students. Electronics, 14(7), 1258. [Google Scholar] [CrossRef]
  44. Sousa, M. J., Dal Mas, F., Pesqueira, A., Lemos, C., Verde, J. M., & Cobianchi, L. (2021). The potential of AI in health higher education to increase the students’ learning outcomes. TEM Journal, 10(2), 488–497. [Google Scholar] [CrossRef]
  45. Stracke, C. M., Griffiths, D., Pappa, D., Bećirović, S., Polz, E., Perla, L., Di Grassi, A., Massaro, S., Skenduli, M. P., Burgos, D., Punzo, V., Amram, D., Ziouvelou, X., Katsamori, D., Gabriel, S., Nahar, N., Schleiss, J., & Hollins, P. (2025). Analysis of Artificial Intelligence Policies for Higher Education in Europe. International Journal of Interactive Multimedia and Artificial Intelligence, 9(2), 124–137. [Google Scholar] [CrossRef]
  46. Sublime, J., & Renna, I. (2024). Is ChatGPT massively used by students nowadays? A survey on the use of large language models such as ChatGPT in educational settings. arXiv, arXiv:2412.17486. [Google Scholar] [CrossRef]
  47. UNESCO. (2021). Recommendation on the ethics of artificial intelligence. Available online: https://www.unesco.org/en/artificial-intelligence/recommendation-ethics (accessed on 8 September 2025).
  48. Walter, Y. (2024). Embracing the future of artificial intelligence in the classroom: The relevance of AI literacy, prompt engineering, and critical thinking in modern education. International Journal of Educational Technology in Higher Education, 21(1), 15. [Google Scholar] [CrossRef]
  49. Wilson, T. D. (2025). The development of policies on generative artificial intelligence in UK universities. IFLA Journal, 51(3), 722–734. [Google Scholar] [CrossRef]
  50. Yang, H. (2024, October 24–25). Towards responsible use: Student perspectives on ChatGPT in higher education. 23rd European Conference on E-Learning, Porto, Portugal. [Google Scholar] [CrossRef]
Figure 1. Use of Generative AI tools (e.g., ChatGPT) for Academic Studies.
Figure 1. Use of Generative AI tools (e.g., ChatGPT) for Academic Studies.
Education 16 00228 g001
Table 1. Demographic Characteristics of the Participants (n = 314).
Table 1. Demographic Characteristics of the Participants (n = 314).
FrequencyPercent
Years of Study17523.9
218458.6
33410.8
4134.1
531.0
>551.6
GenderMale299.3
Female28289.8
Other30.9
Age<=185617.8
199028.7
2011135.4
21288.9
22113.5
>22185.7
Departments of Education and Pedagogical StudiesPreschool Education19060.5
Philosophy8226.1
Philology4213.4
Table 2. Students’ Academic Practices (n = 231).
Table 2. Students’ Academic Practices (n = 231).
TaskNeverVery RarelyRarelyOftenVery Often
Text translation38.1%16.9%26.0%18.2%0.9%
Text paraphrasing26.4%18.6%22.1%29.4%3.5%
Literature processing (e.g., summarizing scientific articles)24.7%16.5%26.0%26.4%6.5%
Data analysis13.9%15.2%20.3%45.0%5.6%
Information searching2.2%4.3%13.4%46.8%33.3%
Study advice15.6%5.6%19.5%39.4%19.9%
Exam preparation20.8%13.0%20.8%31.2%14.3%
Help with references27.3%19.0%23.4%22.1%8.2%
Writing assignments20.8%19.5%25.5%22.9%11.3%
Table 3. Benefits Identified by Students in the Use of GenAI tools (n = 314).
Table 3. Benefits Identified by Students in the Use of GenAI tools (n = 314).
No.Benefit CategoryFrequenciesPercentages
1Information/Answer Finding (easiness, immediacy, big volume, info variety, new knowledge, sources)16050.96%
2Time Saving/Speed (save time, fast completion, quick answers)9028.66%
3Understanding and Simplification of Complex Concepts (content understanding, simplification of difficult concepts, explanation with accuracy, translanguaging)6520.70%
4Help with Assignments and Research (task completion, writing of essays, research references, brainstorming)6019.11%
5Study/Note Organization (focus aid, planning of tasks, note keeping, organizing content and curriculum)4012.74%
6Summarizing/Analyzing/Translating Texts (abstract, summary, translation, content analysis)3511.15%
7Idea Generation and Inspiration (inspiration, guidelines, content generation)309.55%
8Critical Thinking and Skills (vocabulary enrichment, cognitive development, thinking ability)257.96%
9Problem Solving/Clarification of Doubts (answer difficult questions/exercises/quizzes)206.37%
10Performance/Effectiveness Improvement (promote learning, task completion, better performance)154.78%
11Familiarization with Technology and New Tools (development of technological skills, digital literacy)103.18%
Table 4. Challenges Identified by Students in the Use of GenAI tools (n = 314).
Table 4. Challenges Identified by Students in the Use of GenAI tools (n = 314).
No.Challenge CategoryFrequenciesPercentages
1Over-reliance/Reduced Personal Effort (convenience, laziness, easy solutions, lack of effort, stop thinking, dependence, loss of critical thinking, ensconcing)18057.32%
2Plagiarism/Copying/Academic Integrity Issues (cheating, human replacement, lack of understanding, same answers)15047.77%
3Erosion of Critical Thinking/Deep Learning (passive learning, uncritical knowledge, reduced skills, cognitive hibernation)14044.59%
4Diminished Research Skills (less research/cross-checking) (not crosschecking sources, reduced research, no filtering of information, lack of reliability/diversity, monotony)11035.03%
5Inaccuracy, Misinformation, Reliability Concerns (wrong results, invalid sources, unreliability, untrustworthiness, missing information)10533.44%
6Reduced Creativity and Writing Skills (loss of imagination and creativity, loss of human workforce and worth, inability to produce written word)9530.25%
7Improper/Misuse—Overreliance on a Single Source (fraudulent use, dishonesty, unfiltered acceptance of results, incorrect application)9028.66%
8Incomplete Development of Skills (cognitive and non-cognitive) (lack of research skills, reduced academic performance, lack of spherical knowledge/thinking/understanding, problem-solving deficiency, lack of logical reasoning, weakened experiences)7022.29%
9Technology Dependence/Addiction (addiction to easiness, inability to function without AI)6019.11%
10Inequalities and Assessment/Grading Issues (effortless success, difficulty to prove one’ s credentials, reduced trust in the university and students’ degrees)3511.15%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Athanassopoulos, S.; Tzavara, A.; Aravantinos, S.; Lavidas, K.; Komis, V.; Papadakis, S. Teacher Education Students’ Practices, Benefits, and Challenges in the Use of Generative AI Tools in Higher Education. Educ. Sci. 2026, 16, 228. https://doi.org/10.3390/educsci16020228

AMA Style

Athanassopoulos S, Tzavara A, Aravantinos S, Lavidas K, Komis V, Papadakis S. Teacher Education Students’ Practices, Benefits, and Challenges in the Use of Generative AI Tools in Higher Education. Education Sciences. 2026; 16(2):228. https://doi.org/10.3390/educsci16020228

Chicago/Turabian Style

Athanassopoulos, Stavros, Aggeliki Tzavara, Spyridon Aravantinos, Konstantinos Lavidas, Vassilis Komis, and Stamatios Papadakis. 2026. "Teacher Education Students’ Practices, Benefits, and Challenges in the Use of Generative AI Tools in Higher Education" Education Sciences 16, no. 2: 228. https://doi.org/10.3390/educsci16020228

APA Style

Athanassopoulos, S., Tzavara, A., Aravantinos, S., Lavidas, K., Komis, V., & Papadakis, S. (2026). Teacher Education Students’ Practices, Benefits, and Challenges in the Use of Generative AI Tools in Higher Education. Education Sciences, 16(2), 228. https://doi.org/10.3390/educsci16020228

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop