Next Article in Journal
Digital Transformation and Green Innovation in State-Owned Enterprises: Evidence Based on Mixed Ownership Reform
Previous Article in Journal
How Socio-Demographic Traits and Moderating Variables Shape Waste Clothing Recycling in China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Effects of Customized Generative AI on Student Engagement and Emotions in Visual Communication Design Education: Implications for Sustainable Integration

School of the Arts, Kyungpook National University, Daegu 37224, Republic of Korea
*
Author to whom correspondence should be addressed.
Sustainability 2025, 17(22), 9963; https://doi.org/10.3390/su17229963
Submission received: 11 October 2025 / Revised: 5 November 2025 / Accepted: 5 November 2025 / Published: 7 November 2025

Abstract

Generative Artificial Intelligence (GAI) is advancing rapidly and is increasingly integrated into visual communication design education. How to effectively and sustainably leverage GAI to support visual communication design teaching has thus become a critical issue faced by educators. While prior studies have focused on GAI’s impact on student learning outcomes and creativity, limited research has explored its effects on emotions and student engagement. This study aims to investigate the impact of customized GAI integration on visual communication design students’ learning engagement and to qualitatively explore the emotions that occur throughout the learning process. Using a quasi-experimental design, 96 students were randomly assigned to either a control group using traditional instruction or an experimental group using a customized GAI. Student engagement was measured using pre- and post-assessment scales, and semi-structured interviews were conducted to analyze students’ emotional changes. The results show that customized GAI integration effectively enhanced students’ cognitive, emotional, and behavioral engagement. Moreover, students experienced diverse and dynamic emotions during the learning process, which influenced their engagement. This study provides empirical support for the application of GAI in visual communication design education, highlighting the importance of balancing technology integration with emotional regulation, thereby informing the responsible and sustainable integration of GAI in design education.

1. Introduction

Technological innovation is regarded as a key driving force for achieving sustainable development in education [1]. In this study, “sustainable integration” refers to pedagogical implementation models that maintain educational effectiveness over extended time periods, support continued student engagement without fostering detrimental dependencies, and can be realistically maintained within institutional resource constraints. As an emerging technology, Generative Artificial Intelligence (GAI) not only reshaped the tools and processes of visual communication design but also offers new possibilities for building a more sustainable and equitable model for design education [2,3]. GAI applications such as ChatGPT, Midjourney, and Stability AI have been integrated into various tasks in visual communication design education. For example, they are used for conceptualization and creative expression in graphic design instruction [4], creative exploration in 3D animation design [5], needs analysis and creative visualization in advertising design, thereby supporting the development of students’ creativity and problem-solving abilities [6].
While these applications highlight the potential for integrating GAI into visual communication design education, whether technological tools can maintain long-term teaching effectiveness largely depends on student engagement [7,8,9]. Student engagement is conceptualized as a multidimensional construct that includes cognitive, behavioral, and emotional components [10,11]. It serves as a prerequisite for effective teaching and learning [12] and significantly influences students’ learning outcomes and professional competency development [13,14]. However, the impact of GAI on student engagement is multifaceted, encompassing both positive and negative effects. The literature indicates that GAI creates more intelligent and personalized learning environments, which helps enhance students’ learning motivation and confidence [15,16,17] and encourages students to participate more actively in the learning process [18]. However, GAI may also trigger “learning disengagement” problems. Some students may become overly reliant on Artificial Intelligence to complete their tasks, which may compromise their autonomy and learning sustainability [19,20].
Given that student emotions are closely related to their engagement in learning [21,22,23], it is crucial to examine students’ emotional responses to using GAI. Positive emotions, such as excitement, pleasure, and enjoyment, enhance learning motivation and increase student engagement [24]. Conversely, negative emotions such as anxiety and frustration may reduce cognitive efficiency and impede student engagement [10,25]. Existing research has demonstrated that the integration of GAI may elicit a range of positive and negative emotions in instructional contexts [26]. For instance, the interactive and innovative learning experiences provided by GAI may generate pleasure, confidence, and learning interest among students, while technological challenges and uncertainty may induce anxiety, doubt, and stress [27,28]. Given these widespread effects, it is reasonable to speculate that the integration of GAI into visual communication design education may elicit positive emotional experiences that promote student engagement while also potentially triggering negative emotions that lead to student disengagement. Moreover, as a discipline that heavily relies on creative thinking, visual expression, and design practice, visual communication design has a distinctive learning process. Students may exhibit unique emotional responses and engagement patterns when using GAI. With the widespread integration of Generative Artificial Intelligence (GAI) into visual communication design education, cultivating design students’ ability to use AI tools critically and to engage in sustainable learning has become a key educational mission. Therefore, it is necessary to understand whether and how GAI affects student engagement to develop sustainable models of design education.
Despite growing research on AI integration in teaching environments, most studies have centered on general educational contexts. Empirical research specifically examining visual communication design education remains notably limited [29]. Currently, design scholars have primarily focused on the pedagogical integration of general text-to-image AI, such as explorations of human-AI collaborative models based on Midjourney and DALL·E [30,31] and pedagogical application methods [32,33]. Furthermore, a small body of research has examined the impact of general-purpose text-to-image AI tools applied in the design ideation phase on outcome variables such as student learning outcomes, creativity, and self-efficacy [34,35,36]. Despite these efforts, little is known about how GAI influences process variables in visual communication design education, such as student engagement and emotional experiences. This largely limits our understanding of how GAI can be implemented sustainably and effectively in visual communication design education. Moreover, existing studies rarely examine customized GAI, which addresses common shortcomings of generic tools, such as severe creative homogenization, insufficient standardization, and uncontrollable variations, thereby offering long-term practical advantages.
Therefore, unlike previous research that has focused on general GAI, this study employed customized GAI as the main instructional intervention to examine its effects on student engagement and emotions. We conducted a six-week teaching experiment at a university in northern China, employing a mixed-methods approach that combined quantitative and qualitative analyses to address three core research questions: RQ1: Compared with traditional teaching strategies, does customized GAI effectively enhance student engagement across cognitive, emotional, and behavioral dimensions? RQ2: What kinds of emotions do students experience when using customized GAI in design learning? RQ3: What is the relationship between student emotions and engagement in instructional contexts integrating customized GAI? This study enriches the discourse on educational technology and provides valuable insights for educators and researchers seeking to achieve sustainable integration of Artificial Intelligence in visual communication design education.

2. Literature Review

2.1. Theoretical Framework

Control-Value Theory (CVT) emphasizes that emotions arising during learning process primarily stem from learners’ perceptions of control and value [21]. Specifically, control refers to students’ perceived ability to complete learning tasks, namely their belief that they can effectively manage the learning process. Value denotes students’ appraisal of the significance they attribute to the learning tasks. These two appraisal processes interact to generate emotions that subsequently influence students’ learning motivation and engagement. Furthermore, CVT posits that the social environment in which students are situated exerts a significant influence on their academic emotions, which, in turn, affect student learning behaviors and performance through direct and indirect pathways. This interactive relationship demonstrates that academic emotions play a pivotal role in stimulating students’ interest and enhancing their engagement [21,37].
This study uses a customized GAI to create an innovative learning environment encompassing design analysis, personalized generation, creative extension, and solution optimization and evaluation. This environment offers personalized and controllable learning experiences that enhance students’ sense of control and subjective value regarding learning tasks, thereby fostering the development of positive emotions [38]. When students experience high levels of control and value in a GAI-integrated learning environment, they are more likely to develop positive academic emotions, such as curiosity, satisfaction, and confidence, which in turn enhance their engagement. Conversely, when students face challenges such as technological dependence, uncertainty in generated outcomes, or comprehension biases in interacting with the system, their perceived control and value of learning tasks may be diminished, triggering negative emotions and reducing participation in learning [39].
Specifically, the customized GAI system employed in this study operationalizes CVT’s control dimension through three specific mechanisms: (1) Lora training enables students to directly influence model outputs through dataset curation, enhancing procedural control over the creative process; (2) parameter adjustment capabilities provide moment-to-moment control during generation, allowing students to refine outputs in real-time; and (3) iterative feedback loops allow students to evaluate and regenerate content, supporting metacognitive control. Similarly, the value dimension is addressed through the system’s capacity to produce outputs aligned with students’ personal aesthetic preferences and design objectives, enhancing task value through perceived relevance and utility. Overall, CVT provides a robust framework for understanding emotional changes induced by a customized GAI and for promoting positive emotional development and sustained engagement.

2.2. Artificial Intelligence and Student Engagement

Engagement is defined as the extent to which students participate in instructional activities, and it plays a critical role in effective learning. In the context of AI-enhanced learning, engagement is typically assessed by observing students’ involvement in relevant tasks [12,34,40]. Liu conceptualized academic engagement as a multifaceted structure, including cognitive, emotional, and behavioral engagement dimensions, all of which are activated in designing instructional activities [41]. In the digital technology era, student engagement has undergone significant changes, not only in knowledge acquisition but also in participation, collaboration, and practical application in meaningful contexts [42]. The GAI provides personalized learning environments that enable students to explore complex theories and solutions in a more meaningful way [43]. GAI in the educational field has many advantages, such as enhancing learner engagement, promoting collaboration, and improving accessibility, but it may also bring various problems and concerns, particularly in terms of copyright, data security, and privacy issues [44].
Liu became pioneers in confirming the efficacy of GAI in improving student engagement in design thinking teaching environments [41]. The study included two groups, with the experimental group receiving instructions using GAI-enhanced collaborative whiteboards. The results showed that GAICW significantly improved the student engagement. Furthermore, scholars have extensively studied the impact of GAI on student engagement in other higher education fields. For example, Cao et al. analyzed the impact of AI tools on the engagement levels of 472 Chinese students [45]. Their findings revealed significant improvements in classroom participation following GAI integration in the learning process. Owoseni et al. investigated GAI applications for enhancing student engagement, demonstrating that GAI can foster more inclusive and participatory learning environments through the provision of personalized learning materials, adaptive curricular pathways and interactive educational tools [46]. While these studies examined general AI applications rather than specifically generative AI, their findings regarding personalized learning environments and engagement enhancement provide a reasonable foundation for anticipating similar effects in GAI-enhanced contexts.

2.3. GAI in Visual Communication Design Education

Visual communication design addresses human needs at the informational, aesthetic, and cultural levels through creative thinking and practice [47]. However, higher education faces significant challenges in this field, particularly within large class teaching environments, where resource constraints limit educators’ ability to provide personalized materials and instruction. Traditional classroom practices typically employ standardized pedagogical approaches and uniform assessment systems, which inadequately address students’ diverse capabilities, learning trajectories, and interests, thereby constraining the cultivation of their creative potential. With the widespread adoption of GAI across the design industry, it has emerged as a transformative tool in visual communication design education [32], catalyzing significant shifts in pedagogies and learning experiences [2].
In recent years, scholarly discourse has increasingly focused on integrating GAI tools, such as ChatGPT, Midjourney, and DALL·E, into visual communication design pedagogy to foster creativity and sustainability in design [30]. For instance, Vartiainen and Tedre explored pathways for applying Midjourney to specific teaching processes through design workshops [4]. Lively, Hutson, and Melick integrated DALL·E into interface design curricula, enabling students to generate ideas and visual elements more effectively [48]. Li proposed a collaborative teaching model that incorporates Stable Diffusion into visual communication design education, establishing new pedagogical paradigms for the field [49]. Furthermore, Chen et al. examined the application of ChatGPT, DALL·E, and Midjourney as visual brainstorming tools, exploring both the implementation of GAI in creative design courses and the dynamics of human-AI collaborative frameworks [31].
Furthermore, multiple studies have demonstrated the potential impact of integrating GAI into visual communication design education. Sáez-Velasco examined both educators and students’ perceptions of Midjourney as a supportive tool for illustration generation, revealing that GAI enhances the quantity of conceptual ideas and diversity of innovative outputs [20]. In a complementary study, Kicklighter et al. employed image-based GAI tools to support students’ creative endeavors in 3D animation design coursework. Their findings indicate that GAI usage provides students with expanded conceptual frameworks and inspirational resources, thereby enhancing their creative capacities [5]. Similarly, Chen et al. investigated the pedagogical effects of integrating ChatGPT into visual communication design instruction and demonstrated that GAI implementation improved both student learning outcomes and self-efficacy [34].
While these studies demonstrate GAI’s potential impact on outcome variables, they share a common limitation: the focus on generic, publicly available GAI tools (ChatGPT, Midjourney, DALL·E) used as creative aids. This approach overlooks a critical pedagogical concern—that generic GAI tools, while accessible, often produce homogenized outputs and lack the specificity required for achieving educational objectives in design education. Furthermore, the absence of process-oriented research examining how students engage with and emotionally respond to GAI during learning leaves a significant gap in our understanding of sustainable integration models. These gaps motivate the current study’s focus on customized GAI systems and process variables including engagement and emotions. To address these research gaps, this study developed a customized GAI system that integrated ChatGPT-4 with Lora-trained Stable Diffusion and examined its effects on student engagement and emotional experiences through pedagogical experiments.

3. Methodology

This study employed a sequential explanatory mixed methods design to evaluate the impact of customized GAI on student engagement in visual communication design education and to explore the relationship between emotions and engagement [50] (see Figure 1). Following this approach, quantitative data were initially collected and analyzed, followed by the collection of qualitative data through semi-structured interviews to further explain these findings.

3.1. Participants

This study recruited 96 undergraduate students (junior years) majoring in visual communication design from a Chinese university. The participants’ ages ranged from 19 to 21 years. During participant recruitment, a brief questionnaire was used to confirm that none of the students had previously attended GAI-integrated design courses or used customized GAI platforms. About 90.6% of the participants reported occasional use of general AI tools (e.g., ChatGPT or image generators) for non-design purposes such as casual conversation or simple image creation, while the remaining 9.4% had no prior experience with such tools. Accordingly, all participants were classified as novices in AI-assisted design practice. Participants were randomly assigned to either the control (n = 48) or experimental group (n = 48). The experimental group comprised 23 males and 25 females (M = 20.23, SD = 0.93), whereas the control group consisted of 22 males and 26 females (M = 20.31, SD = 0.91).

3.2. Instruments

3.2.1. Student Engagement Scale

The student engagement survey was adapted from a previous study by Reeve and Tseng and included 17 items measuring cognitive, emotional, and behavioral engagement in the classroom [12]. Quantitative data were collected using a 5-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree). The representative items for each dimension included: “During the design creation process, I reflect on my design thinking and creative approaches” (cognitive engagement), “I feel excited and delighted during the design creation process” (emotional engagement), and “I actively seek advice from instructors on design-related matters” (behavioral engagement). The three dimensions demonstrated high internal consistency reliability, with Cronbach’s α coefficients of 0.89, 0.90, and 0.87, respectively [51].

3.2.2. Semi-Structured Interview

To explore participants’ perceptions of how emotions influence their engagement and supplement the quantitative findings, an open-ended survey was conducted with the experimental group. The interview guide was developed by the research team and subsequently reviewed by three experts, whose feedback was incorporated into the final version. After the review, one question was removed, resulting in a final set of five open-ended questions. For example, can you describe your emotional experience throughout the learning process? What emotional changes did you notice? Randomly selected students from the experimental group were interviewed individually, with probing and clarifying questions used to facilitate bidirectional interactions with the participants. Participants were encouraged to ask clarifying questions when clarification was needed. Each interview lasted approximately 35 min and was conducted face-to-face, with all session’s audio-recorded using digital recording equipment.

3.2.3. Customized GAI Tools

This course aimed to establish an innovative practical platform for visual communication design education by examining the application of customized GAI in the pedagogy of visual communication design. The customized GAI employed in this study integrated text analysis and image generation modules. Through systematic parameter constraint —including style weight adjustments (0.5–1.5 range, set at 0.8 for instructional purposes), semantic token prioritization, and negative prompt filtering—and workflow design, the customized GAI established comprehensive pedagogical collaboration. The text analysis module, built upon GPT-4, primarily performs tasks including design concept interpretation, cultural connotation comprehension, prompt generation, and design solution evaluation, thereby providing textual analysis and knowledge-based instructional support for design education. The image generation module, based on Stable Diffusion 3.5 enhanced with Lora models, not only enabled students to achieve precise creative expression and efficient output but also generated optimized design iterations based on feedback. This approach addresses the core challenges commonly observed in current GAI-integrated design practices, including severe creative homogenization, insufficient standardization, and uncontrollable variations.
Prior to the experiment, the students were required to conduct Lora training based on the selected datasets under the instructor’s guidance. The core principle of the Lora method lies in training only additional low-rank matrices while keeping the main parameters of the pretrained large model frozen, thereby enabling rapid adaptation to specific style-semantic image generation requirements under few-shot conditions [52]. This training workflow encompassed key steps, including image material collection and preprocessing, model parameter configuration, and fine-tuning training (see Figure 2), providing students with the technical foundation to construct generative models aligned with their personalized visual language. In this instructional example, the Lora model was fine-tuned based on the Stable Diffusion v1.5 base model and executed on a local system equipped with an NVIDIA GeForce RTX 4060 Ti (16 GB). Twenty images were selected as training data and uniformly resized to 512 × 512 pixels. The training of a single Lora model took approximately 40 min, with the following hyperparameters: repeat = 10, epoch = 10, batch size = 1, and learning rate = 0.0001. This configuration balances image quality and training efficiency, making the procedure feasible and accessible on hardware commonly available in educational institutions. Regarding copyright compliance, all training images were either sourced from publicly available datasets with appropriate licenses or created with authorization, ensuring adherence to copyright regulations.

3.3. Course Design

This study implemented a six-week design workshop for the 96 students, themed “Visual Design and Culture.” Students were required to complete poster designs on selected topics related to the 20th-century European design history. The curriculum was structured around the Double Diamond model to guide the entire design pedagogy [53]. The course was conducted from June to July 2025 and comprised four phases: discover, define, develop, and deliver (see Figure 3). To minimize potential confounding factors and establish robust causal relationships, the teaching team maintained uniform control over class duration, instructional content, and assignment standards across the experimental and control groups. Each group received six hours of in-class instructional support per week, with the same instructor teaching both groups to ensure consistency in controlled variables. The instructor had previously taught a semester-long AI-integrated visual design course and was an active researcher in AI-assisted design. This experience ensured a high level of familiarity with GAI-based teaching tools, thereby supporting the accurate and consistent implementation of the instructional intervention.
During the course, teachers guided students through the progressive development of poster design tasks, introducing GAI at critical junctures to enhance design analysis and creative expression capabilities (see Figure 4). In the “Discovery phase” (Week 1), students conducted cultural research and analysis, focusing on 20th-century European visual design history, identifying key design movements, representative figures, and their cultural contexts. The control group primarily engaged in material analysis through consultations of print and digital literature. The experimental group used GPT-4 to facilitate learning and material analysis.
Upon entering the define phase (Weeks 2–3), the control group followed traditional design processes, with students independently analyzing first-phase content and devel-oping design positioning statements and requirement specifications. The experimental group engaged in interactive discussions about design culture, style, and visual expression through GPT-4, expanding design thinking through human-AI dialogue and establishing design directions. Subsequently, the research team introduced a customized Lora training workflow. This process employed Stable Diffusion as the image generation framework and performed fine-tuning aligned with the stylistic directions determined by students during the define phase. Under the instructor’s guidance, students collected relevant images and visual elements, integrated semantic labels to construct training datasets, and completed the deployment of targeted Lora fine-tuned models.
During the develop phase (Weeks 4–5), the control group engaged in hand-drawn sketch conceptualization based on the preliminary research findings and established themes. The students in the experimental group synthesized design background information, combining their understanding, experience, inspiration, and logical reasoning to formulate insights. These insights were subsequently articulated in a detailed textual form and iteratively refined through dialogue with GPT-4 to optimize the prompts, ultimately assisting in the creation of precise prompts for Stable Diffusion and generating preliminary design sketches.
During the deliver and presentation phase (Week 6), both groups were required to complete their final poster designs and compose the corresponding design rationales. The control group refined their work through traditional feedback and revision processes, while the experimental group utilized GPT-4 to assist in developing creative statements and incorporated feedback from the generated images to further optimize the design content, thereby enhancing the clarity and innovation of the design expression.
The overall instructional design aims to address current challenges in integrating GAI into design education, namely reducing the complexity of prompt engineering to enhance students’ mastery of tools such as Stable Diffusion, and clarifying students’ role as “lead designers,” with GAI serving only as an auxiliary tool to prevent over-reliance. To support this, several instructional strategies were implemented. The course adopted a tiered prompt-teaching approach, progressing from simple keyword generation to multi-constraint prompts, and provided a library of ready-to-use templates to ease the initial learning burden. Students were also required to submit sketches or concept boards before AI generation, specifying the design theme, function, target audience, and reference colors and materials, to ensure that AI functioned as an aid rather than a replacement. During the AI generation process, there were no explicit limits on the number of iterations; however, students were required to document each prompt and provide written explanations detailing the rationale and modifications for every iteration. Through GPT-4’s conversational guidance, Lora fine-tuning for visual generation, and multi-stage iterative feedback, this study explores deep integration models of customized GAI in visual communication design education and assess their impact on supporting students’ understanding of design culture, developing creative expression, and enhancing learning engagement.

3.4. Experimental Procedure

Before the experiment began, the researchers provided participants with a brief introduction to the study’s purpose and procedures and obtained informed consent. Participants were randomly assigned to either the experimental or control group. Questionnaires were administered to establish baseline measures of academic engagement (see Figure 5). Students in the experimental group received one week of training on customized GAI tools before the formal coursework commenced. The training curriculum encompassed operational methods for ChatGPT and Stable Diffusion, prompt engineering techniques, and technical components, including Lora training, parameter adjustment, image generation, and optimization. Upon completion of each training module, students were required to complete practice assignments to ensure proficiency. Recognizing individual differences in students’ learning curves with GAI tools, the experimental design employed a progressive strategy. During the initial phase, emphasis was placed on guiding students to use GPT4 to organize creative logic and design thinking processes. The developmental phase introduced Stable Diffusion image generation tools to support visual exploration and solution presentation. Instructors provided targeted feedback based on the AI-generated content and adjusted their teaching strategies accordingly.
The control group employed traditional design pedagogy throughout the six-week period. During the final week of the experiment, both groups presented their design out-comes and completed the post-intervention academic engagement questionnaires. Following the experiment, the researchers randomly selected 25 students from the experimental group for semi-structured interviews to explore their affective experiences and factors influencing engagement. After the interviews, the researchers recorded and verified the data for consistency and relevance. Subsequently, the data were entered into Excel files according to the study’s overall objectives.

3.5. Data Analysis

Collins, Onwuegbuzie, and Sutton noted that mixed methods can effectively assess the implementation fidelity of interventions, enrich quantitative findings with qualitative insights, and enhance the meaningfulness of intervention effects [54]. Therefore, this study employed a sequential explanatory mixed-methods design. In the quantitative analysis phase, the data were first cleaned and organized, and the normal distribution was confirmed using the Kolmogorov–Smirnov test. Second, descriptive statistics were used to summarize the participants’ pre- and post-test results. Third, paired-samples t-tests were conducted to analyze significant changes in mean engagement scores between pre- and post-test measurements within the experimental group. Finally, analysis of covariance (ANCOVA) was employed to control for covariates and examine the impact of customized GAI integration on student engagement.
In the qualitative analysis phase, Braun and Clarke’s systematic data analysis approach was followed to explore the relationship between participants’ emotional changes and their engagement. emotional changes and their impact on engagement. Coding was conducted using MAXQDA 2023. Initially, the researchers systematically generated codes based on data characteristics, which were subsequently consolidated into potential themes. To minimize coding bias, an external doctoral student in education who was unaffiliated with the research team conducted a blind review [55]. The reviewer was unaware of the research hypotheses, the assignment of participants to the experimental or control groups, and the researchers’ preliminary coding, but was informed of the data sources, analysis objectives, and coding procedures. Data relevant to each theme were then organized, and the relevance of these themes to the coded segments was examined before appropriate names were assigned. Finally, similar themes were categorized into thematic groups for analysis. To ensure coding consistency, two researchers conducted independent coding, achieving a Cohen’s Kappa coefficient of 0.821, indicating high inter-rater reliability.

4. Results

4.1. Student Engagement Results

ANCOVA was conducted to examine the impact of integrating customized GAI into visual communication design education on student engagement. Prior to this analysis, the Kolmogorov–Smirnov test was employed to assess the normality of the collected data (see Table 1). Under pre-test conditions, all dimensions for both the experimental and control groups demonstrated non-significant p-values in the Kolmogorov–Smirnov test (p > 0.05), indicating a normal data distribution. Similarly, under post-test conditions, both groups exhibited non-significant p-values in the normality test (p > 0.05), confirming that the collected data followed a normal distribution. Furthermore, Levene’s test confirmed homogeneity of variance across groups (p > 0.05), and the homogeneity of regression slopes assumption was met (p > 0.05), indicating that ANCOVA was appropriate.
Subsequently, descriptive statistics were used to summarize the students’ pre- and post-test results (see Table 2). In the experimental group, the pre-test mean scores for behavioral engagement (BE, 6 items), emotional engagement (EE, 5 items), and cognitive engagement (CE, 6 items) were 3.911 (SD = 0.528), 3.792 (SD = 0.486), and 3.757 (SD = 0.541), respectively, while the post-test means were 4.198 (SD = 0.528), 4.188 (SD = 0.486), and 4.229 (SD = 0.541). In the control group, the pre-test means were 3.865 (SD = 0.491), 3.776 (SD = 0.457), and 3.791 (SD = 0.469), and the post-test means were 3.948 (SD = 0.491), 3.734 (SD = 0.457), and 3.753 (SD = 0.469), respectively. The change scores and their 95% confidence intervals revealed distinct patterns between groups. The experimental group exhibited statistically significant improvements across all three dimensions (Δ = 0.286–0.472, all 95% CIs excluded zero), with cognitive engagement showing the largest gain (Δ = 0.472, 95% CI [0.386, 0.558]). In contrast, the control group demonstrated minimal change in post-test engagement mean scores (Δ = −0.042 to 0.083, all 95% CIs included zero), remaining essentially at the baseline levels.
Subsequently, paired-samples t-tests were conducted to examine whether the differences between students’ pre- and post-test scores were statistically significant. As shown in Table 3, behavioral engagement means scores increased from 3.911 in the pre-test to 4.198 in the post-test, with a mean difference of 0.286 that was statistically significant (t = 4.737, p < 0.001). Emotional engagement scores also increased significantly, rising from 3.792 in the pre-test to 4.188 in the post-test, with a mean difference of 0.396 that was statistically significant (t = 6.147, p < 0.001). Similarly, cognitive engagement demonstrated a significant increase from 3.757 in the pre-test to 4.229 in the post-test, showing a statistically significant mean difference of 0.472 (t = 10.752, p < 0.001).
Table 4 further presents the paired-samples t-test results for the academic engagement pre- and post-test scores in the control group. Behavioral and cognitive engagement showed slight increases, while emotional engagement demonstrated a slight decrease; however, none of these changes were statistically significant.
These results support the effectiveness of customized GAI integration in enhancing students’ behavioral, emotional, and cognitive engagement, as evidenced by the low p-values obtained. The absence of significant changes in the control group further strengthens the conclusion that the improvements in the experimental group were more likely attributable to the customized GAI intervention itself rather than external con-founding factors.
Additionally, a one-way ANCOVA was conducted to examine the unique contribution of customized GAI-integrated visual design education to student engagement. The results presented in Table 5 reveal significant differences between the groups in behavioral engagement, with the experimental group outperforming the control group (F [1, 93] = 16.13, p < 0.001, η2p = 0.148). The results in Table 6 indicate significant differences between the groups in emotional engagement, with the experimental group demonstrating superior performance compared to the control group (F [1, 93] = 14.74, p < 0.001, η2p = 0.116). The results in Table 7 show significant differences in cognitive engagement between groups, with the experimental group outperforming the control group (F [1, 93] = 21.04, p < 0.001, η2p = 0.185).

4.2. Emotional Analysis Results

4.2.1. Types of Customized GAI-Induced Emotional Experiences

To gain deeper insights into students’ emotional experiences during the learning process, the researchers conducted semi-structured interviews with students in the experimental group. Through thematic analysis, the following results were obtained (see Table 8): students experienced a range of positive and negative emotions during their interactions with GAI. Positive emotional responses yielded 235 codes (67.7% of the total), while negative emotional responses accounted for 112 codes (32.3% of the total), resulting in a positive-to-negative emotion ratio of approximately 2.1:1. Overall, emotional orientation demonstrated a positive tendency. Regarding positive emotional experiences, the expressions exhibited rich hierarchical characteristics. These can be categorized into nine core dimensions: excitement (38), satisfaction (32), curiosity (29), pleasure (28), enjoyment (26), confidence (25), surprise (23), pride (19), and anticipation (15). Among these, “excitement” was the most prominent, receiving 38 codes (16.2% of total positive emotions). “Satisfaction” followed closely, with 32 codes (13.6%). Negative emotional experiences were categorized into eight core dimensions: stress (26), doubt (22), worry (18), confusion (15), anxiety (12), frustration (10), fear (5), and disappointment (4). Among these, stress (23.2%), doubt (19.6%), and worry (16.2%) were the most dominant.

4.2.2. Sources of Positive Emotions

To provide supplementary evidence for the codes, we conducted an in-depth analysis of the sources of positive emotions and identified four primary themes that encompassed 173 codes. These themes revealed the core factors underlying the positive emotions elicited by customized GAI (see Figure 6).
Enhancing Design Creation Efficiency (57 codes). This theme reflects the impact of customized GAI on improving students’ understanding of design background knowledge and their efficiency in visual creative expression. It comprises four categories: improving design analysis efficiency (19 codes), rapid content generation (17 codes), reducing entry barriers (14 codes), and optimizing cognitive resource allocation (7 codes). For example, Student S07 noted that the customized GAI helped him extract key information from complex background materials and transform abstract cultural concepts in his mind into concrete visual elements. Another student S04 emphasized that, compared with traditional sketching and 3D modeling software, Stable Diffusion is much easier to operate and significantly lowers the entry barrier, enabling him to manage the entire design process more effectively while gaining greater inspiration and creativity. Student S12 explained, “When creating a poster draft now, I no longer rely on the tedious process of hand drawing and repeated revisions. With GPT-4 and a trained Lora model, I can generate several personalized design options within minutes”. Additionally, Students S12 and S23 further commented, “GPT helps us handle complex and repetitive tasks, allowing us to focus more on creative thinking and design innovation”.
Stimulating Design Innovation Thinking (42 codes). This theme demonstrates the value of customized GAI in expanding students’ creative thinking. It encompasses two categories: cross-cultural creative fusion (22 codes) and breaking conventional thinking (20 codes). For instance, Student S17 noted that by training a Lora model integrating elements of traditional Chinese culture and modern design styles, he generated unprecedented combinations of cultural motifs, which sparked design inspiration. Student S19 further demonstrated the integration of visual storytelling with unique narrative expression: “I first developed a story using GPT-4, then imported the generated prompt into my newly trained art-style model, which produced unexpected visual effects. This combination of visual and narrative storytelling significantly enhanced my creative capacity in design”.
Professional Benefits of Customized GAI (40 codes). This theme reflects how the professional advantages of customized GAI stimulated students’ learning confidence and sense of pride. It comprises four categories: precise control (15 codes), personalized learning support (11 codes), reproducible creation (9 codes), and enhanced technical literacy (5 codes). During the development phase, students achieved precise control over the quality of the generated content by adjusting the Lora training parameters. This technical mastery significantly enhanced the learners’ sense of achievement and professional confidence (S15). Customized GAI can conduct deep learning based on individual design preferences, and as the training process progresses, the alignment between the generated content and personal style continuously improves, enabling students to anticipate and reproduce specific content without anxiety about the accuracy of the generated outputs (S6, S11, S23). These findings stand in sharp contrast to prior observations that the unpredictability of GAI outputs often triggers learner anxiety and uncertainty. The customized system appears to fundamentally alter the user experience from one of uncertain reception to confident anticipation, suggesting that customization may address one of the primary emotional barriers to GAI adoption in creative education contexts.
Enhancing Career Development Confidence (34 codes). This theme encompasses students’ positive expectations for future career development when using a customized GAI. It comprises two categories: enhanced employment competitiveness (21 codes) and a stronger professional identity (13 codes). Students S06 and S11 stated, “By learning and mastering customized AI tools, I feel more competitive in the job market. Acquiring this cutting-edge skill has strengthened my confidence in pursuing a design career and made me more optimistic about the future”.

4.2.3. Sources of Negative Emotions

Based on the analysis of the causes of negative emotions, three themes were identified, encompassing 112 codes. These themes reveal the primary problems and challenges students encountered when using customized GAI (see Figure 7).
Customized GAI Dependence and Concerns (41 codes). This theme reflects students’ concerns about the potential negative consequences of GAI use and their underlying ap-prehensions, which triggered negative emotions such as doubt and anxiety. The concerns were concentrated in four areas: impairment of independent thinking (15 codes), fears of skill degradation (12 codes), technological dependence (8 codes), and privacy and security issues (6 codes). For example, students S05 and S23 stated, “I’m a bit worried that CGAI is so convenient that I might become overly dependent on its output, which could eventually weaken my independent thinking and creativity”. Student S17 added, “I feel a bit anxious when AI accesses our personal data”. Student S14 also noted, “I’m concerned that my original work might be leaked. This sense of insecurity makes me somewhat hesitant about the actual value of GAI-based learning activities”.
Challenges in Mastering Customized GAI (37 codes). This theme reflects the challenges students faced in mastering customized GAI, demonstrating the adaptive difficulties and pressures encountered during the transition from traditional design methods to intelligent tools. It comprises four categories: difficulties in proper utilization (14 codes), insufficient AI literacy (11 codes), quality assessment challenges (7 codes), and new technology learning pressures (5 codes). For example, Student S24 remarked, “In class, I often wasn’t sure which base model to choose. Sometimes I chose the wrong one, and the results were poor. It made me feel like I didn’t have full control over the technology.” S04 further explained, “It’s often hard to tell whether the AI-generated content is actually good or not.” S01 observed, “I feel like there’s a communication gap between me and the AI—it doesn’t always understand what I really want.” Similarly, Students S19, S22, and S25 noted, “Although GAI seems easy to use at first, truly mastering it and applying it effectively in design requires considerable time and effort”.
Limitations of Customized GAI (34 codes). This theme reflects the technical bottlenecks and constraints that students encountered during GAI training and usage. It encompasses three categories: unstable Lora training performance (15 codes), creative limitations (11 codes), and technical deployment barriers (8 codes). For example, student S22 noted, “During the Lora training process, I had to keep trying different settings, and the results were often unpredictable, which made me feel anxious”. Student S07 commented, “I think GAI still has creative limitations. It can’t generate subtle or metaphorical elements the way human designers can”. Students S02 and S21 added, “The Lora model isn’t very compatible across different system versions. Sometimes switching platforms means reconfiguring everything, which is inconvenient”. Additionally, Student S12 remarked, “Some advanced GAI systems require very high-performance hardware, which makes them hard for ordinary students like us to use”.

4.2.4. Enhancing Student Engagement

Based on an in-depth analysis of 220 codes, positive emotions enhanced student engagement across three core dimensions: cognitive enhancement (75 codes), emotional activation (92 codes), and behavioral reinforcement (53 codes) (see Figure 8).
Cognitive Enhancement (75 codes). This theme reflects how customized GAI effectively stimulated students’ deep-thinking activities, promoting the establishment of systematic design cognition and improving learning capabilities. It comprises four categories: inspiring personalized thinking (25 codes), developing structured thinking (20 codes), improving student attention (18 codes), and enhancing learning motivation (12 codes). For example, Student S08 stated, “During the Lora training process, I began to think more deeply about the essence of my design style and gained a clearer understanding of the core of my design philosophy”. Student S03 noted, “With the assistance of GAI, I can organize my design ideas more systematically, and the entire creative process has become more structured”. He further added, “Since GAI can automatically handle repetitive tasks, I’m able to focus more of my energy on creative development”.
Emotional Activation (92 codes). This theme reflects how customized GAI significantly stimulated students’ emotional participation by creating personalized success experiences and fostering a sense of technological mastery, thereby strengthening emotional connections during the learning process. It encompasses four categories: increased learning confidence (27 codes), stimulated learning motivation (25 codes), enhanced sense of learning achievement (23 codes), and increased class enjoyment (17 codes). The novelty of the customized GAI technology sparked students’ intense interest in design learning, generating a strong desire to gain a deeper understanding of the field (S21). Students enhanced their learning confidence and sense of achievement by using the GAI to preview background information on design cultural development after class (S13). It is important to noted that the term “emotional activation” described here primarily refers to the discrete emotions experienced by students during interactions with the GAI system, such as excitement, confidence, and satisfaction, which are transient emotional states. According to CVT, these emotional experiences can further influence the indicators of emotional engagement measured in our quantitative assessments (e.g., “I feel excited and pleased during the design process”). The qualitative data thus provide insight into the mechanisms by which specific emotional states emerge and contribute to overall emotional engagement patterns.
Behavioral Reinforcement (53 codes). This theme reflects students’ professional identification with mastering customized GAI, which prompted behavioral changes. It comprises three categories: active seeking of guidance (22 codes), active engagement in design communication (18 codes), and continuous learning (13 codes). Students reported spending considerable time daily training and optimizing GAI models, proactively contacting instructors for assistance even when encountering problems outside of class, and particularly enjoyed sharing training results and parameter-setting experiences with classmates (S08).

5. Discussion

This study examined the impact of customized GAI integration in visual communication design education on student engagement, and the relationship between emotional experiences and engagement. The quantitative analysis results demonstrate that the ap-plication of a customized GAI significantly enhanced students’ cognitive, emotional, and behavioral engagement. This finding is further supported by the fact that GAI can foster collaborative educational environments that promote student engagement. Although no research has been conducted on the direct impact of GAI on student engagement in art and design education, existing evidence indicates that it can increase students’ enthusiasm and participation. Cao et al. found that accurate, timely, and personalized feedback from AI technology can encourage students’ participation in learning environments [45]. Similarly, Owoseni et al. reported that AI can provide supportive and inclusive learning environments, facilitating student engagement [46]. Furthermore, Liu et al. confirmed the role of GAI technology in establishing deep collaborative learning environments that comprehensively enhance student engagement [41]. These findings support the results of the current study, indicating that incorporating customized GAI into visual communication design education can effectively boost student engagement.
The qualitative findings further reveal that students experienced both positive and negative emotions triggered by GAI integration in visual communication design courses. Specifically, excitement, satisfaction, curiosity, pleasure, and enjoyment were the most frequently perceived positive emotions. Conversely, stress, doubt, worry, confusion, and frustration were the predominant negative emotions. These emotional responses exhibited dynamic fluctuations throughout the learning process, with positive emotions demonstrating a clear overall dominance. These findings are consistent with previous research highlighting the prevalence of both positive and negative emotional experiences among students in AI-integrated higher education contexts [26,56]. This may stem from the novelty of GAI in visual communication design education fields. Students familiar with AI perceived GAI as a source of excitement and pride because of its innovative nature. Simultaneously, students’ emotional sensitivity and vulnerability when encountering novel technologies and pedagogical approaches may have triggered negative emotional responses [57]. Additionally, the complexity of design students’ emotional experiences and potential sociocultural influences may have shaped these results.
Further analysis revealed the sources of these emotions. Positive emotions may have originated from the customized GAI’s enhancement of students’ perceived control and learning value. Through the collaborative functioning of text analysis and image generation modules, the customized GAI realized full-process instructional assistance spanning concept comprehension, design analysis, visual presentation, design evaluation, and optimization. This holistic support enhances students’ control over their learning processes, fostering positive attitudes toward GAI. This phenomenon can be explained by the CVT: students primarily valued the integration of AI into their design courses, affirmed the instructional quality, and experienced a sense of control over their learning, which in turn fostered positive emotions [21]. Furthermore, customized GAI-generated content broke conventional thinking barriers, encouraging students to explore cross-cultural and cross-stylistic design approaches that broke through creative constraints and traditional thinking patterns. This expansion of creative capabilities enabled students to recognize the value of customized GAI in broadening creative thinking, thereby stimulating curiosity and a sense of wonder. This finding aligns with Creely and Blannin’s assertion that the GAI helps students break free from cognitive rigidity by providing diverse and unconventional ideas, thereby stimulating intrinsic motivation [58]. Finally, students enhanced their career confidence through customized GAI use, recognizing its professional value, and developing greater expectations and self-assurance regarding their career prospects. This phenomenon may be related to current Chinese employment trends, where the demand for AIGC designers has surged. Students perceived that mastering GAI skills would provide competitive advantages, reinforcing their learning motivation and enhancing their positive emotions and career confidence.
Notably, our study found that customized GAI significantly reduced students’ anxiety specifically regarding the accuracy and controllability of final outputs compared to concerns documented with generic GAI tools. This finding diverges from existing research [27,28,59], which commonly identified student anxiety regarding output unpredictability. However, it is crucial to recognize that while customized GAI reduced outcome-related anxiety, it introduced different sources of stress related to the training process itself (26 codes for stress, 15 codes for unstable Lora training performance). This suggests a shift in the locus of anxiety rather than its complete elimination: students experienced less apprehension about whether the system would produce usable results, but more concern about whether they could successfully configure the system to do so. This shift may stem from customized GAI’s “precise control” and “reproducible creation” capabilities, which clarified students’ creative agency and enhanced their engagement and identification with the creative process. In actual instruction, students were able to train models according to their preferences and achieved personalized expression through parameter constraints and workflow design, thereby significantly reducing uncertainty and anxiety during the creative process (S06, S13, S22). Compared to previous concerns about result accuracy, students’ cognitive focus shifted toward the active control of the creative process. This shift essentially restructures the collaborative relationship between designers and AI: designers are no longer passive recipients of AI-generated content; they have become co-creators capable of guiding AI.
However, customized GAI integration may also generate negative emotions by diminishing students’ sense of control and learning value. Students developed concerns about over-reliance on GAI, fearing it could impair their creativity and professional skills. Vasconcelos et al. support this concern, noting that individuals tend to accept AI-generated content even when it is incorrect [60]. Passi and Vorvoreanu explored the challenges users face when evaluating their dependency on AI [61]. These concerns are consistent with Kaitharath et al. ’s findings that while GAI helps enhance learning experiences, overreliance on it may inhibit students’ creativity [62]. Another potential factor may be related to the challenges of mastering customized GAI. Although researchers provided students with one week of technical training before the experiment, the students indicated that fully leveraging AI capabilities requires considerable technical knowledge. They had difficulty training suitable Lora models, particularly in terms of sample preprocessing and parameter tuning (S19 and S22). The findings reflect gaps in technical literacy among design students, where some students possessed adequate technical skills and held positive attitudes toward AI use, while students with lower technical literacy experienced learning pressure. Furthermore, although this study employed a customized GAI, it inevitably encountered technical limitations, like previous design education research based on general purpose GAI. Issues such as unstable Lora training performance and constraints in creating metaphorical visual elements created stress in both the learning and application processes.
Finally, the study found that students experiencing more positive emotions tended to demonstrate higher levels of cognitive deepening, emotional activation, and behavioral reinforcement. This result is consistent with Wang et al. ’s study, which emphasizes the facilitative role of pleasurable experiences in student engagement [63]. Shen also indicated that positive emotions are important predictors of engagement [64]. This finding further supports CVT’s core assumption that positive emotions enhance students’ intrinsic and extrinsic motivation, thereby promoting self-regulated learning. Within this framework, self-regulatory capacity builds upon cognitive flexibility and integrates metacognitive, meta-motivational, and meta-affective strategies to achieve learning task adaptation and individual goal attainment [65]. When students experience positive emotions, such as excitement, satisfaction, and confidence, during customized GAI interactions, they are more likely to mobilize cognitive resources and enhance design learning autonomy, thereby fostering sustained student engagement.

6. Conclusions

This study provides empirical evidence regarding the application of customized GAI in visual communication design education, revealing its impact on student engagement and emotions. On the one hand, customized GAI enhanced students’ sense of learning control and value, stimulated positive emotions, and subsequently improved student engagement. However, it also triggered negative emotions such as anxiety, frustration, and doubt due to issues including technological dependency, learning pressure, and technical limitations, thereby inhibiting engagement performance. These findings call for a balanced approach between technological integration and students’ emotional implementation to promote effective and responsible GAI integration in design education. This study has significant implications for CVT and technology-enhanced learning models. It extends CVT from traditional education to GAI-driven visual communication design education and expands technology-enhanced learning models by integrating emotion. Furthermore, the professional advantages of customized GAI alleviated students’ traditional concerns about AI-generated content accuracy, achieving a cognitive upgrade from “outcome-oriented” to “process-controlled” approaches, and facilitating learner role transformation. This insight enriches our understanding of AI-designer collaborative relationships and provides direction for promoting sustainable collaboration and future research in this field.
Our research also provides valuable insights into teaching practices. First, we recommend enhancing GAI technical training to improve students’ AI literacy. The study found that students’ incomplete understanding of GAI may hinder the full utilization of its capabilities. Introducing GAI knowledge early in design courses can help students apply it effectively in their subsequent coursework. Future curricula should encompass GAI mechanisms, development, applications, and usage examples, as well as fundamental principles, including advanced technologies such as prompt engineering, model fine-tuning, and collaborative integration of different GAI applications. Second, we recommend strengthening core design literacy instructions to enhance student autonomy. Teachers should reinforce fundamental design theory and aesthetics training, guide students to identify and evaluate the quality of GAI-generated content, enhance aesthetic judgment, and reduce technological dependency. Instructional design should focus on constructing a “human-AI collaborative creation” learning model that guides students to critical analysis and creative transformation of AI-generated content through reflective tasks, thereby strengthening design leadership and visual narrative capabilities. Finally, creating a positive classroom environment and emphasizing emotional regulation are crucial. We recommend that teachers establish a “technological exploration experience sharing continuous innovation” learning model to leverage GAI’s benefits. For example, teachers can implement phased achievement presentations and technical sharing sessions, encouraging students to gain recognition and confidence through exchange, thereby continuously enhancing their learning motivation and engagement.
This study has several limitations. First, the sample size was relatively small, and the study was limited to specific AI tools (GPT-4 and Stable Diffusion) within a visual communication design teaching context, which may constrain the generalizability of the findings. Second, the six-week intervention period was short, and interaction effects between group and time were not examined, potentially limiting a comprehensive assessment of GAI’s impact. Finally, this study did not explore potential intergenerational or cultural biases in the image models. Future research should consider larger and more diverse samples across different design education contexts and adopt longer-term longitudinal designs to assess sustained effects. It is also recommended to incorporate objective evaluation methods, such as analysis of work quality and interaction effects, to better understand the relationship between engagement changes and intervention outcomes. Future studies should prioritize cognitive learning outcomes, focusing on how customized GAI influence students’ understanding, problem-solving, and creativity. Additionally, future studies should address potential cultural or ethical biases in image generation models. Through continued research, we expect to provide more scientific guidance for achieving sustainable integration of GAI in design education and promote digital transformation and innovative development in creative education.

Author Contributions

Conceptualization, H.L.; methodology, H.L.; software, H.L.; validation, H.L. and L.S.; formal analysis, H.L.; investigation, H.L.; writing—original draft, H.L.; writing—review and editing, H.L. and S.K.; visualization, L.S.; Funding acquisition, L.S.; supervision, S.K. All authors commented on previous versions of the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

This research was approved by School of the Arts, Kyungpook National University (ID: AS20250611 Date: 11 June 2025).

Informed Consent Statement

Informed consent was obtained from all subjects involved in this study.

Data Availability Statement

Data are contained within the article.

Acknowledgments

The authors would like to thank all of those who supported us in this work. Thanks to the reviewers for their comments and efforts to help improve the paper.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
GAIGenerative Artificial Intelligence
CVTControl-Value Theory

References

  1. UNESCO. Global Education Monitoring Report 2023: Technology in Education and Sustainable Development; UNESCO: Paris, France, 2023; Available online: https://unesdoc.unesco.org/ark:/48223/pf0000386165 (accessed on 10 October 2025).
  2. Matthews, B.; Shannon, B.; Roxburgh, M. Destroy all humans: The dematerialization of the designer in an age of automation and its impact on graphic design—A literature review. Int. J. Art Des. Educ. 2023, 42, 367–383. [Google Scholar] [CrossRef]
  3. Yang, Z.; Shin, J. The impact of Gen AI on art and design program education. Des. J. 2025, 28, 310–326. [Google Scholar] [CrossRef]
  4. Vartiainen, H.; Tedre, M. Using artificial intelligence in craft education: Crafting with text-to-image generative models. Digit. Creat. 2023, 34, 1–21. [Google Scholar] [CrossRef]
  5. Kicklighter, C.; Seo, J.H.; Andreassen, M.; Bujnoch, E. Empowering creativity with generative AI in digital art education. In ACM SIGGRAPH 2024 Educator’s Forum; ACM: New York, NY, USA, 2024; pp. 1–2. [Google Scholar] [CrossRef]
  6. Cui, W.; Liu, M.J.; Yuan, R. Exploring the integration of generative AI in advertising agencies: A co-creative process model for human–AI collaboration. J. Advert. Res. 2025, 65, 167–189. [Google Scholar] [CrossRef]
  7. Bond, M.; Buntins, K.; Bedenlier, S.; Zawacki-Richter, O.; Kerres, M. Mapping research in student engagement and educational technology in higher education: A systematic evidence map. Int. J. Educ. Technol. High. Educ. 2020, 17, 2. [Google Scholar] [CrossRef]
  8. Khurma, O.A.; Albahti, F.; Ali, N.; Bustanji, A. AI ChatGPT and student engagement: Unraveling dimensions through PRISMA analysis for enhanced learning experiences. Contemp. Educ. Technol. 2024, 16, ep503. [Google Scholar] [CrossRef]
  9. Qian, Y. Pedagogical applications of generative AI in higher education: A systematic review of the field. TechTrends 2025, 69, 1105–1120. [Google Scholar] [CrossRef]
  10. Fredricks, J.A.; Blumenfeld, P.C.; Paris, A.H. School engagement: Potential of the concept, state of the evidence. Rev. Educ. Res. 2004, 74, 59–109. [Google Scholar] [CrossRef]
  11. Sinatra, G.M.; Heddy, B.C.; Lombardi, D. The challenges of defining and measuring student engagement in science. Educ. Psychol. 2015, 50, 1–13. [Google Scholar] [CrossRef]
  12. Reeve, J.; Tseng, C.-M. Agency as a Fourth Aspect of Students’ Engagement during Learning Activities. Contemp. Educ. Psychol. 2011, 36, 257–267. [Google Scholar] [CrossRef]
  13. Lin, H.C.K.; Feng, C.H.; Liu, Y.L.E. Generative AI-Enhanced UX Storyboarding: Transforming Novice Designers’ Engagement and Reflective Thinking for IoMT Product Design. Int. J. Technol. Des. Educ. 2025, 1–26. [Google Scholar] [CrossRef]
  14. El Fathi, T.; Saad, A.; Larhzil, H.; Al Ibrahmi, E.M. Integrating generative AI into STEM education: Enhancing conceptual understanding, addressing misconceptions, and assessing student acceptance. Discip. Interdiscip. Sci. Educ. Res. 2025, 7, 6. [Google Scholar] [CrossRef]
  15. Abanyam, F.E.; Edeh, N.I.; Abanyam, V.A.; Obimgbo, J.I.; Nwokike, F.O.; Ugwunwoti, P.E.; Idris, B.A. Artificial intelligence: Interactive effect of Google Classroom and learning analytics on academic engagement of business education students in universities in Nigeria. Lib. Philos. Pract. 2023, 7703. Available online: https://digitalcommons.unl.edu/libphilprac/7703 (accessed on 10 October 2025).
  16. Saritepeci, M.; Yildiz Durak, H. Effectiveness of artificial intelligence integration in design-based learning on design thinking mindset, creative and reflective thinking skills: An experimental study. Educ. Inf. Technol. 2024, 29, 25175–25209. [Google Scholar] [CrossRef]
  17. Wang, Y.; Xue, L. Using AI-driven chatbots to foster Chinese EFL students’ academic engagement: An intervention study. Comput. Hum. Behav. 2024, 159, 108353. [Google Scholar] [CrossRef]
  18. Brisco, R.; Hay, L.; Dhami, S. Exploring the role of text-to-image AI in concept generation. Proc. Des. Soc. 2023, 3, 1835–1844. [Google Scholar] [CrossRef]
  19. Miao, F.; Holmes, W. Guidance for Generative AI in Education and Research; UNESCO: Paris, France, 2023; Available online: https://unesdoc.unesco.org/ark:/48223/pf0000386693 (accessed on 1 July 2025).
  20. Sáez-Velasco, S.; Alaguero-Rodríguez, M.; Delgado-Benito, V.; Rodríguez-Cano, S. Analyzing the impact of generative AI in arts education: A cross-disciplinary perspective of educators and students in higher education. Informatics 2024, 11, 37. [Google Scholar] [CrossRef]
  21. Pekrun, R. The control-value theory of achievement emotions: Assumptions, corollaries, and implications for educational research and practice. Educ. Psychol. Rev. 2006, 18, 315–341. [Google Scholar] [CrossRef]
  22. Kahu, E.; Stephens, C.; Leach, L.; Zepke, N. Linking academic emotions and student engagement: Mature-aged distance students’ transition to university. J. Furth. High. Educ. 2014, 39, 481–497. [Google Scholar] [CrossRef]
  23. Martucci, A.; Graziani, D.; Bei, E.; Bischetti, L.; Bambini, V.; Gursesli, M.C. Emotional engagement in a humor-understanding reading task: An AI study perspective. Curr. Psychol. 2025, 44, 7818–7831. [Google Scholar] [CrossRef]
  24. Alias, M.; Lashari, T.A.; Akasah, Z.A.; Kesot, M.J. Self-efficacy, attitude, student engagement: Emphasising the role of affective learning attributes among engineering students. Int. J. Eng. Educ. 2018, 34, 226–235. [Google Scholar]
  25. Yuan, L.; Liu, X. The effect of artificial intelligence tools on EFL learners’ engagement, enjoyment, and motivation. Comput. Hum. Behav. 2025, 162, 108474. [Google Scholar] [CrossRef]
  26. Yang, L.; Zhao, S. AI-Induced Emotions in L2 Education: Exploring EFL Students’ Perceived Emotions and Regulation Strategies. Comput. Hum. Behav. 2024, 159, 108337. [Google Scholar] [CrossRef]
  27. Gao, W.; Mei, Y.; Duh, H.; Zhou, Z. Envisioning the incorporation of generative artificial intelligence into future product design education: Insights from practitioners, educators, and students. Des. J. 2025, 28, 346–366. [Google Scholar] [CrossRef]
  28. Grájeda, A.; Córdova, P.; Córdova, J.P.; Laguna-Tapia, A.; Burgos, J.; Rodríguez, L.; Sanjinés, A. Embracing artificial intelligence in the arts classroom: Understanding student perceptions and emotional reactions to AI tools. Cogent Educ. 2024, 11, 1. [Google Scholar] [CrossRef]
  29. Su, H.; Mokmin, N.A.M. Unveiling the canvas: Sustainable integration of AI in visual art education. Sustainability 2024, 16, 7849. [Google Scholar] [CrossRef]
  30. Fathoni, A.F. Leveraging generative AI solutions in art and design education: Bridging sustainable creativity and fostering academic integrity for innovative society. E3S Web Conf. 2023, 426, 01102. [Google Scholar] [CrossRef]
  31. Chen, J.; Ni, C.; Lin, P.; Lin, R. Designing the future: A case study on human-AI co-innovation. Creat. Educ. 2024, 15, 474–494. [Google Scholar] [CrossRef]
  32. Fleischmann, K. The commodification of creativity: Integrating generative artificial intelligence in higher education design curriculum. Innov. Educ. Teach. Int. 2024, 1–15. [Google Scholar] [CrossRef]
  33. Huang, Z.; Fu, X.; Zhao, J. Research on AIGC-integrated design education for sustainable teaching: An empirical analysis based on the TAM and TPACK models. Sustainability 2025, 17, 5497. [Google Scholar] [CrossRef]
  34. Chen, J.; Mokmin, N.A.M.; Su, H. Integrating Generative Artificial Intelligence into Design and Art Course: Effects on Student Achievement, Motivation, and Self-Efficacy. Innov. Educ. Teach. Int. 2025, 1–16. [Google Scholar] [CrossRef]
  35. Huang, K.; Liu, Y.; Dong, M.; Lu, C. Integrating AIGC into product design ideation teaching: An empirical study on self-efficacy and learning outcomes. Learn. Instr. 2024, 92, 101929. [Google Scholar] [CrossRef]
  36. Lin, H.; Jiang, X.; Deng, X.; Bian, Z.; Fang, C.; Zhu, Y. Comparing AIGC and traditional idea generation methods: Evaluating their impact on creativity in the product design ideation phase. Think. Ski. Creat. 2024, 54, 101649. [Google Scholar] [CrossRef]
  37. Meyer, D.K.; Turner, J.C. Discovering emotion in classroom motivation research. Educ. Psychol. 2002, 37, 107–114. [Google Scholar] [CrossRef]
  38. Schutz, P.A.; Lanehart, S.L. Introduction: Emotions in Education. Educ. Psychol. 2002, 37, 67–68. [Google Scholar] [CrossRef]
  39. Pekrun, R.; Frenzel, A.C.; Götz, T.; Perry, R.P. Control-Value Theory of Academic Emotions: How Classroom and Individual Factors Shape Students Affect. In Proceedings of the Annual Meeting of the American Educational Research Association; AERA: Washington, DC, USA, 2006; Available online: https://kops.uni-konstanz.de/handle/123456789/36408 (accessed on 10 October 2025).
  40. Li, Y.; Chiu, T. The mediating effects of needs satisfaction on the relationship between teacher support and student engagement with generative artificial intelligence (GenAI) chatbots from a self-determination theory (SDT) perspective. Educ. Inf. Technol. 2025, 30, 20051–20070. [Google Scholar] [CrossRef]
  41. Liu, Y.L.E.; Lee, T.P.; Huang, Y.M. Enhancing student engagement and higher-order thinking in human-centred design projects: The impact of generative AI-enhanced collaborative whiteboards. Interact. Learn. Environ. 2025, 1–18. [Google Scholar] [CrossRef]
  42. Ezeoguine, E.P.; Eteng-Uket, S. Artificial intelligence tools and higher education student’s engagement. Edukasiana J. Inov. Pendidik. 2024, 3, 300–312. [Google Scholar] [CrossRef]
  43. Chaudhry, I.S.; Sarwary, S.A.M.; El Refae, G.A.; Chabchoub, H. Time to revisit existing student’s performance evaluation approach in higher education sector in a new era of ChatGPT—A case study. Cogent Educ. 2023, 10, 1. [Google Scholar] [CrossRef]
  44. Cotton, D.R.E.; Cotton, P.A.; Shipway, J.R. Chatting and cheating: Ensuring academic integrity in the era of ChatGPT. Innov. Educ. Teach. Int. 2023, 61, 228–239. [Google Scholar] [CrossRef]
  45. Cao, S.; Phongsatha, S. An empirical study of the AI-driven platform in blended learning for Business English performance and student engagement. Lang. Test. Asia 2025, 15, 39. [Google Scholar] [CrossRef]
  46. Owoseni, A.; Kolade, O.; Egbetokun, A. Enhancing personalised learning and student engagement using generative AI. In Generative AI in Higher Education; Palgrave Macmillan: Cham, Switzerland, 2024; pp. 67–85. [Google Scholar] [CrossRef]
  47. Zhao, H.; Li, S.; Xu, H.; Ye, L.; Chen, M. The influence of educational psychology on modern art design entrepreneurship education in colleges. Front. Psychol. 2022, 13, 843484. [Google Scholar] [CrossRef]
  48. Lively, J.; Hutson, J.; Melick, E. Integrating AI-generative tools in web design education: Enhancing student aesthetic and creative copy capabilities using image and text-based AI generators. DS J. Artif. Intell. Robot. 2023, 1, 23–36. [Google Scholar] [CrossRef]
  49. Li, J.; Liu, S.; Zheng, J.; He, F. Enhancing visual communication design education: Integrating AI in collaborative teaching strategies. J. Comput. Methods Sci. Eng. 2024, 24, 2469–2483. [Google Scholar] [CrossRef]
  50. Creswell, J.W.; Plano Clark, V.L.; Gutmann, M.; Hanson, W. Advanced mixed methods research designs. In Handbook on Mixed Methods in the Behavioral and Social Sciences; Tashakkori, A., Teddlie, C., Eds.; Sage Publications: Thousand Oaks, CA, USA, 2003; pp. 209–240. [Google Scholar]
  51. Nunnally, J.C. An overview of psychological measurement. In Clinical Diagnosis of Mental Disorders: A Handbook; Wolman, B.B., Ed.; Springer: Boston, MA, USA, 1978; pp. 97–146. [Google Scholar] [CrossRef]
  52. Mao, Y.; Ge, Y.; Fan, Y.; Xu, W.; Mi, Y.; Hu, Z.; Gao, Y. A survey on LoRA of large language models. Front. Comput. Sci. 2025, 19, 197605. [Google Scholar] [CrossRef]
  53. Shi, Y.; Hu, Y.; Bai, Y.; Zhou, Z.; Liang, Z.; Jiang, Y.; Du, X. Vibration of creativity: Exploring the relationship between appraisal shift and creative process in design teams. Think. Skills Creat. 2024, 54, 101663. [Google Scholar] [CrossRef]
  54. Collins, K.M.T.; Onwuegbuzie, A.J.; Sutton, I.L. A model incorporating the rationale and purpose for conducting mixed-methods research in special education and beyond. Learn. Disabil. 2006, 4, 67–100. [Google Scholar]
  55. Braun, V.; Clarke, V. Conceptual and design thinking for thematic analysis. Qual. Psychol. 2022, 9, 3–26. [Google Scholar] [CrossRef]
  56. Hwang, Y.; Wu, Y. The influence of generative artificial intelligence on creative cognition of design students: A chain mediation model of self-efficacy and anxiety. Front. Psychol. 2025, 15, 1455015. [Google Scholar] [CrossRef] [PubMed]
  57. Lacave, C.; Velázquez-Iturbide, J.Á.; Paredes-Velasco, M.; Molina, A.I. Analyzing the influence of a visualization system on students’ emotions: An empirical case study. Comput. Educ. 2020, 149, 103817. [Google Scholar] [CrossRef]
  58. Creely, E.; Blannin, J. Creative Partnerships with Generative AI: Possibilities for Education and Beyond. Think. Skills Creat. 2025, 56, 101727. [Google Scholar] [CrossRef]
  59. Chan, C.K.Y.; Hu, W. Students’ voices on generative AI: Perceptions, benefits, and challenges in higher education. Int. J. Educ. Technol. High. Educ. 2023, 20, 43. [Google Scholar] [CrossRef]
  60. Vasconcelos, H.; Krishna, R.; Bernstein, M.; Gerstenberg, T. AI Overreliance Is a Problem: Are Explanations a Solution? Stanford HAI. 2023. Available online: https://hai.stanford.edu/news/ai-overreliance-problem-are-explanations-solution (accessed on 21 July 2025).
  61. Passi, S.; Vorvoreanu, M. Overreliance on AI: Literature Review; Microsoft Research: Redmond, WA, USA, 2022; Available online: https://www.microsoft.com/en-us/research/uploads/prod/2022/06/Aether-Overreliance-on-AI-Review-Final-6.21.22.pdf (accessed on 13 July 2025).
  62. Kaitharath, M.F.; Shifna, K.N.; Anuradha, P.S. Generative AI and Its Impact on Creative Thinking Abilities in Higher Education Institutions. In Impacts of Generative AI on Creativity in Higher Education; Fields, Z., Ed.; IGI Global: Hershey, PA, USA, 2024; pp. 143–180. [Google Scholar] [CrossRef]
  63. Wang, X.; Wan Jaafar, W.M.; Sulong, R.M. Building better learners: Exploring positive emotions and life satisfaction as keys to academic engagement. Front. Educ. 2025, 10, 1535996. [Google Scholar] [CrossRef]
  64. Shen, S.; Tang, T.; Pu, L.; Mao, Y.; Wang, Z.; Wang, S. Teacher emotional support facilitates academic engagement through positive academic emotions and mastery-approach goals among college students. SAGE Open 2024, 14, 21582440241245369. [Google Scholar] [CrossRef]
  65. Wolters, C.A. Regulation of Motivation: Evaluating an Underemphasized Aspect of Self-Regulated Learning. Educ. Psychol. 2003, 38, 189–205. [Google Scholar] [CrossRef]
Figure 1. Experimental design.
Figure 1. Experimental design.
Sustainability 17 09963 g001
Figure 2. Lora training process.
Figure 2. Lora training process.
Sustainability 17 09963 g002
Figure 3. Course design.
Figure 3. Course design.
Sustainability 17 09963 g003
Figure 4. Visual communication design pedagogy integrating customized GAI.
Figure 4. Visual communication design pedagogy integrating customized GAI.
Sustainability 17 09963 g004
Figure 5. The experimental procedure.
Figure 5. The experimental procedure.
Sustainability 17 09963 g005
Figure 6. Thematic map of the sources of positive emotions.
Figure 6. Thematic map of the sources of positive emotions.
Sustainability 17 09963 g006
Figure 7. Thematic map of the sources of negative emotions.
Figure 7. Thematic map of the sources of negative emotions.
Sustainability 17 09963 g007
Figure 8. Thematic map of enhancing student engagement.
Figure 8. Thematic map of enhancing student engagement.
Sustainability 17 09963 g008
Table 1. The results of normality tests (k–S).
Table 1. The results of normality tests (k–S).
GroupEngagementPre–TestPost–Test
StatisticDf.Sig.StatisticDf.Sig.
ExperimentalBE 10.091470.2000.078470.200
EE 20.095470.2000.086470.200
CE 30.118470.1470.115470.148
ControlBE 10.089470.2000.112470.197
EE 20.106470.2000.121470.142
CE 30.095470.2000.108470.200
1 Behavioral engagement. 2 Emotional engagement. 3 Cognitive engagement.
Table 2. Descriptive results for student engagement.
Table 2. Descriptive results for student engagement.
GroupEngagementNPre TestPost-TestChange Scores
MeanSDMeanSDΔ Mean95% CI
ExperimentalBE 1483.9110.5284.1980.5610.286[0.167, 0.405]
EE 23.7920.4864.1880.4110.396[0.270, 0.522]
CE 33.7570.5414.2290.5330.472[0.386, 0.558]
ControlBE 1483.8650.4913.9480.4180.083[−0.036, 0.202]
EE 23.7760.4573.7340.559−0.042[−0.144, 0.060]
CE 33.7330.4693.7530.5430.020[−0.050, 0.090]
1 Behavioral engagement. 2 Emotional engagement. 3 Cognitive engagement.
Table 3. Paired-samples t-test results for student engagement (experimental group).
Table 3. Paired-samples t-test results for student engagement (experimental group).
Mean (PRE)Mean (POST)Mean DifferenceStd. Deviationt-Valuep
BE 13.9114.1980.2860.4194.737<0.001
EE 23.7924.1880.3960.4466.147<0.001
CE 33.7574.2290.4720.30410.752<0.001
1 Behavioral engagement. 2 Emotional engagement. 3 Cognitive engagement.
Table 4. Paired-samples t-test results for student engagement (control group).
Table 4. Paired-samples t-test results for student engagement (control group).
Mean (PRE)Mean (POST)Mean DifferenceStd. Deviationt-Valuep 4
BE 13.8653.9480.0830.422−1.101>0.200
EE 23.7763.734−0.0420.3620.443>0.200
CE 33.7333.7530.0200.246−0.317>0.200
1 Behavioral engagement. 2 Emotional engagement. 3 Cognitive engagement. 4 All p-values greater than 0.200 are reported as >0.200 for clarity.
Table 5. The differences between the two groups’ behavioral engagement.
Table 5. The differences between the two groups’ behavioral engagement.
SourceType III Sum
of Squares
dfMean SquareFpPartial eta Squared
Corrected model2.85621.42812.25<0.0010.208
Pre-BE (covariates)0.97510.9758.36<0.010.082
Group1.88111.88116.13<0.0010.148
Table 6. The differences between the two groups’ emotional engagement.
Table 6. The differences between the two groups’ emotional engagement.
SourceType III Sum
of Squares
dfMean SquareFpPartial eta Squared
Corrected model4.32922.16517.28<0.0010.208
Pre-EE (covariates)2.48212.48219.81<0.0010.082
Group1.84711.84714.74<0.0010.116
Table 7. The differences between the two groups’ cognitive engagement.
Table 7. The differences between the two groups’ cognitive engagement.
SourceType III Sum
of Squares
dfMean SquareFpPartial eta Squared
Corrected model4.89222.44622.79<0.0010.329
Pre-CE (covariates)2.63412.63424.55<0.0010.209
Group2.25812.25821.04<0.0010.185
Table 8. Distribution of students’ emotions.
Table 8. Distribution of students’ emotions.
Emotional ExperiencesCategoriesNumberPercentage (%)
Positive (235 codes)Excitement3816.2
Satisfaction3213.6
Curiosity2912.3
Pleasure2811.9
Enjoyment2611.1
Confidence2510.6
Surprise239.8
Pride198.1
Anticipation156.4
Negative (112 codes)Stress2623.2
Doubt2219.6
Worry1816.1
Confusion1513.4
Anxiety1210.7
Frustration108.9
Fear54.5
Disappointment43.6
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, H.; Sun, L.; Kim, S. Effects of Customized Generative AI on Student Engagement and Emotions in Visual Communication Design Education: Implications for Sustainable Integration. Sustainability 2025, 17, 9963. https://doi.org/10.3390/su17229963

AMA Style

Li H, Sun L, Kim S. Effects of Customized Generative AI on Student Engagement and Emotions in Visual Communication Design Education: Implications for Sustainable Integration. Sustainability. 2025; 17(22):9963. https://doi.org/10.3390/su17229963

Chicago/Turabian Style

Li, He, Liang Sun, and Seongnyeon Kim. 2025. "Effects of Customized Generative AI on Student Engagement and Emotions in Visual Communication Design Education: Implications for Sustainable Integration" Sustainability 17, no. 22: 9963. https://doi.org/10.3390/su17229963

APA Style

Li, H., Sun, L., & Kim, S. (2025). Effects of Customized Generative AI on Student Engagement and Emotions in Visual Communication Design Education: Implications for Sustainable Integration. Sustainability, 17(22), 9963. https://doi.org/10.3390/su17229963

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop