Next Article in Journal
Framing and Evaluating Task-Centered Generative Artificial Intelligence Literacy for Higher Education Students
Previous Article in Journal
Multi-Objective Coordinated Control Model for Paths Considering Left-Turn Speed Guidance
Previous Article in Special Issue
Auditing AI Literacy Competency in K–12 Education: The Role of Awareness, Ethics, Evaluation, and Use in Human–Machine Cooperation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Exploring the Role of AI and Teacher Competencies on Instructional Planning and Student Performance in an Outcome-Based Education System

by
Wafa Naif Alwakid
1,
Nisar Ahmed Dahri
2,3,*,
Mamoona Humayun
4 and
Ghadah Naif Alwakid
5
1
Department of Business Administration, College of Business, Jouf University, Sakaka 72388, Saudi Arabia
2
Faculty of Educational Sciences and Technology (FEST), Universiti Teknologi Malaysia, Johor Bahru 81310, Malaysia
3
Faculty of Language Studies, Sohar University, Sohar 311, Oman
4
Department of Computing, School of Arts Humanities and Social Sciences, University of Roehampton, London SW15 5PJ, UK
5
Department of Computer Science, College of Computer and Information Sciences, Jouf University, Sakaka 72388, Saudi Arabia
*
Author to whom correspondence should be addressed.
Systems 2025, 13(7), 517; https://doi.org/10.3390/systems13070517 (registering DOI)
Submission received: 4 March 2025 / Revised: 9 June 2025 / Accepted: 18 June 2025 / Published: 27 June 2025

Abstract

The rapid integration of artificial intelligence (AI) in education has transformed traditional teaching methodologies, particularly within Outcome-Based Education (OBE), in higher education. Based on the Technological Pedagogical Content Knowledge (TPACK) model and the OBE system, this present study investigates how teachers perceive AI applications, specifically ChatGPT, in enhancing instructional design and student performance. The research develops a new AI-based instructional planning model, incorporating AI ChatGPT capabilities, teacher competencies, and their direct and indirect effects on student outcomes. This study employs quantitative research design using Structural Equation Modeling (SEM) to validate the proposed model. Data were collected from 320 university teachers in Pakistan using a structured survey distributed through WhatsApp and email. Findings from the direct path analysis indicate that AI ChatGPT capabilities significantly enhance instructional planning (β = 0.33, p < 0.001) and directly impact student performance (β = 0.20, p < 0.001). Teacher competencies also play an important role in instructional planning (β = 0.37, p < 0.001) and student performance (β = 0.16, p = 0.020). The indirect path analysis reveals that instructional planning mediates the relationship between AI ChatGPT capabilities and student performance (β = 0.160, p < 0.001), as well as between teacher competencies and student performance (β = 0.180, p < 0.001). The R-square values indicate that instructional planning explains 41% of its variance, while student performance accounts for 56%. These findings provide theoretical contributions by extending AI adoption models in education and offer practical implications for integrating AI tools in teaching. This study emphasizes the need for professional development programs to enhance educators’ AI proficiency and suggests policy recommendations for AI-driven curriculum development.

1. Introduction

The Outcome-Based Education (OBE) system has gained much importance in higher education as a student-centered system for achieving specific student learning outcomes. Compared to the traditional education systems, which are teacher-focused, OBE is concerned with ensuring curriculum planning, instructional strategies, and assessment are aligned with clearly defined competencies and skills students must acquire upon the completion of their courses [1]. The most important features of the OBE system include learning outcome classification, learner-centered instructional strategies, continuous assessment, and the convergence of instructional practices toward these outcomes [2]. The most important features of the OBE system include learning outcome classification, learner-centered instructional strategies, continuous assessment, and the convergence of instructional practices toward these outcomes [3]. The educational system guarantees that students acquire theoretical concepts and develop practical and analytical skills for success in the desired profession. However, achieving these objectives greatly depends on the power of instructional planning [4]. Moreover, instructional planning is the center of effective teaching and learning under the OBE model. Instructional planning comprises a series of interdependent tasks or activities, including lesson planning, assessment design, the preparation of course materials, and instructional delivery strategies [5,6]. Lesson planning makes every class session pertinent to targeted learning outcomes, improving a well-organized and goal-oriented pattern of teaching [7,8]. Similarly, assessment design, including formative and summative evaluations, enables a continuous measurement of students’ progress [9,10,11]. Course materials, including textbooks, digital materials, and interactive tools, are prepared and used to support learning objectives and improve student engagement [12,13]. Effective instructional planning also includes the integration of innovative instructional methodologies, addressing diverse student needs, and making teaching practices flexible and responsive [14,15].
Though instructional planning is the backbone, conventional instructional planning faces many challenges in the OBE system. Teachers often struggle to connect their lesson plans, assessments, and evaluations to the desired learning outcomes [14,16]. The primary challenge is the teacher’s inability to utilize the OBE framework effectively [17]. Most teachers do not possess adequate knowledge and skills in instructional planning [18], including outcome mapping, strategies for assessment, and integrating technology into pedagogy [19,20]. This lack of knowledge and skills leads to a disparity in attaining the desired educational objectives. Teachers often do not make effective and complete lesson plans, link assessments to the learning outcomes, nor utilize evaluation strategies that facilitate student growth [21]. Time constraints, poor access to resources, and insufficient opportunities for professional development exacerbate these challenges, thus resulting in ineffective implementation of the OBE framework in higher education in Pakistan [22,23]. Pirzada and Gull [24] noted that OBE improves teaching efficiency and student performance through competency-based learning. In Pakistan, where the quality and employability of higher education are challenged, OBE provides a systematic way of improving the quality of the academics. They discovered that OBE improves goal-centered teaching and performance-based evaluation, which aligns with international standards. It also enhances instructional planning, enabling institutions to produce employable graduates. The incorporation of OBE has the potential to improve the learning of students and faculty performance, aligning Pakistan’s education system with modern needs [24].
In modern education, integrating technological pedagogical content knowledge (TPACK) within the Outcome-Based Education (OBE) framework has become essential for enhancing instructional planning and student performance [5]. TPACK provides a structured approach that helps educators blend technology, pedagogy, and content effectively to create meaningful learning experiences [25]. Ren et al. [26] discussed in the study that the integration of AI-supported teaching is highly reliable because AI tools, such as ChatGPT, require both technological competence (addressed by TPACK) and clear alignment with instructional goals (as prescribed by OBE). This integration is particularly suitable for exploring AI tools in teaching because it ensures that technology is not used in isolation but is strategically applied to enhance student learning outcomes. Teachers equipped with TPACK can effectively incorporate AI tools into planning, delivery, and assessment, while the OBE framework ensures that all these instructional components are aligned with specific learning goals. This dual approach enables a more structured and outcome-focused use of AI technologies in higher education settings. Prior studies have highlighted the value of such integrated frameworks in technology-enhanced teaching and curriculum design, especially within competency-based education models] [24,25,26,27]. Asim et al. [27] highlighted that the OBE model emphasizes well-defined learning outcomes, requiring teachers to align instructional strategies and assessments with these objectives. However, many educators struggle with instructional planning due to limited technological proficiency and pedagogical alignment [28,29]. This gap can be addressed through artificial intelligence (AI) tools, which assist teachers in designing lesson plans, assessments, and instructional strategies tailored to student needs [30]. In recent years, artificial intelligence (AI) has shown immense potential to overcome existing challenges across all levels of education [31,32]. AI-based tools are increasingly integrated into teaching and learning processes, transforming how instruction is planned, delivered, and assessed [33,34]. In the educational sphere, AI has enabled personalized learning experiences, automated grading systems, real-time feedback, and enhanced student engagement [33,35]. Technologies like intelligent tutoring systems, learning management platforms, and virtual teaching assistants have proven their ability to elevate the quality of higher education [36]. ChatGPT, a generative AI language model, has been widely adopted in various educational settings [37,38]. Its advanced capabilities make it a powerful resource for educators, supporting tasks such as lesson planning, curriculum design, assessment creation, and feedback generation.
In an Outcome-Based Education (OBE) environment, ChatGPT helps teachers map learning outcomes to specific instructional activities to ensure alignment with the preferred competencies [39]. ChatGPT creates lesson plans specific to OBE principles, creates outcome-based assessments, and gives examples of course materials under OBE standards [39,40]. Furthermore, ChatGPT facilitates teachers’ professional development by providing extensive information and resources, enabling teachers to adopt more competent instructional practices [41]. One of the most profound impacts that ChatGPT can offer involves filling in gaps in instructional planning and teacher competency. By automating mundane teaching activities and making real-time suggestions, ChatGPT frees the instructor to focus on constructing valuable learning experiences. For instance, it will allow teachers to construct assessment rubrics matched with learning objectives, create individualized learning paths for varied learners, and point out areas where teaching practices can be improved [2,39]. Additionally, ChatGPT boosts students’ performance by providing personalized feedback and adaptive learning content according to their needs. These essential capabilities make it a valuable facilitator for improving instructional planning and students’ performance, as shown in Figure 1.
These tools, like ChatGPT and teacher capabilities, may revolutionize instructional planning and student performance. Artificial intelligence can handle most of the work that goes into instructional planning but cannot replace the teacher. The teacher has contextual awareness, critical thinking, and the ability to develop close relationships with students without Artificial Intelligence [39,42]. Instructors will be freed up for activities such as developing creative instructional strategies, mentoring students, and developing collaborative learning environments by incorporating the capabilities of AI. This synergistic effect of AI capabilities and teacher capabilities can enhance the effectiveness of instructional planning and ensure students deliver the intended learning outcomes in the OBE model. Despite the increasing uses of AI in education, recent studies have instead concentrated on the technical aspects of AI or its stand-alone applications in particular settings. Lo [41] has investigated the symbiosis of AI tools such as ChatGPT with teaching potential and their synergistic effects on instructional planning and learner performance.
There is limited empirical evidence demonstrating how AI-based principles of the OBE model guide instructional planning. This gap highlights the need for rigorous research to explore the transformative potential of AI in education. To bridge this gap, the present study investigates the combined influence of ChatGPT capabilities and teacher competencies on instructional planning and student achievement within the OBE paradigm. Unlike other research, this study concerns the symbiotic relationship between teachers and AI tools, emphasizing instructional planning as the key to achieving education goals. Through such symbiosis, the research expects to offer practical advice on how AI technologies can be adapted to overcome the limitations of conventional instructional planning and enhance education outcomes.
This study aims to answer the central research question: How do AI ChatGPT capabilities and teacher competencies interact to shape instructional planning and student achievement within the OBE model? To address this overarching question, the following objectives were formulated:
  • To explore teachers’ perceptions of AI ChatGPT capabilities in supporting instructional planning within the OBE paradigm.
  • To examine how teachers perceive the impact of their competencies on lesson planning and their ability to integrate AI tools.
  • To assess the perceived direct and indirect effects of ChatGPT capabilities and teacher competencies on student achievement.
  • To investigate the mediating effect of instructional planning in the relationship between AI capacities and teacher competencies, and their impact on student achievement.
The research objectives of this study are as follows:
This study contributes to the educational literature on AI with policy-relevant suggestions for policymakers, educators, and researchers. It highlights the potential of AI in learning planning to achieve OBE objectives and foster innovation in higher education.

2. Theoretical Background and Hypothesis Development

The theoretical framework of this research is essentially based on the integration of the Technological Pedagogical Content Knowledge (TPACK) model [43] and the Outcome-Based Education (OBE) paradigm [44]. This integrated approach provides a solid foundation for understanding how technology (specifically AI tools like ChatGPT) and teacher competencies synergistically influence instructional planning and student academic performance.
The TPACK framework is an extension of Pedagogical Content Knowledge (PCK) given by Shulman [45], which focuses on the intersection of content knowledge and teaching methods. Pierson [46] added Technological Knowledge (TK) to emphasize the digital tool component for enriching pedagogy and learning. Mishra and Koehler [25] then further developed this model, formally defining TPACK as the integration of content knowledge (CK), pedagogical knowledge (PK), and technological knowledge (TK), and their intersectional domains—Technological Pedagogical Knowledge (TPK), Technological Content Knowledge (TCK), and Pedagogical Content Knowledge (PCK). The model stresses that effective technology integration in education necessitates teachers to build competencies across these interrelated domains. Many studies have investigated the use of TPACK in various educational contexts. Ning et al. [42] reviewed how TPACK helps teachers use technology to support student-centered learning, whereas ref. [47] scrutinized teachers’ acquisition of TPACK skills through professional development courses. In education for artificial intelligence, researchers have particularly emphasized how TPACK can facilitate teachers’ and students’ learning of AI concepts. Research [40,42,47] demonstrates the use of TPACK in AI-based learning environments, showing that teachers who successfully incorporate AI tools have greater instructional effectiveness and student motivation.
However, TPACK alone does not address the outcome-oriented nature of instruction. For this reason, the study also incorporates the OBE framework. While TPACK highlights the integration of teaching components [25], OBE ensures alignment with desired learning outcomes. OBE emphasizes curriculum planning and teaching strategies that correlate directly with anticipated competencies [47].
Moreover, the OBE is a framework of instructional design that focuses on well-defined learning outcomes so that curriculum planning, teaching methods, and assessment schemes correlate with anticipated competencies [27]. In contrast to conventional content-driven methods, OBE reverses the focus on student accomplishments and competency-based instruction, necessitating teachers to build instructional strategies that directly impact measurable outcomes [48,49]. The implementation of OBE in higher education has been extensively researched, especially in engineering, medical, and teacher education [17,50]. Morcke et al. [51] investigated the application of OBE in medical education, emphasizing its efficiency in competency-based training. Likewise, Tan et al. [52] discussed OBE’s contribution to developing students’ critical thinking and problem-solving skills. In technology-rich learning environments, OBE offers a systematic approach to assessing the success of AI-based learning tools like ChatGPT. A study by Strielkowski et al. [35] indicates that AI-based platforms can assist teachers in mapping instructional content to OBE goals to ensure that students master the required skills and knowledge to achieve academic success.
These two models combined form an integrated foundation for analyzing the impact of AI tools like ChatGPT and teacher competencies on instructional planning and student performance, as shown in Figure 2. The TPACK-OBE model offers a model for evaluating AI’s contribution to instructional planning. AI applications like ChatGPT enhance teachers’ technological knowledge (TK) with real-time assistance, lesson planning, and customized feedback [53].
Recent advancements in generative AI, especially the introduction of ChatGPT, have opened new directions in instructional planning and personalized learning [39]. ChatGPT, a language-based AI model, supports teachers by generating customized lesson plans, formative assessments, and feedback according to learning outcomes. A study by Jung and Suh [54] found that ChatGPT refined instructional content and generated student-centered activities, enhancing creativity and reducing planning time. Moreover, research by Naznin et al. [55] and Wang et al. [56] demonstrated that higher education adopted ChatGPT to support curriculum alignment with competency-based education models, thereby improving planning efficiency and instructional coherence. Case studies also highlight ChatGPT’s role in augmenting teacher agency and reflective practice. Likewise, Kasneci et al. [57] investigated the application of ChatGPT in lesson design, revealing that while novice teachers relied more heavily on AI-generated content, experienced teachers used it to enhance pedagogical strategies and align them with outcome-based goals. Furthermore, Hussain et al. [58] reported integrating ChatGPT in education settings, where AI supported formative assessment tasks and enriched learner feedback loops, fostering student engagement and academic performance. Despite these benefits, challenges persist. Concerns over content accuracy, ethical use, and over-reliance on AI tools demand a balanced, competency-informed implementation strategy. Hence, embedding AI capabilities within the TPACK-OBE framework ensures that ChatGPT use is technologically feasible, pedagogically sound, and aligned with learning outcomes. Integrating recent findings into this framework reinforces its relevance and practical utility in guiding instructional planning in AI-enhanced educational environments. From an OBE perspective, AI promotes competency-based learning by mapping material to learning outcomes, enhancing student performance [42]. Findings show that incorporating AI within TPACK and OBE enhances teacher competencies, quality, student engagement, and performance [59,60].
The conceptual model argues that teacher competency and the advantages of AI ChatGPT are key determinants of instructional planning and student performance. ChatGPT helps educators with curriculum development, lesson planning, item writing, and individual student questions [61]. These capabilities enhance instructional planning and contribute indirectly to better student achievement. This is where teacher competency in knowledge, skills, and attitudes becomes crucial in determining the quality of instructional planning and, ultimately, student achievement. Ref. [62] revealed that when teachers are not sufficiently prepared to implement OBE, this often results in poorly designed lessons, inept assessments, and disengaged students. Teachers’ competencies will affect the development of instructional strategies so that they provide coherence with OBE principles and focus on measurable learning outcomes. The two combined form a synergistic effect and thus improve the quality of instruction and improve student performance. Instructional planning is the core of the OBE system and acts as a mediating variable, interconnecting AI tools, teacher competence, and student performance. Effective instructional planning ensures that learning objectives, pedagogical strategies, and assessment methods are well-integrated, thus leading to more cognitive engagement and improved academic performance [63]. This research bridges the gap by exploring the interaction between the constructs and their impact using hypothesis-driven design. Therefore, the study aims to examine the role of the AI capabilities of ChatGPT and teachers’ competencies in enhancing instructional planning and students’ academic performance synergistically and independently. The theoretical framework provides the opportunity for discussing hypothesized assumptions; it provides a holistic view of how technology, pedagogy, and learning outcomes interact within OBE.

2.1. AI ChatGPT Capabilities → Instructional Planning

Artificial Intelligence, the generative AI platform ChatGPT, has transformed educational instructional planning and implementation processes [64,65]. Instructional planning, a cornerstone of the Outcome-Based Education (OBE) framework, entails designing lesson plans, developing assessments, preparing course materials, and aligning teaching strategies with clearly defined learning outcomes. Traditional instructional planning often presents limitations, including time constraints, a lack of personalization, and inconsistent alignment between content, pedagogy, and assessment strategies [24,66].
These challenges can hinder educators’ ability to design compelling and engaging learning experiences. ChatGPT addresses these limitations through its advanced natural language processing (NLP) and content generation capabilities. It can automatically generate lesson plans aligned with specific learning outcomes, design formative and summative assessments, create customized instructional materials, and offer pedagogically appropriate explanations across various subjects [67,68]. By automating routine instructional tasks, ChatGPT allows educators to allocate more time for refining instructional strategies and adapting them to student needs, thereby enhancing the overall instructional quality and student engagement [33]. Flexibility in using ChatGPT also implies training for meeting varied pedagogic requirements from different subject areas: providing templates, suggesting creative pedagogies, or, where required, concisely explaining the complexity to make this process manifold [69]. Case studies, such as Zou et al. [70], demonstrate ChatGPT’s effectiveness in co-creating STEM lesson plans with novice teachers, enabling improved clarity, structure, and outcome alignment. Similarly, Naznin et al. [55] reported enhanced teacher productivity and student engagement when ChatGPT was integrated into the curriculum design for higher education courses. Empirical research has identified how such an emerging potential of using AI in a learning environment has assured effectiveness in planning teaching and matching learning outcomes for learners [61].
Moreover, this impact can be effectively understood through the technological Pedagogical Content Knowledge (TPACK) model, which emphasizes effective teaching results from the combined content knowledge, pedagogical expertise, and technological proficiency. ChatGPT enhances teachers’ TPACK by enabling them to integrate technology meaningfully with their pedagogical strategies and content expertise. Using ChatGPT, educators can develop lesson plans that not only meet curriculum standards (Content Knowledge) but also apply effective teaching methods (Pedagogical Knowledge) and leverage AI-generated suggestions, resources, or assessments (Technological Knowledge) to adapt the instruction to learners’ needs. This synthesis empowers educators to plan more effectively, ensuring instructional activities are engaging, goal-oriented, and aligned with OBE principles. This gives the teachers opportunities to overcome such common limitations of traditional planning as insufficient preparation time and a lack of customized educational resources. Thus, AI, or ChatGPT, will likely affect instructional planning positively. This present discussion throws light on the capacity of artificial intelligence to make pedagogical methods more effective and efficient, thereby ensuring better outcomes in the Outcome-Based Education (OBE) system. The following hypothesis will be tested empirically to confirm the contribution of AI technologies to the formulation of instructional planning:
H1. 
The use of AI ChatGPT significantly and positively influences instructional planning.

2.2. AI ChatGPT Capabilities → Students’ Performance

AI ChatGPT capabilities have emerged as a transformative force in improving student performance within educational settings. AI ChatGPT plays an important role in helping students to perform better in their studies. Integrating ChatGPT and other forms of artificial intelligence into learning environments significantly enhances the student’s learning experience through real-time, personalized feedback that improves retention and encourages participation [61]. Students’ performance, usually defined by academic grades, depth of understanding, and problem-solving ability, greatly relies on the quality of the learning materials and support [71].
From the perspective of Outcome-Based Education (OBE), student performance is the ultimate metric for evaluating the efficacy of instructional design and delivery. Within this framework, ChatGPT are positioned to play a critical role in helping students achieve defined learning outcomes by offering individualized support, reducing the cognitive load, and enhancing engagement [72]. In parallel, the Technological Pedagogical Content Knowledge (TPACK) model provides the theoretical underpinning for understanding how teachers’ competence in AI technologies, specifically their Technological Knowledge (TK), affects their ability to facilitate high-quality learning experiences [73,74].
In a conventional learning environment, there is a lack of availability of good teachers and homogeneous teaching techniques, with insufficient personal support, contributing to underachievement among students [23]. ChatGPT resolves such issues by providing detailed explanations, answering questions, and providing instant feedback. It offers learners a vast repository of knowledge and personalized learning processes and encourages interactive engagement through AI-based mentoring. Notably, the TPACK model emphasizes that teachers’ ability to effectively use AI tools (AI tool competence), when combined with sound pedagogical and content knowledge, enhances the delivery of instruction and positively impacts learner outcomes. For example, a teacher proficient in using ChatGPT can model more complex thinking, facilitate deeper inquiry, and guide self-directed learning skills that are directly linked to student academic performance [58,75].
Empirical research has demonstrated that AI-based learning tools like ChatGPT improve learners’ motivation, deep learning, and academic achievements by addressing individual learning needs [76,77]. Additionally, ChatGPT helps improve critical thinking skills by providing context-specific explanations, improving problem-solving processes, and improving self-assessment opportunities. By improving learning accessibility, interactivity, and personalization according to individual needs, ChatGPT plays a vital role in improving student performance. It is therefore assumed that the functionality of AI, more specifically ChatGPT, has a positive effect on academic achievements, making AI an essential tool in modern education.
H2. 
The features of AI ChatGPT significantly impact student performance.

2.3. Teacher Competency → Instructional Planning

Teacher competency plays a significant role in the effective planning of instruction. It includes pedagogical knowledge, subject matter competence, technological competency, and the ability to reconcile curriculum goals and the pedagogy of teaching [78,79]. Highly skilled teachers can use innovative instruction methods, synchronize instruction with the OBE paradigm, and utilize appropriate methods of assessment to monitor progress [80]. According to the TPACK framework, this integration depends on a teacher’s ability to synthesize Technological Knowledge (TK), Pedagogical Knowledge (PK), and Content Knowledge (CK) in a way that supports dynamic instructional planning aligned with curriculum goals and learner needs [81].
Instructional planning involves a systematic approach to designing learning experiences that meet targeted outcomes, including developing lesson plans, aligning content with learning objectives, integrating assessments, and incorporating appropriate teaching strategies [82]. As per the OBE model, teachers are responsible for designing instruction that ensures students achieve predefined competencies, making instructional planning a performance-critical activity [83].
Nevertheless, the existing literature supports the belief that many teachers face the challenge of planning instruction owing to inadequate training, unawareness of assessment design, and limitations in incorporating technology in pedagogy [84]. According to a study [85], teachers with sound pedagogic capabilities are more effective in constructing systematic instruction plans. Consequently, the student engagement is higher, and learning outcomes are enhanced. This depicts the importance of teacher competency in planning instruction since it directly relates to curriculum planning, planning for assessment, and reconciling pedagogic practice with instructional intent. However, teachers with strong pedagogical foundations and TPACK-based technological competence can better plan coherent, responsive, and student-centered instruction [86,87].
Teachers with high levels of competency are more adept at aligning their instructional strategies with clearly defined learning objectives, ensuring that the teaching activities are purposefully directed toward achieving the desired educational outcomes. Such teachers are also more capable of integrating the technology into their pedagogical practices in ways that support a variety of learning modalities, thereby catering to diverse student needs [88]. Additionally, they are skilled in designing meaningful and measurable assessments, enabling accurate evaluation of student learning progress. Competent educators foster inclusive learning environments responsive to learners’ varied backgrounds, abilities, and preferences, thereby promoting equitable access to quality education.
This perspective is strongly supported by the Technological Pedagogical Content Knowledge (TPACK) framework, which emphasizes the critical relationship between a teacher’s knowledge of content, pedagogy, and technology. According to the TPACK model, effective instructional planning is not solely based on content expertise nor pedagogical know-how but on a teacher’s ability to integrate all three domains in a cohesive and contextually appropriate manner. A teacher who understands how to use artificial intelligence tools like ChatGPT effectively can incorporate these technologies into their planning and delivery processes to create more interactive, personalized, and data-driven learning experiences. Such integration enhances instructional quality and aligns with the core principles of Outcome-Based Education (OBE), which prioritizes measurable learning achievements and learner-centered pedagogies.
H3. 
Teacher effectiveness has a significant positive impact on instructional planning.

2.4. Teacher Competency → Students’ Performance

Teacher competence is a significant factor in determining students’ performance as it affects the quality of instruction, student engagement approaches, and the responsiveness to varying learning needs [79]. Studies indicate that high-quality subject matter and pedagogically competent teachers create more interactive, engaging, and student-focused learning environments, thus improving academic performance [89]. Effective teaching involves assessing students’ learning needs, identifying their learning gaps, providing timely and constructive feedback, and adapting new instructional methods to enhance understanding and improve the learning outcomes [90,91].
Research consistently demonstrates that competent teachers who clearly understand curriculum objectives, employ diverse instructional strategies, integrate suitable digital tools, and effectively assess learning outcomes tend to create learning environments that foster higher levels of student engagement, motivation, and achievement [79,89]. These teachers are content experts and skilled facilitators of learning, capable of diagnosing individual student needs and addressing learning gaps through tailored instruction. They provide timely and constructive feedback, essential for guiding student improvement and encouraging reflective learning practices. Moreover, such educators utilize adaptive pedagogical approaches that respond dynamically to classroom contexts and individual learner differences, thereby enhancing the inclusivity and effectiveness of instruction. Furthermore, competent teachers promote the development of higher level thinking skills, such as critical thinking, problem-solving, and self-directed learning competencies, crucial for success in contemporary educational and professional landscapes. The Technological Pedagogical Content Knowledge (TPACK) framework stresses the importance of this multidimensional competence by highlighting how effective teaching depends on the strategic integration of content knowledge, pedagogy, and technology. When proficient in TPACK, teachers can seamlessly integrate tools like ChatGPT to support differentiated instruction and personalized learning pathways aligned with Outcome-Based Education (OBE) principles.
On the contrary, teachers with low competencies cannot deliver lessons, have inferior instructional strategies, and have inadequate assessment feedback, which harms the students’ learning, according to [92]. Empirical studies have illustrated that students taught by teachers with high TPACK proficiency tend to perform better in standardized assessments and exhibit deeper levels of understanding and skill acquisition compared to their peers who are instructed by less competent educators [92,93]. It is, therefore, argued that teachers’ competency reflects positively on students’ performance; therefore, continuous professional development and training of teachers is necessary.
H4. 
Teacher competency has a significant impact on enhancing students’ performance.

2.5. Instructional Planning → Students’ Performance

Instructional planning is a fundamental aspect of effective teaching, significantly influencing student learning outcomes. It entails the deliberate organization of learning activities, teaching methods, instructional materials, and assessment strategies coherently aligned with clearly defined learning objectives and principles central to the Outcome-Based Education (OBE) approach [58]. From the perspective of the Technological Pedagogical Content Knowledge (TPACK) framework, instructional planning requires the seamless integration of content expertise, pedagogical strategies, and technological tools to design instruction that is both goal-oriented and responsive to students’ diverse learning needs. Among other reasons, student performance relies significantly on effective instructional planning because it ensures learning goals, teaching methods, and assessment strategies are aligned to facilitate meaningful educational experiences [63]. In planning instruction, one considers the development of lesson materials and formative and summative assessments and selects an effective teaching strategy that fosters the essence of students’ engagement and understanding [94].
Effective instructional planning includes several key components: designing lessons with specific objectives that directly map to intended learning outcomes [95]; employing pedagogical techniques suited to students’ cognitive abilities and learning preferences [96]; integrating technology into personalized and interactive learning experiences [97]; and incorporating both formative and summative assessments to monitor progress and provide feedback [98]. Teachers who apply their TPACK knowledge during the planning process are better equipped to craft instructional experiences that are engaging, inclusive, and intellectually stimulating. Such planning promotes higher levels of student motivation and critical thinking and enhances academic achievement [78,79]. Technology-enhanced lesson plans can include simulations, multimedia resources, and instant feedback mechanisms, all contributing to deeper understanding and improved knowledge retention.
The evidence has shown that well-organized instructional planning may enhance student achievement by providing clear learning pathways, encouraging deeper cognitive engagement, and offering consistent progress monitoring [63,99]. Conversely, poor instruction planning often causes lessons to be disorganized, assessments to be ineffective, and students to not be highly engaged, leading to less motivation and poorer academic results [100]. It has also been proven that when teachers had differentiated instruction included in their plans and technology-enhanced tools, students had a heightened sense of problem-solving retention [101]. Since instructional planning directly impacts the quality of teaching and learning, effective planning contributes positively to student performance. Therefore, this calls for teachers to build their planning skills and use AI-based tools to enhance lesson organization and assessment strategies for improved learning outcomes.
H5. 
Effective instructional planning has a significant positive effect on student performance.

3. Methodology

3.1. Research Design

This study adopted a quantitative, cross-sectional research design to explore the interactions between AI ChatGPT 3.5 capabilities, teacher competencies, instructional planning, and student performance within an Outcome-Based Education (OBE) context. The quantitative approach aims to empirically test hypothesized relationships among latent variables and measure their direct and indirect effects [102]. A cross-sectional design was considered appropriate because the research sought to assess faculty perceptions and behaviors simultaneously rather than over an extended period, which aligns with the efficient nature of cross-sectional designs in capturing current phenomena [103]. The overall research design is shown in Figure 3.

3.2. Sampling Strategy

A non-probability convenience sampling technique was employed to recruit faculty members from three public-sector universities in Sindh, Pakistan. This sampling method was chosen by practical considerations such as administrative limitations, restricted institutional access, and the need to reach respondents efficiently through widely used communication channels like WhatsApp, institutional email, and learning management systems [103,104]. Although convenience sampling lacks the randomness of probability methods, it is commonly used in educational technology studies where the access to a broad population is limited but where the goal is to gain insights into specific user groups, educators actively involved in instructional design and technology integration.
From 330 distributed invitations, 320 fully completed and valid responses were obtained, yielding a response rate of 97.0%, which is notably high for online survey-based studies. This sample size is particularly justified for Partial Least Squares Structural Equation Modeling (PLS-SEM), which is effective for small to medium sample sizes and robust to non-normal data distributions [105]. According to statistical guidelines, a minimum of 10 responses per indicator or a minimum of 200 responses is acceptable for stable PLS-SEM estimation [106]. With 21 indicators used in this study, the final sample of 320 far exceeds these requirements, ensuring sufficient statistical power for hypothesis testing and mediation analysis.

3.3. Instrument Design and Validation

A structured questionnaire was designed to capture the four core constructs of the theoretical model: (1) AI ChatGPT Capabilities having five items: ACC1–ACC5, (2) Teacher Competency containing five items: TCO1–TCO5, (3) Instructional Planning containing six items: INP1–INP6, and Students’ Performance having five items: STP1–STP5. These items were taken from these studies [107,108] and other relevant studies [25,109]. The instrument consisted of 21 items, each measured using a five-point Likert scale ranging from 1 (Strongly Disagree) to 5 (Strongly Agree). These items were taken from these studies [107,108] and other relevant studies [25,109] and were selected based on their relevance to AI tool use, pedagogical planning, and perceived teaching efficacy. For transparency, the questionnaire items are provided in Appendix A.
The survey was reviewed by three domain experts specializing in instructional design, educational technologies, and AI integration in teaching to ensure the content validity. Their feedback led to minor wording adjustments to enhance the clarity and alignment with the OBE instructional framework. A pilot study was conducted with 55 faculty members to ensure the instrument reliability and internal consistency further. Results demonstrated high reliability across all constructs, with Cronbach’s alpha values exceeding 0.80 (AI ChatGPT Capabilities = 0.83; Teacher Competency = 0.89; Instructional Planning = 0.87; and Student Performance = 0.86), which meets the widely accepted threshold of 0.70 for high reliability [110].

3.4. Ethical Considerations

This research followed established ethical protocols. Participation was voluntary, and informed consent was acquired digitally before data collection. Participants were assured that their responses would be kept anonymous and confidential and that the data would be used solely for research purposes. These measures ensured compliance with ethical standards in social science research [111].

3.5. Data Analysis Procedure

Partial Least Squares Structural Equation Modeling (PLS-SEM) was applied using SmartPLS 4.0 software to test the hypothesized model and assess the relationships among constructs. The decision to employ PLS-SEM over other structural modeling techniques, such as covariance-based SEM (CB-SEM) or AMOS, was guided by the exploratory nature of the study and the model’s predictive orientation [105,112]. Unlike CB-SEM, which emphasizes model fit and theory confirmation, PLS-SEM is variance-based and better suited for predictive analysis and theory development [112,113].
Furthermore, the SEM analysis proceeded in two stages. Firstly, the measurement model was evaluated to ensure reliability and validity. This involved assessing factor loadings (threshold > 0.70), Cronbach’s alpha and composite reliability (threshold > 0.70), and AVE (threshold > 0.50). Discriminant validity was confirmed using HTMT ratios (threshold < 0.85) and Fornell-Larcker criteria [114].
Secondly, the structural model was assessed to test the hypothesized relationships among the constructs. This involved evaluating the R2 values to determine the explanatory power of independent variables on dependent constructs [115], f2 effect sizes to estimate the strength of individual predictors, and standardized path coefficients (β) to test directional hypotheses [116]. A bootstrapping procedure with 5000 resamples was employed to determine the estimated paths’ statistical significance (p-values), a method that ensures robust inference even in non-normal data distributions [117].
This theory-driven PLS-SEM approach strengthens the construct validity and supports robust causal inferences, effectively analyzing the relationship between teacher competencies, AI capabilities, instructional planning, and student performance within the OBE framework.

4. Findings

4.1. Survey Results

The survey was completed by 320 university lecturers from QUEST 192 (60%), SBBU 96 (30%), and SALU 32 (10%) to study the impact of Outcome-Based Education (OBE) and AI tools such as ChatGPT on teaching planning and student performance as highlighted in Table 1. A total of 218 (68%) males and 102 (32%) females made up the sample, of which 160 (50%) were lecturers, 96 (30%) assistant professors, 48 (15%) associate professors, and 16 (5%) professors. The levels of education were diverse, with 128 (40%) of them holding a Master’s degree, 112 (35%) an MPhil/MS, and 80 (25%) a PhD. Teaching experience was also diverse, where 144 (45%) had 1–5 years, 112 (35%) 6–10 years, and 64 (20%) more than 11 years of experience. The survey findings reflect a high acceptance of OBE and AI applications in higher education. A total of 282 (88%) respondents agreed that OBE improves student learning and academic achievement, and 272 (85%) agreed on its success in connecting course objectives with performance outcomes. AI tools such as ChatGPT were also well received, with 240 (75%) reporting better teaching planning and 256 (80%) specific to using AI to teach. Also, 250 (78%) believed that AI-based feedback systems improved student engagement. These findings reflect a positive trend towards AI-facilitated OBE systems, affirming the significance of technology-facilitated teaching methods in higher education. The findings reflect that AI and OBE can significantly enhance faculty instructional practices and student learning outcomes, allowing universities to keep up with contemporary educational standards better.

4.2. Measurement Model Analysis

Firstly, the convergent validity, reliability, and potential for multicollinearity of the measurement model were evaluated. The most critical parameters to be verified were factor loadings, Cronbach’s Alpha (α), Composite Reliability (CR), and Average Variance Extracted (AVE), as the findings are shown in Table 2. Convergent validity is confirmed when items measuring the same construct correlate. Convergent validity is evaluated using factor loadings, Composite Reliability (CR), and Average Variance Extracted (AVE). Hair et al. [112] suggest that factor loadings should be above 0.70, CR above 0.70, and AVE above 0.50. As seen from Table 2, the factor loadings are above the 0.70 benchmark for all the items, with the exception of ACC05 (0.68) and INP01 (0.69), which are slightly less but are acceptable [118]. The VIF for all items is below 5.0, confirming the absence of multicollinearity issues, as suggested by Ref. [119]. Additionally, the Composite Reliability (CR) values for all the constructs are above the suggested 0.70 benchmark, ranging from 0.88 to 0.91, thus confirming the internal consistency of the constructs, as recommended by Ref. [120]. Additionally, the AVE values are above 0.50, thus confirming convergent validity. The mathematical formula for AVE is:
A V E = λ i 2 n
where λi represents the factor loading of each indicator and n is the total number of items.
Cronbach’s Alpha (α) and Composite Reliability (CR) were used to assess reliability. The results indicate that Cronbach’s Alpha values for all constructs range from 0.83 to 0.88, exceeding the 0.70 threshold [110]. This confirms strong internal reliability.
The formula for Cronbach’s Alpha is:
α = N c ¯ v ¯ + N 1 c ¯
where N is the number of items, c ¯ is the average covariance between items, and v ¯ is the average variance.
Similarly, Composite Reliability (CR) was computed using:
C R = λ i 2 λ i 2 + θ i
where λi are item loadings and θi are measurement errors.
Measurement model results establish convergent validity and reliability of the constructs. CR values are more significant than 0.70, and AVE values are greater than 0.50, fulfilling standard criteria [120]. Hence, the constructs have good measurement properties.

4.3. Discriminant Validity Analysis

Discriminant validity ensures that a concept significantly differs from other concepts in a model. It is established using the Fornell-Larcker Criterion and Heterotrait-Monotrait Ratio (HTMT) [120]. These tools ensure that the concepts are more similar to their indicators than other concepts.
Fornell-Larcker Criterion compares a concept’s square root of Average Variance Extracted (AVE) with how it correlates with other ideas. Discriminant validity is established when the square root of AVE is greater than the correlation with different concepts.
In simple words, the Fornell-Larcker criterion can be articulated as:
A V E i > r i j
where AVEi represents the square root of the AVE of construct i, and rij is the correlation between constructs i and j.
Table 3 diagonal entries indicate the square root of AVE. They are more significant than other correlations, which indicates discriminant validity is present. As a case, AVE in AI ChatGPT Capabilities (ACC) is 0.77 and more significant than other correlations with Instructional Planning (INP) at 0.59, Student Performance (STP) at 0.71, and Teacher Competency (TCO) at 0.85. Discriminant validity is tested through HTMT by comparing average correlations of Heterotrait constructs (average correlations between other constructs) versus monotrait construct average correlations (average correlations in the same construct) [117]. HTMT under 0.90 (threshold) or 0.85 (strict threshold) indicates good discriminant validity. HTMT is computed as:
H T M T i j = c C i , d C j \ l v e r t r c d \ r v e r t c C i r c c d C j r d d n
Ci and Cj represent items belonging to constructs i and j, and rcd is the correlation between items c and d.
From Table 3, all HTMT values are below 0.90, confirming adequate discriminant validity. For example, the HTMT value between AI ChatGPT Capabilities (ACC) and Instructional Planning (INP) is 0.69, below the 0.85 threshold, confirming that they are distinct constructs.
The Fornell-Larcker Criterion and HTMT results confirm that all the constructs exhibit adequate discriminant validity. The square root of AVE for each construct is more significant than its correlations with the other constructs, and all the HTMT values remain below 0.85, ensuring that the constructs are empirically distinct.
  • Effect Size (f2) and Coefficient of Determination (R2) Analysis
The effect size (f2) measures the relative impact of an independent variable on a dependent variable within the structural model [116]. It quantifies how much an exogenous construct contributes to explaining the variance of an endogenous construct. According to Cohen [116], f2 is interpreted as:
  • Small effect: 0.02 ≤ f2 < 0.15
  • Moderate effect: 0.15 ≤ f2 < 0.35
  • Large effect: f2 ≥ 0.35
The mathematical formula for calculating f2 is:
f2 = (R2_included − R2_excluded)/(1 − R2_included)
Interpretation of f2 Values from results, the following effect sizes were observed:
  • ACC → INP (f2 = 0.09): Small effect
  • ACC → STP (f2 = 0.04): Small effect
  • INP → STP (f2 = 0.33): Moderate effect
  • TCO → INP (f2 = 0.11): Small effect
  • TCO → STP (f2 = 0.03): Small effect
The R2 value (coefficient of determination) represents the proportion of variance explained by the predictor variables for an endogenous construct. It is interpreted as:
  • R2 ≥ 0.75 → Substantial
  • 0.50 ≤ R2 < 0.75 → Moderate
  • 0.25 ≤ R2 < 0.50 → Weak
Mathematically, R2 is calculated as:
R2 = Σ(Ŷi − Ȳ)2/Σ(Yi − Ȳ)2
R2 Values, INP (R2 = 0.41): Moderate explanatory power—the model explains 41% of variance in Instructional Planning. STP (R2 = 0.56): The model explains moderate explanatory power—56% of variance in Student Performance.
The f2 and R2 values indicate that Instructional Planning (INP) is the strongest predictor of Student Performance (STP), while AI ChatGPT Capabilities (ACC) and Teacher Competency (TCO) contribute meaningfully to both Instructional Planning and Student Performance. The moderate R2 values confirm that the model explains a substantial proportion of variance, supporting its predictive validity.

4.4. Structural Model Analysis

Structural model analysis investigates relationships among the constructs by inspecting path coefficients, statistical significance, and indirect effects. Structural Equation Modeling (SEM) was conducted using the Partial Least Squares (PLS) method. Path Analysis (Direct Effects) results are shown in Table 4 and Figure 4, and path analysis (Indirect Effects) results are shown in Table 4: Direct effects among AI ChatGPT Capabilities (ACC), Teacher Competency (TCO), Instructional Planning (INP), and Student Performance (STP) were investigated. Path coefficients (β), t-values, and p-values were investigated for hypothesis testing. A t-value greater than 1.96 and a p-value less than 0.05 indicate statistical significance.
Mathematically, structural equations may be represented as follows:
INP = β1 * ACC + e1
STP = β2 * ACC + β3 * INP + β4 * TCO + e2
The indirect effects assess the mediating role of instructional planning (INP) in the relationship between AI ChatGPT Capabilities (ACC) and Student Performance (STP), as well as Teacher Competency (TCO) and Student Performance (STP). Bootstrapping with 5000 resamples was used to determine the significance of indirect effects.
Mathematically, the indirect effect equations are:
STP = β5 * (ACC → INP) + β3 * (INP → STP)
STP = β6 * (TCO → INP) + β3 * (INP → STP)
The results demonstrate that AI ChatGPT Capabilities (ACC) and Teacher Competency (TCO) positively influence Instructional Planning (INP), which in turn significantly impacts Student Performance (STP). Indirect effects confirm the mediating role of INP in these relationships, suggesting that effective instructional planning enhances the student outcomes.

5. Discussion

This study explores how AI ChatGPT capabilities and teacher competencies influence instructional planning and student performance within an outcome-based education system. The direct and indirect impact of the variables were determined in the study through Structural Equation Modeling, thus giving insightful information to higher education institutions that have already integrated AI tools in instruction. These are consistent with the results obtained in previous literature on the instructions of AI; however, they emphasize different insights into identifying instructional planning as a mediating factor.
The research determined that AI ChatGPT capabilities significantly impact instructional planning, as evidenced by a significant positive effect (β = 0.33, p < 0.001). This finding supports the previous studies by highlighting AI’s potential to enhance lesson planning, automate content creation, and personalize instructional materials based on student needs [121,122]. AI-powered tools like ChatGPT streamline curriculum development, allowing educators to allocate more time to interactive and student-centered teaching approaches [123]. Furthermore, Mariani et al. [124] revealed that AI-driven platforms have been shown to enhance efficiency by reducing teachers’ workload and improving lesson adaptability. From the TPACK (Technological Pedagogical Content Knowledge) model perspective, ChatGPT is a technological tool that enhances teachers’ pedagogical and content knowledge integration. By assisting in instructional planning, AI supports educators in effectively combining subject matter expertise with pedagogical strategies while leveraging technology to optimize content delivery and engagement [50,59,125]. Van et al. [39] indicate that AI applications in education facilitate differentiated instruction, ensuring inclusivity for students with diverse learning needs. The practical implications include minimizing the teachers’ workload and ensuring the curricula meet the OBE demands [66]. AI tools facilitate more customized and efficient work, allowing easier planning of lessons and allocation of resources [123]. The results, however, also highlight the importance of training programs enhancing instructors’ AI literacy. Teachers must utilize the AI tool effectively for the OBE education system implementations in universities since low proficiency may hinder their efficiency [53].
Although teacher competency is another influential variable in instructional planning (β = 0.37, p < 0.001), this finding reinforces previous research emphasizing the critical role of pedagogical and technological skills in lesson planning [27,40,42]. Ning et al. [42] highlighted that teachers with strong technological competence can effectively integrate AI tools into their instructional design, automating lesson planning, enhancing content personalization, and streamlining assessment processes. This aligns with studies highlighting that teachers proficient in digital tools are better equipped to leverage AI-driven platforms for adaptive learning and differentiated instruction [35,47,59]. Furthermore, teachers’ technological competency plays a vital role in implementing Outcome-Based Education (OBE), as AI tools assist in designing competency-driven curricula, generating data-driven insights, and facilitating formative assessments [27,29]. Almuhanna et al. [33] also indicate that AI-powered educational systems support teachers in aligning instructional strategies with learning objectives, ensuring that students achieve targeted competencies more effectively. Integrating AI into instructional planning requires technological proficiency and pedagogical adaptability. The TPACK framework highlights that effective teaching with AI tools necessitates a balance between technological knowledge (TK), pedagogical knowledge (PK), and content knowledge (CK), ensuring that technology enhances rather than replaces instructional expertise [33,40,47]. This supports prior findings that educators who receive AI training demonstrate increased confidence in using AI-driven platforms for instructional design, leading to improved teaching efficiency and student outcomes [126,127]. Given the increasing reliance on AI in education, future research should explore professional development programs that enhance teachers’ AI literacy, ensuring that they can maximize AI’s potential in instructional planning while maintaining pedagogical integrity. Universities must invest in training to enable teachers with capabilities to incorporate AI tools into instruction.
This further reveals that there are significant influences of AI ChatGPT capabilities (β = 0.20, p < 0.001) and teacher competencies (β = 0.16, p = 0.020) on student achievement. These present results support the findings of prior work to show that AI-aided instructional design positively contributes to student interest and academic attainment by adapting lessons and giving learner-centric feedback [128,129]. Fabiyi and Damilola [130] and Rani et al. [131] stated that AI-based tools, including ChatGPT, facilitate instructional planning by matching content with Outcome-Based Education (OBE) standards to guarantee that learning outcomes are achieved efficiently. Furthermore, teachers’ skills are still essential in critically integrating AI-based instructional planning with students’ needs to ensure that AI-generated content is pedagogically valid and developed appropriately [32]. Additionally, Karataş et al. [40] mentioned that the TPACK model underscores that technological, pedagogical, and content knowledge are interconnected and crucial to achieving the full potential of AI in teaching. Teachers who successfully implement AI in their teaching strategies have the potential to close the gap between technology and instruction so that planning for instruction is not only practical but also leads to deeper student involvement. Seo et al. [132] support findings that AI-augmented education leads to more meaningful learning experiences and improved academic outcomes when guided by skilled teachers. These findings indicate that although AI is pioneering in planning instruction, the teacher’s proficiency in using AI-based strategies is still a crucial predictor of student achievement.
Additionally, instructional planning mediates between AI ChatGPT capabilities, teacher competence, and student achievement [66]. The indirect effects reflected the importance of systematic lesson planning in bridging AI tools, teacher competence, and learning outcomes. Effective instructional planning, with the support of AI ChatGPT and teacher competencies, has significantly improved student learning. This supports evidence that well-designed lesson content improves students’ engagement and performance [133,134]. Investment in AI-integrated instructional design practices at an institutional level; the design of AI-enabled instructional models should be carried out so that lesson planning becomes more effective, and student engagement increases. Additionally, faculty training programs must include AI-enabled instructional design practices to maximize learning efficiency.
This study provides empirical evidence that AI ChatGPT capabilities and teacher competencies significantly enhance instructional planning and student performance within higher education’s Outcome-Based Education (OBE) system. AI-driven instructional support optimizes lesson planning, content development, and personalized learning, reinforcing the critical role of teachers’ technological and pedagogical expertise in effective AI integration. The TPACK framework offers a valuable perspective on AI adoption in education, emphasizing that AI tools like ChatGPT are the most effective when the teachers possess the necessary pedagogical and technological knowledge to adapt and implement the AI-generated resources effectively. A strong TPACK foundation enables faculty to align AI-driven instructional planning with student-centered learning strategies and academic requirements. This study also highlights the need for AI literacy and faculty training programs in higher education. Training should focus on technical proficiency and a critical evaluation of AI-generated content to ensure its accuracy and alignment with OBE objectives. By strengthening faculty competencies in AI integration, the institutions can control technology to enhance, rather than replace, effective teaching, ultimately fostering adaptive, student-centered, and outcome-driven learning environments. The long-term effects and cross-disciplinary applications of AI tools in education should be considered in future research.

5.1. Practical Implications

This study provides valuable results regarding the contribution of ChatGPT in solving the issues of Outcome-Based education (OBE) in higher education. The results suggest that AI technologies have the potential to help teachers in lesson planning through effective lesson planning based on outcome-based curriculum at higher education, timely personal feedback, and support on lesson plans and their implementations, ultimately improving instructional effectiveness. With these tools, instructors can create teaching content, student learning activities, and several other forms of formative and summative assessments of tools and resources. This facilitates the creation of conducive learning environments and makes learning more interactive and engaging in OBE systems. However, the effective implementation of artificial intelligence in Outcome-Based Education requires that teachers possess a strong foundation in technological, pedagogical content knowledge (TPACK). The study emphasizes that while artificial intelligence can support teaching approaches, its efficacy is significantly enhanced when teachers integrate technology with teaching approaches and content knowledge.
This study emphasizes the importance of integrating AI technologies within a strong TPACK (Technological Pedagogical Content Knowledge) framework for educators. Teachers who effectively blend pedagogical strategies with content expertise and AI tools are better positioned to deliver personalized, inclusive, and data-informed instruction. This enriches the teaching process and fosters adaptive learning environments that accommodate diverse student needs.
For policymakers, the findings provide a clear roadmap for improving the quality and effectiveness of instruction in higher education. The evidence suggests that institutional policies should promote the professional development of teachers in AI literacy and TPACK integration. For example, universities can implement training programs and certification schemes focusing on the pedagogically sound use of AI tools like ChatGPT. Furthermore, curriculum policies should encourage the integration of AI-supported instructional planning into OBE frameworks to ensure an alignment between teaching strategies and learning outcomes. Policymakers can also influence these findings to: A) Develop institutional guidelines and ethical frameworks for AI use in instructional planning. B) Allocate funding and resources for AI training infrastructure, platforms, and support services. C) Incentivize innovation and research in AI-enabled pedagogies that enhance learning quality and performance monitoring.
This study opens new ways for researchers to investigate how AI adoption, when guided by the TPACK model, influences instructional quality and student achievement over time. Future studies can explore discipline-specific applications, long-term impacts on academic success, and the scalability of AI-enhanced OBE systems.
This study provides a policy-relevant, educator-focused, and empirically grounded contribution to the discourse on AI in education. It highlights the need for systemic support to enable AI integration to elevate instructional quality and improve student learning outcomes in OBE-based higher education systems.

5.2. Theoretical Implications

This study contributes to the literature by integrating the TPACK framework with AI-driven instructional planning in Outcome-Based Education (OBE). The research empirically demonstrates how AI capabilities and teacher competencies enhance pedagogical effectiveness, ensuring that instructional planning aligns with OBE learning outcomes. By verifying the mediating role of instructional planning, the study advances knowledge on AI-supported pedagogy, emphasizing the necessity of technological and pedagogical expertise for effective AI adoption in education. Additionally, this study offers methodological rigor through Structural Equation Modeling (SEM), reinforcing the validity of AI adoption constructs in instructional planning. It also highlights the importance of exploring other theoretical perspectives to better understand AI’s role in digital education. These insights provide a foundation for future research investigating AI-assisted teaching strategies and their theoretical bases in the evolving digital learning model.

6. Conclusions

This study delves into the capability of AI tools such as ChatGPT of supplementing instruction planning and enhancing student outcomes under the Outcome-Based Education framework. It focuses on how relationships between the AI capabilities and teacher capabilities affect instruction planning; the latter combined with the former will then be used as a factor that impacts students. The research focuses on empirical studies about how AI technology can be helpful in teaching and learning as education increasingly employs AI technology. Considering teacher competencies, the research investigated how AI ChatGPT functions enhanced instruction planning and student performance. A conceptual model was formulated based on the Technological Pedagogical Content Knowledge (TPACK) model, emphasizing the interaction between technological, pedagogical, and content knowledge necessary to effectively implement AI in teaching. The TPACK model highlights how teachers’ ability to apply AI tools like ChatGPT depends on their ability to combine these three. The Outcome-Based Education (OBE) model is also applied as the foundation to determine how integrating AI and instructional planning leads to measurable student learning outcomes. Contrary to existing research on AI in general education settings, the current research concentrated on the OBE model, highlighting a new feature of AI adoption in instruction design.
A quantitative methodology was adopted with a Structural Equation Modeling (SEM) validation method for the hypothesized model. The data were gathered from the faculties of three universities in Sindh, Pakistan, using convenience sampling. Surveys of 330 were sent via WhatsApp groups and email, and following data cleaning, 320 cases were used for final analysis. The measurement model validated the constructs’ reliability and validity, while the structural model established significant direct and indirect relationships between the AI ChatGPT competencies, teaching competencies, planning instruction, and student achievement. The results indicated that AI ChatGPT significantly improved instructional planning, which resulted in enhanced student performance. Teachers’ Competencies were also identified as essential in maximizing instructional planning and the proper utilization of AI tools. Instructional planning also meditated, connecting AI and teacher competencies to student learning outcomes. These findings are consistent with existing research on AI in education, further affirming the need for incorporating AI-powered tools to enhance pedagogical approaches. The policy implications of this research indicate that universities must offer faculty training in AI tools to strengthen teaching planning and streamline student learning outcomes. Policymakers must also explore AI adoption as an integral element of future educational plans. This research adds to the existing literature on AI in education through empirical analysis of its role in teaching planning and student achievement. Future research can investigate the long-term effect of AI-based teaching strategies across educational institutions.

Limitations and Future Research

In spite of the valuable contributions of this study, several limitations need to be recognized. To start with, the sample was confined to 320 respondents across three universities in Sindh, Pakistan, which might limit the external validity of the findings to other educational settings. Future research should use a larger and more diversified sample from various regions and institutions to improve external validity. Secondly, the study used a cross-sectional research design, measuring data at one point. This restricts the capacity to make inferences regarding causal links between AI ChatGPT capabilities, instructional planning, teacher competency, and student performance. Future research should use a longitudinal design to analyze the long-term impact of AI-driven instructional planning on students’ learning outcomes.
Thirdly, this study investigates respondents’ attitudes primarily based on a survey questionnaire, not directly measuring the effects of AI-integrated instruction. Future studies should use experimental or quasi-experimental designs to measure the direct impact of AI-instructed student learning. When such studies are conducted, they should explicitly define lesson deployment, the assessment measures, and the particular metrics used in measuring learning accomplishment.
Theoretically, the research was based on the Technological Pedagogical Content Knowledge (TPACK) and Outcome-Based Education (OBE) frameworks. However, other theories and constructs inform the adoption of AI in education that have yet to be tested. Future research should examine additional theoretical perspectives to gain an overall better picture of AI ChatGPT adoption. Future research should also examine the most significant determinants that inform the adoption of AI in OBE.
Methodologically, this study applied Structural Equation Modeling (SEM), which can test latent construct relationships well but fails to encompass the in-depth contextual understanding qualitative approaches can generate. Future research should engage in a mixed-methods strategy with qualitative interviews or case studies augmented by quantitative analysis to assess the practical pitfalls and advantages of AI adoption within education. This improvement will further strengthen the knowledge on AI-powered instructional planning and its contribution to student achievement.

Author Contributions

Conceptualization, W.N.A. and G.N.A.; methodology, N.A.D., M.H. and G.N.A.; validation, G.N.A.; formal analysis, N.A.D.; investigation, W.N.A.; data curation, G.N.A., M.H. and N.A.D.; visualization, M.H.; supervision, W.N.A.; funding acquisition, W.N.A.; resources, W.N.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research is funded by the Deanship of Graduate Studies and Scientific Research at Jouf University through the Fast-Track Research Funding Program.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A. Data Collection Questionnaire

S. No.Item StatementSDDNASA
AI ChatGPT Capabilities
1ChatGPT helps generate high-quality instructional content. (ACC1)
2ChatGPT provides relevant and accurate responses to teaching needs. (ACC2)
3ChatGPT enhances lesson planning and curriculum design. (ACC3)
4ChatGPT assists in providing personalized student feedback. (ACC4)
5ChatGPT improves efficiency in classroom management. (ACC5)
Teacher Competency
6I am confident in integrating AI tools like ChatGPT in teaching. (TCO1)
7I can effectively evaluate AI-generated content for instructional use. (TCO2)
8I have sufficient knowledge to incorporate ChatGPT into my teaching. (TCO3)
9I can guide students in using AI tools responsibly. (TCO4)
10I adapt my teaching strategies based on AI-generated insights. (TCO5)
Instructional Planning
11AI tools help me create structured and effective lesson plans. (INP1)
12I use ChatGPT to enhance my instructional strategies. (INP2)
13AI tools support differentiated instruction for diverse learners. (INP3)
14AI-driven insights improve my assessment strategies. (INP4)
15ChatGPT helps me align my teaching with curriculum goals. (INP5)
16AI tools enhance my ability to track student progress. (INP6)
Student Performance
17AI-driven tools improve student engagement in learning. (STP1)
18Students show a better understanding with AI-assisted learning. (STP2)
19ChatGPT helps students develop critical thinking skills. (STP3)
20AI-based instructional support enhances student motivation. (STP4)
21AI integration positively impacts students’ academic performance. (STP5)

References

  1. Syeed, M.M.M.; Shihavuddin, A.S.M.; Uddin, M.F.; Hasan, M.; Khan, R.H. Outcome Based Education (OBE): Defining the Process and Practice for Engineering Education. IEEE Access 2022, 10, 119170–119192. [Google Scholar] [CrossRef]
  2. Li, M.; Rohayati, M.I. The Relationship between Learning Outcomes and Graduate Competences: The Chain-Mediating Roles of Project-Based Learning and Assessment Strategies. Sustainability 2024, 16, 6080. [Google Scholar] [CrossRef]
  3. Harden, R.M. AMEE Guide No. 14: Outcome-Based Education: Part 1-An Introduction to Outcome-Based Education. Med. Teach. 1999, 21, 7–14. [Google Scholar] [CrossRef]
  4. Alamri, H.; Lowell, V.; Watson, W.; Watson, S.L. Using Personalized Learning as an Instructional Approach to Motivate Learners in Online Higher Education: Learner Self-Determination and Intrinsic Motivation. J. Res. Technol. Educ. 2020, 52, 322–352. [Google Scholar] [CrossRef]
  5. Harris, J.B.; Hofer, M.J. Technological Pedagogical Content Knowledge (TPACK) in Action: A Descriptive Study of Secondary Teachers’ Curriculum-Based, Technology-Related Instructional Planning. J. Res. Technol. Educ. 2011, 43, 211–229. [Google Scholar] [CrossRef]
  6. Lim, C.P.; Chai, C.S. Rethinking Classroom-Oriented Instructional Development Models to Mediate Instructional Planning in Technology-Enhanced Learning Environments. Teach. Teach. Educ. 2008, 24, 2002–2013. [Google Scholar] [CrossRef]
  7. Smith, S. An Analysis of Teachers’ Viewpoints from Public, Private, and Charter Schools on Effective Lesson Planning and Instruction. Ph.D. Thesis, Lindenwood University, St. Charles, MO, USA, 2024. [Google Scholar]
  8. Farhang, A.P.Q.; Hashemi, A.; Ghorianfar, A. Lesson Plan and Its Importance in Teaching Process. Int. J. Curr. Sci. Res. Rev. 2023, 6, 5901–5913. [Google Scholar] [CrossRef]
  9. Bhat, B.A.; Bhat, G.J. Formative and Summative Evaluation Techniques for Improvement of Learning Process. Eur. J. Bus. Soc. Sci. 2019, 7, 776–785. [Google Scholar]
  10. Rodrigues, F.; Oliveira, P. A System for Formative Assessment and Monitoring of Students’ Progress. Comput. Educ. 2014, 76, 30–41. [Google Scholar] [CrossRef]
  11. Looney, J.W. Integrating Formative and Summative Assessment: Progress Toward a Seamless System? OECD: Paris, France, 2011. [Google Scholar]
  12. Chisunum, J.I.; Nwadiokwu, C. Enhancing Student Engagement through Practical Production and Utilization of Instructional Materials in an Educational Technology Class: A Multifaceted Approach. NIU J. Educ. Res. 2024, 10, 81–89. [Google Scholar]
  13. Jayaraman, J.; Aane, J. The Impact of Digital Textbooks on Student Engagement in Higher Education: Highlighting the Significance of Interactive Learning Strategies Facilitated by Digital Media. In Implementing Interactive Learning Strategies in Higher Education; IGI Global: Hershey, PA, USA, 2024; pp. 301–328. [Google Scholar]
  14. Orlich, D.C.; Harder, R.J.; Callahan, R.C.; Trevisan, M.S.T.; Brown, A.H. Teaching Strategies: A Guide to Effective Instruction; Cengage Learning: Wadsworth, DC, USA, 2010. [Google Scholar]
  15. De Vera, J.L.; Manalo, M.; Ermeno, R.; Delos Reyes, C.; Elores, Y.D. Teachers’ Instructional Planning and Design for Learners in Difficult Circumstances. J. Pendidik. Progresif 2022, 12, 17–32. [Google Scholar] [CrossRef]
  16. Gupta, P.; Kulkarni, T.; Barot, V.; Toksha, B. Applications of ICT: Pathway to Outcome-Based Education in Engineering and Technology Curriculum. In Technology and Tools in Engineering Education; CRC Press: Boca Raton, FL, USA, 2021; pp. 109–142. [Google Scholar]
  17. Shaheen, S. Theoretical Perspectives and Current Challenges of OBE Framework. Int. J. Eng. Educ. 2019, 1, 122–129. [Google Scholar] [CrossRef]
  18. Spear-Swerling, L.; Zibulsky, J. Making Time for Literacy: Teacher Knowledge and Time Allocation in Instructional Planning. Read. Writ. 2014, 27, 1353–1378. [Google Scholar] [CrossRef]
  19. Angeli, C.; Valanides, N. Technology Mapping: An Approach for Developing Technological Pedagogical Content Knowledge. J. Educ. Comput. Res. 2013, 48, 199–221. [Google Scholar] [CrossRef]
  20. Chai, C.S.; Hwee Ling Koh, J.; Teo, Y.H. Enhancing and Modeling Teachers’ Design Beliefs and Efficacy of Technological Pedagogical Content Knowledge for 21st Century Quality Learning. J. Educ. Comput. Res. 2019, 57, 360–384. [Google Scholar] [CrossRef]
  21. Dahri, N.A.; Vighio, M.S.; Alismaiel, O.A.; Al-Rahmi, W.M. Assessing the Impact of Mobile-Based Training on Teachers’ Achievement and Usage Attitude. Int. J. Interact. Mob. Technol. 2022, 16, 107–129. [Google Scholar] [CrossRef]
  22. Dahri, N.A.; Yahaya, N.; Al-Rahmi, W.M.; Noman, H.A.; Alblehai, F.; Kamin, Y.B.; Soomro, R.B.; Shutaleva, A.; Al-Adwan, A.S. Investigating the Motivating Factors That Influence the Adoption of Blended Learning for Teachers’ Professional Development. Heliyon 2024, 10, e34900. [Google Scholar] [CrossRef]
  23. Dahri, N.A.; Yahaya, N.; Al-Rahmi, W.M.; Almogren, A.S.; Vighio, M.S. Investigating Factors Affecting Teachers’ Training through Mobile Learning: Task Technology Fit Perspective. Educ. Inf. Technol. 2024, 29, 14553–14589. [Google Scholar] [CrossRef]
  24. Pirzada, G.; Gull, F. Impact of Outcome-Based Education on Teaching Performance at Higher Education Level in Pakistan. J. Res. Humanit. Soc. Sci. 2019, 2, 95–110. [Google Scholar]
  25. Mishra, P.; Koehler, M.J. Technological Pedagogical Content Knowledge: A Framework for Teacher Knowledge. Teach. Coll. Rec. 2006, 108, 1017–1054. [Google Scholar] [CrossRef]
  26. Ren, X.; Wu, M.L. Examining Teaching Competencies and Challenges While Integrating Artificial Intelligence in Higher Education. TechTrends 2025, 69, 519–538. [Google Scholar] [CrossRef]
  27. Asim, H.M.; Vaz, A.; Ahmed, A.; Sadiq, S. A Review on Outcome Based Education and Factors That Impact Student Learning Outcomes in Tertiary Education System. Int. Educ. Stud. 2021, 14, 1. [Google Scholar] [CrossRef]
  28. Mouza, C. Promoting Urban Teachers’ Understanding of Technology, Content, and Pedagogy in the Context of Case Development. J. Res. Technol. Educ. 2011, 44, 1–29. [Google Scholar] [CrossRef]
  29. Utari, V.T.; Maryani, I.; Hasanah, E.; Suyatno, S.; Mardati, A.; Bastian, N.; Karimi, A.; Reotutor, M.A.C. Exploring the Intersection of TPACK and Professional Competence: A Study on Differentiated Instruction Development within Indonesia’s Merdeka Curriculum. Indones. J. Learn. Adv. Educ. 2025, 7, 136–153. [Google Scholar] [CrossRef]
  30. Dahri, N.A.; Yahaya, N.; Al-Rahmi, W.M.; Vighio, M.S.; Alblehai, F.; Soomro, R.B.; Shutaleva, A. Investigating AI-Based Academic Support Acceptance and Its Impact on Students’ Performance in Malaysian and Pakistani Higher Education Institutions. Educ. Inf. Technol. 2024, 29, 18695–18744. [Google Scholar] [CrossRef]
  31. Holmes, W.; Miao, F. Guidance for Generative AI in Education and Research; UNESCO Publishing: Paris, France, 2023; ISBN 9231006126. [Google Scholar]
  32. Dahri, N.A.; Yahaya, N.; Al-Rahmi, W.M. Exploring the Influence of ChatGPT on Student Academic Success and Career Readiness. Educ. Inf. Technol. 2024, 30, 8877–8921. [Google Scholar] [CrossRef]
  33. Almuhanna, M.A. Teachers’ Perspectives of Integrating AI-Powered Technologies in K-12 Education for Creating Customized Learning Materials and Resources. Educ. Inf. Technol. 2024, 30, 10343–10371. [Google Scholar] [CrossRef]
  34. Vetrivel, S.C.; Vidhyapriya, P.; Arun, V.P. The Role of AI in Transforming Assessment Practices in Education. In AI Applications and Strategies in Teacher Education; IGI Global: Hershey, PA, USA, 2025; pp. 43–70. [Google Scholar]
  35. Strielkowski, W.; Grebennikova, V.; Lisovskiy, A.; Rakhimova, G.; Vasileva, T. AI-driven Adaptive Learning for Sustainable Educational Transformation. Sustain. Dev. 2024, 33, 1921–1947. [Google Scholar] [CrossRef]
  36. Fitria, T.N. Artificial Intelligence (AI) Technology in OpenAI ChatGPT Application: A Review of ChatGPT in Writing English Essay. ELT Forum J. Engl. Lang. Teach. 2023, 12, 44–58. [Google Scholar] [CrossRef]
  37. Khlaif, Z.N.; Mousa, A.; Hattab, M.K.; Itmazi, J.; Hassan, A.A.; Sanmugam, M.; Ayyoub, A. The Potential and Concerns of Using AI in Scientific Research: ChatGPT Performance Evaluation. JMIR Med. Educ. 2023, 9, e47049. [Google Scholar] [CrossRef]
  38. Kooli, C.; Yusuf, N. Transforming Educational Assessment: Insights into the Use of ChatGPT and Large Language Models in Grading. Int. J. Hum.-Comput. Interact. 2024, 41, 3388–3399. [Google Scholar] [CrossRef]
  39. van den Berg, G.; du Plessis, E. ChatGPT and Generative AI: Possibilities for Its Contribution to Lesson Planning, Critical Thinking and Openness in Teacher Education. Educ. Sci. 2023, 13, 998. [Google Scholar] [CrossRef]
  40. Karataş, F.; Ataç, B.A. When TPACK Meets Artificial Intelligence: Analyzing TPACK and AI-TPACK Components through Structural Equation Modelling. Educ. Inf. Technol. 2024, 30, 8979–9004. [Google Scholar] [CrossRef]
  41. Lo, C.K. What Is the Impact of ChatGPT on Education? A Rapid Review of the Literature. Educ. Sci. 2023, 13, 410. [Google Scholar] [CrossRef]
  42. Ning, Y.; Zhang, C.; Xu, B.; Zhou, Y.; Wijaya, T.T. Teachers’ AI-TPACK: Exploring the Relationship between Knowledge Elements. Sustainability 2024, 16, 978. [Google Scholar] [CrossRef]
  43. Harris, J.; Mishra, P.; Koehler, M. Teachers’ Technological Pedagogical Content Knowledge and Learning Activity Types: Curriculum-Based Technology Integration Reframed. J. Res. Technol. Educ. 2009, 41, 393–416. [Google Scholar] [CrossRef]
  44. Biggs, J.; Tang, C. Constructive Alignment: An Outcomes-Based Approach to Teaching Anatomy. In Teaching Anatomy: A Practical Guide; Springer: Berlin/Heidelberg, Germany, 2014; pp. 31–38. [Google Scholar]
  45. Shulman, L.S. Those Who Understand: Knowledge Growth in Teaching. Educ. Res. 1986, 15, 4–14. [Google Scholar] [CrossRef]
  46. Pierson, M.E. Technology Integration Practice as a Function of Pedagogical Expertise. J. Res. Comput. Educ. 2001, 33, 413–430. [Google Scholar] [CrossRef]
  47. Hava, K.; Babayiğit, Ö. Exploring the Relationship between Teachers’ Competencies in AI-TPACK and Digital Proficiency. Educ. Inf. Technol. 2024, 30, 3491–3508. [Google Scholar] [CrossRef]
  48. Henri, M.; Johnson, M.D.; Nepal, B. A Review of Competency-based Learning: Tools, Assessments, and Recommendations. J. Eng. Educ. 2017, 106, 607–638. [Google Scholar] [CrossRef]
  49. Kennedy, M.; Birch, P. Reflecting on Outcome-Based Education for Human Services Programs in Higher Education: A Policing Degree Case Study. J. Criminol. Res. Policy Pract. 2020, 6, 111–122. [Google Scholar] [CrossRef]
  50. Hussain, W.; Spady, W.G.; Khan, S.Z.; Khawaja, B.A.; Naqash, T.; Conner, L. Impact Evaluations of Engineering Programs Using Abet Student Outcomes. IEEE Access 2021, 9, 46166–46190. [Google Scholar] [CrossRef]
  51. Morcke, A.M.; Dornan, T.; Eika, B. Outcome (Competency) Based Education: An Exploration of Its Origins, Theoretical Basis, and Empirical Evidence. Adv. Health Sci. Educ. 2013, 18, 851–863. [Google Scholar] [CrossRef]
  52. Tan, K.; Chong, M.C.; Subramaniam, P.; Wong, L.P. The Effectiveness of Outcome Based Education on the Competencies of Nursing Students: A Systematic Review. Nurse Educ. Today 2018, 64, 180–189. [Google Scholar] [CrossRef] [PubMed]
  53. Chen, X.; Xie, H.; Zou, D.; Hwang, G.-J. Application and Theory Gaps during the Rise of Artificial Intelligence in Education. Comput. Educ. Artif. Intell. 2020, 1, 100002. [Google Scholar] [CrossRef]
  54. Jung, D.; Suh, S. Enhancing Soft Skills through Generative AI in Sustainable Fashion Textile Design Education. Sustainability 2024, 16, 6973. [Google Scholar] [CrossRef]
  55. Naznin, K.; Al Mahmud, A.; Nguyen, M.T.; Chua, C. ChatGPT Integration in Higher Education for Personalized Learning, Academic Writing, and Coding Tasks: A Systematic Review. Computers 2025, 14, 53. [Google Scholar] [CrossRef]
  56. Wang, H.; Dang, A.; Wu, Z.; Mac, S. Generative AI in Higher Education: Seeing ChatGPT Through Universities’ Policies, Resources, and Guidelines. Comput. Educ. Artif. Intell. 2024, 7, 100326. [Google Scholar] [CrossRef]
  57. Kasneci, E.; Seßler, K.; Küchemann, S.; Bannert, M.; Dementieva, D.; Fischer, F.; Gasser, U.; Groh, G.; Günnemann, S.; Hüllermeier, E. ChatGPT for Good? On Opportunities and Challenges of Large Language Models for Education. Learn. Individ. Differ. 2023, 103, 102274. [Google Scholar] [CrossRef]
  58. Hussain, F.; Anwar, M.A. Towards Informed Policy Decisions: Assessing Student Perceptions and Intentions to Use ChatGPT for Academic Performance in Higher Education. J. Asian Public Policy 2024, 1–28. [Google Scholar] [CrossRef]
  59. Celik, I. Towards Intelligent-TPACK: An Empirical Study on Teachers’ Professional Knowledge to Ethically Integrate Artificial Intelligence (AI)-Based Tools into Education. Comput. Hum. Behav. 2023, 138, 107468. [Google Scholar] [CrossRef]
  60. Almaiah, M.A.; Alfaisal, R.; Salloum, S.A.; Al-Otaibi, S.; Shishakly, R.; Lutfi, A.; Alrawad, M.; Mulhem, A.A.; Awad, A.B.; Al-Maroof, R.S. Integrating Teachers’ TPACK Levels and Students’ Learning Motivation, Technology Innovativeness, and Optimism in an IoT Acceptance Model. Electronics 2022, 11, 3197. [Google Scholar] [CrossRef]
  61. Karaman, M.R. Are Lesson Plans Created by ChatGPT More Effective? An Experimental Study. Int. J. Technol. Educ. 2024, 7, 107–127. [Google Scholar] [CrossRef]
  62. Tungpalan, K.A.; Antalan, M.F. Teachers’ Perception and Experience on Outcomes-Based Education Implementation in Isabela State University. Int. J. Eval. Res. Educ. 2021, 10, 1213–1220. [Google Scholar] [CrossRef]
  63. Donker, A.S.; De Boer, H.; Kostons, D.; Van Ewijk, C.C.D.; van der Werf, M.P.C. Effectiveness of Learning Strategy Instruction on Academic Performance: A Meta-Analysis. Educ. Res. Rev. 2014, 11, 1–26. [Google Scholar] [CrossRef]
  64. Egara, F.O.; Mosimege, M. Exploring the Integration of Artificial Intelligence-Based ChatGPT into Mathematics Instruction: Perceptions, Challenges, and Implications for Educators. Educ. Sci. 2024, 14, 742. [Google Scholar] [CrossRef]
  65. Al-Mamary, Y.H.; Alfalah, A.A.; Shamsuddin, A.; Abubakar, A.A. Artificial Intelligence Powering Education: ChatGPT’s Impact on Students’ Academic Performance through the Lens of Technology-to-Performance Chain Theory. J. Appl. Res. High. Educ. 2024; ahead-of-print. [Google Scholar]
  66. Zamir, M.Z.; Abid, M.I.; Fazal, M.R.; Qazi, M.A.A.R.; Kamran, M. Switching to Outcome-Based Education (OBE) System, a Paradigm Shift in Engineering Education. IEEE Trans. Educ. 2022, 65, 695–702. [Google Scholar] [CrossRef]
  67. Gopal, Y. Exploring Academic Perspectives: Sentiments and Discourse on ChatGPT Adoption in Higher Education. Master’s Thesis, Universität Koblenz, Koblenz, Germany, 2024. [Google Scholar]
  68. Miah, A.S.M.; Tusher, M.M.R.; Hossain, M.M.; Hossain, M.M.; Rahim, M.A.; Hamid, M.E.; Islam, M.S.; Shin, J. ChatGPT in Research and Education: Exploring Benefits and Threats. arXiv 2024, arXiv:2411.02816. [Google Scholar]
  69. Lee, G.-G.; Zhai, X. Using ChatGPT for Science Learning: A Study on Pre-Service Teachers’ Lesson Planning. IEEE Trans. Learn. Technol. 2024, 17, 1643–1660. [Google Scholar] [CrossRef]
  70. Zou, D.; Xie, H.; Kohnke, L. Navigating the Future: Establishing a Framework for Educators’ Pedagogic Artificial Intelligence Competence. Eur. J. Educ. 2025, 60, e70117. [Google Scholar] [CrossRef]
  71. Chaudhry, I.S.; Sarwary, S.A.M.; El Refae, G.A.; Chabchoub, H. Time to Revisit Existing Student’s Performance Evaluation Approach in Higher Education Sector in a New Era of ChatGPT—A Case Study. Cogent Educ. 2023, 10, 2210461. [Google Scholar] [CrossRef]
  72. Lo, C.K.; Hew, K.F.; Jong, M.S. The Influence of ChatGPT on Student Engagement: A Systematic Review and Future Research Agenda. Comput. Educ. 2024, 219, 105100. [Google Scholar] [CrossRef]
  73. Ye, L.; Ismail, H.H.; Aziz, A.A. Innovative Strategies for TPACK Development in Pre-Service English Teacher Education in the 21st Century: A Systematic Review. Forum Linguist. Stud. 2024, 6, 274–294. [Google Scholar] [CrossRef]
  74. Rodafinos, A.; Barkoukis, V.; Tzafilkou, K.; Ourda, D.; Economides, A.A.; Perifanou, M. Exploring the Impact of Digital Competence and Technology Acceptance on Academic Performance in Physical Education and Sports Science Students. J. Inf. Technol. Educ. Res. 2024, 23, 19. [Google Scholar] [CrossRef]
  75. Wang, J.; Fan, W. The Effect of ChatGPT on Students’ Learning Performance, Learning Perception, and Higher-Order Thinking: Insights from a Meta-Analysis. Humanit. Soc. Sci. Commun. 2025, 12, 621. [Google Scholar] [CrossRef]
  76. ElSayary, A. An Investigation of Teachers’ Perceptions of Using ChatGPT as a Supporting Tool for Teaching and Learning in the Digital Era. J. Comput. Assist. Learn. 2024, 40, 931–945. [Google Scholar] [CrossRef]
  77. Caratiquit, K.D.; Caratiquit, L.J.C. ChatGPT as an Academic Support Tool on the Academic Performance among Students: The Mediating Role of Learning Motivation. J. Soc. Humanit. Educ. 2023, 4, 21–33. [Google Scholar] [CrossRef]
  78. Pantić, N.; Wubbels, T. Teacher Competencies as a Basis for Teacher Education–Views of Serbian Teachers and Teacher Educators. Teach. Teach. Educ. 2010, 26, 694–703. [Google Scholar] [CrossRef]
  79. Podungge, R.; Rahayu, M.; Setiawan, M.; Sudiro, A. Teacher Competence and Student Academic Achievement. In Proceedings of the 23rd Asian Forum of Business Education (AFBE 2019), Bali, Indonesia, 12–13 December 2019; Atlantis Press: Paris, France, 2020; pp. 69–74. [Google Scholar]
  80. Rapanta, C.; Botturi, L.; Goodyear, P.; Guàrdia, L.; Koole, M. Online University Teaching during and after the Covid-19 Crisis: Refocusing Teacher Presence and Learning Activity. Postdigital Sci. Educ. 2020, 2, 923–945. [Google Scholar] [CrossRef]
  81. Molotsi, A.; van Wyk, M. Exploring Teachers’use of Technological Pedagogical Knowledge in Teaching Subjects in Rural Areas. J. Inf. Technol. Educ. Res. 2024, 23, 30. [Google Scholar]
  82. Macayan, J. V Implementing Outcome-Based Education (OBE) Framework: Implications for Assessment of Students’ Performance. Educ. Meas. Eval. Rev. 2017, 8, 1–10. [Google Scholar]
  83. Fuchs, L.S.; Fuchs, D.; Stecker, P.M. Effects of Curriculum-Based Measurement on Teachers’ Instructional Planning. J. Learn. Disabil. 1989, 22, 51–59. [Google Scholar] [CrossRef] [PubMed]
  84. Bennett, S.; Dawson, P.; Bearman, M.; Molloy, E.; Boud, D. How Technology Shapes Assessment Design: Findings from a Study of University Teachers. Br. J. Educ. Technol. 2017, 48, 672–682. [Google Scholar] [CrossRef]
  85. Hora, M.T.; Holden, J. Exploring the Role of Instructional Technology in Course Planning and Classroom Teaching: Implications for Pedagogical Reform. J. Comput. High. Educ. 2013, 25, 68–92. [Google Scholar] [CrossRef]
  86. Tseng, J.-J.; Chai, C.S.; Tan, L.; Park, M. A Critical Review of Research on Technological Pedagogical and Content Knowledge (TPACK) in Language Teaching. Comput. Assist. Lang. Learn. 2022, 35, 948–971. [Google Scholar] [CrossRef]
  87. Kabakci Yurdakul, I.; Çoklar, A.N. Modeling Preservice Teachers’ TPACK Competencies Based on ICT Usage. J. Comput. Assist. Learn. 2014, 30, 363–376. [Google Scholar] [CrossRef]
  88. Agustini, K.; Santyasa, I.W.; Ratminingsih, N.M. Analysis of Competence on “TPACK”: 21st Century Teacher Professional Development. J. Phys. Conf. Ser. 2019, 1387, 12035. [Google Scholar] [CrossRef]
  89. Nbina, J.B. Teachers’ Competence and Students’ Academic Performance in Senior Secondary Schools Chemistry: Is There Any Relationship? Glob. J. Educ. Res. 2012, 11, 15–18. [Google Scholar]
  90. Levy-Feldman, I. The Role of Assessment in Improving Education and Promoting Educational Equity. Educ. Sci. 2025, 15, 224. [Google Scholar] [CrossRef]
  91. Tomaszewski, W.; Xiang, N.; Huang, Y.; Western, M.; McCourt, B.; McCarthy, I. The Impact of Effective Teaching Practices on Academic Achievement When Mediated by Student Engagement: Evidence from Australian High Schools. Educ. Sci. 2022, 12, 358. [Google Scholar] [CrossRef]
  92. Ekmekci, A.; Serrano, D.M. The Impact of Teacher Quality on Student Motivation, Achievement, and Persistence in Science and Mathematics. Educ. Sci. 2022, 12, 649. [Google Scholar] [CrossRef]
  93. Woodcock, S.; Gibbs, K.; Hitches, E.; Regan, C. Investigating Teachers’ Beliefs in Inclusive Education and Their Levels of Teacher Self-Efficacy: Are Teachers Constrained in Their Capacity to Implement Inclusive Teaching Practices? Educ. Sci. 2023, 13, 280. [Google Scholar] [CrossRef]
  94. Dean, C.B.; Hubbell, E.R. Classroom Instruction That Works: Research-Based Strategies for Increasing Student Achievement; ASCD: Arlington, VA, USA, 2012; ISBN 1416613625. [Google Scholar]
  95. Reeves, A.R. Where Great Teaching Begins: Planning for Student Thinking and Learning; ASCD: Arlington, VA, USA, 2011; ISBN 1416614265. [Google Scholar]
  96. Dunlosky, J.; Rawson, K.A.; Marsh, E.J.; Nathan, M.J.; Willingham, D.T. Improving Students’ Learning with Effective Learning Techniques: Promising Directions from Cognitive and Educational Psychology. Psychol. Sci. Public Interes. 2013, 14, 4–58. [Google Scholar] [CrossRef]
  97. Reigeluth, C.M.; Aslan, S.; Chen, Z.; Dutta, P.; Huh, Y.; Lee, D.; Lin, C.-Y.; Lu, Y.-H.; Min, M.; Tan, V. Personalized Integrated Educational System: Technology Functions for the Learner-Centered Paradigm of Education. J. Educ. Comput. Res. 2015, 53, 459–496. [Google Scholar] [CrossRef]
  98. Wei, W. Using Summative and Formative Assessments to Evaluate EFL Teachers’ Teaching Performance. Assess. Eval. High. Educ. 2015, 40, 611–623. [Google Scholar] [CrossRef]
  99. Noushad, P.P. Aligning Learning Outcomes with Learning Process. In Designing and Implementing the Outcome-Based Education Framework; Springer: Berlin/Heidelberg, Germany, 2024; pp. 139–202. ISBN 9819604400. [Google Scholar]
  100. Macklem, G.L. Boredom in the Classroom: Addressing Student Motivation, Self-Regulation, and Engagement in Learning; Springer: Berlin/Heidelberg, Germany, 2015; Volume 1, ISBN 3319131206. [Google Scholar]
  101. Ritter, O.N. Integration of Educational Technology for the Purposes of Differentiated Instruction in Secondary STEM Education. Ph.D. Thesis, University of Tennessee, Knoxville, TN, USA, 2018. [Google Scholar]
  102. Hair, J.F., Jr.; Hult, G.T.M.; Ringle, C.M.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM); Sage Publications: Thousand Oaks, CA, USA, 2021; ISBN 1544396333. [Google Scholar]
  103. Etikan, I.; Musa, S.A.; Alkassim, R.S. Comparison of Convenience Sampling and Purposive Sampling. Am. J. Theor. Appl. Stat. 2016, 5, 1. [Google Scholar] [CrossRef]
  104. Creswell, J.W.; Creswell, J.D. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches; Sage Publications: Thousand Oaks, CA, USA, 2017; ISBN 1506386717. [Google Scholar]
  105. Taherdoost, H. What Is the Best Response Scale for Survey and Questionnaire Design; Review of Different Lengths of Rating Scale/Attitude Scale/Likert Scale. Int. J. Acad. Res. Manag. (IJARM) 2019, 8, 1–10. [Google Scholar]
  106. Hair, J.F.; Sarstedt, M.; Hopkins, L.; Kuppelwieser, V.G. Partial Least Squares Structural Equation Modeling (PLS-SEM): An Emerging Tool in Business Research. Eur. Bus. Rev. 2014, 26, 106–121. [Google Scholar] [CrossRef]
  107. Kline, R.B. Principles and Practice of Structural Equation Modeling; Guilford Publications: New York, NY, USA, 2015; ISBN 1462523358. [Google Scholar]
  108. Liu, Z.; Vobolevich, A.; Oparin, A. The Influence of AI ChatGPT on Improving Teachers’ Creative Thinking. Int. J. Learn. Teach. Educ. Res. 2023, 22, 124–139. [Google Scholar] [CrossRef]
  109. Dahri, N.A.; Yahaya, N.; Al-Rahmi, W.M.; Aldraiweesh, A.; Alturki, U.; Almutairy, S.; Shutaleva, A.; Soomro, R.B. Extended TAM Based Acceptance of AI-Powered ChatGPT for Supporting Metacognitive Self-Regulated Learning in Education: A Mixed-Methods Study. Heliyon 2024, 10, e29317. [Google Scholar] [CrossRef]
  110. Stronge, J.H.; Ward, T.J.; Grant, L.W. What Makes Good Teachers Good? A Cross-Case Analysis of the Connection between Teacher Effectiveness and Student Achievement. J. Teach. Educ. 2011, 62, 339–355. [Google Scholar] [CrossRef]
  111. Nunnally, B.; Bernstein, I. Psychometric Theory; Oxford Univer: New York, NY, USA, 1994. [Google Scholar]
  112. Bryman, A. Social Research Methods; Oxford University Press: Oxford, UK, 2016; ISBN 0199689458. [Google Scholar]
  113. Hair, J.F.; Risher, J.J.; Sarstedt, M.; Ringle, C.M. When to Use and How to Report the Results of PLS-SEM. Eur. Bus. Rev. 2019, 31, 2–24. [Google Scholar] [CrossRef]
  114. Sarstedt, M.; Ringle, C.M.; Hair, J.F. Partial Least Squares Structural Equation Modeling. In Handbook of Market Research; Springer: Berlin/Heidelberg, Germany, 2021; pp. 587–632. [Google Scholar]
  115. Fornell, C.; Larcker, D.F. Structural Equation Models with Unobservable Variables and Measurement Error: Algebra and Statistics. J. Mark. Res. 1981, 18, 382–388. [Google Scholar] [CrossRef]
  116. Henseler, J.; Ringle, C.M.; Sarstedt, M. A New Criterion for Assessing Discriminant Validity in Variance-Based Structural Equation Modeling. J. Acad. Mark. Sci. 2015, 43, 115–135. [Google Scholar] [CrossRef]
  117. Chin, W.W. The Partial Least Squares Approach to Structural Equation Modeling. Mod. Methods Bus. Res. 1998, 295, 295–336. [Google Scholar]
  118. Hair, J.F., Jr.; Hult, G.T.M.; Ringle, C.M.; Sarstedt, M.; Danks, N.P.; Ray, S. Partial Least Squares Structural Equation Modeling (PLS-SEM) Using R: A Workbook; Springer Nature: Berlin/Heidelberg, Germany, 2021; ISBN 3030805190. [Google Scholar]
  119. Fornell, C.; Larcker, D.F. Evaluating Structural Equation Models with Unobservable Variables and Measurement Error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  120. Cohen, J. Statistical Power Analysis for the Behavioral Sciences; Academic press: Cambridge, MA, USA, 1988; ISBN 1483276481. [Google Scholar]
  121. Duong, C.D.; Nguyen, T.H.; Ngo, T.V.N.; Dao, V.T.; Do, N.D.; Pham, T. Van Exploring Higher Education Students’ Continuance Usage Intention of ChatGPT: Amalgamation of the Information System Success Model and the Stimulus-Organism-Response Paradigm. Int. J. Inf. Learn. Technol. 2024, 41, 556–584. [Google Scholar]
  122. Parker, L.; Carter, C.W.; Karakas, A.; Loper, A.J.; Sokkar, A. Artificial Intelligence in Undergraduate Assignments: An Exploration of the Effectiveness and Ethics of ChatGPT in Academic Work. In ChatGPT and Global Higher Education: Using Artificial Intelligence in Teaching and Learning; STAR Scholars Press: Baltimore, MD, USA, 2024. [Google Scholar]
  123. Hooda, M.; Rana, C.; Dahiya, O.; Rizwan, A.; Hossain, M.S. Artificial Intelligence for Assessment and Feedback to Enhance Student Success in Higher Education. Math. Probl. Eng. 2022, 2022, 5215722. [Google Scholar] [CrossRef]
  124. Mariani, M.M.; Machado, I.; Magrelli, V.; Dwivedi, Y.K. Artificial Intelligence in Innovation Research: A Systematic Review, Conceptual Framework, and Future Research Directions. Technovation 2023, 122, 102623. [Google Scholar] [CrossRef]
  125. Tenedero, E.Q.; Pacadaljen, L.M. Learning Experiences in the Emerging Outcomes-Based Education (OBE) Curriculum of Higher Education Institutions (HEI’S) on the Scope of Hammond’s Evaluation Cube. Psychol. Educ. 2021, 9, 2. [Google Scholar]
  126. Chiu, T.K.F.; Moorhouse, B.L.; Chai, C.S.; Ismailov, M. Teacher Support and Student Motivation to Learn with Artificial Intelligence (AI) Based Chatbot. Interact. Learn. Environ. 2024, 32, 3240–3256. [Google Scholar] [CrossRef]
  127. Kim, J.; Lee, H.; Cho, Y.H. Learning Design to Support Student-AI Collaboration: Perspectives of Leading Teachers for AI in Education. Educ. Inf. Technol. 2022, 27, 6069–6104. [Google Scholar] [CrossRef]
  128. Baig, M.I.; Yadegaridehkordi, E. ChatGPT in the Higher Education: A Systematic Literature Review and Research Challenges. Int. J. Educ. Res. 2024, 127, 102411. [Google Scholar] [CrossRef]
  129. Holmes, W.; Tuomi, I. State of the Art and Practice in AI in Education. Eur. J. Educ. 2022, 57, 542–570. [Google Scholar] [CrossRef]
  130. Fabiyi, S.D. What Can ChatGPT Not Do in Education? Evaluating Its Effectiveness in Assessing Educational Learning Outcomes. Innov. Educ. Teach. Int. 2025, 62, 484–498. [Google Scholar] [CrossRef]
  131. Rani, S.; Kaur, G.; Dutta, S. Educational AI Tools: A New Revolution in Outcome-Based Education. In Explainable AI for Education: Recent Trends and Challenges; Springer: Berlin/Heidelberg, Germany, 2024; pp. 43–60. [Google Scholar]
  132. Seo, K.; Yoo, M.; Dodson, S.; Jin, S.-H. Augmented Teachers: K–12 Teachers’ Needs for Artificial Intelligence’s Complementary Role in Personalized Learning. J. Res. Technol. Educ. 2024, 1–18. [Google Scholar] [CrossRef]
  133. Al-kfairy, M. Factors Impacting the Adoption and Acceptance of ChatGPT in Educational Settings: A Narrative Review of Empirical Studies. Appl. Syst. Innov. 2024, 7, 110. [Google Scholar] [CrossRef]
  134. Katsamakas, E.; Pavlov, O.V.; Saklad, R. Artificial Intelligence and the Transformation of Higher Education Institutions: A Systems Approach. Sustainability 2024, 16, 6118. [Google Scholar] [CrossRef]
Figure 1. OBE and Artificial Intelligence in higher education.
Figure 1. OBE and Artificial Intelligence in higher education.
Systems 13 00517 g001
Figure 2. Hypothesis development.
Figure 2. Hypothesis development.
Systems 13 00517 g002
Figure 3. Proposed methodology.
Figure 3. Proposed methodology.
Systems 13 00517 g003
Figure 4. Relationship effects.
Figure 4. Relationship effects.
Systems 13 00517 g004
Table 1. Survey results.
Table 1. Survey results.
CategorySub-CategoryFrequency (n = 320)Percentage (%)
UniversityQUEST19260%
SBBU9630%
SALU3210%
GenderMale21868%
Female10232%
Teacher DesignationLecturer16050%
Assistant Professor9630%
Associate Professor4815%
Professor165%
Educational QualificationMaster’s Degree12840%
MPhil/MS11235%
PhD8025%
Teaching Experience1–5 years14445%
6–10 years11235%
11–15 years4815%
Above 15 years165%
Survey QuestionsDo you believe OBE enhances student learning and academic success?Yes: 28088%
Have AI tools like ChatGPT improved your instructional planning?Yes: 24075%
Do you feel confident in using AI-driven tools for teaching?Yes: 25680%
Does OBE help in aligning course objectives with student performance?Yes: 27285%
Do you think AI-powered feedback mechanisms enhance student engagement?Yes: 24878%
Table 2. Convergent validity.
Table 2. Convergent validity.
ConstructItemsFactor LoadingVIFCronbach’s AlphaCRAVE
AI ChatGPT CapabilitiesACC010.791.7600.830.880.6
ACC020.822.020
ACC030.822.090
ACC040.761.640
ACC050.681.380
Instructional PlanningINP010.691.4800.880.910.64
INP020.842.530
INP030.842.600
INP040.832.290
INP050.812.350
INP060.761.950
Students PerformanceSTP010.791.7800.870.90.65
STP020.781.910
STP030.862.480
STP040.822.120
STP050.771.790
Teacher CompetencyTCO010.741.6300.860.90.64
TCO020.852.250
TCO030.821.990
TCO040.781.740
TCO050.791.940
Table 3. Discriminant Validity.
Table 3. Discriminant Validity.
HTMT
ACCINPSTPTCO
ACC
INP0.69
STP0.710.8
TCO0.850.690.69
Fornell-Larcker Criterion
ACCINPSTPTCO
0.77
ACC0.590.8
INP0.60.710.81
STP0.710.60.60.8
Table 4. Direct and Indirect Effects of each relationship:.
Table 4. Direct and Indirect Effects of each relationship:.
Direct Effects of Each Relationship
PathOriginal Sample (β)T-Statisticp-ValueDecision
ACC → INP0.334.810.000Accepted
ACC → STP0.203.010.000Accepted
INP → STP0.508.510.000Accepted
TCO → INP0.375.290.000Accepted
TCO → STP0.162.330.020Accepted
Indirect Effects of Each Relationship
ACC → INP → STP0.1604.4000.000Accepted
TCO → INP → STP0.1804.2500.000Accepted
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Alwakid, W.N.; Dahri, N.A.; Humayun, M.; Alwakid, G.N. Exploring the Role of AI and Teacher Competencies on Instructional Planning and Student Performance in an Outcome-Based Education System. Systems 2025, 13, 517. https://doi.org/10.3390/systems13070517

AMA Style

Alwakid WN, Dahri NA, Humayun M, Alwakid GN. Exploring the Role of AI and Teacher Competencies on Instructional Planning and Student Performance in an Outcome-Based Education System. Systems. 2025; 13(7):517. https://doi.org/10.3390/systems13070517

Chicago/Turabian Style

Alwakid, Wafa Naif, Nisar Ahmed Dahri, Mamoona Humayun, and Ghadah Naif Alwakid. 2025. "Exploring the Role of AI and Teacher Competencies on Instructional Planning and Student Performance in an Outcome-Based Education System" Systems 13, no. 7: 517. https://doi.org/10.3390/systems13070517

APA Style

Alwakid, W. N., Dahri, N. A., Humayun, M., & Alwakid, G. N. (2025). Exploring the Role of AI and Teacher Competencies on Instructional Planning and Student Performance in an Outcome-Based Education System. Systems, 13(7), 517. https://doi.org/10.3390/systems13070517

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop