Next Article in Journal
Are Teachers Prepared for the Anthropocene? Climate–Vegetation Integration in Science Teacher Education Across 26 Countries
Previous Article in Journal
Pre-Service Teachers’ Interpretations and Decisions About a 3D Geometry Activity Sequence
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Faculty Perceptions and Adoption of AI in Higher Education: Insights from Two Lebanese Universities

1
Department of English and Translation, Faculty of Humanities, Notre Dame University-Louaize, Zouk Mikael P.O. Box 72, Lebanon
2
Neurodevelopmental Disorders Research Group (NDRG), Department of Biology, Faculty of Arts and Sciences, Holy Spirit University of Kaslik, Jounieh P.O. Box 446, Lebanon
3
School of Engineering, Holy Spirit University of Kaslik, Jounieh P.O. Box 446, Lebanon
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Educ. Sci. 2026, 16(1), 55; https://doi.org/10.3390/educsci16010055
Submission received: 30 October 2025 / Revised: 19 December 2025 / Accepted: 24 December 2025 / Published: 31 December 2025

Abstract

Artificial intelligence (AI) is increasingly transforming higher education, evolving from simple personalization tools into a wide range of applications that support teaching, learning, and assessment. This study examines how university instructors in Lebanon perceive and adopt AI in their academic practices, drawing on evidence from two private institutions: Notre Dame University–Louaize (NDU) and the Holy Spirit University of Kaslik (USEK). The study also proposes practical directions for effective institutional implementation. Using a cross-sectional design and convenience sampling, data were collected from 133 faculty members. Although 73.7% of participants reported moderate to high familiarity with AI, their actual classroom use of such tools remained limited. Adoption was primarily centered on chatbots (69.2%) and translation tools (54.9%), while more advanced technologies, such as adaptive learning systems and AI-based tutoring platforms, were seldom utilized (under 7%). Additionally, participants identified efficiency (69.2%), increased student engagement (44.4%), and personalized learning opportunities (42.9%) as the main benefits of AI integration. In contrast, they reported insufficient training (46.6%), restricted access to resources (45.9%), and concerns about the accuracy of AI-generated outputs (29.3%) as major barriers. Moreover, statistical analysis indicated a strong positive relationship between familiarity with AI and frequency of adoption, with no significant differences across gender, age, or academic qualifications. Overall, the results suggest that faculty members in Lebanese higher education currently view AI primarily as a helpful tool for improving efficiency rather than as a transformative pedagogical innovation. To advance integration, higher education institutions should prioritize targeted professional development, ensure equitable access to AI tools, and establish transparent ethical and governance frameworks.

1. Introduction

Artificial intelligence (AI) has rapidly evolved from a specialized technological innovation into a transformative force within higher education. Its impact now reaches beyond curriculum planning and evaluation, influencing how universities envision and structure the teaching and learning process (Linderoth et al., 2025). Early debates emphasized personalization and differentiated instruction, highlighting AI’s capacity to tailor content and pacing to individual learners (Febrianti et al., 2025; Damyanov, 2024). While these contributions remain valuable, they no longer capture the breadth of AI’s roles or the complexity of its implications for contemporary higher education.
Faculty perceptions and practices play a pivotal role in shaping the trajectory of AI integration. In higher education, instructors act as both gatekeepers and catalysts of technological change; their adoption behaviors can accelerate or constrain institutional innovation (Billy & Anush, 2023). Understanding how faculty engage with AI is therefore critical to bridging the gap between strategic aspirations and everyday teaching practice, particularly in contexts where institutional readiness is uneven. Faculty attitudes and practices often determine whether AI remains a peripheral support tool or becomes meaningfully embedded in pedagogy (Mah & Groß, 2024).
Although AI is expanding quickly across the global higher education landscape, research and documented practices remain scarce in many underrepresented regions. In the Middle East, adoption of AI tools has been shaped by economic pressures, uneven infrastructure, and policy uncertainty (Al-Zahrani & Alasmari, 2025). Lebanon offers a particularly revealing case: universities continue to pursue innovation amid political and financial instability, with faculty adapting under constrained conditions (Akar, 2022). Examining how instructors perceive, adopt, and evaluate AI in this context provides insights that extend beyond Lebanon, with implications for other resource-constrained educational systems.
Despite growing interest in faculty use of AI, much of the existing literature remains descriptive, focusing on awareness or general attitudes rather than examining how familiarity, institutional conditions, and perceived barriers jointly shape adoption in practice. This study addresses this gap by empirically examining the association between faculty familiarity with AI and the frequency of instructional use, while also assessing the role of perceived benefits, barriers, and demographic characteristics. By doing so, the study contributes to a more nuanced understanding of the widely observed readiness practice gap in higher education, where positive attitudes toward AI do not consistently translate into sustained pedagogical integration.
Against this backdrop, the present study examines faculty engagement with AI in two private Lebanese universities, Notre Dame University–Louaize (NDU) and the Holy Spirit University of Kaslik (USEK). It focuses on faculty familiarity with AI, patterns of classroom adoption, perceived benefits and barriers, and potential demographic differences in usage. The study also explores how familiarity relates to adoption frequency and considers faculty attitudes toward future AI integration. The study is guided by the following research questions:
(1)
How familiar are faculty instructors at the two participating Lebanese private universities with AI tools used for teaching and assessment?
(2)
Which AI tools and instructional domains are most commonly used by these instructors?
(3)
What benefits and barriers do faculty instructors perceive regarding the pedagogical use of AI for teaching and assessment in higher education?
(4)
What is the association between self-reported AI familiarity and the frequency of AI use in teaching and assessment among faculty instructors?
(5)
Do AI adoption patterns differ according to faculty instructors’ gender, age group, or academic qualification?
(6)
What are faculty instructors’ attitudes toward the future adoption of AI in Lebanese higher education?
By generating empirical evidence on familiarity, usage patterns, perceived pedagogical benefits and barriers, and demographic variation within a resource-constrained institutional context, this study offers a context-sensitive and theoretically informed contribution to the global literature on AI adoption in higher education. The findings provide actionable insights for institutions seeking to support targeted professional development, equitable access to AI tools, and the development of ethical governance frameworks, which are essential for moving from sporadic experimentation toward sustainable and pedagogically meaningful AI integration.

2. Literature Review

AI use in higher education has expanded beyond its early focus on personalization to include a wide range of practical applications in teaching and assessment. Instead of replacing instructors, most current uses of AI support everyday academic tasks, such as preparing course materials, providing feedback, and supporting multilingual learners, while large-scale changes to teaching methods and curriculum design are still limited and inconsistent (Nur Fitria, 2021).

2.1. AI in Higher Education: Global Perspectives

AI is increasingly used to support practical teaching and learning activities. Common applications include chatbots that respond to student questions, adaptive platforms that personalize learning experiences, automated feedback tools that provide real-time responses to student work, and systems that assist instructors with lesson planning and course management (Naseer, 2023; Kovalchuk et al., 2025). Evidence suggests that AI adoption often begins with low-stakes, high-utility uses, such as lesson planning, translation, and draft generation, before gradually expanding into more advanced instructional and administrative functions (Ramos et al., 2024; Darıcan, 2025). International guidance emphasizes that sustainable integration depends on strengthening ethics, governance, and capacity building, particularly in relation to assessment and data use (Kovalchuk et al., 2025; Barde et al., 2024; Bhavana et al., 2025). Strengthening these areas is widely recognized as essential to ensuring responsible and equitable AI use in educational settings.

2.2. Faculty Perceptions, Familiarity, and Adoption Patterns

Empirical studies consistently show that instructor familiarity and self-efficacy predict both the likelihood and frequency of AI use as well as more favorable attitudes toward its classroom value (Salhab, 2025; Granström & Oppi, 2025; Yang, 2025). Early adoption tends to focus on enhancing efficiency, content adaptation, drafting materials, and workflow support. This is evident from the widespread use of AI in administrative tasks, student services, and personalized learning systems (Dubey & Crevar, 2025; Debo & Saaida, 2024). These applications are generally perceived positively due to their ability to streamline processes, provide personalized feedback, and support academic tasks (Tsiani et al., 2025; Manigandan & Kanimozhi, 2025). However, the integration of AI into higher-stakes instructional tasks, such as core teaching and research, faces slower uptake (Kazimova et al., 2025). This hesitation is largely due to concerns about validity, transparency, and integrity (R. Rajput, 2025). Large-scale surveys portray a blend of optimism and caution among faculty, with enthusiasm tempered by questions about accuracy, equity, and pedagogical alignment.

2.3. Barriers and Enablers

The integration of AI in education faces several persistent barriers, which can be grouped into four main areas: insufficient professional development opportunities, limited or inequitable access to tools, unclear institutional policies, and data-privacy concerns (Ahmed et al., 2025; Gabay & Funa, 2025). Additional challenges include resource constraints and uncertainty around ethical use and accountability frameworks (Guan et al., 2022). Conversely, reported benefits and enabling factors often center on time savings, workflow efficiency, perceived personalization, and improved student engagement (Sholeh, 2025). When faculty have access to adequate training, resources, and governance structures, they are more likely to adopt AI tools for both administrative and pedagogical purposes.

2.4. Pedagogical Implications

Beyond tool uptake, emerging research indicates that AI is incrementally reshaping key aspects of pedagogy, particularly in how instructors plan, deliver, and adapt instruction. Instructors increasingly employ AI to differentiate learning experiences, rapidly adapt resources for diverse cohorts, and provide automated formative feedback that can reduce preparation time while supporting multilingual or mixed-ability classrooms (D. Rajput, 2025; Babu et al., 2025; Khulekani et al., 2025). However, systematic integration into summative assessment and course redesign remains limited, suggesting that, at present, AI often functions as a supportive layer that augments existing practice rather than a driver of deep pedagogical change (Owan et al., 2023). Parallel trends on the student side, marked growth in everyday generative-AI use, are increasing pressure on institutions and instructors to update assessment design, integrity policies, and classroom guidance. These developments highlight the growing pedagogical relevance of AI while underscoring the gap between early-stage tool adoption and broader instructional transformation.

2.5. Regional Perspectives and the Lebanese Context

In the Middle East and North Africa (MENA) region, AI adoption in higher education has been shaped by economic constraints, infrastructure variability, and evolving policy frameworks. Regional studies report growing awareness of AI tools among faculty and students, alongside uneven patterns of pedagogical adoption (Al-Zahrani & Alasmari, 2025). Evidence suggests that AI use in higher education across the region often concentrates on low-risk, general-purpose applications, such as content generation, translation, and administrative support, while integration into core teaching, assessment, and curriculum design remains limited (Shakib Kotamjani et al., 2025; Em et al., 2025).
Research also indicates substantial variation in faculty familiarity and usage across institutions and disciplines, with professional development opportunities, access to tools, and institutional guidance emerging as key enablers of adoption. At the same time, instructors in several MENA contexts report concerns related to data privacy, academic integrity, and the absence of clear governance frameworks, which continue to constrain broader pedagogical integration (Farooqi, 2026; Fang & Tse, 2023).
Lebanon exemplifies these regional dynamics in a particularly acute form. Universities operate under prolonged political and financial instability, limiting access to licensed technologies, structured training, and centralized governance mechanisms. As a result, AI adoption in Lebanese higher education is often fragmented and instructor-driven, relying heavily on individual familiarity rather than coordinated institutional strategy. These conditions underscore the importance of context-sensitive research that centers faculty perceptions and practices and links adoption behaviors to institutional supports and constraints.

2.6. Theoretical Perspectives on Technology Adoption in Education

Research on educational technology adoption has long emphasized the role of both individual and structural factors in shaping instructors’ use of new tools. One influential framework is Ertmer’s distinction between first-order and second-order barriers (Ertmer, 1999; Ertmer & Ottenbreit-Leftwich, 2010). First-order barriers refer to external constraints such as limited access to resources, insufficient training, time constraints, and inadequate institutional support. Second-order barriers relate to internal factors, including instructors’ beliefs, attitudes, confidence, and perceptions of pedagogical value. This framework has been widely applied in studies of digital innovation in education to explain why access to technology alone does not guarantee meaningful adoption. Even when first-order barriers are addressed, second-order barriers may continue to limit pedagogical integration. In higher education contexts, particularly those characterized by resource constraints or policy uncertainty, the interaction between structural conditions and faculty perceptions plays a central role in shaping adoption trajectories. Framing AI adoption through this lens allows for a more nuanced interpretation of how familiarity, perceived benefits, and institutional supports jointly influence instructors’ engagement with AI tools.

3. Materials and Methods

3.1. Study Design, Setting, and Period

We employed a cross-sectional survey to capture faculty perceptions of AI in higher education. The study was designed to examine instructors’ attitudes, perceived benefits and concerns, and self-reported practices related to AI adoption, rather than to measure objective system-level usage. Data were collected from January to October 2024 at two private Lebanese universities, Notre Dame University–Louaize (NDU) and the Holy Spirit University of Kaslik (USEK). Both institutions encompass diverse faculties and teaching levels, offering a robust context for examining AI adoption in a resource-constrained yet academically resilient environment.
For context, NDU and USEK are private universities within a higher education system that includes both private and public institutions. Like many private universities in Lebanon, they offer multi-faculty academic programs and operate under governance and resourcing conditions that differ from those of the public sector. Public universities in Lebanon typically serve larger student populations and face different funding and administrative constraints, which may shape patterns of technology adoption in distinct ways. Focusing on these two private institutions therefore provides insight into faculty perceptions of AI adoption within this segment of the Lebanese higher education system, while recognizing that adoption dynamics may differ in public or differently resourced institutional contexts.

3.2. Sampling Strategy and Participants

A convenience sampling approach was used to efficiently reach a broad range of faculty within the study timeframe. Invitations containing a survey link were distributed via institutional mailing lists and professional academic networks. Participation was voluntary; informed consent was obtained before accessing the questionnaire. Only complete responses were retained, and incomplete or inconsistent entries were excluded. Of the initial responses, 22 were removed for missing data, yielding a final sample of 133 participants.

3.3. Eligibility Criteria

  • Inclusion: University instructors currently engaged in undergraduate or graduate teaching in Lebanon; representation from diverse academic disciplines (humanities, sciences, business, education, and applied fields); provision of informed consent.
  • Exclusion: Individuals not employed in higher education teaching roles; respondents who did not consent; incomplete or invalid survey submissions.

3.4. Instrument Development and Validation

The survey instrument was developed following an extensive review of the literature on AI in higher education (Harris, 2024; Salhab, 2025) and comprised four sections:
(1)
Socio-demographic data (age, gender, academic qualification, years of teaching, discipline, and course levels taught).
(2)
Familiarity with and usage of AI tools, including chatbots, translation systems, automated editing, image/text generation, and personalized learning platforms. Familiarity was operationalized as instructors’ self-rated knowledge of and comfort with using AI tools. Adoption was operationalized as self-reported frequency of AI use across instructional domains, including lesson preparation, differentiation, and assessment.
(3)
Perceived effectiveness of AI, measured through instructors’ ratings of AI usefulness in supporting lesson preparation, differentiation, and assessment tasks.
(4)
Perceived benefits and barriers to AI adoption, operationalized through items capturing instructors’ perceptions of advantages (e.g., time savings, efficiency, personalization) and challenges (e.g., lack of training, limited access, unclear institutional policies).
The instrument underwent pilot testing with 20 instructors. Feedback led to adjustments for clarity, terminology, and item sequencing. Reliability testing produced a Cronbach’s alpha of 0.78, indicating acceptable internal consistency. Content validity was confirmed through review by three subject-matter experts in educational technology and pedagogy.

3.5. Ethical Considerations

Institutional Review Board (IRB) approval was obtained from NDU. Participants were informed of study objectives, their right to withdraw at any stage, and assurances of anonymity and confidentiality. All data were stored securely in password-protected files accessible only to the research team.

3.6. Data Analysis

Analyses were conducted using IBM SPSS Statistics v20.0 (IBM Corp., Armonk, NY, USA). Descriptive statistics (frequencies, percentages) summarized demographic and response distributions. Familiarity with AI and frequency of AI use were treated as ordinal variables, as both were measured using ordered response categories reflecting increasing levels of knowledge and engagement (e.g., from lower to higher familiarity, and from never to frequent use). Chi-square tests were employed to assess associations between categorical demographic variables and frequency of AI use. To examine the relationship between familiarity and adoption, Goodman and Kruskal’s Gamma and Kendall’s τ-b were used, as these non-parametric statistics are appropriate for ordinal data and allow assessment of both the strength and direction of association while accounting for tied ranks. Effect sizes were interpreted following conventional guidelines, with higher absolute values indicating stronger associations (e.g., values above 0.50 reflecting strong relationships). Statistical significance was set at p < 0.05.

4. Results

4.1. Participant Characteristics

A total of 133 instructors participated from NDU and USEK (Table 1). Gender distribution was nearly balanced (51.9% female, 48.1% male). Most were mid-career: 42.9% were aged 35–44 and 32.3% aged 45–54, while only 9.8% were younger than 35. Academically, the majority (64.6%) held a doctorate and 30.1% a master’s degree. More than one-third (36.1%) also reported a diploma in pedagogical techniques. Teaching experience was high, with over half (53.4%) having more than 16 years. Disciplines were diverse, led by Languages (26.3%), Business Administration (12.8%), and Education (11.3%) (Figure 1). Most taught undergraduate courses (88%), while nearly half (46.6%) also taught graduate classes.

4.2. Familiarity and Use of AI Tools

Nearly three-quarters (73.7%) of instructors reported moderate to high familiarity with AI tools, while 5.3% reported no familiarity at all. Despite this, only about half (51.1%) reported using AI for differentiated instruction at least occasionally; the remainder used it rarely or never.
Adoption clustered around a few general-purpose tools. Chatbots and virtual assistants (69.2%) and translation tools (54.9%) dominated, while automated writing/editing (30.8%), image generation (24.1%), and speech-to-text utilities (22.6%) were secondary. Advanced platforms such as personalized learning (6.8%) and AI tutoring systems (3.8%) were scarcely used (Table 2).

4.3. Instructional Applications and Effectiveness

AI was used most frequently for lesson preparation, with 64% using it at least monthly (Figure 2). For addressing diverse learning styles, 53.4% reported monthly or more frequent use. By contrast, only 36% used AI regularly for assessment purposes. Perceptions of effectiveness were positive: three-quarters of instructors reported improvements in lesson quality, and 70% found AI at least somewhat effective for differentiation. Assessment use was more limited, with 36.8% rating AI ‘not applicable’ due to lack of adoption.

4.4. Benefits and Barriers

The most frequently endorsed benefit was time-saving for educators (69.2%), followed by improved engagement (44.4%) and personalization (42.9%). Targeted practice and support for diverse learning styles (each 35.3%) were also valued. Advanced benefits such as adaptive assessment (15.8%) and real-time analytics (16.5%) were less frequently recognized.
Barriers were led by insufficient training (46.6%) and limited access to tools (45.9%). Reliability concerns (29.3%), cost (27.8%), and time to implement (26.3%) were also noted (Figure 3).

4.5. Attitudes Toward Future Adoption

Faculty expressed strong optimism: 75.9% were optimistic or very optimistic about AI’s future impact, while only 6.1% were pessimistic. A large majority (85%) would recommend wider adoption to colleagues (Figure 4).

4.6. Familiarity and Frequency of Use

Familiarity was strongly associated with adoption frequency (Table 3 and Table 4). Among those ‘not familiar,’ 86% never or rarely used AI. By contrast, 75% of the ‘extremely familiar’ group reported frequent use. Statistical testing confirmed a highly significant association (χ2 = 63.91, df = 16, p < 0.001; Gamma = 0.71; Kendall’s τ-b = 0.53).

4.7. Demographic Comparisons

No statistically significant differences in adoption were found by gender (χ2 = 8.65, p = 0.071), age (χ2 = 7.60, p = 0.816), or qualification (χ2 = 18.30, p = 0.568). This suggests that barriers are primarily structural (training, access, cost) rather than linked to individual demographic factors.

5. Discussion

The present study offers a focused portrait of instructors’ engagement with artificial intelligence across two Lebanese universities. Three findings are prominent. First, familiarity with AI tools is strongly associated with adoption: instructors who report higher familiarity are substantially more likely to use AI frequently for instruction and differentiation. Second, use remains concentrated in general-purpose tools, particularly chatbots and translation, whereas pedagogy-specific systems such as intelligent tutoring and adaptive or personalized platforms see minimal uptake. Third, despite limited use in assessment, attitudes are broadly optimistic and faculty indicate willingness to recommend wider adoption, while citing structural barriers, chiefly training and access, as the primary constraints. These patterns are consistent with recent syntheses that describe a readiness–practice gap in higher education, where positive sentiment and perceived efficiency gains precede deep course-embedded uses, and early adoption clusters around assistants before specialized analytics or adaptive systems are mainstreamed (Bond et al., 2024; Crompton & Burke, 2023; Shata & Hartley, 2025).
When situated within the broader MENA literature, the present findings show strong convergence with regional adoption patterns, particularly the concentration of AI use in low-barrier, general-purpose applications and the central role of training, access, and institutional support in shaping adoption (Al-Zahrani & Alasmari, 2025). At the same time, the pronounced structural constraints reported by Lebanese faculty suggest a more fragile institutional environment than that documented in some neighboring contexts, reinforcing the importance of context-sensitive interpretations of AI integration.
In relation to the broader international literature, these findings also converge with and extend established international evidence on AI adoption in higher education. International reviews consistently show that early-stage AI integration is dominated by general-purpose tools that support efficiency and content preparation, while pedagogy-specific systems and assessment-oriented applications remain marginal (Bond et al., 2024; Crompton & Burke, 2023). The Lebanese case aligns with this global trajectory, reinforcing the interpretation of AI as an assistive rather than transformative pedagogical technology in its early adoption phase. However, our findings also suggest a sharper manifestation of the readiness–practice gap than that typically reported in better-resourced higher education systems. In contexts where institutional training, access to licensed tools, and governance frameworks are limited, moderate to high faculty familiarity alone appears insufficient to drive sustained pedagogical integration. This contrast highlights the importance of structural capacity as a moderating factor in international AI adoption patterns.
Beyond the substantive findings, this study underscores the value of perception-based evidence in examining emerging educational technologies. Faculty perceptions play a central role in shaping whether and how AI tools are adopted, pedagogically legitimized, and sustained within higher education institutions. During early or transitional phases of technology integration, when adoption is uneven and system-level usage data may be fragmented or unavailable, instructors’ beliefs, familiarity, and perceived usefulness often precede observable behavioral change. As such, perception-focused research provides critical insight into institutional readiness, perceived barriers, and the contextual conditions that influence adoption trajectories. This perspective is particularly salient in resource-constrained higher education systems, where structural limitations may shape practice long before widespread or measurable system-level implementation occurs.
International guidance and sector surveys similarly emphasize that professional learning, institutional policy, and curated tool portfolios are preconditions for sustained uptake, particularly for higher-stakes tasks such as assessment (Busuttil, 2025; Al Rashdi & Elgeddawy, 2025; Farouqa et al., 2025).
The task-level distribution observed here, strongest traction in lesson preparation and differentiation, with more hesitant use for assessment, can be interpreted through established technology adoption frameworks. From the perspective of Ertmer’s first- and second-order barriers, low-barrier, high-utility tasks such as lesson planning and resource adaptation are more readily adopted because they require limited institutional change and align closely with instructors’ existing pedagogical practices. This pattern is consistent with prior analyses of early-stage AI and LLM adoption, which show that low-risk, efficiency-oriented workflows tend to dominate initial use (Liwanag et al., 2025).
Importantly, these patterns also illustrate how institutional structures actively shape faculty adoption behaviors rather than merely constraining them. The availability of licensed AI tools, the presence or absence of formal guidance, and the degree of institutional endorsement influence whether instructors perceive AI use as legitimate, low-risk, and worth sustained investment of time and effort. In contexts where governance frameworks, assessment policies, and professional support are limited, instructors may rationally restrict AI use to peripheral or preparatory tasks that do not require institutional approval or carry pedagogical or ethical risk. Conversely, clearer policies, access pathways, and support structures can normalize experimentation, reduce perceived risk, and encourage a shift from individual, ad hoc use toward more consistent and pedagogically embedded adoption.
In contrast, assessment-related applications implicate higher-stakes pedagogical, ethical, and governance concerns, intensifying both first-order barriers (e.g., policy clarity, institutional support) and second-order barriers (e.g., beliefs about validity, bias, and academic integrity). As a result, assessment-oriented uses are often perceived as “not applicable” in the absence of clear institutional guardrails, despite generally positive attitudes toward AI more broadly.
At the same time, the present study builds on earlier work in two important ways. First, while previous research has shown that faculty who feel confident using AI and believe it is useful are more likely to adopt it, our results go a step further by measuring how strongly familiarity is linked to actual use. We found that instructors who report higher levels of familiarity with AI also report more frequent use (Ramos Salazar & Peeples, 2025). This association suggests that increasing faculty familiarity, such as through targeted professional development, may be an important leverage point for supporting broader AI adoption in practice.
Second, unlike some earlier studies that highlight demographic differences in technology adoption (Qiu et al., 2024), our findings show no significant differences in AI use by gender, age, or academic qualification. Interpreted through Ertmer’s framework, this pattern suggests that structural and institutional conditions (first-order barriers), such as access to tools, training opportunities, and governance frameworks, may exert a stronger influence on AI adoption than individual demographic characteristics. While second-order barriers related to beliefs and familiarity remain important, the absence of demographic effects underscores the central role of institutional context in enabling or constraining meaningful AI integration. This interpretation aligns with recent sector analyses emphasizing the role of institutional levers, such as clear policies, access to vetted tools, and dedicated time and support for course redesign, as the most powerful drivers of sustainable AI integration in higher education (OECD, 2023; Robert, 2024). Strengthening these structural supports may therefore accelerate adoption more effectively than targeting demographic subgroups.
The recommendations outlined in the following section are explicitly grounded in the barriers identified in this study. Insufficient training informs the emphasis on professional development anchored in concrete teaching workflows, while limited access to licensed AI tools motivates the recommendation for curated, institutionally approved tool portfolios. Similarly, the low uptake of AI in assessment, despite generally positive attitudes toward its potential, supports the proposal for small-scale, policy-supported assessment pilots designed to reduce perceived risk and clarify acceptable use. By directly linking observed barriers to corresponding recommendations, the study translates diagnostic insights into institutionally actionable strategies rather than broad or abstract calls for innovation.
Taken together, these findings have implications that extend beyond interpretation to inform policy, practice, and institutional strategy. At the policy level, the concentration of AI use in low-stakes tasks and the persistent hesitation around assessment underscore the need for clear governance frameworks addressing validity, transparency, and academic integrity. At the level of teaching practice, the strong association between familiarity and adoption highlights the importance of professional learning anchored in concrete instructional workflows. Strategically, the findings suggest that institutions should approach AI integration as a staged capacity-building process, prioritizing access, training, and tool curation before expecting deeper pedagogical transformation. The following section translates these implications into actionable recommendations aligned with the empirical findings.
Conceptually, this study contributes to the AI-in-education literature by reframing early AI adoption not simply as a function of faculty attitudes or familiarity, but as an interaction between task-level pedagogical risk and institutional capacity, offering an interpretive lens that is applicable beyond the Lebanese context.

5.1. Practical Implications

Our findings indicate that insufficient training and limited access to AI tools were the most commonly cited barriers to adoption, while faculty familiarity emerged as a strong predictor of adoption frequency. Instructors also identified time savings, improved engagement, and support for personalized learning as key perceived benefits. Based on these results, we propose the following practical implications. Because lack of training was the most frequently reported barrier and familiarity strongly predicted adoption, we recommend a tiered professional learning sequence anchored in concrete teaching workflows, such as lesson preparation, differentiation, and formative feedback. Short-cycle evaluation (4–8 weeks) is particularly appropriate in the Lebanese context, as it allows institutions to build familiarity and demonstrate value without requiring extensive upfront investment. Because limited access to AI tools was the second most common barrier, institutions should prioritize the publication of a curated, privacy assured list of approved AI tools, accompanied by clear usage guidance and, where feasible, single sign-on or LMS integration. This approach can reduce uncertainty and promote equitable access while remaining feasible in resource-constrained settings. Given the limited use of AI in assessment, despite generally positive perceptions where it is employed, we recommend small-scale assessment redesign pilots in selected courses. These pilots should incorporate validity, transparency, and bias guardrails and allow institutions to explore responsible assessment use incrementally. Finally, the absence of statistically significant demographic differences suggests that adoption is shaped more by institutional conditions than by individual characteristics. Lightweight institutional supports, such as concise AI teaching policies and short evidence briefs summarizing pilot outcomes, can help sustain adoption and translate early experimentation into longer-term practice. Short-term, feasible actions in the Lebanese context include targeted professional development, curated tool access, and small-scale assessment pilots. Broader initiatives, such as micro grant schemes or expanded policy frameworks, should be viewed as longer-term aspirations requiring additional institutional resources.

5.2. Actionable Checklist

Short-term, feasible actions:
  • Launch a 2-module professional development sprint focused on lesson planning and formative feedback, with clear competency targets and uptake indicators, reflecting the strong association observed between faculty familiarity and AI adoption.
  • Publish a curated, privacy assured list of approved AI tools with quick-start guides and, where feasible, LMS integration to address commonly reported access barriers.
  • Pilot AI supported assessment redesign in 2–3 courses per faculty using validity, bias, and transparency checklists and shared exemplars, in response to the limited but promising use of AI in assessment identified in this study.
Longer-term aspirations:
  • Provide micro grants or shared licenses to ensure equitable, discipline balanced access to AI tools, particularly in resource constrained institutional settings.
  • Establish a small standing review group to evaluate emerging tools and issue brief impact summaries to inform policy refinement and support sustainable scaling.

5.3. Limitations

As with any cross-sectional, self-report study, certain limitations should be acknowledged. The design does not allow causal inference, and the focus on two private Lebanese universities limits statistical generalizability to the broader national system. Convenience sampling may also introduce some self-selection effects. In addition, the small number of respondents in the ‘extremely familiar’ group (n = 4) warrants careful interpretation of the familiarity–use gradient. Finally, the study did not include objective usage logs or student learning outcomes; future research could incorporate qualitative methods and system-level data to further examine the pedagogical impact of AI integration.

5.4. Future Directions

This study provides a foundation for understanding faculty engagement with AI in higher education, but it also points to several important avenues for future research. First, longitudinal studies could clarify how faculty familiarity, attitudes, and usage patterns evolve over time and whether professional learning leads to sustained adoption. Incorporating objective usage data would allow more precise measurement and validation of self-reported behaviors. Second, expanding the sample to public universities and other educational sectors would improve generalizability and support cross-institutional comparisons. Comparative research across national and international contexts could further illuminate how policy environments shape AI adoption trajectories. Finally, targeted studies are needed to explore AI integration in assessment and its effects on student learning outcomes. Experimental or quasi-experimental designs examining institutional guardrails, tool vetting, and professional development could help bridge the readiness–practice gap and inform evidence-based policy.

6. Conclusions

This study provides empirical insights into how faculty in Lebanese higher education institutions engage with artificial intelligence in their instructional practices. The findings show that while familiarity with AI tools is generally moderate to high, actual use remains concentrated on low-barrier, general-purpose applications such as chatbots and translation platforms. Based on self-reported survey data, pedagogy-specific tools, including intelligent tutoring and adaptive learning systems, have yet to gain meaningful traction, as participants indicated substantially lower usage of these tools compared with general purpose AI applications such as chatbots and translation systems. Familiarity emerged as a strong correlate of adoption frequency, highlighting the strategic role of professional learning as a potential mechanism for supporting effective AI integration. The absence of statistically significant demographic differences in this sample suggests that institutional structures may play a more prominent role than individual characteristics in shaping AI adoption, although larger and more diverse samples would be needed to detect subtler demographic effects. Access to vetted tools, professional development opportunities, and clear governance frameworks appear to be key levers for scaling responsible AI use in higher education. These results emphasize the importance of institutional readiness and policy clarity in bridging the readiness–practice gap. Looking ahead, future research should use longitudinal and experimental designs to examine how professional learning, governance mechanisms, and institutional support shape long-term adoption and pedagogical outcomes. Broadening the scope to include public universities and international comparative contexts will enhance external validity and build a stronger evidence base for policy and capacity-building initiatives in higher education globally.

Author Contributions

W.H., N.N., P.D., M.E.H., and T.B., conceptualized and designed this project. M.R. performed data analysis and interpretation of results. N.N. and M.R. wrote this paper. W.H., N.N., P.D., M.E.H., T.B. and M.R. reviewed and edited this paper. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board of the Holy Spirit University of Kaslik, Lebanon (HCR/EC 2024-053, 09 October 2024).

Informed Consent Statement

All participants involved in this study provided informed consent prior to participation.

Data Availability Statement

The original contributions presented in this study are included in this article and its Appendix A and Appendix B. Additional information is available from the corresponding author upon reasonable request.

Acknowledgments

We express our gratitude to all individuals who participated in this research.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Results

Table A1. Frequency of Use and Perceived Effectiveness of AI Tools in Teaching and Assessment.
Table A1. Frequency of Use and Perceived Effectiveness of AI Tools in Teaching and Assessment.
VariablesFrequency (n)Percentage (%)
Use of AI to prepare creative, engaging lessons
Every week2821.1
2–3 times a month3627.1
Once a month2115.8
Rarely (less than once a month)3224
Never1612
Perceived improvement in lesson quality due to AI
Significantly improved2821.1
Moderately improved3828.6
Slightly improved3526.3
No noticeable improvement139.8
Decreased quality00
Not Applicable1914.2
Use of AI to address diverse learning styles
Every week2015
2–3 times a month3022.6
Once a month2115.8
Rarely (less than once a month)4130.8
Never2115.8
Effectiveness of AI for diverse learning styles
Very effective1914.3
Effective3022.6
Somewhat effective4433.1
Not effective32.3
Not applicable3727.7
Use of AI to assess student learning outcomes
Every week96.8
2–3 times a month139.8
Once a month2619.5
Rarely (less than once a month)4130.8
Never4433.1
Effectiveness of AI in assessing learning outcomes
Very effective107.5
Effective2720.3
Somewhat effective3727.8
Not effective107.6
Not applicable4936.8
AI helps create personalized learning experiences
To a great extent2619.5
To some extent5742.9
Neutral1813.5
To a small extent1410.5
Not at all10.8
Not applicable1712.8
AI identifies areas for improvement (based on feedback/assessment)
Very effective1712.8
Effective5339.7
Neutral 3022.6
Ineffective32.3
Very ineffective21.5
Not applicable2821.1
Table A2. Reported Benefits and Obstacles of Using AI for Differentiated Instruction.
Table A2. Reported Benefits and Obstacles of Using AI for Differentiated Instruction.
VariablesFrequency (n)Percentage (%)
AI-enabled benefits for differentiated instruction
Personalized learning experiences5742.9
Improved student engagement5944.4
Efficient data analysis for student needs2821.1
Time-saving for educators9269.2
Enhanced student outcomes3929.3
Providing targeted practice exercises4735.3
Automating grading to allow more individual support1813.5
Generating real-time progress insights2216.5
Tailoring feedback based on student performance2115.8
Supporting diverse learning styles4735.3
Enabling adaptive assessments2115.8
Obstacles to AI-driven differentiation
Lack of adequate training or knowledge6246.6
Limited access to AI tools6145.9
Technical issues or software limitations2821.1
Insufficient time to implement AI tools effectively3526.3
High cost of AI tools3727.8
Concerns about data privacy and security2921.8
Lack of support from the institution2418
Resistance from colleagues teaching the same course1511.3
Resistance from students75.3
Difficulty integrating AI with existing teaching methods2921.8
Unreliable or inaccurate AI-generated results3929.3
Table A3. Instructor Attitudes Toward Future AI Adoption in Education.
Table A3. Instructor Attitudes Toward Future AI Adoption in Education.
VariablesFrequency (n)Percentage (%)
Optimism about AI’s future impact on education
Very optimistic4130.8
Optimistic6045.1
Neutral2418
Pessimistic64.5
Very Pessimistic21.6
Willingness to recommend greater AI use for differentiation
Yes11385
No2015
Table A4. Demographic Differences in AI Adoption.
Table A4. Demographic Differences in AI Adoption.
Demographic VariableCategoriesχ2dfp-Value
GenderFemale8.64940.071
Male
Age group25–347.596120.816
35–44
45–54
55 and above
Educational qualificationAssociate degree (minor, diploma, etc.…)18.301200.568
Bachelor’s degree
Master’s degree
Architect
Engineer
Doctor of Medicine (MD)
Doctorate/PhD
Other

Appendix B

Faculty Perceptions and Adoption of AI in Lebanese Higher Education: Evidence from Two Lebanese Universities
Introduction and Consent Form
This survey aims to gather valuable insights into how AI technologies are shaping and enhancing educational methods across various learning environments.
Your responses are crucial in helping us understand the current landscape of AI integration in education, as well as identifying potential areas for improvement and innovation.
There are no known risks associated with participating in this study.
The questionnaire consists of 24 questions and will take 7 to 10 min to complete.
Please note that your participation is voluntary, with the option to withdraw at any time, and all anonymous data will be securely handled and used solely for research purposes.
You will be fully informed about the data collected, how it will be used, and who will have access to it.
The collected information will be handled confidentially.
  • I freely agree to participate in this survey.
    Yes
    No
I.
Personal information
1.
What is your gender?
Female
Male
Prefer not to say
2.
What is your age group?
25–34
35–44
45–54
55 and above
3.
What is your highest level of education?
Associate degree (minor, diploma, etc.)
Bachelor’s degree
Master’s degree
Architect
Engineer
Doctor of Medicine (MD)
Doctorate / PhD
Other: ______
4.
Do you have a diploma related to education techniques and strategies?
Yes
No
5.
How many years have you been an instructor in higher education?
1–5 years
6–10 years
11–15 years
16–20 years
21 years and above
6.
Which subject(s) do you teach? (Select all that apply.)
Accounting and Finance
Agriculture
Art and Design
Biology
Business Administration
Chemistry
Computer Science
Drama and Theatre Studies
Economics
Education
Engineering
Environmental Science
Geography
Health Sciences
History
Journalism and Mass Communication
Languages (e.g., English, French, Spanish, Arabic)
Law
Literature
Linguistics
Marketing
Mathematics
Medicine
Music
Nursing
Philosophy
Physics
Political Science
Psychology
Religious Studies
Sports and Physical Education
Sociology
Theology
Other: ______
7.
Which course levels are you responsible for? (Select all that apply.)
Freshman
University Foundation Courses
Undergraduate
Graduate
Postgraduate
Other: ______
II.
Usage of AI-powered educational tools
8.
To what extent are you familiar with AI-powered educational tools?
Not familiar at all
Slightly familiar
Moderately familiar
Very familiar
Extremely familiar
9.
Have you used AI tools, such as adaptive learning platforms, AI grading systems, or personalized feedback mechanisms, to differentiate instruction?
Never
Rarely
Sometimes
Often
Always
10.
Have you ever used any of the following AI tools in your teaching or assessment practices? (Select all that apply.)
Chatbots & Virtual Assistants (e.g., OpenAI’s ChatGPT, Google Assistant)
Text Analysis Tools (e.g., IBM Watson Natural Language Understanding, GPT-4)
Translation Tools (e.g., DeepL, Google Translate)
Speech-to-Text & Text-to-Speech (e.g., Google Cloud Speech-to-Text, Microsoft Azure Cognitive Services)
Image Generation (e.g., DALL·E, MidJourney)
Video Generation (e.g., Synthesia)
Music and Sound Generation (e.g., AIVA, Amper Music)
Text Generation (e.g., Jasper, Sudowrite)
Automated Writing & Editing (e.g., Grammarly, Copy.ai)
Video Editing with AI (e.g., Adobe Premiere Pro with Sensei AI, Descript)
Medical Imaging (e.g., Arterys, Meta AI Vision, Aidoc)
Virtual Health Assistance (e.g., Babylon Health, Woebot)
Personalized Learning Platforms (e.g., Squirrel AI, Carnegie Learning)
AI Tutoring Systems (e.g., Quartern, Century Tech)
Other: ______
III.
The Impact of AI on Differentiated Instruction
Differentiation involves varying teaching and learning methods, including content, approaches, and assessments, while keeping outcomes as the common dependent variable. This ensures that all students reach the same goals but through personalized pathways suited to their needs.
11.
How often do you use AI tools to prepare creative and engaging lessons during the semester?
Every week
2–3 times a month
Once a month
Rarely(less than once a month)
Never
12.
To what extent has using AI tools improved the quality of your lessons this semester?
Significantly improved
Moderately improved
Slightly improved
No noticeable improvement
Decreased quality
Not applicable
13.
How often do you use AI tools to support you in addressing diverse learning styles in your classroom?
Every week
2–3 times a month
Once a month
Rarely (less than once a month)
Never
14.
How effective have AI tools been in helping you address diverse learning styles in your classroom this semester?
Very effective
Effective
Somewhat effective
Not effective
Not applicable
15.
How often do you use AI tools to assess student learning outcomes during a semester?
Every week
2–3 times a month
Once a month
Rarely (less than once a month)
Never
16.
How effective have AI tools been in helping you assess student learning outcomes during the semester?
Very effective
Effective
Somewhat effective
Not effective
Not applicable
17.
Based on your own experience, to what extent do AI tools help in creating personalized learning experiences for students?
To a great extent
To some extent
Neutral
To a small extent
Not at all
Not applicable
18.
Based on students’ feedback and assessment, how effective do you find AI in identifying areas for improvement in student performance?
Very effective
Effective
Neutral
Ineffective
Very ineffective
Not applicable
IV.
Pros and Cons of AI in Differentiated Education
19.
How has AI helped in differentiated instruction? (Select all that apply)
Personalized learning experiences
Improved student engagement
Efficient data analysis for student needs
Time-saving for educators
Enhanced student outcomes
Creating student-specific resources
Generating teaching materials from individual support
Suggesting diverse learning styles
Tracking feedback based on student performance
Supporting adaptive assessments
None of the above
Other: ______
20.
What type of obstacles have you encountered when using AI tools for differentiation? (Select all that apply.)
Lack of adequate training or knowledge
Limited access to AI tools
Technical issues or software limitations
Insufficient time to implement AI tools effectively
High cost of AI tools
Concerns about data privacy and security
Lack of support from the institution
Resistance from colleagues teaching the same course
Resistance from students
Difficulty integrating AI with existing teaching methods
Unreliable or inconsistent AI-generated results
None of the above
Other: ______
V.
Preparing for an AI-Driven Future
21.
How important do you think it is to balance AI integration with teaching critical thinking and emotional intelligence such as empathy and interpersonal communication?
Extremely important
Very important
Moderately important
Slightly important
Not important
22.
How prepared do you feel your institution/department is for integrating AI into differentiated educational practices?
Very prepared
Prepared
Neutral
Unprepared
Very unprepared
VI.
Future Thoughts
23.
How optimistic are you about the impact of AI on the future of education?
Very optimistic
Optimistic
Neutral
Pessimistic
Very pessimistic
24.
Would you recommend the increased use of AI tools for differentiation to other educators?
Yes
No

References

  1. Ahmed, J., Soomro, A., & Naqvi, S. (2025). Barriers to AI adoption in education: Insights from teacher’s perspectives. International Journal of Innovations in Science and Technology, 7, 411–421. [Google Scholar] [CrossRef]
  2. Akar, B. (2022). Surviving the crises: Lebanon’s higher education in the balance. LCPS. Available online: https://www.lcps-lebanon.org/en/articles/details/4751/surviving-the-crises-lebanon%E2%80%99s-higher-education-in-the-balance (accessed on 18 December 2025).
  3. Al Rashdi, H., & Elgeddawy, M. (2025). Empowering Sustainable Growth: Unveiling the Role of AI in Transforming Professional Development in Higher Education. In B. Awwad (Ed.), AI and IoT: Driving business success and sustainability in the digital age (Volume 2, pp. 325–333). Springer Nature. [Google Scholar] [CrossRef]
  4. Al-Zahrani, A. M., & Alasmari, T. M. (2025). A Comprehensive analysis of AI adoption, implementation strategies, and challenges in higher education across the Middle East and North Africa (MENA) Region. ducation and Information Technologies, 30(8), 11339–11389. [Google Scholar] [CrossRef]
  5. Babu, C. V., Yuvansankar, M., & Tharuneshwaran, K. (2025). Personalized learning and student engagement: Leveraging AI for enhanced learning experiences in distance education (pp. 73–102). IGI Global Scientific Publishing. [Google Scholar] [CrossRef]
  6. Barde, A., Thakur, R., Patel, S., Sinah, N., & Barde, S. (2024, September 27–28). AI-based smart education system to enhanced the learning of students. [Conference session]. 2024 International Conference on Advances in Computing Research on Science Engineering and Technology (ACROSET), Indore, India. [Google Scholar] [CrossRef]
  7. Bhavana, S., Akula, A., Rao, V. N., & Swetha, C. (2025). Advanced AI approaches in education: Education book chapter. IGI Global Scientific Publishing. Available online: https://www.igi-global.com/chapter/advanced-ai-approaches-in-education/371575 (accessed on 18 December 2025).
  8. Billy, I., & Anush, H. (2023). A study of the perception of students and instructors on the usage of artificial intelligence in education. International Journal of Higher Education Management, 9(2). [Google Scholar] [CrossRef]
  9. Bond, M., Khosravi, H., De Laat, M., Bergdahl, N., Negrea, V., Oxley, E., Pham, P., Chong, S. W., & Siemens, G. (2024). A meta systematic review of artificial intelligence in higher education: A call for increased ethics, collaboration, and rigour. International Journal of Educational Technology in Higher Education, 21(1), 4. [Google Scholar] [CrossRef]
  10. Busuttil, L. (2025). Leveraging generative AI in pre-service teacher training: Insights into assessment as learning (pp. 46–60). Springer. [Google Scholar] [CrossRef]
  11. Crompton, H., & Burke, D. (2023). Artificial intelligence in higher education: The state of the field. International Journal of Educational Technology in Higher Education, 20(1), 22. [Google Scholar] [CrossRef]
  12. Damyanov, K. (2024). Differentiation of educational content through artificial intelligence systems in inclusive education. International Journal of Education (IJE), 12, 13–19. [Google Scholar] [CrossRef]
  13. Darıcan, Ş. (2025). Artificial intelligence in education and its importance (pp. 91–116). IGI Global Scientific Publishing. [Google Scholar] [CrossRef]
  14. Debo, D., & Saaida, M. (2024). AI in higher education: Opportunities and challenges. Higher Education, 1–4. Available online: https://www.researchgate.net/publication/387931327_AI_in_Higher_Education_Opportunities_and_Challenges (accessed on 18 December 2025).
  15. Dubey, P., & Crevar, A. (2025). Integrating artificial intelligence in higher education: Enhancing pedagogy, instruction, and administration (pp. 181–208). IGI Global Scientific Publishing. [Google Scholar] [CrossRef]
  16. Em, S., Mok, S., Sam, R., Pen, D., & Serey, M. (2025). Implications and Considerations of AI-Generative Tools for Higher Education Practices: Opportunities, Challenges, and Future Directions. In Examining AI disruption in educational settings: Challenges and opportunities (pp. 297–332). IGI Global Scientific Publishing. [Google Scholar] [CrossRef]
  17. Ertmer, P. A. (1999). Addressing first- and second-order barriers to change: Strategies for technology integration. Educational Technology Research and Development, 47(4), 47–61. [Google Scholar] [CrossRef]
  18. Ertmer, P. A., & Ottenbreit-Leftwich, A. T. (2010). Teacher technology change. Journal of Research on Technology in Education, 42(3), 255–284. [Google Scholar] [CrossRef]
  19. Fang, C., & Tse, A. W. C. (2023, October 28–30). Quasi-Experiment: Postgraduate Students’ Class Engagement in Various Online Learning Contexts When Taking Privacy Issues to Incorporate with Artificial Intelligence Applications. Proceedings of the 14th International Conference on Education Technology and Computers, ICETC ’22 (pp. 356–361), Barcelona, Spain. [Google Scholar] [CrossRef]
  20. Farooqi, S. A. (2026). Ethical Framework for AI in Education: Navigating Challenges and Building Solutions. In AI in education, governance, and leadership: Adoption, impact, and ethics (pp. 153–188). IGI Global Scientific Publishing. [Google Scholar] [CrossRef]
  21. Farouqa, G., Hysaj, A., Hatamleh, Z., Owais, A., Ibrahim, O., Hiasat, L., & Khan, S. (2025). Undergraduate students’ journey with AI in the United Arab Emirates (pp. 155–171). Springer. [Google Scholar] [CrossRef]
  22. Febrianti, C., Tari, K., Ariska, Y., & Lestari, R. (2025). The use of artificial intelligence in personalizing learning experiences at schools. PPSDP International Journal of Education, 4, 1–16. [Google Scholar] [CrossRef]
  23. Gabay, R. A., & Funa, A. (2025). Policy guidelines and recommendations on ai use in teaching and learning: A meta-synthesis study. Social Sciences & Humanities Open, 11, 101221. [Google Scholar] [CrossRef]
  24. Granström, M., & Oppi, P. (2025). Assessing teachers’ readiness and perceived usefulness of AI in education: An estonian perspective. Frontiers in Education, 10. [Google Scholar] [CrossRef]
  25. Guan, H., Dong, L., & Zhao, A. (2022). Ethical risk factors and mechanisms in artificial intelligence decision making. Behavioral Sciences, 12(9), 343. [Google Scholar] [CrossRef] [PubMed]
  26. Harris, P. T. S. (2024). Faculty perspectives toward artificial intelligence in higher education [Doctoral research paper, Middle Georgia State University]. [Google Scholar]
  27. Kazimova, D., Tazhigulova, G., Shraimanova, G., Zatyneyko, A., & Sharzadin, A. (2025). Transforming University Education with AI: A Systematic Review of Technologies, Applications, and Implications. International Journal of Engineering Pedagogy, 15(1), 4–24. [Google Scholar] [CrossRef]
  28. Khulekani, M., Abdultaofeek, A., Emmanuel, A., Olutoyin, O., & Surendra, T. (2025). The transformative influence of generative AI on teaching and learning. Available online: https://dspace.summituniversity.edu.ng/items/3d740759-792c-4ea7-b9c4-8afeece3fcc7/full (accessed on 18 December 2025).
  29. Kovalchuk, V., Reva, S., Volch, I., Shcherbyna, S., Mykhailyshyn, H., & Lychova, T. (2025). Artificial intelligence as an effective tool for personalized learning in modern education. Environment. Technology. Resources. Proceedings of the International Scientific and Practical Conference, 3, 187–194. [Google Scholar] [CrossRef]
  30. Linderoth, C., Mani, M., Schönborn, K., Hultén, M., & Stenliden, L. (2025). Defining ‘the Force’ of artificial intelligence in education: Exploring the future of teaching through informed speculation. Learning, Media and Technology, 1–15. [Google Scholar] [CrossRef]
  31. Liwanag, G. L., Ebardo, R., & Cheng, D. (2025). Low-code and no-code development in the era of artificial intelligence: A systematic review. Data and Metadata, 4, 1218. [Google Scholar] [CrossRef]
  32. Mah, D.-K., & Groß, N. (2024). Artificial intelligence in higher education: Exploring faculty use, self-efficacy, distinct profiles, and professional development needs. International Journal of Educational Technology in Higher Education, 21(1), 58. [Google Scholar] [CrossRef]
  33. Manigandan, L., & Kanimozhi, V. (2025). Adoption of Artificial Intelligence in Education Using UTAUT3 Theory: Experimental Study. In Driving quality education through AI and data science (pp. 193–216). IGI Global Scientific Publishing. [Google Scholar] [CrossRef]
  34. Naseer, S. (2023). Perspective chapter: Advantages and disadvantages of online learning courses (pp. 1–11). IntechOpen. [Google Scholar] [CrossRef]
  35. Nur Fitria, T. (2021, ). Artificial intelligence (AI) In education: Using ai tools for teaching and learning process. Conference: Prosiding Seminar Nasional & Call for Paper STIE AASAt, Surakarta, Jawa Tengah. [Google Scholar]
  36. OECD. (2023). Emerging governance of generative AI in education. In OECD digital education outlook 2023. OECD. Available online: https://www.oecd.org/en/publications/oecd-digital-education-outlook-2023_c74f03de-en/full-report/emerging-governance-of-generative-ai-in-education_3cbd6269.html (accessed on 18 December 2025).
  37. Owan, V., Abang, K., Idika, D., & Bassey, B. (2023). Exploring the potential of artificial intelligence tools in educational measurement and assessment. Eurasia Journal of Mathematics, Science and Technology Education, 19, em2307. [Google Scholar] [CrossRef]
  38. Qiu, Y., Huang, H., Gai, J., & De Leo, G. (2024). The effects of the COVID-19 pandemic on age-based disparities in digital health technology use: Secondary analysis of the 2017–2022 health information national trends survey. Journal of Medical Internet Research, 26, e65541. [Google Scholar] [CrossRef]
  39. Rajput, D. (2025). Use of artificial intelligence to solve problems in the classroom (pp. 137–168). IGI Global Scientific Publishing. [Google Scholar] [CrossRef]
  40. Rajput, R. (2025). Overcoming Barriers to AI Implementation in the Classroom: A Roadmap for Educational Transformation. In Navigating barriers to AI implementation in the classroom (pp. 401–436). IGI Global Scientific Publishing. [Google Scholar] [CrossRef]
  41. Ramos, R., De Angel, R., Ruetas, A., Enrile, J., Calimbo, A., & Vargas, P. (2024, September 28–30). Impact assessment of ChatGPT and AI technologies integration in student learning: An analysis for academic policy formulation. [Conference session]. 2024 6th International Workshop on Artificial Intelligence and Education (WAIE), Tokyo, Japan. [Google Scholar] [CrossRef]
  42. Ramos Salazar, L., & Peeples, S. (2025). Chatgpt adoption in higher education: A study of faculty generation cohort, self-efficacy, and innovativeness. In Technology, Knowledge and Learning: Learning Mathematics, Science and the Arts in the Context of Digital Technologies. [Google Scholar] [CrossRef]
  43. Robert, J. (2024). 2024 EDUCAUSE AI landscape studyEDUCAUSE library. Available online: https://library.educause.edu/resources/2024/2/2024-educause-ai-landscape-study (accessed on 18 December 2025).
  44. Salhab, R. (2025). Adoption of Artificial Intelligence Applications in Higher Education: An Investigation of Faculty Perceptions. In M. Sanmugam, Z. N. Khlaif, W. A. J. Wan Yahaya, & Z. Abdullah (Eds.), A practical guide to artificial intelligence in higher education: Innovation and applications (pp. 39–47). Springer Nature. [Google Scholar] [CrossRef]
  45. Shakib Kotamjani, S., Shirinova, S., Muratova, K., & Sharma, M. (2025, December 11–12). Exploring Students’ Perspectives on Generative AI for Academic Purposes in Uzbekistan’s Higher Education. Proceedings of the 8th International Conference on Future Networks & Distributed Systems, ICFNDS ’24 (pp. 986–994), New York, NY, USA. [Google Scholar] [CrossRef]
  46. Shata, A., & Hartley, K. (2025). Artificial Intelligence and communication technologies in academia: Faculty perceptions and the adoption of generative AI. International Journal of Educational Technology in Higher Education, 22(1), 14. [Google Scholar] [CrossRef]
  47. Sholeh, M. (2025). Educational transformation through artificial intelligence: Implementation of AI Tools in the teaching and learning process (pp. 25–40). IGI Global Scientific Publishing. [Google Scholar] [CrossRef]
  48. Tsiani, M., Lefkos, I., & Fachantidis, N. (2025). Perceptions of Generative AI in Education: Insights from Undergraduate and Master’s-Level Future Teachers. Journal of Pedagogical Research, 9, 89–108. [Google Scholar] [CrossRef]
  49. Yang, H. (2025). Multidimensional Analysis of Art Education Teachers’ Attitudes and Self-Efficacy toward Artificial Intelligence: Exploring Relationships and Strategies for Enhancement. Interactive Learning Environments, 33(4), 2824–2847. [Google Scholar] [CrossRef]
Figure 1. Distribution of Disciplines Taught.
Figure 1. Distribution of Disciplines Taught.
Education 16 00055 g001
Figure 2. Frequency of AI Use across Instructional Domains.
Figure 2. Frequency of AI Use across Instructional Domains.
Education 16 00055 g002
Figure 3. Perceived Benefits and Barriers of AI Adoption.
Figure 3. Perceived Benefits and Barriers of AI Adoption.
Education 16 00055 g003
Figure 4. Faculty Optimism and Willingness to Recommend AI Adoption.
Figure 4. Faculty Optimism and Willingness to Recommend AI Adoption.
Education 16 00055 g004
Table 1. Socio-demographic characteristics of the participants.
Table 1. Socio-demographic characteristics of the participants.
VariablesFrequency (n)Percentage (%)
Gender
Male6448.1
Female6951.9
Age groups
25–34139.8
35–445742.9
45–544332.3
55 and above2015
Highest level of education
Associate degree (minor, diploma, etc.…)00
Bachelor’s degree21.5
Master’s degree4030.1
Architect10.8
Engineer21.5
Doctor of Medicine (MD)00
Doctorate/PhD8664.6
Other21.5
Diploma in education techniques/strategies
Yes4836.1
No8563.9
Years as higher-education instructor
1–5 years1410.5
6–10 years2518.8
11–15 years2317.3
16–20 years3224.1
21 years and above3929.3
Table 2. Instructor Familiarity and Use of AI Tools.
Table 2. Instructor Familiarity and Use of AI Tools.
VariablesFrequency (n)Percentage (%)
Familiarity with AI-powered tools
No75.3
Often1511.3
Always64.5
Specific AI tools used
Chatbots & Virtual Assistants (e.g., OpenAI’s ChatGPT [GPT-4/5], Google Assistant).9269.2
Not familiar at all
Slightly familiar2821.1
Moderately familiar6145.9
Very familiar3324.8
Extremely familiar43
Used AI tools for differentiated instruction
Never2921.8
Rarely3627.1
Sometimes 4735.3
Often 1511.3
Always 64.5
Specific AI tools used
Chatbots & Virtual Assistants (e.g., OpenAI’s ChatGPT, Google Assistant).9269.2
Text Analysis Tools (e.g., IBM Watson Natural Language Understanding, GPT-4).2015
Translation Tools (e.g., DeepL, Google Translate).7354.9
Speech-to-Text & Text-to-Speech (e.g., Google Cloud Speech-to-Text, Microsoft Azure Cognitive Services).3022.6
Image Generation (e.g., DALL·E, MidJourney).3224.1
Video Generation (e.g., Synthesia).2015
Music and Sound Generation (e.g., AIVA, Amper Music).815.8
Text Generation (e.g., Jasper, Sudowrite).216
Automated Writing & Editing (e.g., Grammarly, Copy.ai).4130.8
Video Editing with AI (e.g., Adobe Premiere Pro with Sensei AI, Descript).64.5
Medical Imaging AI (e.g., Zebra Medical Vision, Aidoc).00
Virtual Health Assistants (e.g., Babylon Health, Woebot).10.8
Personalized Learning Platforms (e.g., Squirrel AI, Carnegie Learning).96.8
AI Tutoring Systems (e.g., Querium, Century Tech).53.8
Table 3. Frequency of AI-tool use by instructors’ familiarity level.
Table 3. Frequency of AI-tool use by instructors’ familiarity level.
NeverRarelySometimesOftenAlwaysTotal
Not familiar at all (n = 7)4 (57.1%)2 (28.6%)1 (14.3%)0 (0.0%)0 (0.0%)7
Slightly familiar (n = 28)15 (53.6%)8 (28.6%)4 (14.3%)1 (3.6%)0 (0.0%)28
Moderately familiar (n = 61)10 (16.4%)21 (34.4%)25 (41.0%)5 (8.2%)0 (0.0%)61
Very familiar (n = 33)0 (0.0%)5 (15.2%)16 (48.5%)7 (21.2%)5 (15.2%)33
Extremely familiar (n = 4)0 (0.0%)0 (0.0%)1 (25.0%)2 (50.0%)1 (25.0%)4
Total (N = 133)293647156133
Table 4. Association statistics between familiarity and frequency of AI tool use.
Table 4. Association statistics between familiarity and frequency of AI tool use.
TestValuedfp
Pearson χ263.9116<0.001
Linear-by-linear χ246.791<0.001
Gamma0.713<0.001
Kendall’s τ-b0.528<0.001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Najjar, N.; Rouphael, M.; El Hajj, M.; Bitar, T.; Damien, P.; Hleihel, W. Faculty Perceptions and Adoption of AI in Higher Education: Insights from Two Lebanese Universities. Educ. Sci. 2026, 16, 55. https://doi.org/10.3390/educsci16010055

AMA Style

Najjar N, Rouphael M, El Hajj M, Bitar T, Damien P, Hleihel W. Faculty Perceptions and Adoption of AI in Higher Education: Insights from Two Lebanese Universities. Education Sciences. 2026; 16(1):55. https://doi.org/10.3390/educsci16010055

Chicago/Turabian Style

Najjar, Najib, Melissa Rouphael, Maya El Hajj, Tania Bitar, Pascal Damien, and Walid Hleihel. 2026. "Faculty Perceptions and Adoption of AI in Higher Education: Insights from Two Lebanese Universities" Education Sciences 16, no. 1: 55. https://doi.org/10.3390/educsci16010055

APA Style

Najjar, N., Rouphael, M., El Hajj, M., Bitar, T., Damien, P., & Hleihel, W. (2026). Faculty Perceptions and Adoption of AI in Higher Education: Insights from Two Lebanese Universities. Education Sciences, 16(1), 55. https://doi.org/10.3390/educsci16010055

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop