Next Article in Journal
The Impact of a Mobile Laboratory on Water Quality Assessment in Remote Areas of Panama
Previous Article in Journal
Beyond Construction Waste Management: A Systematic Review of Strategies for the Avoidance and Minimisation of Construction and Demolition Waste in Australia
Previous Article in Special Issue
Understanding Continuance Intention of Generative AI in Education: An ECM-Based Study for Sustainable Learning Engagement
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Dynamic Capabilities of University Administration and Their Impact on Student Awareness of Artificial Intelligence Tools

by
Fathi M. Abunaser
1,*,
Mohamed Mostafa Mohamed Hamd
1,
Asma Mubarak Nasser Bani-Oraba
2,
Omer Hamed
1,
Maen Qasem Mohamad Alshiyab
3 and
Zubaida Shebani
4,*
1
Educational Foundation and Administration Department, College of Education, Sultan Qaboos University, P.O. Box 50, Muscat P.C. 123, Oman
2
Omani Studies Center, Sultan Qaboos University, P.O. Box 50, Muscat P.C. 123, Oman
3
College of Education, Amman Arab University, Amman P.O. Box 2234, Jordan
4
Psychology Department, College of Education, Sultan Qaboos University, P.O. Box 50, Muscat P.C. 123, Oman
*
Authors to whom correspondence should be addressed.
Sustainability 2025, 17(15), 7092; https://doi.org/10.3390/su17157092
Submission received: 23 June 2025 / Revised: 29 July 2025 / Accepted: 31 July 2025 / Published: 5 August 2025
(This article belongs to the Special Issue Artificial Intelligence in Education and Sustainable Development)

Abstract

This study investigates the relationship between the dynamic capabilities of university administration and students’ awareness of artificial intelligence (AI) tools within a higher education context. Drawing on data from 139 students at the College of Education, Sultan Qaboos University, the research employed two validated instruments, one measuring the dynamic capabilities of university administration and another assessing students’ awareness and perception of AI tool use. Understanding this relationship is critical, as universities increasingly face pressure to guide responsible and effective AI use among students. Findings reveal significant correlations between the university administration’s dynamic capabilities, particularly technological agility, and students’ engagement with AI tools. Notably, technological dynamic capabilities within the administration significantly predicted two specific dimensions of student awareness: effectiveness of using AI tools and perceived faculty members’ efficiency in AI. These results highlight the critical role of institutional leadership in promoting equitable and sustainable integration of AI in education. The study contributes to the broader discourse on AI for sustainable development by illustrating how institutional strategies can enhance innovation, inclusion, and student readiness in support of SDG 4.

1. Introduction

In today’s higher education systems, the influence of technology and its associated innovations is evident across nearly every aspect of teaching and learning. This reflects a growing recognition of the importance of integrating digital tools into academic environments to promote inclusive and sustainable education. With the continuous advancement of technological solutions, there has been significant growth in the availability, accessibility, and use of electronic resources, internet-based platforms, and academic databases. These tools are actively applied to support, enhance, and streamline the teaching and learning process through educational and advisory programs offered by universities, contributing directly to the goals of equity, access, and innovation in higher education.
Among the most significant and rapidly evolving of these innovations is artificial intelligence (AI), which utilizes deep learning models to generate content that closely resembles human responses to complex stimuli. This capability has enabled AI to become increasingly embedded in diverse domains of life [1], including both school and higher education settings [2]. As AI technologies continue to progress, they are expected to drive further transformation within higher education systems, offering new opportunities to personalize learning and reduce disparities in access to knowledge. At the same time, they present unavoidable challenges, such as the reconfiguration of industries involving communication technologies, software development, data management, business operations, interactive systems, cybersecurity, and social media platforms [3,4,5].
In response to these developments, universities are increasingly adopting AI across various operational areas—administrative, instructional, and educational—to improve the effectiveness and efficiency of student learning [6,7,8]. AI supports the personalization of learning experiences, enhances student engagement, and strengthens learning-related decision-making. Moreover, it enables the automation of routine tasks, optimizes institutional resources, and advances scientific research by analyzing large datasets to guide researchers toward original and impactful topics. Critically, AI also enables more inclusive educational practices by adapting instructional content to diverse learning abilities and cultural contexts [9,10,11,12]. Through such applications, AI plays a pivotal role in advancing the objectives of the 2030 Agenda for Sustainable Development, particularly SDG 4, which calls for inclusive, equitable, and quality education for all.
Effectively integrating AI into higher education requires a reconsideration of the role university administrations play in enabling equitable access to and responsible engagement with AI. While technological dynamic capabilities are essential, they may not be sufficient in the digital age, particularly when it comes to anticipating and supporting students’ awareness, understanding, and competent use of AI tools. In the 21st century, information is pervasive and addressing “data illiteracy” alone no longer equips students with the skills necessary to read, analyze, and manage complex information environments. There is now an increasing need to enhance “AI literacy,” which enables students to engage critically and intelligently with AI systems to identify, evaluate, and apply information [13,14,15]. As such, university administrations must not only possess technological agility but must also actively deploy it to foster student development, reduce disparities in digital access, and build a foundation for sustainable, inclusive learning.
Several studies have highlighted the importance of this shift. For example, Al-Masry and Tarawneh recommend that university administrations embrace AI applications to advance institutional transformation across education, research, community engagement, and resource management [16]. Similarly, Alatel et al. highlight the importance of raising awareness among faculty and students regarding AI’s significance in educational contexts, noting its potential to help institutions achieve their goals with greater efficiency and effectiveness [17]. Al-Shammari further stresses the importance of establishing robust infrastructure, administrative frameworks, and ethical guidelines to enable the effective and regulated use of AI [18]. Otoom likewise calls for stronger efforts to promote AI integration as part of institutional modernization [19].
More recently, Delello et al. found that although educators are adapting to the use of AI tools like ChatGPT -4, they report a lack of institutional policies and call for AI-specific professional development and ethical training [20]. In a parallel finding, Nelson et al. [21] report that university students in Ecuador largely regard AI-generated work as academic dishonesty and advocate for educational policies that promote constructive and ethical uses of AI. Together, these studies suggest that the responsible integration of AI in higher education requires more than access to tools—it demands comprehensive strategies that include student awareness, clear policy frameworks, ethical guidance, and institutional support, especially in the context of sustainable and inclusive educational development.
To conceptualize how universities can meet these demands, this study draws on Dynamic Capabilities Theory (DCT), originally developed by Teece et al. [22]. DCT refers to an organization’s ability to integrate, build, and reconfigure internal and external resources to adapt to change. It emphasizes strategic agility over routine operations and offers a useful lens for examining how institutions can mobilize technological and organizational capacities to support AI literacy and equitable access. For instance, universities may reconfigure IT infrastructure to ensure secure and equitable access to AI tools (reconfiguring), adapt faculty development programs to include generative AI use cases (learning), or integrate AI ethics modules into existing curricula (integrating). These examples reflect core dynamic capabilities that can influence how effectively institutions foster AI literacy.
Despite these global discussions, a considerable knowledge gap persists regarding how various stakeholders (students, faculty, and administrators) understand and engage with AI, particularly in relation to their respective roles in shaping future-ready learning environments [23,24,25]. Unlike previous studies, which have tended to focus either on student attitudes or institutional readiness in isolation, this study aims to explore the direct relationship between university administration’s technological dynamic capabilities and students’ awareness of AI use. It also introduces a novel application of Dynamic Capabilities Theory within the context of higher education in the Arab region, where empirical evidence on AI-related institutional engagement remains limited. This gap is particularly significant as universities in the region are under growing pressure to modernize educational delivery and ensure students are prepared to navigate evolving digital landscapes. Understanding the link between institutional capabilities and student AI literacy can inform effective policy, curriculum development, and support services. By situating the research within a specific institutional context, the study addresses the need for evidence-based strategies to support responsible and effective AI integration in education.
While institutional interest in AI grows, significant disparities persist in students’ awareness and proficiency in AI use. This raises critical questions about the role of administrative capacities in equipping students to participate meaningfully in the AI-driven knowledge economy. Recent international studies show that students’ perceptions of generative AI tools such as ChatGPT are shaped by institutional and cultural contexts. In a large U.S.-based study, Baek et al. found that policies, demographics, and perceived risks influence student use of ChatGPT, especially for writing and coding [26]. Their findings highlight how generative AI may reinforce existing inequalities in higher education. In Finland, a study by Rüdian et al. showed that students often rejected AI-generated feedback when aware of its machine origin, citing a lack of emotional and social value [27]. In India, Mondal et al. found that medical students use AI tools to simplify content and streamline assignments but had concerns about hallucinations, privacy, and overreliance [28].
Complementing these studies, emerging research on AI literacy explores the knowledge, skills, and ethical understanding students need to engage with AI effectively. For example, Wu et al. developed an evaluation framework grounded in the UNESCO and KSAVE models, revealing demographic differences in AI competencies among students [29]. In a scoping review, Laupichler et al. identified recurring themes in pedagogical approaches and curricular gaps, underscoring the field’s early stage of development [30]. Together, these studies highlight the need for institutionally supported strategies that promote responsible, equitable, and pedagogically sound use of generative AI.
Despite this growing body of international research, most studies continue to focus on individual user experiences or educational practices. Far fewer have examined the organizational capabilities that enable or constrain AI adoption within universities. In particular, the relationship between university-level technological readiness and students’ engagement with AI tools remains underexplored. This study builds on that gap by examining how institutional dynamic capabilities influence students’ awareness and use of AI. The focus on a public university in Oman offers timely insight into how higher education institutions in the Gulf region are responding to AI integration amid ongoing digitization and evolving pedagogical demands.
To explore these issues, the study focuses on the College of Education at Sultan Qaboos University and addresses the following research questions:
  • What is the level of the university administration’s technological dynamic capabilities and the level of students’ awareness of using AI tools at the College of Education, Sultan Qaboos University?
  • What is the nature of the correlation between the university administration’s technological dynamic capabilities and students’ awareness of AI usage?
  • Which specific dimensions of technological dynamic capabilities can predict students’ awareness in using AI tools at Sultan Qaboos University?

2. Theoretical Framework

Amid the ongoing transformations driven by the Fourth Industrial Revolution, AI has emerged as a powerful force reshaping global sectors, including higher education. Beyond enhancing productivity and institutional agility, AI offers transformative potential to foster equitable access to quality education and prepare students for a rapidly evolving knowledge economy. It has been shown to improve productivity, skills, and capabilities among learners and workers, contributing to international economic development [31,32]. In the educational sphere, AI is no longer a peripheral innovation but a strategic asset for universities seeking to respond to change, adapt to societal demands, and advance inclusive and sustainable development [5,19]. However, the educational value of AI depends not only on access but also on how well institutions implement it through thoughtful, adaptive strategies, making Dynamic Capabilities Theory (DCT) a compelling framework for understanding institutional readiness.
Originally introduced by Teece et al., Dynamic Capabilities Theory posits that an institution’s competitive advantage lies in its ability to integrate, build, and reconfigure internal and external competencies in response to change [3,20]. These capabilities are not routine operational functions but strategic processes that enable continuous innovation, learning, and adaptation [33,34]. In the context of higher education, DCT offers a valuable lens for examining how university administrations can mobilize and reconfigure their resources, especially technological ones, to responsibly and inclusively embed AI into teaching, learning, and institutional operations. The theory is typically structured around three components:
  • Sensing Capabilities: The ability to recognize emerging educational and technological opportunities, such as identifying AI tools that support personalized learning, inclusive teaching practices, and innovative research.
  • Seizing Capabilities: The capacity to allocate and mobilize institutional resources toward effective AI adoption, including integrating AI in curriculum design, learning analytics, student feedback systems, and adaptive assessments.
  • Reconfiguring Capabilities: The ability to restructure institutional processes and systems to maximize AI’s long-term educational benefits, such as using predictive analytics to support retention, revising teaching roles, and fostering sustainable academic innovation.
These dynamic capabilities enable institutions to move beyond operational capabilities that generate short-term value toward capabilities that bring about long-term strategic change in their resource base [22,35,36]. An extension of DCT is the concept of technological dynamic capabilities, which specifically refers to an institution’s ability to adapt and align technological infrastructures, processes, and expertise with evolving educational needs [37]. Technological capabilities are defined as “the organization’s technological resources, capacities, and potentials that it seeks to invest and exploit through its workforce’s expertise, skills, and capabilities to effectively and efficiently improve the quality of its products and operations” [38]. These capabilities are further categorized into the following [39]:
  • Technological Competence: Infrastructure, systems, and digital proficiency.
  • Organizational Competence: Institutional agility in adapting to technological change.
  • Innovative Competence: The ability to integrate novel technologies and services.
  • Marketing Competence: Responsiveness to evolving user needs and expectations.
  • Knowledge Management Competence: Practices that foster value through information sharing and knowledge creation.
In the context of university administration, technological dynamic capabilities reflect the ability to meet students’ evolving digital literacy needs, respond to AI-related challenges and lead sustainable innovation. These are expressed through two main dimensions: technological administrative communication (ensuring transparency, support, and clarity about technology use) and technological administrative transformation (driving strategic shifts toward intelligent systems). These capabilities are crucial to enabling administrators to use AI not only for institutional efficiency but also for empowering students through digital literacy and innovation. AI applications like chatbots, intelligent tutoring systems, predictive analytics, and digital writing assistants help improve student outcomes [40]. However, these benefits hinge on students’ awareness and critical engagement with such tools [41,42,43,44] and students’ awareness of technology use is shaped by factors such as institutional training, faculty support, and curricular integration [45]. Without intentional efforts to raise awareness and develop competencies, the use of AI may exacerbate existing educational inequities rather than resolve them.
As AI becomes increasingly central to academic and administrative operations [12,46], universities must ensure that their digital transformation is inclusive, ethical, and sustainability oriented. Institutions lacking a foundational understanding of AI risk being left behind [47]. Accordingly, this study adopts Dynamic Capabilities Theory to explore how the technological dynamic capabilities of university administration at Sultan Qaboos University influence students’ awareness of AI tools. Unlike technology adoption models such as the Technology Acceptance Model (TAM) or the Unified Theory of Acceptance and Use of Technology (UTAUT), which focus primarily on individual user acceptance, DCT offers a broader organizational perspective. It emphasizes how institutions can integrate, build, and reconfigure internal competencies in response to technological change, making it more suitable for analyzing institutional roles in shaping students’ exposure to and understanding of AI tools. Understanding this relationship is essential for ensuring that technological innovation supports the broader goals of education for sustainable development, digital equity, and future-ready learning, aims that align with the vision of SDG 4 for inclusive and equitable quality education for all.
The relationships between the identified capabilities, competencies, and dimensions are illustrated in Figure 1, which presents a conceptual model summarizing the theoretical framework of the study. This framework directly informs the study’s design, guiding the construction of the measurement instrument, the grouping of scale items, and the interpretation of statistical findings.

3. Materials and Methods

A descriptive correlational research design was employed to examine the extent to which students’ awareness of using artificial intelligence (AI) tools (dependent variable) can be predicted by the technological dynamic capabilities of university management (independent variable). The correlational method allows for the investigation of relationships between variables without manipulating them and, thereby, offers insights into naturally occurring associations within the university context.

3.1. Study Sample

The study sample comprised 290 undergraduate students (148 first year and 142 second year) enrolled at Sultan Qaboos University, Oman. A purposive sampling technique was employed to intentionally select students who are at the early stages of their academic journey and are engaged in a university environment that actively promotes the integration of AI in the educational process. This population was specifically targeted because of its direct exposure to institutional initiatives related to AI adoption in teaching and learning. Such a context is particularly relevant to the study’s objectives, as it may significantly influence students’ awareness of and interaction with AI tools. As highlighted by Zawacki-Richter et al., learning environments play a critical role in shaping students’ awareness and understanding of AI applications [12].

3.2. Study Participants

3.2.1. Pilot Study Participants

The research instruments were piloted with a group of 70 undergraduate students (35 from each academic year), drawn from the same study population but excluded from the main study sample. The purpose of this pilot phase was to examine the psychometric properties of the instruments, ensuring their reliability and validity before application in the main study.

3.2.2. Main Study Participants

The finalized instruments were administered to a total of 139 students (71 first year and 68 second year) enrolled in the College of Education at Sultan Qaboos University. Participants were recruited through a targeted invitation shared by the researchers, and inclusion was based on voluntary participation. These participants formed the primary sample for data collection and analysis. The research instrument was initially distributed to 290 students. However, only 139 questionnaires were retained for statistical analysis after excluding incomplete submissions and responses containing inconsistent or illogical data. A sample flow diagram illustrates the stages of sample selection (Figure 2). A power analysis was conducted using G*Power 3.1 software to assess the adequacy of the sample size. The analysis assumed a medium effect size (0.3), a significance level of α = 0.05, and a statistical power of 0.80. The results confirmed that the actual sample size (139) was sufficient to yield statistically reliable findings.

3.3. Study Instruments

Two instruments were used in the study: the University Administration’s Technological Dynamic Capabilities Scale and the Synthetic Index of Use of Artificial Intelligence Tools.

3.3.1. University Administration’s Technological Dynamic Capabilities in AI Scale

A modified version of a scale measuring organizational dynamic capabilities developed by Kump et al. was utilized in this study [48]. The scale has been widely adopted in prior research examining institutional responsiveness to technological change. The original instrument comprised 14 items distributed across three dimensions: Organizational Sensing, Organizational Communication, and Organizational Transformation. In line with the theoretical framework of this study, particularly the concept of technological dynamic capabilities, the original instrument’s 14 items were reorganized into two dimensions: Technological Administrative Communication (9 items) and Technological Administrative Transformation (5 items). This restructuring was guided by both conceptual and empirical considerations. A preliminary exploratory study indicated that students had limited awareness of items related to the original Sensing dimension. Including it might have compromised the accuracy and validity of the responses. Moreover, within the higher education context in Oman, the Sensing items overlapped conceptually with those classified under Communication, making a separate category less meaningful. The revised instrument was reviewed by a panel of experts in artificial intelligence and higher education, who confirmed the content validity and relevance of the two new dimensions. Reliability and validity indicators were established based on their evaluations.
For the purposes of the current study, the instrument was translated into Arabic and reviewed by three experts in educational foundations and administration, all of whom hold doctoral degrees. To ensure linguistic accuracy, the Arabic version was then back translated into English. Based on expert feedback, the translated scale underwent a series of revisions. Additional content validation was carried out by three professors specializing in educational measurement, evaluation, and technology, who approved the final version for use in this research (See Supplementary Material).
  • Scale Administration and Scoring Method
The scale was administered electronically. Students were instructed to provide their demographic information and to respond to all items on the scale without omitting any responses. No specific time limit was imposed for completing the scale. Responses were scored using a four-point Likert scale: Strongly Agree, Agree, Disagree, and Strongly Disagree, with assigned values of 1, 2, 3, and 4 respectively.
  • Psychometric Properties of the Scale
  • Internal Consistency
The internal consistency of the scale was assessed by calculating the correlation coefficients between each item and its corresponding dimension, as well as between each dimension and the overall scale score. This analysis was conducted using responses from a pilot sample of 70 students not included in the main study sample. Table 1 presents the correlation coefficients for each item in relation to its designated dimension within the scale.
As shown in Table 1, the correlation coefficients between each item and the total score of the Technological Administrative Communication dimension ranged from 0.458 to 0.784, all of which are statistically significant at the 0.01 level. The correlation coefficients between each item and the total score of the Technological Administrative Transformation dimension ranged from 0.546 to 0.767, also statistically significant at the 0.01 level. In addition, Table 2 presents the correlation coefficients between each dimension and the total scale score. These high and statistically significant values provide strong evidence of the instrument’s internal consistency and construct validity.
2.
Factor Validity
Factor validity was assessed using Exploratory Factor Analysis (EFA) to identify the factorial structure of the scale. The analysis was conducted on a sample of 100 participants and followed a two-step procedure:
-
The adequacy of the sample size was confirmed using the Kaiser–Meyer–Olkin (KMO) test, which yielded a value of 0.879. This exceeds the recommended threshold of 0.70, indicating that the sample was suitable for factor analysis.
-
EFA was then performed on the 14 items of the scale using Principal Component Analysis (PCA) with Varimax rotation. The Rotated Factor Matrix, after excluding factor loadings below 0.30 (in line with Guilford’s criterion), is presented in Table 3.
The analysis revealed two principal factors which correspond to the pre-established dimensions of the scale: Technological Administrative Communication and Technological Administrative Transformation. These results support the scale’s construct validity by confirming a coherent factorial structure. Items 1, 2, 3, 4, 5, 6, 7, 8, and 9 are loaded on the first factor, which reflects students’ perceptions of the university’s ability to communicate with them by providing various technological services in classrooms, offering a variety of technology-related activities, delivering technological information, and making such information publicly available. This factor also represents the university’s facilitation of administrative communication. It was named the Technological Administrative Communication factor, with an eigenvalue of 5.82 and an explained variance of 41.66%. Items 9, 10, 11, 12, 13, and 14 are loaded on the second factor, which represents students’ perception of the university’s ability to adapt to emerging technological changes, manage unforeseen technological issues, and effectively access new technological knowledge. These capabilities enable students to interact with these technologies more confidently and effectively. This factor was named Technological Administrative Transformation, with an eigenvalue of 1.10, and an explained variance of 7.89%. Together, the two factors accounted for 49.49% of the total variance, providing strong evidence of the scale’s construct validity.
3.
Construct Validity
To further verify the construct validity of the scale assessing the technological dynamic capabilities of university administration in the context of artificial intelligence, Confirmatory Factor Analysis (CFA) was conducted. Model parameters were estimated using the Diagonally Weighted Least Squares (DWLS) method. The measurement model comprises 14 items distributed across two dimensions. Table 4 presents the goodness-of-fit indices for the CFA model.
Table 4 demonstrates that the goodness-of-fit indices fall within acceptable ranges, indicating that the measurement model provides an adequate fit to the observed data. Furthermore, Table 5 presents the standardized factor loadings and the statistical significance of each item included in the Confirmatory Factor Analysis, thereby supporting the structural validity of the scale. Figure 3 illustrates that all item loadings exceeded 0.40 and were statistically significant at the 0.01 level, thereby providing strong evidence for the construct validity of the University Administration’s Technological Dynamic Capabilities in Artificial Intelligence Scale.
4.
Reliability of the Scale
To assess the reliability of the scale, Cronbach’s alpha coefficients were computed for each dimension as well as for the overall scale. This analysis was conducted on a sample of 100 participants drawn from the study population, excluding those in the main study sample. As shown in Table 6, the reliability coefficients for both the individual dimensions and the total scale were statistically significant, indicating that the instrument demonstrates an acceptable level of reliability.
Based on the results presented, the University Administration’s Technological Dynamic Capabilities in AI Scale, as applied within the College of Education at Sultan Qaboos University, demonstrates sound psychometric properties, thereby confirming its suitability for use in the current study.

3.3.2. Synthetic Index of Use of Artificial Intelligence Tools

The scale aims to assess students’ perceived impact of using AI tools in the teaching and learning process. Developed by Grájeda et al., the instrument comprises 30 items distributed across five dimensions [49]. The first dimension measured the extent to which these tools are utilized in the educational environment through nine items. The second dimension focuses specifically on the use of ChatGPT, with seven items assessing its educational applications and contribution to learning quality. The third dimension measures student efficiency using six items that capture how AI supports the development of students’ personal capabilities. The fourth dimension assesses faculty efficiency through four items evaluating the preparedness and capacity of academic staff to integrate AI technologies effectively. Finally, the fifth dimension explores advanced student skills, using four items to assess the extent to which AI integration fosters higher-order cognitive skills.
The authors of the scale established the validity of the questionnaire through Confirmatory Factor Analysis (CFA) conducted on a large sample of 4127 students from a private university in Latin America, representing the Colleges of Engineering, Business Administration, and Arts. The analysis confirmed the five-factor structure of the scale, with all model fit indices falling within acceptable ranges. These results provide robust evidence for the scale’s validity and reliability in assessing the perceived impact of AI tool usage in the learning process.
  • Scale Administration and Scoring Method
The scale was administered electronically and collectively to first- and second-year students at the College of Education, Sultan Qaboos University. Participants were asked to provide their demographic information and respond to all scale items. Responses were recorded using a four-point Likert scale: Strongly Agree, Agree, Disagree, and Strongly Disagree, with scores of 1, 2, 3, and 4, respectively. Table 7 presents the dimensions of the Synthetic Index of Use of Artificial Intelligence Tools along with their corresponding item numbers.
  • Psychometric Properties of the Index
  • Internal Consistency
Internal consistency was assessed as an indicator of the validity of the Synthetic Index of Use of Artificial Intelligence Tools among first- and second-year students. Correlation coefficients were calculated between each item and its corresponding dimension, as well as between each dimension and the total score of the scale. The scale was administered to a sample of 70 students from the study population, excluding those in the main sample. Table 8 presents the correlation coefficients between individual items and their respective dimensions, all of which were high and statistically significant at the 0.01 level, indicating strong internal consistency across all five dimensions of the index. Table 9 presents the correlation coefficients between each dimension and the total score of the index. These coefficients ranged from 0.638 to 0.876, all of which are statistically significant. This provides further evidence of the strong internal consistency and construct validity of the Synthetic Index of Use of Artificial Intelligence Tools.
2.
Reliability of the Scale
To assess the reliability of the scale, Cronbach’s alpha coefficients were calculated based on responses from a sample of 70 male and female students drawn from the study population, excluding those in the main sample. Table 10 displays the alpha values for each dimension and for the total scale score, with coefficients reaching up to 0.91. These values are statistically significant, indicating a high level of internal consistency. Additionally, Table 10 presents McDonald’s omega coefficients, all of which are similarly high and significant at the 0.01 level. These findings confirm that the Synthetic Index of Use of Artificial Intelligence Tools among College of Education students demonstrates strong psychometric properties, establishing the scale as both reliable and valid for use in the current study.

3.4. Statistical Analyses

To analyze the data collected in this study, IBM SPSS Statistics v. 23 was used, along with AMOS version 23 and JASP 0.18.3 software, employing a range of statistical techniques appropriate for the study’s objectives. First, descriptive statistics—including means, standard deviations, and t-values—were used to examine the distribution of responses and statistical significance. To evaluate the internal consistency of the measurement instruments and to examine the relationships between variables, Pearson correlation coefficients were computed. In order to assess the predictive relationships between the independent and dependent variables, multiple linear regression analysis was conducted. Furthermore, Exploratory Factor Analysis (EFA) was employed to uncover the underlying factor structure of the scales used. To confirm this structure, Confirmatory Factor Analysis (CFA) was carried out using AMOS software. Additionally, to further assess the reliability of the measurement instruments, JASP software was used to calculate McDonald’s omega coefficients, providing a robust estimate of internal consistency reliability.

4. Results

4.1. Results of Research Question 1: What Is the Level of the University Administration’s Technological Dynamic Capabilities and the Level of Students’ Awareness of Using AI Tools at the College of Education, Sultan Qaboos University?

To address this question, means and standard deviations were calculated to assess the levels of technological dynamic capabilities of the university administration, as well as students’ awareness in using AI tools. The response scale was categorized as follows: 1.00–1.74 (Strongly Disagree), 1.75–2.49 (Disagree), 2.50–3.24 (Agree), and 3.25–4.00 (Strongly Agree). As shown in Table 11, students perceive the technological dynamic capabilities of university administration to be at a moderate level, with an overall mean score of 3.00. Both sub-dimensions—Technological Administrative Communication and Technological Administrative Transformation—also reflected moderate mean scores. The relatively low standard deviations in comparison to the means suggest limited variability in student responses, thereby enhancing the precision of the findings in assessing perceived levels of dynamic capabilities.
Table 12 shows that students demonstrated a moderate level of awareness in using artificial intelligence tools, with an overall mean score of 3.04. All dimensions of student awareness similarly fell within the moderate range. Additionally, the relatively low standard deviations in relation to the means suggest limited variation in student responses, thereby enhancing the accuracy of the findings in assessing students’ levels of awareness.

4.2. Results of Research Question 2: What Is the Nature of the Correlation Between the University Administration’s Technological Dynamic Capabilities and Students’ Awareness of AI Usage?

To address this question, Pearson’s correlation coefficient for parametric samples was calculated using SPSS software. Table 13 presents the correlation matrix between the University Administration’s Technological Dynamic Capabilities in AI Scale and Synthetic Index of Use of AI Tools. The results indicate a statistically significant positive correlation at the 0.01 level between students’ scores on the overall scale of university administration’s technological dynamic capabilities and their scores on the Synthetic Index, suggesting a meaningful association between these two variables among students.

4.3. Results of Research Question 3: What Is the Nature of the Correlation Between the University Administration’s Technological Dynamic Capabilities and Students’ Awareness of AI Usage?

To explore this relationship, simple linear regression analysis was conducted to assess the extent to which the technological dynamic capabilities of university administration predict students’ awareness of using AI tools. The analysis was carried out in the following steps:
  • Testing statistical assumptions before conducting regression analyses: Normality was checked using skewness and kurtosis values; multicollinearity was assessed using Variance Inflation Factor (VIF), with all values below 5; and homoscedasticity and independence of residuals were evaluated using scatterplots and the Durbin–Watson statistic, which fell within the acceptable range (1.5–2.5).
  • Determining the Contribution of Administration’s Technological Dynamic Capabilities to Predicting Students’ Overall Awareness of AI Tools
Table 14 shows that the technological dynamic capabilities of university administration significantly contribute to predicting students’ awareness of using AI tools, with a contribution rate of 12.4%. The predictive equation can be formulated as follows:
Students’ awareness score = 52.949 + 0.905 × Technological dynamic capabilities score
As shown in Table 14, all coefficients are accompanied by 95% confidence intervals. The regression analysis showed a significant positive effect of administration’s technological dynamic capabilities on students’ awareness of using AI tools (β = 0.905, p < 0.001, 95% CI [0.735, 1.075]).
3.
Determining the Contribution of Administration’s Technological Dynamic Capabilities to Predicting Specific Dimensions of Students’ Awareness
As shown in Table 15, results indicate that the technological dynamic capabilities of university administration significantly contribute to predicting two specific dimensions of students’ awareness: Effectiveness of AI Tool Use and Faculty Proficiency in AI. The other three dimensions were not statistically significant. The predictive equations for the significant dimensions are as follows:
Effectiveness of AI tool use = 12.67 + 0.394 × Technological dynamic capabilities score
Faculty Proficiency in AI = 15.44 + 0.223 × Technological dynamic capabilities score
The regression analyses revealed varying levels of association. For instance, the use of technological dynamic capabilities significantly predicted the effectiveness of AI tool use (β = 0.394, p < 0.001, 95% CI [0.274, 0.514]). However, for variables such as advanced student AI skills, the confidence interval included zero, suggesting a nonsignificant predictive effect.

5. Discussion

5.1. Discussion of Research Question 1

This research question examined the current level of technological dynamic capabilities within the university administration and students’ awareness of using generative AI tools in the educational context. The findings indicate that both the technological dynamic capabilities of the university administration and students’ awareness of using AI tools are at a moderate level. This suggests that Sultan Qaboos University has taken steps to respond to technological change, yet there appears to be substantial room for improvement, particularly in integrating AI technologies into educational and administrative practices.
The two sub-dimensions of dynamic capabilities (Technological Administrative Communication and Technological Administrative Transformation) were also found to be moderate, reflecting the university’s developing ability to adapt to and implement AI technologies in its processes. While this indicates some institutional readiness, it points to the need for more effective decision-making and strategic integration of AI to enhance educational quality and institutional competitiveness.
These dynamic capabilities are foundational for universities seeking to adapt to rapid digital shifts. As noted by Alshadoodee et al. [50], activating such capabilities can be important to transforming traditional education into a modern digital environment, especially at the undergraduate level where students begin developing essential digital skills. Enhancing dynamic capabilities may support flexible learning environments, innovation and sustainable improvement by investing in digital infrastructure and integrating AI into both teaching and assessment.
The moderate level of students’ awareness in using generative AI tools also points to a generally positive trend, reflecting their growing engagement with emerging technologies. While some of this awareness may stem from the university’s technological environment, it is also shaped by external societal influences. This dual influence highlights a complex ecosystem of learning, where formal education intersects with informal digital literacy.
As ChatGPT and similar AI tools become increasingly prevalent in higher education [51], students are gaining exposure to powerful AI applications that support interactive, personalized, and self-directed learning. Studies highlight the rapid spread of ChatGPT in educational contexts, reflecting a broader transformation in how students learn and engage with content [52,53].
However, as Sposato cautions, the integration of AI into academic environments is not without its complexities [54]. Although AI technologies hold substantial promise for improving learning outcomes and expanding educational access, their integration into academic settings requires thoughtful consideration. Sposato emphasizes the need for a balanced approach that recognizes both the opportunities and limitations of AI while prioritizing the preservation of human-centered educational values.
In light of this, it is essential to recognize that students may not always use AI tools in ways that align with intended learning outcomes. To address this, greater institutional support is needed to guide the ethical and pedagogical use of AI. This includes targeted professional development for faculty members, who play a pivotal role in modeling responsible AI use. By equipping faculty members with the necessary skills and frameworks, universities can better support the effective educational integration of AI while upholding academic integrity.

5.2. Discussion of Research Question 2

This research question investigated the nature of the correlation between the university administration’s technological dynamic capabilities and students’ awareness of AI usage. The findings revealed a statistically significant positive but modest correlation between the scores of students on the overall University Administration’s Technological Dynamic Capabilities in AI Scale and their scores on the Synthetic Index of Use of Artificial Intelligence Tools, at a significance level of 0.01. This result indicates a statistically reliable but moderate association; as students perceive higher technological dynamic capabilities within their university administration, their awareness and use of AI tools tends to increase correspondingly.
These findings are consistent with prior research. Studies by Bates et al. and Chai et al. similarly demonstrated that when educational institutions actively build and leverage dynamic technological capabilities, they create environments that promote students’ awareness of AI applications [55,56]. This, in turn, enhances students’ motivation to engage with AI tools and develop related competencies. Kelly et al. also emphasized that the integration of AI within educational institutions encourages student acceptance of AI technologies, especially in the design and delivery of learning environments [57]. This acceptance plays a key role in strengthening students’ motivation and readiness to use these tools effectively in their academic work. Similarly, Deng and Yu found that AI tools, particularly chatbot technologies, can enhance classroom interaction and overall learning experiences [58]. These technologies do more than streamline educational processes; they enrich student engagement, boost motivation, and improve learning outcomes, all of which are crucial for success in higher education.
Our findings also align with Owens and Lilly, who noted that students’ intentions toward adopting educational technologies often reflect the quality and variety of learning experiences offered by their institutions throughout different academic stages [59]. In other words, when universities invest in and demonstrate technological dynamic capabilities, they are more likely to influence students’ positive attitudes toward using AI. This result underscores the strategic role of university administrations in shaping students’ digital engagement. The direct correlation found in this study affirms that when institutions demonstrate strong technological dynamic capabilities, they are better positioned to support students’ awareness and informed use of AI tools in educational contexts.

5.3. Discussion of Research Question 3

This research question examined the predictive relationship between university administration’s technological dynamic capabilities and students’ awareness of AI tool usage, using simple linear regression analysis. The findings show that the technological dynamic capabilities of university administration significantly contribute to predicting students’ overall awareness of AI tools, accounting for 12.4% of the variance. Additionally, these capabilities significantly predicted two specific dimensions of AI awareness: the effectiveness of AI tool use and faculty proficiency in AI, with contribution rates of 21.8% and 1.5%, respectively.
These results highlight the crucial role of institutional adaptability in cultivating digital awareness among students. Researchers attribute this to the importance of university administrations possessing dynamic capabilities that enable them to respond proactively to technological changes in the educational landscape. Such responsiveness fosters an environment where students are more aware of, and prepared to engage with, emerging digital tools. This finding is consistent with Nadia’s work, which emphasized the necessity for institutions to adapt quickly to change and to approach it with creativity and innovation [60]. The university administration’s ability to implement best practices for using AI in classrooms represents a form of institutional adaptation that not only modernizes pedagogy but also equips students with essential competencies for the evolving labor market. This perspective aligns with the findings of Hernández-Linares who stressed the importance of institutional sensing and responsiveness in developing organizational capabilities, particularly in the context of staff preparedness [61].
However, while the predictive contribution to students’ perception of faculty proficiency in AI was statistically significant, it was relatively limited at 1.5%, suggesting that faculty members may currently lack sufficient readiness or training to integrate AI tools effectively into their teaching. This highlights a key developmental need: enhancing the university’s dynamic capability in faculty professional development, especially around the pedagogical integration of AI applications. Strengthening this area could increase faculty confidence and competence in using AI, thereby reinforcing student awareness and engagement.
Moreover, the broader implications of AI awareness extend into the motivational domain. When students recognize the relevance and utility of AI technologies in their academic and professional lives, their acceptance of these tools increases, positively influencing their motivation to learn. Rozek et al. noted that students’ acceptance of educational technologies like AI can act as a catalyst for deeper learning, particularly when students understand the practical value of these tools [62]. Similarly, Upadhyay et al. proposed an AI acceptance model that identifies performance expectations, openness to experience, social influence, enjoyment, and productivity as key drivers of student motivation to adopt AI tools [63]. These findings, therefore, highlight the importance of strengthening the dynamic technological capabilities of university administrations, not only for advancing institutional innovation but also for enhancing students’ digital readiness and active engagement with AI tools.
Although the regression model yielded a statistically significant result (R2 = 0.124), the proportion of variance explained is relatively small. This suggests that while dynamic capabilities, specifically seizing and reconfiguring, contribute to students’ awareness of AI tools, other unmeasured factors may also play a substantial role. Moreover, it is important to note that several predictors did not reach statistical significance. These non-significant results highlight the complexity of the phenomenon and suggest that awareness of AI tools among students may be influenced by variables beyond the scope of this study, such as individual digital literacy, curriculum exposure or institutional culture. Finally, the exclusion of the “Sensing” dimension may have narrowed the operationalization of dynamic capabilities. As such, interpretations should be confined to the dimensions assessed, and future research is encouraged to explore sensing in contexts where its indicators are more observable and meaningful to participants.

5.4. Conceptual Contribution

This study contributes to the literature on Dynamic Capabilities Theory by applying it to the context of AI literacy in higher education, particularly from the perspective of students rather than institutional actors. The findings confirm the relevance of the “Seizing” and “Reconfiguring” dimensions in shaping students’ awareness of AI tools, while the exclusion of “Sensing” highlights the need to recontextualize this dimension when applied outside organizational leadership. This suggests a potential student-centered adaptation of DCT in educational contexts, where environmental scanning may be less salient than internal capability development and resource restructuring.

6. Limitations and Directions for Future Research

While this study offers valuable insights into how university administration’s technological dynamic capabilities relate to students’ awareness of AI, several limitations should be acknowledged. First, the sample was limited to students from the College of Education at Sultan Qaboos University, which may restrict the generalizability of the findings to other disciplines or institutional contexts. Moreover, a purposive sampling approach was employed, which, while appropriate for exploratory research, may limit the representativeness of the sample. The cross-sectional design of the study further restricts the ability to draw causal inferences or observe changes over time. Additionally, reliance on self-reported data introduces potential biases, including social desirability effects and subjective variability in how students interpret and report their awareness of AI.
Future research could benefit from mixed methods approaches that combine quantitative data with interviews or focus groups to provide deeper insight into students’ experiences and interpretations of AI integration. Triangulating data sources, such as institutional policy reviews, classroom observations and student performance data, could also strengthen the validity of findings and offer a more holistic understanding of institutional readiness. In addition, experimental designs may help establish causal relationships between specific dynamic capability interventions (e.g., AI-focused faculty training or curriculum redesign) and measurable changes in student AI literacy or engagement.
Importantly, the findings suggest new directions for inclusive, equity-focused research. Future studies should examine how technological dynamic capabilities influence AI awareness among students with special needs, a group often underserved in digital transformation discourse. Investigating how university administrations can adapt their technological strategies to foster accessibility and inclusion would support the broader goals of educational equity and SDG 4.
Further, the study highlights gaps in faculty preparedness for AI integration. As such, future research might design and evaluate AI-supported professional development programs to build adaptive leadership and teaching capacity within university administrations. Research could also explore how AI tools themselves, such as predictive analytics or intelligent decision systems, can enhance dynamic capabilities and support strategic planning, resource optimization, and more personalized support for diverse learners. These research efforts, combined with practical initiatives such as developing AI-integrated curricula, fostering adaptive learning environments, and organizing awareness-raising workshops, can collectively contribute to a more inclusive and technologically progressive educational ecosystem.

7. Practical Recommendations for University Administrators

Based on the study’s findings, we offer the following actionable recommendations:
  • Targeted Faculty Training: Provide ongoing, discipline-specific workshops on AI tools, focusing on pedagogical integration, ethical use, and alignment with learning outcomes.
  • Collaborative Policy Development: Engage faculty and stakeholders in creating clear, context-relevant guidelines for AI use in teaching and assessment to ensure buy-in and clarity.
  • Curriculum Adaptation: Promote redesigning curricula to incorporate AI literacy, encourage critical thinking, and include assignments that leverage AI responsibly.
  • Communities of Practice: Facilitate regular forums where faculty members can exchange experiences, share best practices, and collaboratively solve challenges related to AI integration.
  • Monitoring and Evaluation: Establish systems to regularly assess AI’s impact on student learning, academic integrity, and faculty readiness, using data to guide continuous improvement.
  • Equity and Inclusion Focus: Ensure AI initiatives address diverse student needs, including accessibility and support for underrepresented groups.

8. Conclusions

This study examined the relationship between the technological dynamic capabilities of university administration and students’ awareness of AI usage at the College of Education, Sultan Qaboos University. Findings revealed that both the university’s technological capabilities and student AI awareness were perceived as being at moderate levels, indicating progress but also clear opportunities for growth. A significant positive correlation between the two confirms the strategic role of administrative agility in shaping students’ engagement with AI tools. Regression analysis further showed that these capabilities predict students’ overall AI awareness, particularly in terms of perceived effectiveness and faculty proficiency, though the latter showed limited influence, emphasizing the need for targeted faculty development. These findings affirm that institutional readiness for AI is not solely a technological matter, but also a strategic and organizational one that can shape how effectively students engage with emerging tools. Therefore, universities seeking to foster meaningful AI integration should invest in AI-informed curricula, flexible digital learning environments, and institution-wide awareness initiatives. Based on the study’s findings, a conceptual framework is proposed that positions AI literacy as an emergent capability shaped by two dynamic processes: opportunity seizing (e.g., institutional support and access) and reconfiguring (e.g., integration of AI in learning practices). This framework reflects a modified application of DCT that emphasizes internal enablers within educational environments rather than external market sensing. Looking forward, educational leaders must consider inclusion, adaptability, and long-term sustainability as core pillars of their AI strategies. Future research should explore how dynamic capabilities can be mobilized to support students with diverse needs and how AI itself can be leveraged to strengthen institutional responsiveness. Only through such multidimensional efforts can higher education fulfill its role in advancing equitable, future-ready, and technologically empowered learning.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/su17157092/s1.

Author Contributions

Conceptualization, F.M.A.; methodology, F.M.A. and O.H.; validation, Z.S. and M.Q.M.A.; formal analysis, M.M.M.H., Z.S.; data curation, M.M.M.H. and A.M.N.B.-O.; writing—original draft preparation, F.M.A., O.H., and M.Q.M.A.; writing—review and editing, Z.S.; supervision, F.M.A.; project administration, F.M.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the College of Education Research Ethics Committee (protocol code REAAF/EDU/DEFA/2024/14 date of approval 02 December 2024).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Mergel, I.; Dickinson, H.; Stenvall, J.; Gasco, M. Implementing AI in the Public Sector. Public Manag. Rev. 2023, 1–14. [Google Scholar] [CrossRef]
  2. Aldosari, S.A.M. The Future of Higher Education in the Light of Artificial Intelligence Transformations. Int. J. High. Educ. 2020, 9, 145–151. [Google Scholar] [CrossRef]
  3. Michel-Villarreal, R.; Vilalta-Perdomo, E.; Salinas-Navarro, D.E.; Thierry-Aguilera, R.; Gerardou, F.S. Challenges and opportunities of generative AI for higher education as explained by ChatGPT. Educ. Sci. 2023, 13, 856. [Google Scholar] [CrossRef]
  4. Wang, S.; Wang, F.; Zhu, Z.; Wang, J.; Tran, T.; Du, Z. Artificial intelligence in education: A systematic literature review. Expert Syst. Appl. 2024, 252, 124167. [Google Scholar] [CrossRef]
  5. Al-Zahrani, A.M.; Alasmari, T.M. Exploring the Impact of Artificial Intelligence on Higher Education: The Dynamics of Ethical, Social, and Educational Implications. Humanit. Soc. Sci. Commun. 2024, 11, 912. [Google Scholar] [CrossRef]
  6. Chiu, T.K.F. Future Research Recommendations for Transforming Higher Education with Generative AI. Comput. Educ. Artif. Intell. 2024, 6, 100197. [Google Scholar] [CrossRef]
  7. Çifçi, H.; Şahin, M.A.; Çifçi, İ.; Çetin, G. Measuring Artificial Intelligence Integration in Higher Education: A Bibliometric Analysis of Quantitative Studies. J. Data Appl. 2024, 3, 33–62. [Google Scholar] [CrossRef]
  8. Zhang, K.; Aslan, A.B. AI technologies for education: Recent research & future directions. Comput. Educ. Artif. Intell. 2021, 2, 100025. [Google Scholar] [CrossRef]
  9. Overono, A.L.; Ditta, A.S. The rise of artificial intelligence: A clarion call for higher education to redefine learning and reimagine assessment. Coll. Teach. 2023, 73, 123–126. [Google Scholar] [CrossRef]
  10. Delcker, J.; Heil, J.; Ifenthaler, D.; Seufert, S.; Spirgi, L. First-Year Students’ AI Competence as a Predictor for Intended and De Facto Use of AI Tools for Supporting Learning Processes in Higher Education. Int. J. Educ. Technol. High. Educ. 2024, 21, 18. [Google Scholar] [CrossRef]
  11. Dahri, N.A.; Yahaya, N.; Al-Rahmi, W.M.; Vighio, M.S.; Alblehai, F.; Soomro, R.B.; Shutaleva, A. Investigating AI-Based Academic Support Acceptance and Its Impact on Students’ Performance in Malaysian and Pakistani Higher Education Institutions. Educ. Inf. Technol. 2024, 29, 18695–18744. [Google Scholar] [CrossRef]
  12. Zawacki-Richter, O.; Marín, V.I.; Bond, M.; Gouverneur, F. Systematic review of research on artificial intelligence applications in higher education—Where are the educators? Int. J. Educ. Technol. High. Educ. 2019, 16, 39. [Google Scholar] [CrossRef]
  13. Ng, D.T.K.; Leung, J.K.L.; Chu, S.K.W.; Qiao, M.S. Conceptualizing AI literacy: An exploratory review. Comput. Educ. Artif. Intell. 2021, 2, 100041. [Google Scholar] [CrossRef]
  14. Heung, I.; Yim, Y. Artificial intelligence literacy in primary education: An arts-based approach to overcoming age and gender barriers. Comput. Educ. Artif. Intell. 2024, 7, 100321. [Google Scholar] [CrossRef]
  15. Laupichler, M.C.; Aster, A.; Haverkamp, N.; Raupach, T. Development of the “Scale for the Assessment of Non-experts’ AI Literacy”—An exploratory factor analysis. Comput. Hum. Behav. Rep. 2023, 12, 100338. [Google Scholar] [CrossRef]
  16. Al-Masry, E.; Tarawneh, E. The reality of using AI applications to support the transformation of Jordanian public universities into productive institutions: Perspectives of academic leaders. J. Fac. Educ. Assiut Univ. 2021, 37, 121–145. [Google Scholar]
  17. Alatel, M.; Al-Anzi, E.; Ajami, A. The role of artificial intelligence (AI) in education from the point of view of students at the Faculty of Basic Education in the State of Kuwait. J. Stud. Educ. Res. 2021, 1, 30–64. Available online: https://jser-kw.com (accessed on 1 May 2025).
  18. Al-Shammari, M.F. The Effectiveness of Employing Artificial Intelligence Applications and Their Impact on Improving the Quality of Learning Outcomes in the Universities of the Kingdom of Saudi Arabia. Jordanian Educ. J. 2024, 9, 388–416. [Google Scholar] [CrossRef]
  19. Otoom, N.M. Requirements for employing artificial intelligence applications in higher education and its challenges. Jerash Res. Stud. J. 2023, 24, 325–340. Available online: https://digitalcommons.aaru.edu.jo/jpu/vol24/iss2/6 (accessed on 2 March 2025).
  20. Delello, J.A.; Sung, W.; Mokhtari, K.; Hebert, J.; Bronson, A.; De Giuseppe, T. AI in the Classroom: Insights from Educators on Usage, Challenges, and Mental Health. Educ. Sci. 2025, 15, 113. [Google Scholar] [CrossRef]
  21. Nelson, A.S.; Santamaría, P.V.; Javens, J.S.; Ricaurte, M. Students’ Perceptions of Generative Artificial Intelligence (GenAI) Use in Academic Writing in English as a Foreign Language. Educ. Sci. 2025, 15, 611. [Google Scholar] [CrossRef]
  22. Teece, D.J. Explicating dynamic capabilities: The nature and microfoundations of (sustainable) enterprise performance. Strateg. Manag. J. 2007, 28, 1319–1350. [Google Scholar] [CrossRef]
  23. El-Shara, I.A.; Saeed, A.S.; Arouri, Y.M. University Students’ Awareness and Attitudes toward the Use of Artificial Intelligence Applications (AIAs) in Learning: A Descriptive Study. Int. J. Inf. Educ. Technol. 2025, 15, 539–548. [Google Scholar] [CrossRef]
  24. Al-Qahtani, A.; Al-Dail, S. The Level of Cognitive Awareness of the Concepts of Artificial Intelligence and Its Applications in Education among Female Students at Princess Noura bint Abdul Rahman University and Their Attitudes. J. Educ. Psychol. Sci. 2021, 22, 163–192. [Google Scholar]
  25. Aldreabi, H.; Dahdoul, N.K.; Alhur, M.; Alzboun, N.; Alsalhi, N.R. Determinants of Student Adoption of Generative AI in Higher Education. Electron. J. e-Learn. 2025, 23, 15–33. [Google Scholar] [CrossRef]
  26. Baek, C.; Tate, T.; Warschauer, M. “ChatGPT Seems Too Good to Be True”: College Students’ Use and Perceptions of Generative AI. Comput. Educ. Artif. Intell. 2024, 7, 100294. [Google Scholar] [CrossRef]
  27. Rüdian, S.; Podelo, J.; Kužílek, J.; Pinkwart, N. Feedback on Feedback: Student’s Perceptions for Feedback from Teachers and Few-Shot LLMs. In Proceedings of the 15th International Learning Analytics and Knowledge Conference (LAK ’25), Dublin, Ireland, 3–7 March 2025; Association for Computing Machinery: New York, NY, USA, 2025; pp. 82–92. [Google Scholar] [CrossRef]
  28. Mondal, H.; Karri, J.K.K.; Ramasubramanian, S.; Mondal, S.; Juhi, A.; Gupta, P. A qualitative survey on perception of medical students on the use of large language models for educational purposes. Adv. Physiol. Educ. 2025, 49, 27–36. [Google Scholar] [CrossRef] [PubMed]
  29. Wu, D.; Sun, X.; Liang, S.; Qiu, C.; Wei, Z. Construction of AI literacy evaluation system for college students and an empirical study at Wuhan University. Front. Digit. Educ. 2025, 2, 6. [Google Scholar] [CrossRef]
  30. Laupichler, M.C.; Aster, A.; Schirch, J.; Raupach, T. Artificial Intelligence Literacy in Higher and Adult Education: A Scoping Literature Review. Comput. Educ. Artif. Intell. 2022, 3, 100101. [Google Scholar] [CrossRef]
  31. Lassébie, J.; Quintini, G. What Skills and Abilities Can Automation Technologies Replicate and What Does It Mean for Workers? New Evidence (OECD Social, Employment and Migration Working Papers No. 282); OECD Publishing: Paris, France, 2022. [Google Scholar] [CrossRef]
  32. Lane, M.; Williams, M.; Broecke, S. The Impact of AI on the Workplace: Main Findings from the OECD AI Surveys of Employers and Workers (OECD Social, Employment and Migration Working Papers No. 288); OECD Publishing: Paris, France, 2023. [Google Scholar] [CrossRef]
  33. Monteiro, A.P.; Soares, A.M.; Rua, O.L. Linking intangible resources and export performance: The role of entrepreneurial orientation and dynamic capabilities. Balt. J. Manag. 2017, 12, 329–347. [Google Scholar] [CrossRef]
  34. Alqirem, A.; Alkshali, S. Dynamic capabilities and its impact on entrepreneurial orientation: The moderating role of creative environment in information technology companies in Jordan. Glob. J. Econ. Bus. 2022, 12, 21–48. [Google Scholar] [CrossRef]
  35. Ambrosini, V.; Bowman, C. What are dynamic capabilities and are they a useful construct in strategic management? Int. J. Manag. Rev. 2009, 11, 29–49. [Google Scholar] [CrossRef]
  36. Eisenhardt, K.M.; Martin, J.A. Dynamic Capabilities: What Are They? Strateg. Manag. J. 2000, 21, 1105–1121. [Google Scholar] [CrossRef]
  37. Bodendorf, F.; Franke, J. The technological transformation process for dynamic capabilities in business operations. IEEE Trans. Eng. Manag. 2024, 71, 3671–3687. [Google Scholar] [CrossRef]
  38. Al-Obeidi, R.A. The role of technological capabilities in enhancing the dimensions of organizational ambidexterity: An exploratory study of the views of employees in the Directorate of Education in Nineveh. Arab. J. Adm. 2020, 40, 145–161. Available online: https://digitalcommons.aaru.edu.jo/aja/vol40/iss3/8 (accessed on 11 February 2025).
  39. Gheitarani, F.; Guevara, R.; Nawaser, K.; Jahanshahi, A.A. Identifying dimensions of dynamic technological capability: A systematic review of the last two decades of research. Int. J. Innov. Technol. Manag. 2022, 19, 2230002. [Google Scholar] [CrossRef]
  40. Mohamed, A.M.; Shaaban, T.S.; Bakry, S.H.; Guillén-Gámez, F.D.; Strzelecki, A. Empowering the Faculty of Education Students: Applying AI’s Potential for Motivating and Enhancing Learning. Innov. High Educ. 2025, 50, 587–609. [Google Scholar] [CrossRef]
  41. Bennett, S.; Maton, K.; Kervin, L. The “Digital Natives” Debate: A Critical Review of the Evidence. Br. J. Educ. Technol. 2008, 39, 775–786. [Google Scholar] [CrossRef]
  42. Nair, C. Technology in the classroom and its impact on college graduates. Creat. Educ. 2023, 14, 2358–2367. [Google Scholar] [CrossRef]
  43. AlNaimi, S.; Mahmoud, H.; Ghoneim, F. The Degree to Which the Use of Modern Educational Technologies Affects the Quality of Education and Its Development in Al-Ahliyya Amman University from the Viewpoint of Faculty Members. Al-Balqa J. Res. Stud. 2020, 23, 65–76. [Google Scholar] [CrossRef]
  44. George, B.; Wooden, O. Managing the strategic transformation of higher education through Artificial Intelligence. Adm. Sci. 2023, 13, 196. [Google Scholar] [CrossRef]
  45. Wang, J. The Digital Literacy Education of College Students Under the Digital China Strategy. In Proceedings of the International Conference on Education, Culture and Industry Development and Tourism (ICECIDT 2022); Volodin, A., Roumbal, I., Eds.; ASSEHR 677. Atlantis Press: Paris, France, 2023; pp. 553–560. [Google Scholar] [CrossRef]
  46. Johnson, A. Ways AI Is Changing the Education Industry. eLearning Industry. 23 June 2019. Available online: https://elearningindustry.com/ai-is-changing-the-education-industry-5-ways (accessed on 21 May 2025).
  47. Nwile, C.B.; Edo, B.L. Artificial intelligence and robotic tools for effective educational management and administration in the state-owned universities in Rivers State, Nigeria. Fac. Nat. Appl. Sci. J. Math. Sci. Educ. 2023, 4, 28–36. Available online: https://fnasjournals.com/index.php/FNAS-JMSE/article/view/143 (accessed on 12 June 2025).
  48. Kump, B.; Engelmann, A.; Keßler, A.; Schweiger, C. Toward a dynamic capabilities scale: Measuring organizational sensing, seizing, and transforming capacities. Ind. Corp. Change 2019, 28, 1149–1172. [Google Scholar] [CrossRef]
  49. Grájeda, A.; Burgos, J.; Córdova, P.; Sanjinés, A. Assessing student-perceived impact of using artificial intelligence tools: Construction of a synthetic index of application in higher education. Cogent Educ. 2023, 11, 1–24. [Google Scholar] [CrossRef]
  50. Alshadoodee, H.; Mansoor, M.; Kuba, H.; Gheni, H. The role of artificial intelligence in enhancing administrative decision support systems by depending on knowledge management. Bull. Electr. Eng. Inform. 2022, 11, 3577–3589. [Google Scholar] [CrossRef]
  51. Okonkwo, C.W.; Ade-Ibijola, A. Chatbots applications in education: A systematic review. Comput. Educ. Artif. Intell. 2021, 2, 100033. [Google Scholar] [CrossRef]
  52. Huang, W.; Hew, K.F.; Fryer, L.K. Chatbots for language learning—Are they really useful? A systematic review of chatbot-supported language learning. J. Comput. Assist. Learn. 2022, 38, 237–257. [Google Scholar] [CrossRef]
  53. Firat, M. What ChatGPT means for universities: Perceptions of scholars and students. J. Appl. Learn. Teach. 2023, 6, 57–63. [Google Scholar] [CrossRef]
  54. Sposato, M. A call for caution and evidence–based research on the impact of artificial intelligence in education. Qual. Educ. All 2025, 2, 158–170. [Google Scholar] [CrossRef]
  55. Bates, T.; Cobo, C.; Mariño, O.; Wheeler, S. Can artificial intelligence transform higher education? Int. J. Educ. Technol. High. Educ. 2020, 17, 42. [Google Scholar] [CrossRef]
  56. Chai, C.S.; Wang, X.; Xu, C. An extended theory of planned behavior for the modelling of Chinese secondary school students’ intention to learn artificial intelligence. Mathematics 2020, 8, 2089. [Google Scholar] [CrossRef]
  57. Kelly, S.; Kaye, S.A.; Oviedo-Trespalacios, O. What factors contribute to acceptance of artificial intelligence? A systematic review. Telemat. Inform. 2023, 77, 101925. [Google Scholar] [CrossRef]
  58. Deng, X.; Yu, Z. A Meta-Analysis and Systematic Review of the Effect of Chatbot Technology Use in Sustainable Education. Sustainability 2023, 15, 2940. [Google Scholar] [CrossRef]
  59. Owens, J.; Lilly, F. The influence of academic discipline, race, and gender on web-use skills among graduate-level students. J. Comput. High. Educ. 2017, 29, 286–308. [Google Scholar] [CrossRef]
  60. Nadia, F.N.D.; Sukoco, B.M.; Susanto, E.; Sridadi, A.R. Discomfort and organizational change as a part of becoming a world-class university. Int. J. Educ. Manag. 2020, 34, 1265–1287. [Google Scholar] [CrossRef]
  61. Hernández-Linares, R.; Kellermanns, F.W.; López-Fernández, M.C. Dynamic capabilities and SME performance: The moderating effect of market orientation. J. Small Bus. Manag. 2021, 59, 162–195. [Google Scholar] [CrossRef]
  62. Rozek, C.S.; Hyde, J.S.; Svoboda, R.C.; Hulleman, C.S.; Harackiewicz, J.M. Gender Differences in the Effects of a Utility-Value Intervention to Help Parents Motivate Adolescents in Mathematics and Science. J. Educ. Psychol. 2015, 107, 195–206. [Google Scholar] [CrossRef]
  63. Upadhyay, N.; Upadhyay, S.; Dwivedi, Y.K. Theorizing artificial intelligence acceptance and digital entrepreneurship model. Int. J. Entrep. Behav. Res. 2022, 28, 1138–1166. [Google Scholar] [CrossRef]
Figure 1. A conceptual model summarizing the theoretical framework of the study.
Figure 1. A conceptual model summarizing the theoretical framework of the study.
Sustainability 17 07092 g001
Figure 2. Sample flow diagram of participant selection.
Figure 2. Sample flow diagram of participant selection.
Sustainability 17 07092 g002
Figure 3. Standardized values of item loadings for the University Administration’s Technological Dynamic Capabilities in Artificial Intelligence Scale according to Confirmatory Factor Analysis (CFA).
Figure 3. Standardized values of item loadings for the University Administration’s Technological Dynamic Capabilities in Artificial Intelligence Scale according to Confirmatory Factor Analysis (CFA).
Sustainability 17 07092 g003
Table 1. Correlation coefficients between the score for each item and the score for its corresponding dimension in the University Administration’s Technological Dynamic Capabilities in Artificial Intelligence Scale (N = 70).
Table 1. Correlation coefficients between the score for each item and the score for its corresponding dimension in the University Administration’s Technological Dynamic Capabilities in Artificial Intelligence Scale (N = 70).
DimensionsItem NumberCorrelation Coefficient
910.495 **
20.784 **
30.650 **
40.481 **
50.477 **
60.623 **
70.646 **
80.458 *
Technological Administrative Transformation90.546 **
100.551 **
110.750 **
120.711 **
130.767 **
140.765 **
** Significant at the 0.01 level. * Significant at the 0.05 level.
Table 2. Correlation coefficients between the score of each dimension of the University Administration’s Technological Dynamic Capabilities in AI Scale and the total scale score (N = 70).
Table 2. Correlation coefficients between the score of each dimension of the University Administration’s Technological Dynamic Capabilities in AI Scale and the total scale score (N = 70).
DimensionCorrelation Coefficient
Technological Administrative Communication0.945 **
Technological Administrative Transformation0.914 **
** Significant at the 0.01 level.
Table 3. Rotated Factor Matrix for the University Administration’s Technological Dynamic Capabilities in AI Scale (N = 70).
Table 3. Rotated Factor Matrix for the University Administration’s Technological Dynamic Capabilities in AI Scale (N = 70).
No.ItemFactor 1Factor 2
1The university administration applies the best technological educational practices for students in classrooms to meet labor market demands.0.741
2The administration develops its student activities related to technology to align with labor market requirements.0.698
3The administration systematically searches for technological information needed by students to keep up with the current labor market.0.682
4I am aware of new technological information announced by the university administration.0.576
5The administration utilizes available information about the technological environment to enhance quality and develop its educational services.0.5390.413
6The administration can acquire knowledge related to innovation mechanisms in technology.0.5210.352
7The administration implements the planned technological changes through clearly defined responsibilities for staff.0.4920.488
8The administration can access new technological information required by students in the labor market.0.4680.381
9The university administration is capable of adapting to technological change plans in the current context. 0.851
10The administration can develop contingency plans when unexpected technological problems arise. 0.712
11The administration continuously monitors its competitors in educational technology activities.0.3820.632
12The administration continuously tracks the implementation of planned technological changes.0.4400.615
13The university demonstrates strength in executing previously planned technological changes.0.3780.587
14The administration can quickly access surrounding external technological knowledge.0.3640.526
Eigenvalues: Factor 1 = 5.82, Factor 2 = 1.10. Percentage of Variance Explained: Factor 1: 41.66%, Factor 2: 7.89%.
Table 4. Goodness-of-fit indices for the Confirmatory Factor Analysis of the University Administration’s Technological Dynamic Capabilities in Artificial Intelligence Scale.
Table 4. Goodness-of-fit indices for the Confirmatory Factor Analysis of the University Administration’s Technological Dynamic Capabilities in Artificial Intelligence Scale.
Goodness-of-Fit IndexCalculated ValueAcceptable Values
Chi-square140.021Chi-square/degrees of freedom < 3
Degrees of Freedom76
Chi-square/Degrees of Freedom1.842
Tucker–Lewis Index (TLI)0.901TLI ≥ 0.95
Comparative Fit Index (CFI)0.903CFI ≥ 0.95
Root Mean Square Error of Approximation (RMSEA)0.078RMSEA < 0.08
Table 5. CFA results for the Technological Dynamic Capabilities of University Administration in Artificial Intelligence Scale.
Table 5. CFA results for the Technological Dynamic Capabilities of University Administration in Artificial Intelligence Scale.
ItemStandardized LoadingStandard Errorz-ValueItemStandardized LoadingStandard Errorz-Value
10.5770.1825.468100.6770.1706.574
20.6330.1975.850110.7490.1707.090
30.6350.1955.865120.5640.1785.674
40.5670.1965.400130.6650.1676.483
50.6960.0596.242140.6640.1686.478
60.6100.0695.700
70.4760.0544.708
80.585
90.631
All z-values shown in the table are statistically significant at the 0.01 level.
Table 6. Reliability coefficients for the dimensions and the overall scale (N = 70).
Table 6. Reliability coefficients for the dimensions and the overall scale (N = 70).
No.DimensionsCronbach’s AlphaMcDonald’s Omega
1Technological Administrative Communication0.780.79
2Technological Administrative Transformation0.790.79
Total Scale0.8910.85
Table 7. Dimensions of the Synthetic Index of Use of Artificial Intelligence Tools and their corresponding item numbers.
Table 7. Dimensions of the Synthetic Index of Use of Artificial Intelligence Tools and their corresponding item numbers.
No.DimensionItem Numbers
1Effectiveness of using AI tools1, 2, 3, 4, 5, 6, 7, 8, 9
2Effectiveness of using ChatGPT10, 11, 12, 13, 14, 15, 16
3Student efficiency in using AI tools17, 18, 19, 20, 21, 22
4Faculty member efficiency in AI23, 24, 25, 26
5Advanced student skills in AI27, 28, 29, 30
Table 8. Correlation coefficients between item scores and their respective dimension scores on the Synthetic Index of Use of AI Tools (N = 70).
Table 8. Correlation coefficients between item scores and their respective dimension scores on the Synthetic Index of Use of AI Tools (N = 70).
Dimension Items
Effectiveness of Using Artificial Intelligence ToolsItem No. 12345
Correlation Coefficient0.732 **0.519 **0.694 **0.808 **0.892 **
Item No.6789
Correlation Coefficient0.764 **0.857 **0.748 **0.767 **
Effectiveness of Using ChatGPTItem No.1011121314
Correlation Coefficient0.844 **0.779 **0.753 **0.841 **0.866 **
Item No.1516
Correlation Coefficient0.815 **0.790 **
Student Proficiency in Using Artificial Intelligence ToolsItem No.17 18192021
Correlation Coefficient0.693 **0.830 **0.775 **0.725 **0.787 **
Item No.22
Correlation Coefficient0.840 **
Faculty Member Proficiency in Artificial IntelligenceItem No.23242526
Correlation Coefficient0.826 **0.876 **0.638 **0.812 **
Students’ Advanced AI SkillsItem No.27282930
Correlation Coefficient0.823 **0.934 **0.930 **0.903 **
Correlation is significant at 0.01 (2-tailed) indicated by ** (p < 0.01).
Table 9. Correlation coefficients between each dimension of the Synthetic Index of Use of AI Tools and the total score (N = 70).
Table 9. Correlation coefficients between each dimension of the Synthetic Index of Use of AI Tools and the total score (N = 70).
DimensionsCorrelation Coefficient
Effectiveness of Using AI Tools0.826 **
Effectiveness of Using ChatGPT0.638 **
Student Competence in Using AI Tools0.876 **
Faculty Member Competence in AI0.812 **
Students’ Advanced AI Skills0.853 **
Total Scale0.813 **
** Statistically significant at 0.01.
Table 10. Values of Cronbach’s alpha for the dimensions of the Synthetic Index of Use of AI Tools among students and the total score (N = 70).
Table 10. Values of Cronbach’s alpha for the dimensions of the Synthetic Index of Use of AI Tools among students and the total score (N = 70).
No.DimensionsCronbach’s AlphaMcDonald’s Omega
1Effectiveness of Using AI Tools0.900.91
2Effectiveness of Using ChatGPT0.910.90
3Student Competence in Using AI Tools0.870.86
4Faculty Member Competence in AI0.800.82
5Students’ Advanced AI Skills0.910.92
Total Scale0.910.92
Table 11. Means and standard deviations of the University Administration’s Technological Dynamic Capabilities in AI Scale.
Table 11. Means and standard deviations of the University Administration’s Technological Dynamic Capabilities in AI Scale.
RankLevel of AgreementStandard DeviationMeanDimension
1Agree0.473.04Technological Administrative Communication
2Agree0.462.97Technological Administrative Transformation
Agree0.393.00Overall Technological Dynamic Capabilities
Table 12. Means and standard deviations of Synthetic Index of Use of Artificial Intelligence Tools.
Table 12. Means and standard deviations of Synthetic Index of Use of Artificial Intelligence Tools.
RankLevel of AgreementStandard DeviationMeanDimension
1Agree0.5103.24Effectiveness of Using AI Tools
4Agree0.6162.96Effectiveness of Using ChatGPT
2Agree0.5683.09Students’ Proficiency in Using AI Tools
3Agree0.6412.97Faculty Member’s Proficiency in AI
5Agree0.8382.67Students’ Advanced AI Skills
Agree0.4733.04Overall Student Awareness of AI Tool Usage
Table 13. Correlation matrix between students’ scores on the University Administration’s Technological Dynamic Capabilities in AI Scale and its dimensions (independent variable) and their scores on the Synthetic Index of Use of AI Tools (dependent variable) (N = 139).
Table 13. Correlation matrix between students’ scores on the University Administration’s Technological Dynamic Capabilities in AI Scale and its dimensions (independent variable) and their scores on the Synthetic Index of Use of AI Tools (dependent variable) (N = 139).
DimensionsOverall Score of AI Tools Usage Index
Technological Administrative Communication0.312 **
Technological Administrative Transformation0.351 **
Total Score of Dynamic Capabilities Scale0.352 **
** Statistically significant at 0.01.
Table 14. Results of simple linear regression analysis to determine the contribution of technological dynamic capabilities of university administration in predicting students’ awareness of using AI tools.
Table 14. Results of simple linear regression analysis to determine the contribution of technological dynamic capabilities of university administration in predicting students’ awareness of using AI tools.
F-Valuet-ValueR2RRegression Coefficient95% CI [LL, UL]Constant ValueDependent VariableIndependent Variable
19.32010.240.1240.3520.905[0.735, 1.075]52.949Students’ awareness of using AI toolsTechnological dynamic capabilities
Significant at 0.01.
Table 15. Results of simple linear regression analysis to determine the contribution of technological dynamic capabilities of university administration in predicting the dimensions of students’ awareness of using AI tools.
Table 15. Results of simple linear regression analysis to determine the contribution of technological dynamic capabilities of university administration in predicting the dimensions of students’ awareness of using AI tools.
F-Valuet-ValueR2RRegression Coefficient95% CI [LL, UL]Constant ValueDependent VariableIndependent Variable
39.5296.2870.2180.4730.394[0.274, 0.514]12.669Effectiveness of AI tool useTechnological dynamic capabilities
3.6411.9080.0190.1610.126[−0.004, 0.256]15.444Effectiveness of ChatGPT useTechnological dynamic capabilities
2.9721.7240.0140.1460.090[−0.034, 0.214]14.717Student proficiency in using AI toolsTechnological dynamic capabilities
40.9716.4010.0150.4800.223[0.155, 0.291]2.486Faculty proficiency in AITechnological dynamic capabilities
1.9321.3900.2970.1180.072[−0.031, 0.175]7.634Advanced student AI skillsTechnological dynamic capabilities
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Abunaser, F.M.; Hamd, M.M.M.; Bani-Oraba, A.M.N.; Hamed, O.; Alshiyab, M.Q.M.; Shebani, Z. Dynamic Capabilities of University Administration and Their Impact on Student Awareness of Artificial Intelligence Tools. Sustainability 2025, 17, 7092. https://doi.org/10.3390/su17157092

AMA Style

Abunaser FM, Hamd MMM, Bani-Oraba AMN, Hamed O, Alshiyab MQM, Shebani Z. Dynamic Capabilities of University Administration and Their Impact on Student Awareness of Artificial Intelligence Tools. Sustainability. 2025; 17(15):7092. https://doi.org/10.3390/su17157092

Chicago/Turabian Style

Abunaser, Fathi M., Mohamed Mostafa Mohamed Hamd, Asma Mubarak Nasser Bani-Oraba, Omer Hamed, Maen Qasem Mohamad Alshiyab, and Zubaida Shebani. 2025. "Dynamic Capabilities of University Administration and Their Impact on Student Awareness of Artificial Intelligence Tools" Sustainability 17, no. 15: 7092. https://doi.org/10.3390/su17157092

APA Style

Abunaser, F. M., Hamd, M. M. M., Bani-Oraba, A. M. N., Hamed, O., Alshiyab, M. Q. M., & Shebani, Z. (2025). Dynamic Capabilities of University Administration and Their Impact on Student Awareness of Artificial Intelligence Tools. Sustainability, 17(15), 7092. https://doi.org/10.3390/su17157092

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop