Next Article in Journal
Relationship between GDP and Municipal Waste: Regional Disparities and Implication for Waste Management Policies
Next Article in Special Issue
Virtual Reality Technology in Architectural Theory Learning: An Experiment on the Module of History of Architecture
Previous Article in Journal
Spatiotemporal Responses of Vegetation to Hydroclimatic Factors over Arid and Semi-arid Climate
Previous Article in Special Issue
Ergonomic Factors Affecting the Learning Motivation and Academic Attention of SHS Students in Distance Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Predicting the Intention to Use Learning Analytics for Academic Advising in Higher Education

by
Mahadi Bahari
1,2,3,*,
Ibrahim Arpaci
4,*,
Nurulhuda Firdaus Mohd Azmi
3 and
Liyana Shuib
5
1
Department of Information Systems, Faculty of Management, Universiti Teknologi Malaysia, Johor 81310, Malaysia
2
College of Business Administration, University of Business Technology, Jeddah 23435, Saudi Arabia
3
UTM Big Data Centre, Universiti Teknologi Malaysia, Skudai 81310, Malaysia
4
Department of Software Engineering, Faculty of Engineering and Natural Sciences, Bandirma Onyedi Eylul University, Balikesir 10200, Türkiye
5
Department of Information Systems, Faculty of Computer Science and Information Technology, University of Malaya, Kuala Lumpur 50603, Malaysia
*
Authors to whom correspondence should be addressed.
Sustainability 2023, 15(21), 15190; https://doi.org/10.3390/su152115190
Submission received: 4 September 2023 / Revised: 17 October 2023 / Accepted: 18 October 2023 / Published: 24 October 2023

Abstract

:
Learning analytics (LA) is a rapidly growing educational technology with the potential to enhance teaching methods and boost student learning and achievement. Despite its potential, the adoption of LA remains limited within the education ecosystem, and users who do employ LA often struggle to engage with it effectively. As a result, this study developed and assessed a model for users’ intention to utilize LA dashboards. The model incorporates constructs from the “Unified Theory of Acceptance and Use of Technology”, supplemented with elements of personal innovativeness, information quality, and system quality. The study utilized exploratory research methodology and employed purposive sampling. Participants with prior experience in LA technologies were selected to take part in the study. Data were collected from 209 academic staff and university students in Malaysia (59.33% male) from four top Malaysian universities using various social networking platforms. The research employed “Partial Least Squares Structural Equation Modeling” to explore the interrelationships among the constructs within the model. The results revealed that information quality, social influence, performance expectancy, and system quality all positively impacted the intention to use LA. Additionally, personal innovativeness exhibited both direct and indirect positive impacts on the intention to use LA, mediated by performance expectancy. This study has the potential to offer valuable insights to educational institutions, policymakers, and service providers, assisting in the enhancement of LA adoption and usage. This study’s contributions extend beyond the present research and have the potential to positively impact the field of educational technology, paving the way for improved educational practices and outcomes through the thoughtful integration of LA tools. The incorporation of sustainability principles in the development and deployment of LA tools can significantly heighten their effectiveness, drive user adoption, and ultimately nurture sustainable educational practices and outcomes.

1. Introduction

Recent technological developments have presented a chance to collect and monitor students’ learning patterns in digital environments, storing them as extensive sets of data [1]. Therefore, through learning analytics (LA), higher education in the twenty-first century persists in promoting discoveries in the field [2]. LA is commonly defined as measuring, storing, analyzing, and reporting data on students’ progress as well as the environments in which they learn, aiming to better comprehend and improve both the learning environment and learning itself [3]. Research has revealed several advantages of LA, such as the provision of customized course offerings, superior learning outcomes, curriculum design, teaching execution, and greater post-educational employment options [4].
Utilizing data gathered from learners’ interactions with a “Learning Management System” (LMS) to make predictions about the factors that contribute to increased student retention is one example of learning analytics. According to the findings of [5], students who are actively engaged by participating in more discussions, sending more messages, and completing more assessments tend to achieve greater overall grades in the course. The results highlight the potential role that learning analytics can play in guiding students’ academic success and growth [6].
Today’s instructors need a learning system that offers thorough and instructive LA for their online courses [7]. By accessing the LA of students’ lesson completion status and quiz results, educators may better understand students’ ability to follow and grasp the course contents, the themes they found problematic, their social relationships, and their knowledge gains [8]. LA has shown that it can detect learners’ unique requirements and learning challenges in addition to predicting student success. This knowledge might be used to build flexible educational structures with tailored directions for specific students [9]. Thus, when properly applied, LA could result in greater accountability at all educational levels.
The early promise of LA to enhance learning and its settings has not been fully realized [10]. There is currently little evidence demonstrating how LA services affect outcomes of student learning, processes of instruction and learning, as well as institutional decision-making [11]. Although numerous tools and methods for LA have been presented, there is little empirical evidence of the factors impacting this new technology’s potential adoption [8]. To handle the difficulties of LA adoption, previous researchers have developed numerous instruments and frameworks to guide the implementation of LA technologies in Australia, Europe, and North America [3]. Although those studies have enabled academics to pinpoint crucial factors that generally impact the LA services’ acceptance, there are still several complicated silos, competing leadership agendas, and institution-specific problems [12]. There have been few efforts to develop theoretical models and evaluate the variables determining the intention to utilize LA technologies [13].
The integration of learning analytics (LA) in the educational system is presently limited, and users often face challenges in effectively utilizing LA technologies [14]. This highlights a critical research problem: understanding and improving users‘ willingness to use LA dashboards, ultimately enabling smooth integration of LA tools in educational environments [15]. Learning analytics has substantial potential to revolutionize teaching methods and enhance student learning and success [16]. However, their underutilization and ineffective engagement hinder the realization of these advantages [17]. Addressing this research problem is crucial for optimizing the incorporation of LA tools, thereby enhancing educational methods and results. Hence, this study aimed to create and validate a model for users‘ intention to use LA dashboards by integrating elements from the “Unified Theory of Acceptance and Use of Technology” (UTAUT), supplemented with aspects of “system quality” (SQ), “personal innovativeness” (PI), and “information quality” (IQ). The study hypothesized that integrating these additional variables, along with the UTAUT construct, would positively influence the intention to use LA among academic staff and university students in four leading Malaysian universities.
This study makes a significant contribution by elucidating crucial factors that influence users‘ intention to utilize LA, including information quality, social influence, performance expectancy, and system quality. Furthermore, it emphasizes the direct and indirect positive effects of personal innovativeness on the intention to use LA, mediated by performance expectancy. These findings offer valuable insights for educational institutions, policymakers, and service providers to enhance LA adoption and usage. The study‘s contribution extends beyond the current research, positively impacting the educational technology field. By advocating for the integration of sustainability principles in the development and implementation of LA tools, it suggests a path to enhance effectiveness, drive user adoption, and ultimately foster sustainable educational practices and outcomes.
The paper is organized as follows: Section 2 provides a review of related works, while Section 3 presents the conceptual framework and hypotheses. In Section 4, we delve into the research methodology, and Section 5 covers the results of the data analysis. Section 6 discusses the key findings of the study, including theoretical and managerial implications. Finally, in Section 7, we conclude the study, address limitations, and consider future avenues of research.

2. Literature Review

2.1. Learning Analytics Dashboards

Technology has evolved into a crucial tool that helps teachers and students create more effective learning environments over the past few years. The proliferation of online learning environments has substantially increased, expanding the amount of data created about the educational process [18]. Mitchell and Costello coined the term “Analytics of Learning” in 2000, presenting it as an emerging concept in their analysis of the potential prospects in the global market for the development and implementation of educational services via the network [19].
Stakeholders in higher education are increasingly employing “Learning Analytics” (LA) dashboards for various purposes, such as tools for learners to assess their progress [20], tools for administrators to support and control instructors, students, and staff [21], and tools for faculty to assess students’ performance and provide feedback on their teaching exercises [22].
LA is a field that focuses on gathering, analyzing, and sharing data related to learning environments and learners, evolved to realize the promise of this data analysis. LA is intended to address retention concerns and preserve the effectiveness of academic achievement [23]. LA was shown to be the most effective tool for growing awareness of the importance of bringing the ideas of information technologies and education closer together in the context of advancing higher education and, most significantly, learning professional and personal development based on the theoretical assumption of [5].
Numerous frameworks have been suggested to facilitate the adoption of LA and address the challenges it encounters. There are, however, few empirical studies on the variables impacting the potential adoption of LA technologies [8]. Klein et al. conducted qualitative research to comprehend organizational challenges, rewards, and opportunities relating to faculty and professional advising staff usage of LA techniques [24]. The results indicate that both organizational commitment and the organizational context, encompassing structures, processes, leadership, and policies, influence individuals’ choices to utilize and place trust in LA technologies. A study was performed [3] in European higher education organizations by interviewing institutional leaders. They identified context, challenges, strategy, and people as factors of LA adoption. Results suggested routine assessments of LA adoption to guarantee the desired changes and alignment of strategies.
Dawson et al. utilized “complexity leadership theory (CLT)” to identify the interaction between key dimensions of LA adoption. Interviews were conducted in Australian universities, and results suggested that to advance from the small-scale course stages to a more integrated and comprehensive organizational level, research on LA adoption models needs to extend [5]. Based on the UTAUT, Herodotou et al. analyzed the involvement patterns of university teachers using LA dashboards through in-depth interviews [25]. Results revealed that “social influence” (SI), “performance expectancy” (PE), and “effort expectancy” (EE) were among the elements promoting engagement with PLA. PE-facilitated conditions (FC) and a lack of knowledge of predictive data were factors that prevented PLA involvement. Based on the “technology acceptance model” (TAM), Ali et al. provided a model of the variables impacting instructors’ opinions on the use of LA tools [8]. The usage beliefs regarding an LA tool are related to the intention to employ the technology, according to the model. The study identified analytics categories that are the main interpreters of “perceived ease of use” and “perceived usefulness’ (interactive visualization).
Malaysia is known as one of the largest educational providers in Asia. The Malaysian government has consistently created efficient policies to improve the educational system. The newest but most crucial measure to enhance the educational system is thought to be the implementation of LA [23]. Malaysia is just beginning to investigate LA’s possibilities to aid student retention. The “Ministry of Higher Education” (MOHE) intends to emphasize LA to incorporate the learning and teaching transformation in higher education institutions, shifting the emphasis from retention to better satisfy the present changes in the industry known as the “fourth industrial revolution” (IR 4.0) [26]. To follow the IR 4.0 revolution, the MOHE advised a focus on four key areas: reforming learning classrooms, integrating 21st-century teaching methods, utilizing a flexible curriculum to address new developments and fields of knowledge, and utilizing the most recent teaching and learning technologies [27]. Studies confirmed academics’ high interest in utilizing LA for learning and teaching and its positive role in learners’ performance in Malaysian higher learning institutions [23,26]. Zaki et al. proposed an LA conceptual model in serious games for education in Malaysia [28]. Ismail et al. identified the LA as a powerful technique for exploring the data generated from the LMS and highlighted the likelihood of handling the LA technique with the LMS engagement in educational institutions [29]. However, LA implementation in Malaysia is fraught with difficulties. As a vital tool for the management and operation of educational institutions in Malaysia, LA is still not frequently utilized [27]. There is a lack of theoretical models that examine the determinants of LA tools’ usage in higher education in Malaysia. Therefore, the objective of this research was to construct a conceptual framework addressing the question: “What factors influence users’ intention to utilize LA dashboards within Malaysian higher education institutions?”.

2.2. Applications of LADs in Educational Settings

“Learning Analytics Dashboards” (LADs) have evolved in sync with the advancements in educational technology and the growing abundance of educational data. While the roots of learning analytics can be traced back to the early 2000s, progress in data analytics, visualization techniques, and digital learning platforms has significantly impacted the creation and usage of LADs in higher education [30].
Learning analytics utilize methodologies from data science to analyze data and present the resulting analysis using diverse textual and visual approaches [31]. In the domain of LA, dashboards have gained significant attention as tools that can provide users with relevant insights, encourage self-reflection, and potentially guide interventions to optimize learning and improve the quality of the student experience [32]. As defined by Schwendimann et al., LADs are described as “a unified display that consolidates various indicators about the learner(s), learning process(es), and/or learning context(s) into one or multiple visual representations” [33] (p. 8).
These dashboards aggregate and synthesize diverse educational data, offering insights into student performance, engagement, behavior, and learning patterns [34]. LADs are designed to assist educators, administrators, and students in making informed decisions, optimizing teaching and learning strategies, and improving overall educational outcomes [35].
LADs provide a real-time view of individual and group academic performance [36]. Consequently, educators can monitor student progress, identify areas for improvement, and tailor instructional strategies accordingly. LADs analyze student data to craft personalized learning pathways based on individual weaknesses, strengths, and learning styles [37]. This facilitates a customized learning experience, enhancing student engagement and comprehension [38].
LADs can detect early signs of academic struggles or disengagement, enabling timely intervention and support for at-risk students [39]. This proactive approach can boost student retention rates [39]. Educators can utilize LADs to evaluate the effectiveness of courses and curricula [32]. Insights from the dashboards can guide adjustments to course content, assessments, and instructional strategies for better learning outcomes.
LADs can employ predictive models to forecast student success, aiding institutions in identifying students who may require additional support [40]. They evaluate student engagement and motivation levels using data on participation, interactions, and feedback [41]. This information empowers educators to implement strategies that enhance engagement and motivation [42]. Institutions can optimize resource allocation, including faculty time and support services, based on LAD insights [43]. This data-driven approach facilitates efficient planning and allocation of educational resources [33].
In summary, LADs are pivotal in contemporary higher education as they harness the power of data and analytics to improve teaching, enrich learning experiences, and promote student achievement [44]. They offer a multifaceted view of educational data, enabling stakeholders to make data-informed decisions and nurture a more effective and efficient educational ecosystem [42]. Integrating learning analytics into academic advising not only enhances the advising process but also contributes to student success and retention [39]. By harnessing the power of data analytics, academic advisors can provide timely, personalized guidance, ultimately supporting self-regulated learning [38].

2.3. Research Gap

In a thorough examination of existing literature, researchers [45] identified critical factors predicting the adoption of “Learning Analytics” (LA) within higher education institutions (HEIs). These factors encompassed user support, effective communication with users, comprehensive end-user training, and the establishment of standards for LA tools. Another systematic review focused on enhancing students’ learning performance through improved engagement in learning, encompassing cognitive, behavioral, and emotional dimensions [46].
A separate study [47] explored trust’s influence on the integration of LA within higher education. The results showed that educators displayed substantial trust in the competence of HEIs and the effectiveness of LA. However, their trust in technology vendors concerning privacy and ethical aspects was relatively diminished. Another investigation [48] aimed to grasp students’ privacy anxieties regarding the effective deployment of LA tools in HEIs. This study revealed aspects that heightened students’ concerns regarding the gathering, utilization, and disclosure of the data for learning analytics.
Furthermore, a study [49] investigated differences in teachers’ utilization of LA and identified that early adopters, classified as “innovators”, displayed significantly higher engagement during remote education weeks compared to their counterparts. This underscored the critical role of personal innovativeness in LA tool adoption. Additionally, recent research [50] examined LA adoption in higher education and emphasized “perceived usefulness” and “ease of use” as significant determinants.
Another study [51] explored the effect of a comprehensive LA approach on learning performance in online collaborative learning. The outcomes indicated a significant enhancement in group performance, collaborative knowledge building, metacognitive learning engagement, social interaction, and coregulated behaviors compared to traditional online collaborative learning. Moreover, research [52] evaluated the effectiveness and utility of an LA dashboard in HEIs, demonstrating its efficacy in aiding students in informed decision-making regarding their learning approach.
Integrating AI-based performance prediction models with LA methodologies, a study [53] endeavored to enhance student learning outcomes within a collaborative learning environment. The integrated approach exhibited notable improvements in collaborative learning performance, increased student engagement, and heightened student satisfaction with the learning process. Additionally, research [54] investigated how students utilize “Learning Analytics Dashboards” (LADs) in higher education. The findings highlighted that engagement with LADs supported students’ metacognitive and time management strategies during learning, particularly during out-of-class learning sessions, with greater benefits observed for higher-performing students.
In previous research, the “Technology Acceptance Model” (TAM) and the “Unified Theory of Acceptance and Use of Technology” (UTAUT) were predominantly employed to elucidate “Learning Analytics” (LA) adoption within “Higher Education Institutions” (HEIs). Table 1 succinctly presents the key findings from studies on LA adoption within HEIs. These findings consistently demonstrate robust correlations among factors such as “Perceived Usefulness” (PU), “Perceived Ease of Use” (PEOU), “Behavioral Intention” (BI), “Performance Expectancy” (PE), “Effort Expectancy” (EE), “Social Influence” (SI), and others. These factors are shown to have positive or negative associations with one another or with specific outcomes, including training satisfaction and attitudes toward use.
While the literature review offers valuable insights into the essential factors that influence the adoption of “Learning Analytics” (LA) and its influence on student learning, a gap persists in achieving a comprehensive understanding of the interconnections among these factors and their collective impact on student learning. This study took a deeper dive into exploring how information quality, system quality, and personal innovativeness converge to influence LA adoption. To fill this research gap and gain a more holistic perspective, further exploration of the viewpoints and experiences of both educators and students concerning LA adoption is warranted, providing a more encompassing view of this subject.

3. Theoretical Background

3.1. Conceptual Framework

One of the frequently cited theories in the literature on technology acceptance and information systems (IS) is the UTAUT [59]. The UTAUT has been repeatedly replicated with success and applied in research on a wide range of technologies and even circumstances outside of adoption [60,61]. The complete range of innovation adoption, from initial adoption to post-adoption use, has been studied using UTAUT [61]. UTAUT was confirmed to surpass earlier adoption theories [59]. Moreover, UTAUT stood out as the most employed model in a recent literature review performed to investigate technology acceptance models from 2010 to 2020 [62]. UTAUT is a comprehensive and successful theory examining the adoption and use of different technologies in the education sector [63]. Hence, UTAUT was adopted as the foundation of the current research. UTAUT consists of “performance expectancy” (PE), “effort expectancy” (EE), “social influence” (SI), and “facilitating condition” (FC) constructs. In this study, UTAUT has been employed to assess the intentions to use LA (IULA). Additionally, the UTAUT’s basic constructs, “personal innovativeness” (PI), “information quality” (IQ), and “system quality” (SQ) constructs, were included in the model as they were recommended to be significant for technology acceptance in educational environments [64,65,66]. Figure 1 shows the developed conceptual framework in this study.
The study employed UTAUT as a theoretical framework to understand users’ intention to utilize LA dashboards. UTAUT is a well-established model that integrates several key constructs such as “performance expectancy”, “effort expectancy”, “social influence”, and “facilitating conditions”, providing a comprehensive understanding of user acceptance and use of technology. The decision to use UTAUT in this study is justified by its widespread recognition and adoption in technology acceptance research [67].
Additionally, the study integrated specific variables from the DeLone and McLean IS Success Model, “information quality” (IQ) and “system quality” (SQ)—as well as the “personal innovativeness” (PI) construct. “Information quality” (IQ) assesses the information quality provided by the system, while “system quality” (SQ) evaluates the overall system attributes and capabilities [68]. “Personal innovativeness” (PI) measures an individual’s openness to try out novel technologies and innovations [69].
SQ and IQ are widely acknowledged as pivotal factors influencing the acceptance and utilization of information systems, including learning analytics platforms [70]. SQ encompasses the overall performance and reliability of the system, while IQ encompasses the accuracy, relevance, and completeness of the information provided by the system [71]. These two factors are crucial in determining the effectiveness of learning analytics from the users’ perspective. PI is also considered a relevant variable due to its connection with technology adoption. It signifies an individual’s willingness to embrace and utilize new technologies, making it highly relevant in the context of learning analytics adoption [56]. Prior research, including studies utilizing the “Diffusion of Innovations theory,” underscores the significance of PI as a determinant of technology acceptance and adoption [72].
The integration of these variables from the D&M “IS Success Model” is justified by their relevance to the context of LA adoption and usage. “Information quality” (IQ) and “system quality” (SQ) are critical factors influencing users’ perception and acceptance of LA dashboards, aligning with the study’s objective to investigate factors impacting LA adoption. “Personal innovativeness” (PI) complements UTAUT constructs by assessing the individual’s openness to adopting recent technologies, which is crucial in the context of rapidly evolving educational technologies like LA.

3.2. Hypothesis Development

The term “performance expectation” (PE) refers to “the degree to which an individual believes that utilizing the system will assist them in achieving improvements in job performance” [59]. The degree to which using the system is simple and uncomplicated is referred to as “effort expectancy” (EE) [59]. Handoko confirmed the significant influence of PE and EE on students’ perceptions of technology adoption [73]. The positive impact of PE on teachers’ adoption of MOOCs has been supported in [74]. With UTAUT, the objective of [25] was to identify the factors predicting university teachers’ engagement with “predictive learning analytics” (PLA). The study’s findings indicated that key facilitators for engagement with PLA included PE and EE. Therefore, in this study, we expect the following:
H1: 
PE positively influences IULA.
H2: 
EE positively influences IULA.
The term “social influence” (SI) refers to “the extent to which an individual believes that significant others believe they should use the new system” [59]. Handoko found a positive effect of SI on students’ behavioral intentions to use technology [73]. Abd Rahman et al. confirmed the positively significant effect of SI on the intent to utilize flipped learning among learners [75]. Herodotou et al. discovered that social influence significantly and positively impacts university teachers’ engagement with “predictive learning analytics” (PLA) [25]. In this study, users’ desire to learn and adopt LA is influenced by the opinions and actions of their colleagues and peers. When they observe their peers using LA, they are more likely to use it themselves.
H3: 
SI positively influences IULA.
The term “facilitating conditions” (FC) refers to “the level of confidence an individual has in the existence of an organizational and technical infrastructure to support system usage” [59]. According to UTAUT, the FC factor significantly impacts the ultimate acceptance and use of an innovation [25]. The influence of FC on students’ and educators’ behavior toward technology adoption has been confirmed in [63,74]. Lecturers and students are expected to be more willing to use LA for educational purposes if sufficient technical and organizational infrastructure is provided to support them.
H4: 
FC positively influences IULA.
In the field of general diffusion of innovations research, it has long been widely accepted that highly innovative individuals are active information seekers about new concepts (Lu et al., 2005). They can manage a great deal of ambiguity and develop more positive attitudes towards acceptance. “Personal innovativeness” (PI) denotes a proclivity for experimenting with cutting-edge information technology, showcasing a positive correlation with the utilization and adoption of emerging technologies [67,76,77]. According to Blut et al., PI has a strong impact on technology usage [60]. Previous studies have indicated that PI positively impacts the intentions to use m-learning applications [56,76,78], cryptocurrencies [79], and cloud computing systems [80]. Moreover, recent findings confirmed the effects of PI on PE and EE related to e-learning and e-book adoption by students [81,82]. Therefore, in this study, we hypothesize the following:
H5: 
PI positively influences IULA.
H6: 
PI positively influences PE.
H7: 
PI positively influences EE.
The quality of the content and information offered on the online platform is evaluated as “information quality” (IQ). The provided information inevitably influences user satisfaction and intentions to use [66]. In this study, IQ assesses the quality of the information generated with LA tools and its effectiveness for users. “System quality” (SQ) refers to the extent of a system’s availability, speed of feedback, user-friendliness, and screen features (interface)—all indicators of its usability—influencing the user’s behavioral intentions to use the innovation [68]. E-learning studies have shown that IQ and SQ have a substantial positive effect on users’ intentions to use [64,65]. Both information and system quality have a positive and significant effect on both usage and user satisfaction [83]. Thus, in this study, we hypothesize the following:
H8: 
IQ positively influences IULA.
H9: 
SQ positively influences IULA.

4. Research Method

4.1. Procedure and Participants

This study was conducted at four top Malaysian universities using purposive sampling. The universities were specifically chosen due to their status as comprehensive institutions with a large population of students in blended and online courses. They were also among the top 200 universities in the world according to international rankings. The survey targeted academic staff (lecturers or tutors) as well as students. The choice of academic staff (lecturers or tutors) and students as respondents in this study stems from the study’s focus on “learning analytics” (LA) and its potential to improve teaching methods and student learning outcomes. Academic staff and students are primary users of LA tools, making their perspectives and intentions crucial to understanding the effective implementation of LA dashboards. The inclusion of both groups allows for a comprehensive exploration of the factors influencing LA usage and adoption, providing diverse perspectives from teaching and learning angles.
Respondents were encouraged to participate if they had experience using LA technologies for educational purposes. The questionnaire was administered online using various social media platforms such as Facebook and email. Over 3 months, a total of 209 complete questionnaires were collected and utilized for the data analysis in the present study. Given the exclusion of 120 participants without prior experience with LA from the main analysis, the overall response rate for the study was 63.5%.
The sample size for this study was established following the rule of thumb proposed in [84], which recommends that the minimum sample size should be 10 times larger than the number of structural paths in the structural model directed at a specific construct. As there were a maximum of seven structural paths directing LA adoption in this study, a minimum sample size of 70 respondents would be needed. Therefore, the sample size of 209 exceeded the recommended minimum and is highly adequate to validate the developed model. Table 2 displays that the majority of the respondents were male (59.33%). Nearly three-quarters of the respondents were students (85.65%). Furthermore, most respondents (81.34%) believed they had sufficient skills in using technology.

4.2. Research Instruments

In this study, the research instrument was segmented into two main sections. The initial segment gathered personal information from the respondents. The second section comprised questions rated on a 5-point Likert-type scale, ranging from “strongly disagree = 1” to “strongly agree = 5”, for assessing the influence of EE, PE, SI, FC, PI, IQ, and SQ constructs on IULA. Questions related to EE, PE, SI, FC, and intention to use LA were adopted from [59]. For assessing PI, four questions were adopted from Agarwal and Prasad (1998). Four IQ items were adopted from [64,85,86], while three SQ items were adopted from [66,87] (see Appendix A).
Content validity was ensured by inviting four experts to confirm the simplicity and relevance of the scale items. Furthermore, a pilot study involving 70 respondents was carried out to evaluate the validity and reliability of the survey instrument. After incorporating suggested rewordings and modifications, the instrument was confirmed for use in large-scale data collection for the main study.

4.3. Research Design and Data Analysis

The cross-sectional study utilized an exploratory research design. The statistical method known as “Partial Least Squares Structural Equation Modeling” (PLS-SEM) was utilized to assess and estimate relationships using causal hypotheses and statistical data [88,89]. Since it is a powerful statistical approach to assess complex relationships, PLS-SEM is extensively employed in numerous social science fields, such as “Management Information Systems” [90], hospitality [91], and strategic management [92]. This method reconciles the apparent opposition between explanation and prediction, which forms the foundations for creating managerial implications, a crucial aspect often emphasized in academic research [93]. PLS-SEM’s statistical power is particularly valuable for exploratory research that delves into less-developed or still-emerging theories [88]. Therefore, PLS-SEM was selected for the data analysis in this study. SmartPLS 4 was the software used to run PLS-SEM. The following subsections first assessed the measurement model, and then the acceptance/rejection of hypotheses was evaluated in the structural model.

5. Results

5.1. Common Method Bias

Since self-reported data were utilized in the data collection process, there is a potential risk of encountering “common-method bias”. To address this concern, a “Harman’s one-factor test”, following the approach proposed in [94], was conducted on the factors included in the theoretical model. The findings revealed that the initial factor accounted for the largest share of covariance at 22.7%, falling below the recommended threshold of 50%. This result indicates that the risk of “common-method bias” in the present study is not significant. Moreover, the “Variance Inflation Factor” (VIF) was computed to evaluate the extent of multicollinearity among the predictors employed in the regression model. The results demonstrate that the VIF values for the majority of predictors are in proximity to 1, signifying low multicollinearity. Nevertheless, the VIFs for the independent variables PI, PE, and EE exceed the recommended threshold. Due to the relatively small sample size, it is plausible that these elevated VIFs are a consequence of this factor.

5.2. Evaluation of the Measurement Model

The measurement model was evaluated using the following criteria: 1. Indicator reliability, evaluated through outer loading values. 2. Construct reliability, evaluated through “Composite Reliability” (CR) and “Cronbach’s alpha” (CA). 3. Discriminant validity, examined using “Average Variance Extracted” (AVE), Fornell–Larcker Criterion, cross-loadings, and “Heterotrait–Monotrait” Ratio (HTMT) [84].
Under outer loading, the robustness of the association between each indicator and its items is assessed. Based on Table 3, the factor loadings ranged between 0.684 and 0.924, affirming strong construct validity [87]. For construct reliability, the values of CR and CA must exceed 0.7 to ensure reliability. In this study, construct reliability was confirmed, with CR values ranging between 0.849 and 0.943 and CA coefficients ranging from 0.768 to 0.920, all falling within the acceptable range. AVE was examined to establish convergent validity. As all constructs demonstrated values above 0.50, convergent validity was affirmed [95].
Discriminant validity ensures that no two constructs are highly correlated. In the model, any pair of constructs should have correlations that are lower than √AVE values. As depicted in Table 4, the √AVE values (diagonal in bold) for each factor are higher than the maximum correlation with any other factor in both the row and column [95]. Additionally, as demonstrated in Table 5, all elements of a construct exhibit higher factor loadings than the related cross-loadings in both rows and columns.
The “Heterotrait–Monotrait Ratio” (HTMT) was employed, as proposed in [96], to validate “discriminant validity.” It assumes that the HTMT conservative criterion should be below the threshold value of 0.85. As presented in Table 6, the HTMT values are all less than 0.85. The results affirm the discriminant validity of the model.

5.3. Estimation of the Structural Model

In this step, the significance of the p-values and t-values for each hypothesized path was evaluated. Bootstrapping was conducted for this purpose. The model’s ability to predict the endogenous constructs was evaluated using R2 to determine its explanatory power. The results are detailed in Table 7 and Figure 2.
PE (β = 0.158, p-value = 0.032), SI (β = 0.157, p-value = 0.019), PI (β = 0.160, p-value = 0.007), IQ (β = 0.232, p-value = 0.000), and SQ (β = 0.139, p-value = 0.041) were found to have a positively significant effect on IULA, providing support for H1, H3, H5, H8, and H9, respectively. However, EE (β = −0.032, p-value = 0.615) and FC (β = 0.118, p-value = 0.069) were not found to have a significant impact on IULA, not providing support for H2 and H4. PI showed a positive influence on EE (β = 0.395, p-value = 0.000) and PE (β = 0.428, p-value = 0.000), supporting H7 and H6.
In summary, the developed model explained 45.6% of the variance in IULA, 15.2% of the variance in EE, and 17.9% variance in PE. Model fit was confirmed by evaluating the “standardized root mean square residual” (SRMR) with a value of 0.062, falling below the threshold of 0.08. This indicates a good fit for the developed model in the study [97].

6. Discussion

6.1. Key Findings

The findings indicated that users’ perception of performance expectancy, social influence, personal innovativeness, system quality, and information quality was positively linked to their intention to use “learning analytics” (LA). This affirms that when users perceive that utilizing LA will enhance their performance and are influenced by social factors, they are more inclined to adopt these tools. Previous research on LA use in higher education underscored the critical role of perceived usefulness as a primary determinant [50]. Additionally, [98] found that “perceived usefulness” significantly impacts the success of LA dashboards. Furthermore, both “perceived ease of use” and “perceived usefulness” showed positive correlations with satisfaction regarding LA dashboards [55].
The study highlighted the pivotal role of system quality and information quality in positively influencing users’ intention to utilize learning analytics (LA) dashboards. This observation is supported by [83], which illustrated that quality aspects such as technical systems, educational systems, instructors, support service, and course content quality exert a direct positive effect on students’ satisfaction, perceived usefulness, and the use of e-learning systems in higher education. Likewise, a different study found evidence supporting a positive association between the service quality of e-learning analytics and the system’s ease of use [99].
Unexpectedly, variables such as facilitating conditions and effort expectancy did not demonstrate a notable impact on users’ intention to use LA dashboards. This result may be attributed to the distinctive characteristics and technological familiarity of the sampled academic staff and university students in Malaysia. It is plausible that their prior experiences and comfort with technology have diminished the perceived effort typically associated with adopting new technological tools. Additionally, the provision of sufficient training and support to participants may likely equip them with the necessary knowledge and skills to navigate LA dashboards effectively. These factors collectively might have mitigated the anticipated impact of facilitating conditions and effort expectancy on their intention to use LA dashboards.
The study also revealed a strong positive impact of personal innovativeness on both effort expectancy and performance expectancy, highlighting the role of individuals’ willingness to embrace innovation in shaping their engagement with LA. Likewise, a study [49] delved into variations in teachers’ utilization of LA, revealing that those identified as early adopters or “innovators” demonstrated notably heightened engagement during remote education weeks in comparison to their peers. This highlights the pivotal role of personal innovativeness in the adoption of LA tools. Overall, the findings provide crucial insights for educational institutions and stakeholders, emphasizing the significance of improving information quality, and system quality, and fostering a culture of innovation to drive effective adoption and utilization of LA tools in educational settings.

6.2. Managerial Implications

This study contributes valuable insights for academic instructors, LA tool developers, and administrators to understand crucial determinants when implementing LA tools in their institutions. Understanding the key factors that impact users’ behavioral intention to use LA tools is essential for the successful adoption and utilization of these tools in educational settings.
The significant positive effects of PE, SI, PI, IQ, and SQ on IULA emphasize the importance of these factors in driving users’ intention to use LA tools. LA tool developers should focus on enhancing these aspects to increase users’ willingness to adopt and use their tools. Specifically, the study highlights the importance of automation in analytical tool design to reduce human involvement and streamline decision-making processes, enhancing the effectiveness of LA tools.
Moreover, the findings underscore the significance of providing reliable, secure, user-friendly, and responsive LA tools. Developers should prioritize creating systems that deliver accurate and timely information in an easy-to-understand format, ensuring a positive user experience and encouraging tool adoption.
The influence of PI on PE, EE, and IULA suggests that individuals with a higher inclination for experimenting with recent technologies are more likely to adopt LA tools. Developers should consider designing LA tools that cater to the characteristics of innovative and adventurous users, making the tools engaging and exciting for experimentation.
However, the study also revealed that FC and EE did not have a significant impact on IULA. Understanding the reasons behind these findings and addressing any barriers to their influence on user intention could lead to improvements in LA tool design and implementation. Additionally, considering the specific context of developing nations and adapting LA tools to align with users’ familiarity and flexibility preferences is crucial for successful adoption.
Overall, this research provides actionable insights that can guide the development, marketing, and promotion of LA tools, ultimately contributing to the enhancement of educational experiences and outcomes in HEIs.

6.3. Theoretical Implications

This research successfully formulated a comprehensive model based on the UTAUT to assess users’ intention to use “Learning Analytics” (LA) dashboards in HEIs. By extending UTAUT with additional constructs including “Personal Innovativeness” (PI), “Information Quality” (IQ), and “System Quality” (SQ), this study provided a more nuanced understanding of the key factors predicting the adoption and utilization of LA tools. This integration allowed for a comprehensive exploration of users’ intention to utilize LA dashboards, encompassing both established technology acceptance factors and domain-specific variables relevant to the educational context. The study aimed to uncover how these constructs collectively influence users’ intention to use LA, contributing to a more nuanced understanding of LA adoption and usage within the educational ecosystem.
The empirical assessment of this model represents a notable contribution to the field, as there has been a lack of extensive empirical studies evaluating theoretical models concerning LA dashboard usage. The outcomes of this research are expected to shed light on users’ behavior towards LA, offering insights into the key factors that drive their intentions to use these tools.
The implications of this model are far-reaching. Educational institutions, instructors, and LA tool developers can leverage the insights gained from this study to strategize and enhance the adoption and usage of LA tools. By understanding the determinants outlined in the model, decision-makers can make informed choices regarding the selection and implementation of LA tools, aligning them with the specific needs and contexts of their institutions.
Moreover, this research lays the foundation for future studies, serving as a foundational basis for developing a comprehensive theory for the use of LA tools. The identified significant variables and their interactions can be further explored and refined in subsequent research, advancing our understanding of the complex dynamics predicting the adoption and effective utilization of LA in educational settings.
The contributions discussed in the study have notable implications for sustainability within educational technology. A central strategy highlighted is the integration of automation in the design of LA tools. By automating analytical procedures and minimizing human intervention, this approach can potentially optimize resource utilization and operational efficiency, thereby reducing the environmental impact. Additionally, prioritizing user-friendliness, reliability, and responsiveness in LA tool design aligns with sustainable principles, ensuring long-term viability and scalability. Adapting LA tools to specific contexts, particularly in developing nations, is vital for their continued relevance and utilization, embodying a commitment to educational sustainability. Furthermore, a strong focus on data accuracy and quality within LA tools supports well-informed and sustainable decision-making in educational institutions. In summary, integrating sustainability principles in developing and implementing LA tools can significantly enhance their effectiveness, drive user adoption, and ultimately foster sustainable educational practices and outcomes.

7. Concluding Remarks

Compared to developing countries, developed economies are leaders in adopting and utilizing recent technologies to improve teaching and learning procedures. Learning analytics (LA) is a new trend, the utilization of which can bring advantages to educational institutions [56]. However, not much is known about the usefulness of LA tools or their effects on individual cognitive capacities, despite their predicted benefits for enterprises [16,100]. Due to obstacles and a lack of expertise, many higher education institutions might not be prepared to begin adopting LA [101]. Efforts that aim to validate theoretical models of the determinants influencing intention to use LA tools are limited. Therefore, this study aimed to propose and empirically validate a model for the intentions to use LA dashboards in Malaysian HEIs. This study drew upon the UTAUT as a theoretical base and three additional factors (PI, IQ, and SQ). Data were obtained from academic staff and students at selected universities in Malaysia.
The results revealed that the “Intention to Use Learning Analytics” (IULA) is formed with “Performance Expectancy” (PE), “Social Influence” (SI), “Information Quality” (IQ), and “System Quality” (SQ) factors. Personal innovativeness (PI) showed a direct positive influence and an indirect influence through PE on the IULA. However, “Effort Expectancy” (EE) and “Facilitating Conditions” (FC) did not indicate any significant impact on the IULA. The study provides robust implications in academia, as educational institutions in developing nations should enhance the implications of LA tools by motivating users to acquire knowledge through flexible modules.

Limitations and Future Research Recommendations

While this study improves the existing understanding of the phenomena, there are also some limitations. Firstly, the target respondents in the present study were from selected top universities in Malaysia. Future studies can replicate this research by collecting data from a broader range of public and private universities and providing an in-depth comparison between findings. Secondly, it may be exciting to use the created model in other nations and even under different circumstances. Acknowledging the limitations of the study, including the specific focus on selected top universities in Malaysia, opens up possibilities for broader and more diverse future research. Replicating the study across a broader range of institutions, potentially in different countries, can enhance the generalizability and applicability of the findings.
A notable limitation in this study is the presence of multicollinearity, evident from the VIFs of the independent variables PI, PE, and EE surpassing the recommended threshold. To address the impact of multicollinearity resulting from a limited sample size, future research should prioritize expanding the sample size to enhance the robustness and representativeness of the dataset. Additionally, considering interviews with professionals and exploring moderating and mediating roles of variables can further enrich the understanding of LA tool adoption. Further, the study aimed to comprehensively explore the factors influencing LA usage and adoption by considering both academic staff and student perspectives. However, it is important to acknowledge that not treating these perspectives as separate units for the analysis might present a limitation. While this approach allows for a focused examination, analyzing them separately may provide a more in-depth understanding of each viewpoint. Lastly, this study performed a statistical analysis using PLS-SEM. Future studies can integrate statistical and multi-criteria decision-making approaches to acquire more exciting results.

Author Contributions

Conceptualization, M.B.; methodology, M.B. and I.A.; software, N.F.M.A. and L.S.; validation, I.A. and M.B.; formal analysis, N.F.M.A. and L.S.; investigation, I.A.; resources, N.F.M.A. and L.S.; data curation, N.F.M.A. and L.S.; writing—original draft, I.A., M.B., N.F.M.A. and L.S.; writing—review and editing, I.A. and M.B.; visualization, N.F.M.A. and L.S.; supervision, M.B.; project administration, M.B. and I.A.; funding acquisition, I.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Review Board.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data will be available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Scale Items

ConstructItemsSource
PE“To carry out my tasks, the LA dashboard is useful.”[59]
“I believe that using the LA dashboard enables me to perform responsibilities faster.”
“My productivity would increase using the LA dashboard.”
“My performance would improve using the LA dashboard.”
EE“I suppose learning to perform the LA dashboard can be easy for me.”[59]
“Becoming skillful at using the LA dashboard would be easy for me.”
“I would discover the LA dashboard easy to use.”
“My interaction with the LA dashboard would be understandable and clear.”
SI“People who impact my feelings think that I should utilize the LA dashboard.”[59]
“My community members and organization support the use of the LA dashboard.”
“My community members and organization support the use of the LA dashboard.”
“People who are important to me believe that I need to utilize the LA dashboard.”
FC“I have the required resources to utilize the LA dashboard.”[59]
“I have the required knowledge to utilize the LA dashboard.”
“For assistance with the LA dashboard, a specific individual is accessible.”
“Professional assistance is provided by the university to users of the LA dashboard through clear instructions and guides available on the website.”
PI“I am not doubtful to test new technologies.”[102]
“I am normally the first to test out new technologies compared to my friends.”
“I will seek methods to try if I know about new technology.”
“I enjoy experiencing new technologies.”
IQ“Information, which is relevant to my necessities, is acquired through the LA dashboard.”[64,85,86]
“The information produced through the LA dashboard is enough for my needs.”
“The yield information from the LA dashboard is clear.”
“The LA dashboard offers the information in a suitable structure.”
SQ“The LA dashboard provides ease of navigation.” [66,87]
“LA dashboards are well matched to the learning environment.”
“LA dashboard provides a flexible platform.”
IULA“In the following months, I intend to utilize the LA dashboard.”[59]
“In the following months, I expect I will use the LA dashboard.”
“In the following months, I plan to utilize the LA dashboard.”
“In the following months, I intend to utilize the LA dashboard.”

References

  1. Banihashem, S.K.; Noroozi, O.; van Ginkel, S.; Macfadyen, L.P.; Biemans, H.J. A systematic review of the role of learning analytics in enhancing feedback practices in higher education. Educ. Res. Rev. 2022, 37, 100489. [Google Scholar] [CrossRef]
  2. Nunn, S.; Avella, J.T.; Kanai, T.; Kebritchi, M. Learning Analytics Methods, Benefits, and Challenges in Higher Education: A Systematic Literature Review. Online Learn. 2016, 20, 13–29. [Google Scholar] [CrossRef]
  3. Tsai, Y.-S.; Kovanović, V.; Gašević, D. Connecting the dots: An exploratory study on learning analytics adoption factors, experience, and priorities. Internet High. Educ. 2021, 50, 100794. [Google Scholar] [CrossRef]
  4. Fan, S.; Chen, L.; Nair, M.; Garg, S.; Yeom, S.; Kregor, G.; Yang, Y.; Wang, Y. Revealing Impact Factors on Student Engagement: Learning Analytics Adoption in Online and Blended Courses in Higher Education. Educ. Sci. 2021, 11, 608. [Google Scholar] [CrossRef]
  5. Dawson, S.; Poquet, O.; Colvin, C.; Rogers, T.; Pardo, A.; Gasevic, D. Rethinking Learning Analytics Adoption through Complexity Leadership Theory. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge, Sydney, Australia, 7–9 March 2018; ACM: New York, NY, USA, 2018; pp. 236–244. [Google Scholar]
  6. Kohnke, L.; Foung, D.; Chen, J. Using Learner Analytics to Explore the Potential Contribution of Multimodal Formative Assessment to Academic Success in Higher Education. SAGE Open 2022, 12. [Google Scholar] [CrossRef]
  7. Herodotou, C.; Maguire, C.; Hlosta, M.; Mulholland, P. Predictive Learning Analytics and University Teachers: Usage and perceptions three years post implementation. In Proceedings of the LAK23: 13th International Learning Analytics and Knowledge Conference, Arlington, TX, USA, 13–17 March 2023; ACM: New York, NY, USA, 2023; pp. 68–78. [Google Scholar]
  8. Ali, L.; Asadi, M.; Gašević, D.; Jovanović, J.; Hatala, M. Factors influencing beliefs for adoption of a learning analytics tool: An empirical study. Comput. Educ. 2013, 62, 130–148. [Google Scholar] [CrossRef]
  9. Lim, L.-A.; Dawson, S.; Gašević, D.; Joksimović, S.; Fudge, A.; Pardo, A.; Gentili, S. Students’ sense-making of personalised feedback based on learning analytics. Australas. J. Educ. Technol. 2020, 36, 15–33. [Google Scholar] [CrossRef]
  10. Caspari-Sadeghi, S. Learning assessment in the age of big data: Learning analytics in higher education. Cogent Educ. 2023, 10, 2162697. [Google Scholar] [CrossRef]
  11. Viberg, O.; Hatakka, M.; Bälter, O.; Mavroudi, A. The current landscape of learning analytics in higher education. Comput. Hum. Behav. 2018, 89, 98–110. [Google Scholar] [CrossRef]
  12. Zilvinskis, J.; Willis, J.; Borden, V.M.H. An Overview of Learning Analytics. New Dir. High. Educ. 2017, 2017, 9–17. [Google Scholar] [CrossRef]
  13. Alzahrani, A.S.; Tsai, Y.-S.; Iqbal, S.; Marcos, P.M.M.; Scheffel, M.; Drachsler, H.; Kloos, C.D.; Aljohani, N.; Gasevic, D. Untangling connections between challenges in the adoption of learning analytics in higher education. Educ. Inf. Technol. 2022, 28, 4563–4595. [Google Scholar] [CrossRef] [PubMed]
  14. Sghir, N.; Adadi, A.; Lahmer, M. Recent advances in Predictive Learning Analytics: A decade systematic review (2012–2022). Educ. Inf. Technol. 2022, 28, 8299–8333. [Google Scholar] [CrossRef] [PubMed]
  15. Gasevic, D.; Tsai, Y.-S.; Dawson, S.; Pardo, A. How do we start? An approach to learning analytics adoption in higher education. Int. J. Inf. Learn. Technol. 2019, 36, 342–353. [Google Scholar] [CrossRef]
  16. Williamson, K.; Kizilcec, R. A Review of Learning Analytics Dashboard Research in Higher Education: Implications for Justice, Equity, Diversity, and Inclusion. In Proceedings of the LAK22: 12th International Learning Analytics and Knowledge Conference, Online, 21–25 March 2022; ACM: New York, NY, USA, 2022; pp. 260–270. [Google Scholar]
  17. Banihashem, K.; Macfadyen, L.P. Pedagogical Design: Bridging Learning Theory and Learning Analytics. Can. J. Learn. Technol. 2021, 47. [Google Scholar] [CrossRef]
  18. Gaftandzhieva, S.; Docheva, M.; Doneva, R. A comprehensive approach to learning analytics in Bulgarian school education. Educ. Inf. Technol. 2021, 26, 145–163. [Google Scholar] [CrossRef]
  19. Chatti, M.A.; Dyckhoff, A.L.; Schroeder, U.; Thüs, H. A reference model for learning analytics. Int. J. Technol. Enhanc. Learn. 2012, 4, 318–331. [Google Scholar] [CrossRef]
  20. Bodily, R.; Verbert, K. Review of Research on Student-Facing Learning Analytics Dashboards and Educational Recommender Systems. IEEE Trans. Learn. Technol. 2017, 10, 405–418. [Google Scholar] [CrossRef]
  21. Guerra, J.; Ortiz-Rojas, M.; Zúñiga-Prieto, M.A.; Scheihing, E.; Jiménez, A.; Broos, T.; De Laet, T.; Verbert, K. Adaptation and evaluation of a learning analytics dashboard to improve academic support at three Latin American universities. Br. J. Educ. Technol. 2020, 51, 973–1001. [Google Scholar] [CrossRef]
  22. Brown, M. Seeing students at scale: How faculty in large lecture courses act upon learning analytics dashboard data. Teach. High. Educ. 2020, 25, 384–400. [Google Scholar] [CrossRef]
  23. Kumar, S.R.; Hamid, S. Analysis of Learning Analytics in Higher Educational Institutions: A Review. In Proceedings of the Advances in Visual Informatics: 5th International Visual Informatics Conference, IVIC 2017, Bangi, Malaysia, 28–30 November 2017; pp. 185–196. [Google Scholar]
  24. Klein, C.; Lester, J.; Rangwala, H.; Johri, A. Learning Analytics Tools in Higher Education: Adoption at the Intersection of Institutional Commitment and Individual Action. Rev. High. Educ. 2019, 42, 565–593. [Google Scholar] [CrossRef]
  25. Herodotou, C.; Maguire, C.; McDowell, N.; Hlosta, M.; Boroowa, A. The engagement of university teachers with predictive learning analytics. Comput. Educ. 2021, 173, 104285. [Google Scholar] [CrossRef]
  26. West, D.; Tasir, Z.; Luzeckyj, A.; Na, K.S.; Toohey, D.; Abdullah, Z.; Searle, B.; Jumaat, N.F.; Price, R. Learning analytics experience among academics in Australia and Malaysia: A comparison. Australas. J. Educ. Technol. 2018, 34, 122–139. [Google Scholar] [CrossRef]
  27. Mokhtar, S.; Alshboul, J.A.Q.; Shahin, G.O.A. Towards Data-driven Education with Learning Analytics for Educator 4.0. J. Phys. Conf. Ser. 2019, 1339, 012079. [Google Scholar] [CrossRef]
  28. Zaki, N.A.A.; Zain, N.Z.M.; Noor, N.A.Z.M.; Hashim, H. Developing a Conceptual Model of Learning Analytics in Serious Games for STEM Education. J. Pendidik. IPA Indones. 2020, 9, 330–339. [Google Scholar] [CrossRef]
  29. Ismail, S.N.; Hamid, S.; Ahmad, M.; Alaboudi, A.; Jhanjhi, N. Exploring Students Engagement Towards the Learning Management System (LMS) Using Learning Analytics. Comput. Syst. Sci. Eng. 2021, 37, 73–87. [Google Scholar] [CrossRef]
  30. Mejia, C.; Florian, B.; Vatrapu, R.; Bull, S.; Gomez, S.; Fabregat, R. A Novel Web-Based Approach for Visualization and Inspection of Reading Difficulties on University Students. IEEE Trans. Learn. Technol. 2016, 10, 53–67. [Google Scholar] [CrossRef]
  31. Jivet, I.; Scheffel, M.; Drachsler, H.; Specht, M. Awareness Is Not Enough: Pitfalls of Learning Analytics Dashboards in the Educational Practice. In Proceedings of the Data Driven Approaches in Digital Education: 12th European Conference on Technology Enhanced Learning, EC-TEL 2017, Tallinn, Estonia, 12–15 September 2017; pp. 82–96. [Google Scholar]
  32. Dunlosky, J.; Rawson, K.A.; Marsh, E.J.; Nathan, M.J.; Willingham, D.T. Improving Students’ Learning With Effective Learning Techniques. Psychol. Sci. Public Interes. 2013, 14, 4–58. [Google Scholar] [CrossRef]
  33. Schwendimann, B.A.; Rodriguez-Triana, M.J.; Vozniuk, A.; Prieto, L.P.; Boroujeni, M.S.; Holzer, A.; Gillet, D.; Dillenbourg, P. Perceiving Learning at a Glance: A Systematic Literature Review of Learning Dashboard Research. IEEE Trans. Learn. Technol. 2016, 10, 30–41. [Google Scholar] [CrossRef]
  34. Arnold, K.E.; Pistilli, M.D. Course Signals at Purdue: Using Learning Analytics to Increase Student Success. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, Vancouver, BC, Canada, 29 April–2 May 2012; ACM: New York, NY, USA, 2012; pp. 267–270. [Google Scholar]
  35. Charleer, S.; Moere, A.V.; Klerkx, J.; Verbert, K.; De Laet, T. Learning Analytics Dashboards to Support Adviser-Student Dialogue. IEEE Trans. Learn. Technol. 2018, 11, 389–399. [Google Scholar] [CrossRef]
  36. Beheshitha, S.S.; Hatala, M.; Gašević, D.; Joksimović, S. The Role of Achievement Goal Orientations When Studying Effect of Learning Analytics Visualizations. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge—LAK’16, Edinburgh, UK, 25–29 April 2016; ACM Press: New York, NY, USA, 2016; pp. 54–63. [Google Scholar]
  37. Shum, S.B.; Ferguson, R.; Martinez-Maldonado, R. Human-Centred Learning Analytics. J. Learn. Anal. 2019, 6. [Google Scholar] [CrossRef]
  38. Matcha, W.; Uzir, N.A.; Gasevic, D.; Pardo, A. A Systematic Review of Empirical Studies on Learning Analytics Dashboards: A Self-Regulated Learning Perspective. IEEE Trans. Learn. Technol. 2020, 13, 226–245. [Google Scholar] [CrossRef]
  39. de Freitas, S.; Gibson, D.; Du Plessis, C.; Halloran, P.; Williams, E.; Ambrose, M.; Dunwell, I.; Arnab, S. Foundations of dynamic learning analytics: Using university student data to increase retention. Br. J. Educ. Technol. 2014, 46, 1175–1188. [Google Scholar] [CrossRef]
  40. Verbert, K.; Duval, E.; Klerkx, J.; Govaerts, S.; Santos, J.L. Learning Analytics Dashboard Applications. Am. Behav. Sci. 2013, 57, 1500–1509. [Google Scholar] [CrossRef]
  41. Brouwer, N.; Bredeweg, B.; Latour, S.; Berg, A.; van der Huizen, G. Learning Analytics Pilot with Coach2—Searching for Effective Mirroring. In Proceedings of the Adaptive and Adaptable Learning: 11th European Conference on Technology Enhanced Learning, EC-TEL 2016, Lyon, France, 13–16 September 2016; pp. 363–369. [Google Scholar]
  42. Few, S. Information Dashboard Design: The Effective Visual Communication of Data; O’Reilly Media: Sebastopol, CA, USA, 2006. [Google Scholar]
  43. Verbert, K.; Govaerts, S.; Duval, E.; Santos, J.L.; Van Assche, F.; Parra, G.; Klerkx, J. Learning dashboards: An overview and future research opportunities. Pers. Ubiquitous Comput. 2013, 18, 1499–1514. [Google Scholar] [CrossRef]
  44. Gašević, D.; Kovanović, V.; Joksimović, S. Piecing the learning analytics puzzle: A consolidated model of a field of research and practice. Learn. Res. Pract. 2017, 3, 63–78. [Google Scholar] [CrossRef]
  45. Márquez, L.; Henríquez, V.; Chevreux, H.; Scheihing, E.; Guerra, J. Adoption of learning analytics in higher education institutions: A systematic literature review. Br. J. Educ. Technol. 2023. [Google Scholar] [CrossRef]
  46. Johar, N.A.; Na Kew, S.; Tasir, Z.; Koh, E. Learning Analytics on Student Engagement to Enhance Students’ Learning Performance: A Systematic Review. Sustainability 2023, 15, 7849. [Google Scholar] [CrossRef]
  47. Alzahrani, A.S.; Tsai, Y.-S.; Aljohani, N.; Whitelock-Wainwright, E.; Gasevic, D. Do teaching staff trust stakeholders and tools in learning analytics? A mixed methods study. Educ. Technol. Res. Dev. 2023, 71, 1471–1501. [Google Scholar] [CrossRef]
  48. Korir, M.; Slade, S.; Holmes, W.; Héliot, Y.; Rienties, B. Investigating the dimensions of students’ privacy concern in the collection, use and sharing of data for learning analytics. Comput. Hum. Behav. Rep. 2023, 9, 100262. [Google Scholar] [CrossRef]
  49. Lomos, C.; Luyten, J.W.; Kesting, F.; da Cunha, F.L. Explaining variation in teachers’ use of ICT: A learning analytics approach. Interact. Learn. Environ. 2023, 1–18. [Google Scholar] [CrossRef]
  50. Knobbout, J.; van der Stappen, E.; Versendaal, J.; van de Wetering, R. Supporting Learning Analytics Adoption: Evaluating the Learning Analytics Capability Model in a Real-World Setting. Appl. Sci. 2023, 13, 3236. [Google Scholar] [CrossRef]
  51. Zheng, L.; Kinshuk; Fan, Y.; Long, M. The impacts of the comprehensive learning analytics approach on learning performance in online collaborative learning. Educ. Inf. Technol. 2023. [Google Scholar] [CrossRef]
  52. Kaur, A.; Chahal, K.K. A learning analytics dashboard for data-driven recommendations on influences of non-cognitive factors in introductory programming. Educ. Inf. Technol. 2023. [Google Scholar] [CrossRef]
  53. Ouyang, F.; Wu, M.; Zheng, L.; Zhang, L.; Jiao, P. Integration of artificial intelligence performance prediction and learning analytics to improve student learning in online engineering course. Int. J. Educ. Technol. High. Educ. 2023, 20, 4. [Google Scholar] [CrossRef] [PubMed]
  54. Chen, L.; Geng, X.; Lu, M.; Shimada, A.; Yamada, M. How Students Use Learning Analytics Dashboards in Higher Education: A Learning Performance Perspective. SAGE Open 2023, 13. [Google Scholar] [CrossRef]
  55. Rienties, B.; Herodotou, C.; Olney, T.; Schencks, M.; Boroowa, A. Making Sense of Learning Analytics Dashboards: A Technology Acceptance Perspective of 95 Teachers. Int. Rev. Res. Open Distrib. Learn. 2018, 19. [Google Scholar] [CrossRef]
  56. Shorfuzzaman, M.; Hossain, M.S.; Nazir, A.; Muhammad, G.; Alamri, A. Harnessing the power of big data analytics in the cloud to support learning analytics in mobile learning environment. Comput. Hum. Behav. 2019, 92, 578–588. [Google Scholar] [CrossRef]
  57. Mavroudi, A.; Papadakis, S.; Ioannou, I. Teachers’ Views Regarding Learning Analytics Usage Based on the Technology Acceptance Model. TechTrends 2021, 65, 278–287. [Google Scholar] [CrossRef]
  58. Nistor, N.; Baltes, B.; Dascălu, M.; Mihăilă, D.; Smeaton, G.; Trăuşan-Matu, Ş. Participation in virtual academic communities of practice under the influence of technology acceptance and community factors. A learning analytics application. Comput. Hum. Behav. 2014, 34, 339–344. [Google Scholar] [CrossRef]
  59. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User Acceptance of Information Technology: Toward a Unified View. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef]
  60. Blut, M.; Chong, A.Y.L.; Tsigna, Z.; Venkatesh, V. Meta-Analysis of the Unified Theory of Acceptance and Use of Technology (UTAUT): Challenging Its Validity and Charting a Research Agenda in the Red Ocean. J. Assoc. Inf. Syst. 2022, 23, 13–95. [Google Scholar] [CrossRef]
  61. Venkatesh, V. Adoption and use of AI tools: A research agenda grounded in UTAUT. Ann. Oper. Res. 2022, 308, 641–652. [Google Scholar] [CrossRef]
  62. Alghazi, S.S.; Wong, S.Y.; Kamsin, A.; Yadegaridehkordi, E.; Shuib, L. Towards Sustainable Mobile Learning: A Brief Review of the Factors Influencing Acceptance of the Use of Mobile Phones as Learning Tools. Sustainability 2020, 12, 10527. [Google Scholar] [CrossRef]
  63. Abbad, M.M.M. Using the UTAUT model to understand students’ usage of e-learning systems in developing countries. Educ. Inf. Technol. 2021, 26, 7205–7224. [Google Scholar] [CrossRef]
  64. Mohammadi, H. Investigating users’ perspectives on e-learning: An integration of TAM and IS success model. Comput. Hum. Behav. 2015, 45, 359–374. [Google Scholar] [CrossRef]
  65. Thongsri, N.; Shen, L.; Bao, Y. Investigating factors affecting learner’s perception toward online learning: Evidence from ClassStart application in Thailand. Behav. Inf. Technol. 2019, 38, 1243–1258. [Google Scholar] [CrossRef]
  66. Vanitha, P.S.; Alathur, S. Factors influencing E-learning adoption in India: Learners’ perspective. Educ. Inf. Technol. 2021, 26, 5199–5236. [Google Scholar] [CrossRef]
  67. Arpaci, I. A comparative study of the effects of cultural differences on the adoption of mobile learning. Br. J. Educ. Technol. 2015, 46, 699–712. [Google Scholar] [CrossRef]
  68. Delone, W.H.; McLean, E.R. The DeLone and McLean Model of Information Systems Success: A Ten-Year Update. J. Manag. Inf. Syst. 2003, 19, 9–30. [Google Scholar] [CrossRef]
  69. Arpaci, I. An Investigation of the Relationship between University Students’ Innovativeness Profile and Their Academic Success in the Project Development Course. J. Entrep. Innov. Manag. 2018, 7, 79–95. [Google Scholar]
  70. Khuzairi, N.M.S.; Cob, Z.C.; Hilaluddin, T. Measuring Educator Satisfaction of Learning Analytics for Online Learning Systems in Malaysia; Springer International Publishing: Cham, Switzerland, 2022; pp. 382–391. [Google Scholar]
  71. DeLone, W.H.; McLean, E.R. Information Systems Success: The Quest for the Dependent Variable. Inf. Syst. Res. 1992, 3, 60–95. [Google Scholar] [CrossRef]
  72. Arpaci, I. Organizational Adoption of Mobile Communication Technologies; Middle East Technical University: Ankara, Turkey, 2013. [Google Scholar]
  73. Handoko, B.L. UTAUT 2 Model for Entrepreneurship Students on Adopting Technology. In Proceedings of the 2020 International Conference on Information Management and Technology (ICIMTech), Bandung, Indonesia, 13–14 August 2020; IEEE: Piscataway Township, NJ, USA, 2020; pp. 191–196. [Google Scholar]
  74. Tseng, T.H.; Lin, S.; Wang, Y.-S.; Liu, H.-X. Investigating teachers’ adoption of MOOCs: The perspective of UTAUT2. Interact. Learn. Environ. 2022, 30, 635–650. [Google Scholar] [CrossRef]
  75. Abd Rahman, S.F.; Md Yunus, M.; Hashim, H. Applying UTAUT in Predicting ESL Lecturers Intention to Use Flipped Learning. Sustainability 2021, 13, 8571. [Google Scholar] [CrossRef]
  76. Al-Emran, M.; Arpaci, I.; Salloum, S.A. An empirical examination of continuous intention to use m-learning: An integrated model. Educ. Inf. Technol. 2020, 25, 2899–2918. [Google Scholar] [CrossRef]
  77. Arpaci, I. Antecedents and consequences of cloud computing adoption in education to achieve knowledge management. Comput. Hum. Behav. 2017, 70, 382–390. [Google Scholar] [CrossRef]
  78. Chen, J. Adoption of M-learning apps: A sequential mediation analysis and the moderating role of personal innovativeness in information technology. Comput. Hum. Behav. Rep. 2022, 8, 100237. [Google Scholar] [CrossRef]
  79. Arpaci, I. Predictors of financial sustainability for cryptocurrencies: An empirical study using a hybrid SEM-ANN approach. Technol. Forecast. Soc. Chang. 2023, 196, 122858. [Google Scholar] [CrossRef]
  80. Arpaci, I.; Masrek, M.N.; Al-Sharafi, M.A.; Al-Emran, M. Evaluating the actual use of cloud computing in higher education through information management factors: A cross-cultural comparison. Educ. Inf. Technol. 2023, 28, 12089–12109. [Google Scholar] [CrossRef]
  81. Twum, K.K.; Ofori, D.; Keney, G.; Korang-Yeboah, B. Using the UTAUT, personal innovativeness and perceived financial cost to examine student’s intention to use E-learning. J. Sci. Technol. Policy Manag. 2022, 13, 713–737. [Google Scholar] [CrossRef]
  82. Lawson-Body, A.; Willoughby, L.; Lawson-Body, L.; Tamandja, E.M. Students’ acceptance of E-books: An application of UTAUT. J. Comput. Inf. Syst. 2020, 60, 256–267. [Google Scholar] [CrossRef]
  83. Al-Adwan, A.S.; Albelbisi, N.A.; Hujran, O.; Al-Rahmi, W.M.; Alkhalifah, A. Developing a Holistic Success Model for Sustainable E-Learning: A Structural Equation Modeling Approach. Sustainability 2021, 13, 9453. [Google Scholar] [CrossRef]
  84. Hair, J.F., Jr.; Hult, G.T.M.; Ringle, C.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM), 2nd ed.; Sage Publications: Thousand Oaks, CA, USA, 2016; ISBN 1483377466. [Google Scholar]
  85. Alsabawy, A.Y.; Cater-Steel, A.; Soar, J. Determinants of perceived usefulness of e-learning systems. Comput. Hum. Behav. 2016, 64, 843–858. [Google Scholar] [CrossRef]
  86. Roca, J.C.; Chiu, C.-M.; Martínez, F.J. Understanding e-learning continuance intention: An extension of the Technology Acceptance Model. Int. J. Human-Computer Stud. 2006, 64, 683–696. [Google Scholar] [CrossRef]
  87. Cidral, W.A.; Oliveira, T.; Di Felice, M.; Aparicio, M. E-learning success determinants: Brazilian empirical study. Comput. Educ. 2018, 122, 273–290. [Google Scholar] [CrossRef]
  88. Hair, J.F.; Risher, J.J.; Sarstedt, M.; Ringle, C.M. When to use and how to report the results of PLS-SEM. Eur. Bus. Rev. 2019, 31, 2–24. [Google Scholar] [CrossRef]
  89. Martins, C.; Oliveira, T.; Popovič, A. Understanding the Internet banking adoption: A unified theory of acceptance and use of technology and perceived risk application. Int. J. Inf. Manag. 2014, 34, 1–13. [Google Scholar] [CrossRef]
  90. Ringle, C.M.; Sarstedt, M.; Straub, D.W. Editor’s Comments: A Critical Look at the Use of PLS-SEM in “MIS Quarterly”. MIS Q. 2012, 36, iii. [Google Scholar] [CrossRef]
  91. Ali, F.; Rasoolimanesh, S.M.; Sarstedt, M.; Ringle, C.M.; Ryu, K. An assessment of the use of partial least squares structural equation modeling (PLS-SEM) in hospitality research. Int. J. Contemp. Hosp. Manag. 2018, 30, 514–538. [Google Scholar] [CrossRef]
  92. Hair, J.F.; Sarstedt, M.; Pieper, T.M.; Ringle, C.M. The Use of Partial Least Squares Structural Equation Modeling in Strategic Management Research: A Review of Past Practices and Recommendations for Future Applications. Long Range Plan. 2012, 45, 320–340. [Google Scholar] [CrossRef]
  93. Hair, J.F.; Sarstedt, M.; Ringle, C.M. Rethinking some of the rethinking of partial least squares. Eur. J. Mark. 2019, 53, 566–584. [Google Scholar] [CrossRef]
  94. Podsakoff, P.M.; Organ, D.W. Self-Reports in Organizational Research: Problems and Prospects. J. Manag. 1986, 12, 531–544. [Google Scholar] [CrossRef]
  95. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  96. Henseler, J.; Ringle, C.M.; Sarstedt, M. A new criterion for assessing discriminant validity in variance-based structural equation modeling. J. Acad. Mark. Sci. 2015, 43, 115–135. [Google Scholar] [CrossRef]
  97. Hu, L.T.; Bentler, P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Model. Multidiscip. J. 1999, 6, 1–55. [Google Scholar] [CrossRef]
  98. Park, Y.; Jo, I.-H. Factors that affect the success of learning analytics dashboards. Educ. Technol. Res. Dev. 2019, 67, 1547–1571. [Google Scholar] [CrossRef]
  99. Sandhu, K.; Alharbi, H. PLS Model Performance for Factors Influencing Student Acceptance of E-Learning Analytics Recommender. Int. J. Virtual Pers. Learn. Environ. 2020, 10, 1–14. [Google Scholar] [CrossRef]
  100. Nadj, M.; Maedche, A.; Schieder, C. The effect of interactive analytical dashboard features on situation awareness and task performance. Decis. Support Syst. 2020, 135, 113322. [Google Scholar] [CrossRef]
  101. Freitas, E.; Fonseca, F.; Garcia, V.; Ferreira, R.; Gasevic, D. Towards a Maturity Model for Learning Analytics Adoption An Overview of Its Levels and Areas. In Proceedings of the 2020 IEEE 20th International Conference on Advanced Learning Technologies (ICALT), Tartu, Estonia, 6–9 July 2020; IEEE: Piscataway Township, NJ, USA, 2020; pp. 122–126. [Google Scholar]
  102. Agarwal, R.; Prasad, J. A Conceptual and Operational Definition of Information Technology. Inf. Syst. Res. 1998, 9, 204–215. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions, and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions, or products referred to in the content.
Figure 1. Conceptual framework.
Figure 1. Conceptual framework.
Sustainability 15 15190 g001
Figure 2. Hypotheses testing results.
Figure 2. Hypotheses testing results.
Sustainability 15 15190 g002
Table 1. Studies on LA Adoption in HEIs.
Table 1. Studies on LA Adoption in HEIs.
SourceModelKey Findings
[55]TAMPU and PEOU were positively associated with training satisfaction.
[56]TAMPU, PI, and PEOU were positively associated with BI.
[7]UTAUTUse of PLA was positively associated with PE, less experience, self-efficacy, attitudes, and low anxiety, while associated with lack of FC and low EE.
[57]TAMTeachers’ perceived usefulness was positively associated with attitudes toward use.
[58]UTAUTPE, EE, and SI were positively associated with BI.
[8]TAMPU and PEOU were positively associated with BI.
Table 2. Respondents’ personal information.
Table 2. Respondents’ personal information.
Information of RespondentsFrequency (Percentage)
GenderFemale 85 (40.67%)
Male124 (59.33%)
Current positionLecturer30 (14.35%)
Student179 (85.65%)
Age Below 2035 (16.75%)
21–30 years old89 (42.58%)
Above 3085 (40.67%)
Technology skillsNot bad9 (4.31%)
Sufficient170 (81.34%)
Very good30 (14.35%)
Table 3. Validity and Reliability.
Table 3. Validity and Reliability.
ConstructsItemsFactor LoadCArho_ACRAVE
EEEE10.6840.7680.8010.8490.585
EE20.825
EE30.802
EE40.741
FCFC10.7930.8630.8700.9070.710
FC20.882
FC30.882
FC40.808
IQIQ10.8900.9090.9090.9360.785
IQ20.898
IQ30.883
IQ40.873
PEPE10.8490.8770.8800.9160.731
PE20.839
PE30.895
PE40.837
PIPI10.8970.8570.8610.9040.703
PI20.776
PI30.772
PI40.899
SISI10.8760.8800.8830.9170.735
SI20.875
SI30.821
SI40.855
SQSQ10.8670.7810.7850.8730.697
SQ20.868
SQ30.765
IULAIULA10.9090.9200.9220.9430.806
IULA20.897
IULA30.924
IULA40.860
Table 4. Fornell–Larcker Criterion.
Table 4. Fornell–Larcker Criterion.
EEFCIQIULAPEPISISQ
EE0.765
FC0.3690.842
IQ0.5480.4520.886
IULA0.3770.4810.5340.898
PE0.4590.5150.5300.5240.855
PI0.3890.2670.3750.4170.4230.838
SI0.3740.5060.4470.4800.4600.2650.857
SQ0.3490.4830.3900.4560.4260.3230.3990.835
Table 5. Cross-Loadings.
Table 5. Cross-Loadings.
EEFCIQIULAPEPISISQ
EE10.6840.3110.4150.1800.2260.2260.2240.237
EE20.8250.3490.4830.3880.4350.3320.3360.285
EE30.8020.3010.4480.3050.3930.3570.2860.307
EE40.7410.1470.3120.2240.2970.2430.2840.225
FC10.3360.7930.4390.4070.4870.2460.5130.406
FC20.2960.8820.3990.4180.4240.2500.3690.375
FC30.3380.8820.4200.4410.4580.2360.4060.461
FC40.2670.8080.2440.3440.3540.1550.4210.379
IQ10.4430.3960.8900.4550.4550.3340.3830.306
IQ20.4930.4010.8980.4590.4780.3070.3940.336
IQ30.5320.4300.8830.4950.5140.3970.4290.414
IQ40.4710.3740.8730.4820.4310.2890.3740.320
IULA10.3540.4480.4830.9090.4980.4100.4470.494
IULA20.3120.3770.4400.8970.4110.3920.4170.369
IULA30.3070.4660.4920.9240.4780.3450.4510.404
IULA40.3790.4310.5020.8600.4900.3510.4050.361
PE10.4040.4570.5230.4600.8490.3890.4260.368
PE20.3380.3710.3620.4490.8390.3310.3760.321
PE30.4400.4470.5020.4700.8950.3870.4030.365
PE40.3860.4880.4160.4110.8370.3360.3650.407
PI10.3440.2090.3050.4000.3390.8990.2260.244
PI20.3370.2150.3110.4000.3410.8970.2270.249
PI30.2670.2090.3150.3200.3800.7760.1950.269
PI40.3550.2630.3290.2700.3620.7720.2390.325
SI10.3030.4650.3830.4420.3940.2800.8760.370
SI20.3050.4140.3300.4000.3580.1870.8750.326
SI30.3150.3640.4200.3750.3430.1990.8210.279
SI40.3610.4820.4000.4230.4760.2340.8550.385
SQ10.2890.4290.3200.3910.3770.2190.3960.867
SQ20.2710.3330.3350.3910.3460.3030.3700.868
SQ30.3150.4520.3210.3590.3450.2870.2250.765
Table 6. HTMT.
Table 6. HTMT.
EEFCIQIULAPEPISISQ
EE
FC0.440
IQ0.6450.503
IULA0.4250.5350.583
PE0.5350.5890.5900.581
PI0.4650.3080.4260.4680.489
SI0.4480.5800.5000.5310.5210.303
SQ0.4460.5900.4620.5360.5170.3990.476
Table 7. The outcome of the hypotheses testing.
Table 7. The outcome of the hypotheses testing.
MeanSTDEVt-Valuep-ValueVIFResult
PE → IULA0.1560.0732.1510.0321.0437Supported
EE → IULA−0.0370.0730.5020.6151.0005Rejected
SI → IULA0.1600.0682.3490.0191.0589Supported
FC → IULA0.1180.0651.8190.0691.0300Rejected
PI → IULA0.1590.0592.6980.0071.0789Supported
PI → PE0.4230.0636.6790.0005.0947Supported
PI → EE0.3890.0636.2230.0005.0140Supported
IQ → IULA0.2330.0653.5900.0001.2365Supported
SQ → IULA0.1390.0682.0420.0411.0513Supported
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bahari, M.; Arpaci, I.; Azmi, N.F.M.; Shuib, L. Predicting the Intention to Use Learning Analytics for Academic Advising in Higher Education. Sustainability 2023, 15, 15190. https://doi.org/10.3390/su152115190

AMA Style

Bahari M, Arpaci I, Azmi NFM, Shuib L. Predicting the Intention to Use Learning Analytics for Academic Advising in Higher Education. Sustainability. 2023; 15(21):15190. https://doi.org/10.3390/su152115190

Chicago/Turabian Style

Bahari, Mahadi, Ibrahim Arpaci, Nurulhuda Firdaus Mohd Azmi, and Liyana Shuib. 2023. "Predicting the Intention to Use Learning Analytics for Academic Advising in Higher Education" Sustainability 15, no. 21: 15190. https://doi.org/10.3390/su152115190

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop