Next Article in Journal
Merging Didactic and Relational Competence: A Student Perspective on the Teacher’s Role in Working with Student Health
Previous Article in Journal
Polynomials—Unifying or Fragmenting High School Mathematics?
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development and TAM-Based Validation of a User Experience Scale for Actual System Use in Online Courses

1
International College of Digital Innovation, Chiang Mai University, Chiang Mai 50200, Thailand
2
School of Information and Engineering, Sichuan Tourism University, Chengdu 610100, China
3
University of Stirling, Chengdu University, Chengdu 610106, China
*
Authors to whom correspondence should be addressed.
Educ. Sci. 2025, 15(7), 855; https://doi.org/10.3390/educsci15070855
Submission received: 28 May 2025 / Revised: 21 June 2025 / Accepted: 30 June 2025 / Published: 3 July 2025

Abstract

This study aims to develop and validate a user experience scale to construct an Actual System Use model for online courses based on the Technology Acceptance Model, allowing for a comprehensive assessment of the multidimensional factors affecting Learning Outcomes and Actual System Use in the context of online courses. The scale includes six core dimensions: Interactive Experience, Content Quality, Learning Outcomes, Teaching Quality, Technical Support, and Learning Motivation. Through a literature review, pre-survey, exploratory factor analysis, and confirmatory factor analysis, the reliability and validity of the developed scale were verified. A second-order complex Structural Equation Model was used to measure users’ Actual System Use with respect to online courses. The results demonstrate that the Interactive Experience and Learning Motivation dimensions play crucial roles in enhancing learners’ engagement and learning satisfaction, while Perceived Usefulness and Perceived Ease of Use significantly influence system usage behaviors. This study provides a systematic theoretical basis and empirical data for the design of online courses, offering valuable insights for optimizing course design and enhancing user experiences.

1. Introduction

1.1. Problem Statement

With the rapid development of internet technology, online courses have become a mainstream and indispensable method in the context of primary, higher and vocational education in recent years. Notably, 99% of higher education institutions have implemented learning management systems, with 85% of these systems being in active use (Al-Fraihat et al., 2020). Particularly in 2020, due to their flexibility, efficiency, and large-scale coverage, online courses became a key solution to meet the demands of remote learning. However, issues such as “building for the sake of building, disconnected from use” (Zhou et al., 2022), “supply demand mismatch” (Cho et al., 2024), and “low performance outcomes” (Bailey et al., 2022) have become increasingly prominent, prompting researchers to consider how to scientifically assess the quality of online courses. The contradiction between the growth of online course resources and learners’ diverse needs remains evident: while the quantity of online course content on the internet has increased dramatically, the quality of this content often fails to meet the multi-level needs of different learners. Many online educational resources do not deliver high-quality user experience (UE), thus failing to meet learners’ diverse cognitive, interactive, and emotional needs and leading to difficulties in user retention.

1.2. Research Objectives

UE refers to learners’ perceptions and experiences of the online course learning process and outcomes, serving as a key indicator of course quality (Dwivedi et al., 2020; Hlosta et al., 2022). Existing research on online course UE has been limited to discussions regarding its definition, influencing factors, and related aspects, mainly addressing questions such as “What is the online course user experience?” (Y. M. Cheng, 2020), “What dimensions does it include?” (Al-Mekhlafi et al., 2022), and “What factors influence the online course user experience?” (M. Yang et al., 2017). Thus, there is a lack of systematic theoretical analyses and empirical studies relating to the question of “How to measure the UE of online courses?”; particularly regarding research on Online Course User Experience Scales and how UE impacts users’ continued usage behavior in the context of online courses.
This study focuses on three main objectives: first, to explore the dimensions that constitute the UE of online courses; second, to scientifically construct a user experience scale; and third, to validate the impacts of UE dimensions on Actual System Use.

1.3. User Experience

User experience (UE) refers to the overall perception and emotional response of users when interacting with a product or service, involving multiple dimensions such as usability, emotional experience, aesthetic value, interactivity, and contextual relevance (Norman, 2004; Hassenzahl, 2018). In recent years, researchers have proposed various theoretical frameworks—such as Hassenzahl’s model of functionality and pleasure and Norman’s three-level experience model—which provide important perspectives for understanding the complexity of UE. With the proliferation of online teaching, e-commerce, and mobile applications, UE research has gradually expanded from focusing on functional design to exploring approaches for personalization and cross-cultural differences, addressing the challenges of diverse user needs and contextual dependencies (Soares et al., 2021; Mei et al., 2025). Van Wart et al. (2020) conducted an exploratory factor analysis-based study and identified seven key success factors for online learning from the students’ perspective—including Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social Comfort, Online Interactive Modality, and Social Presence—followed by determination of their hierarchical significance. S. Wang et al. (2021) carried out research based on UE, exploring four dimensions: usefulness, ease of use, functionality, and aesthetics. Using the Delphi method and analytic hierarchy process, they conducted a comprehensive evaluation of 16 secondary indicators and concluded that the primary indicator “ease of operation” and the secondary indicator “convenience” are the most important factors affecting UE in the context of online courses. Tao et al. (2022) identified the key features of Massive Open Online Courses (MOOCs), emphasizing that factors such as perceived entertainment, perceived quality, and perceived usability are crucial considerations for online course designers. Table 1 briefly introduces the components used for measurement of different aspects of UE.
User experience research has gradually expanded into the field of online teaching, providing important theoretical support for the optimizing course designs and improving user satisfaction (Norman, 2004; Hassenzahl, 2018; Hassenzahl et al., 2021). In this context, the Technology Acceptance Model (TAM) has become the theoretical foundation for studying UE in online courses. Proposed by Davis (1989), the TAM emphasizes the influence of Perceived Usefulness (PU) and Perceived Ease of Use (PEOU) on users’ willingness to use and their behaviors. This model has been widely applied in educational technology research, to analyze how learners form a positive learning experience through their perceptions regarding the functionality and ease of use of platforms (Venkatesh & Davis, 2000). Many scholars have explored the relationship between UE and Actual System Use based on the TAM (as detailed in Table 2). Studies have found that PU and PEOU have direct impacts on UE, which in turn influences the actual usage of the system; for example, Bailey et al. investigated the application of the TAM in online teaching platforms and found that PU has a significant positive effect on students continued use of the system, while PEOU plays a moderating role in student experiences. Similar studies have also been performed in other fields; for example, H. Yang (2024) analyzed the TAM of UE with respect to healthcare platforms, and found that PU and PEOU are key factors in predicting users’ intention to continue using the system, with high-quality system design significantly enhancing user acceptance and the frequency of ASU.

1.4. Dimensions of User Experience

Users experience scales are important tools for measuring the perceptions and emotions of users when interacting with products or services and are widely used in fields such as human–computer interaction, online teaching, and e-commerce. Scale designs typically include core dimensions such as usability, emotional experience, Learning Outcomes, and interactivity, and utilize methods such as the Likert scale for quantitative evaluation (Lai et al., 2022). For example, classic scales such as the System Usability Scale (SUS) (Vlachogianni & Tselios, 2022) and the User Experience Questionnaire (Laugwitz et al., 2008) provide standardized frameworks for evaluating UE. At the same time, new scales incorporating contextual adaptability and personalization dimensions have gradually been developed to address complex user needs (Su et al., 2011). Future research should focus on diversified scale designs and validation methods to improve the accuracy and applicability of UE assessments.
Interactive Experience (IE): The IE of online courses is an important dimension for measuring learners’ engagement and the quality of interaction in a digital learning environment. Learning freedom, the learning community, and learning collaboration reflect the flexibility of learners in terms of time and space, their depth of interaction with other learners or instructors, and their level of involvement in collaborative tasks, respectively.
Learning freedom is reflected in a learners’ ability to flexibly arrange their study time and location, as well as to choose learning content based on personal goals or interests (Hung et al., 2010). This autonomy enhances the learner’s sense of control over the course, thereby improving their learning satisfaction and outcomes (Jung et al., 2019). Moreover, online courses further meet learners’ personalized needs through diverse module designs and resource offerings (Mamun et al., 2020).
The learning community serves as the basis for social interactions and emotional connections among learners (Yassine et al., 2022). It provides learners with a platform to share knowledge and promotes knowledge construction through discussion and collaborative activities (M. Wang et al., 2024). For example, through community exchanges and discussions, learners can extend their knowledge both within and beyond the course, as well as receiving emotional support and academic assistance from their peers (Y. Tang & Hew, 2022). These interactions help learners to overcome feelings of isolation and enhance their sense of belonging in the learning process.
Learning collaboration refers to learners working together through collaborative tasks or project completion to collectively achieve learning goals (Seifert & Bar-Tal, 2023). Group tasks and collaborative projects in online courses provide learners with opportunities to apply their knowledge practically and enhance their team collaboration skills (Vartiainen et al., 2022). Collaborative learning has been shown to have significant effects, promoting knowledge sharing and improving learning efficiency (Erkens & Bodemer, 2019). Additionally, through the platform’s diverse interactive features, learners can engage in chat, discussion forums, and after-class tutoring, thereby deepening their understanding and application of knowledge (Mansour, 2024). Timely feedback from instructors and the provision of after-class resources further facilitate the internalization of knowledge by learners (Da-Hong et al., 2020). Moreover, the platform’s multi-device support and the construction of learning communities enhance learners’ engagement and collaborative experiences (Nong et al., 2023). Learners can receive positive feedback through interactions, and this feedback mechanism plays a crucial role in maintaining motivation and solving problems (W.-S. Wang et al., 2024).
Content Quality (CQ) is one of the key dimensions of UE in online courses, involving the design of course content, the richness of resources, and the methods of presentation. High-quality content can significantly enhance learners’ satisfaction and Learning Outcomes (Du, 2023). Existing research on content and resources, knowledge presentation, and course presentation provides a theoretical foundation for further analysis of CQ on online teaching platforms.
Rich and high-quality course resources are key factors in enhancing the learning experience. Online course platforms support autonomous learning through the provision of diverse resources such as videos, course materials, case studies, and experiments (Wu et al., 2024). This diversity of resources can meet the personalized needs of learners, increasing engagement and learning efficiency (B. Liu & Yuan, 2024). Regular updating of these resources helps to reflect the latest developments in the subject area, maintaining the timeliness and attractiveness of the course (Shen, 2018).
The presentation of key concepts plays a crucial role in learners’ cognition and understanding. Presenting key concepts through multimedia formats such as charts, animations, and videos can enhance learners’ understanding of abstract concepts (Mayer, 2017). At the same time, through intuitive visual presentations, online courses not only improve the visualization of knowledge but also enhance engagement with and the memorability of the learning process (P. Tang et al., 2022).
The design of course presentation directly influences learners’ perceived quality and willingness to engage in learning. Well-designed course materials with vibrant color schemes help to attract learners’ attention and enhance their learning experience (Plass et al., 2014). Furthermore, appropriately balancing the difficulty of the course’s content and highlighting key points helps to effectively prevent learners from losing interest due to the content being too difficult or too simple (Coman et al., 2020). The instructor’s performance in the course (e.g., clear speech, moderate pacing, and appropriate attire) also significantly impacts the overall perceived quality of the course (Morris et al., 2019). In recent years, some studies have explored the roles of interactive features in enhancing CQ in online courses. For example, live chat and real-time Q&A features facilitate communication between instructors and students, enhancing learners’ immersion and interaction (Quadir & Yang, 2024). While this technology-enhanced presentation approach brings new vitality to traditional teaching, it also imposes higher demands on course design.
Learning Outcomes (LO) reflect the comprehensive improvement of learners in areas such as knowledge, skills, and interest. Learning Outcomes are influenced by multiple factors, including course design, learning tool support, and the individual engagement of learners (M. Wang et al., 2025).
Systematic course design helps learners to form a clear and structured understanding of knowledge (Theelen & van Breukelen, 2022). Modular content presentation and practical features such as case analysis and simulation experiments can significantly enhance the application of knowledge (Mei et al., 2023). Moreover, effective communication between learners and instructors further optimizes the knowledge absorption process (Castaneda et al., 2018). Online learning skills and a positive learning attitude are crucial for positive LO, with learners who efficiently use platform tools performing better in terms of knowledge acquisition and problem-solving (Zhu et al., 2020). Reflective learning also enhances efficiency and self-efficacy (Kuo et al., 2023). Learning interest—as the core intrinsic factor driving learning behavior—is stimulated through diverse content formats such as videos and animations (Mayer, 2017). The flexibility and efficiency of the course further promote learners’ motivation (Wu et al., 2024). Achieving positive LO depends on multiple factors, including CQ, learning attitude, platform support, and instructor feedback (Da-Hong et al., 2020). As such, these factors should be considered jointly to determine important directions for optimizing online courses.
Teaching Quality (TQ) in online courses is a key factor affecting learning experiences and outcomes. In recent years, research in this field has focused on three main areas—course structure and organization, teaching methods, and teaching assessment—exploring how to enhance the effectiveness of courses through optimization of their design and implementation (Haagen-Schützenhöfer & Hopf, 2020). Clear course objectives and a well-organized chapter structure provide a foundation for high-quality teaching (Oliveira et al., 2021). Modular course design helps learners to quickly adapt to the online learning environment, reduces cognitive load, and improves learning satisfaction and task completion efficiency (Theelen & van Breukelen, 2022). The logical structure and coherence of the course also help learners to form systematic knowledge (Jimoyiannis, 2010). The diversity of teaching methods is particularly important in online courses. The introduction of case-based teaching allows theoretical knowledge to be integrated with practical application, enhancing understanding and mastery (Wu et al., 2024); meanwhile, multimedia technologies (such as videos, animations, and interactive content) significantly increase engagement and motivation (Mayer, 2017). Experienced instructors can dynamically adjust the course content based on learners’ progress, thus enhancing the course’s effectiveness and the learners’ confidence (Morris et al., 2019). A comprehensive teaching assessment system ensures TQ through systematically evaluating the progress and outcomes of learners, enabling targeted course improvements (Da-Hong et al., 2020). Furthermore, assessments should include multidimensional feedback on the learning process, and regular course team evaluations can help to continually optimize the course’s content and teaching methods to meet learners’ evolving needs (Castaneda et al., 2018).
Technical Support (TS) is a crucial element in enhancing the learning experience and LO in online courses, encompassing three major areas: assessment methods, functional and technical environment, and after-class support. Assessment methods play a key role in improving LO. Dynamic assessment mechanisms, through recording and providing feedback on the learning process, help learners to clarify their learning progress and adjust their strategies accordingly (Yan et al., 2024). Diversified assessment formats (such as assignments, exams, quizzes, and group projects) not only help to determine LO, but also accurately reflect learners’ progress (Wei et al., 2021). Feedback-based assessments further consolidate knowledge and optimize learning methods (Howell, 2021). An effective assessment method should balance formative and summative evaluations, promoting continuous learning and skill development. A fully functional online platform ensures technic support Easy access, clear layout, and multi-device support enhance the learning experience and reduce technical barriers (Y. Liu et al., 2022). Reminder services (such as progress notifications and task reminders) help learners to manage their learning progress and improve their time management skills (Oreopoulos et al., 2022). The stability and functionality of the technical environment (such as real-time Q&A, data tracking, and knowledge visualization) further enhance the mastery of learning content (Gao & Li, 2024). After-class support ensures the continuity of learning, with assignments and exams reinforcing the application of knowledge, and timely feedback from instructors and online Q&A addressing knowledge gaps and improving learning strategies (Wei et al., 2021). Tools and resources (such as toolkits and answers) provide support for self-assessment and learning reinforcement, further promoting the internalization of knowledge (Sobaih et al., 2021).
Learning Motivation (LM) is a key driving factor for the success of online learning, determining the engagement, persistence, and LO of learners (C. Wang et al., 2022). Key factors influencing LM include emotional learning experiences, motivation stimulation, a sense of belonging and support, and course incentive mechanisms. Pleasant learning experiences and a sense of achievement can enhance learners’ interest and intrinsic motivation for the course, while incomplete tasks may trigger frustration and reduce their willingness to participate (Feng et al., 2024). During course design, task difficulty must be balanced to provide positive emotional experiences through goal achievement (Coman et al., 2020). Motivational tasks and modular design in online courses significantly stimulate LM, with clear task goals and attractive course content helping learners to maintain interest and motivation (Daniels et al., 2021). External rewards, such as points and badges, can also effectively motivate learners to actively participate (Xiao & Hew, 2024). A sense of belonging and support is essential for sustaining LM. Interactive design and the development of learning communities can alleviate feelings of isolation in online learning contexts, while group tasks foster a collective sense of achievement. Platform reminders and encouragement further stimulate the intrinsic motivation of learners (Zhang et al., 2024). Additionally, effective course incentive mechanisms motivate learners’ participation through fun and fairness (John et al., 2023).
This study identifies six primary indicators, each containing secondary indicators, with the dimensions and definitions outlined in Table A1. Combining the characteristics of online teaching and TAM theory, an Online Course User Experience Scale was developed in order to comprehensively assess the key factors influencing UE. The scale includes six core dimensions: Interactive Experience (IE), Content Quality (CQ), Learning Outcomes (LO), Teaching Quality (TQ), Technical Support (TS), and Learning Motivation (LM). With the six core dimensions as external variables; Perceived Usefulness (PU), Perceived Ease of Use (PEOU), and UE as internal variables; and Actual System Use (ASU) as the independent variable, a second-order complex model of user system usage was constructed to measure user ASU behaviors in online courses. This study was carried out in several stages: first, through a literature review and theoretical analysis, the dimensions and theoretical foundation of the scale were clarified; second, the User Experience Scale was developed, and its item validity and applicability were tested through a pre-survey; third, the reliability, validity, and structural validity of the scale were verified through exploratory factor analysis (EFA) and confirmatory factor analysis (CFA); and, finally, a second-order complex Structural Equation Model (SEM) was used to measure user ASU in the context of online courses.

1.5. Dimensions of Actual System Use

Perceived Usefulness (PU) reflects a user’s subjective assessment of the extent to which a technology or system enhances their performance in real-world tasks (Scherer et al., 2019). In the context of online teaching, PU is manifested primarily in improvements in academic achievement, increases in task-completion efficiency, greater ease of knowledge acquisition, and overall enhancement of the learning experience. First, online instruction—owing to its flexible delivery modes and abundant resource support—is widely regarded as an effective means of boosting learner performance. Through offering personalized learning pathways and diverse pedagogical approaches, learners can review and consolidate the provided content more precisely, thereby achieving superior outcomes in examinations and assessments (Iglesias-Pradas et al., 2021). This effect is particularly pronounced in data-driven adaptive learning environments, where PU has been shown to correlate significantly with gains in learner performance (C. Jia et al., 2022). Second, the convenience of online teaching enables learners to complete learning tasks with minimal friction. Features such as multi-platform access, flexible scheduling, and on-demand resource retrieval substantially reduce the temporal and cognitive costs of study (H.-H. Yang & Su, 2017). Moreover, modular course designs and automated task reminders assist learners in planning and managing their workload more effectively, rendering the learning process smoother and less burdensome (Larmuseau et al., 2019). A third important dimension of PU is its impact on learning efficiency. By providing high-quality multimedia materials—such as instructional videos, animations, and case studies—online platforms can significantly shorten the learning time while enhancing comprehension (Anderson & Dron, 2011). Learners who exercise control over the pace of instruction are able to master concepts more rapidly, thereby elevating overall learning efficiency (Qu, 2021). For example, virtual laboratories and real-time case discussions help students to apply theoretical knowledge to practical scenarios, deepening and broadening their understanding (Alamri, 2022). Finally, PU plays a critical role in shaping learners’ intentions to continue using online teaching resources. Through increasing user satisfaction, PU exerts an indirect effect on LO and serves as a key driver for sustained engagement with online instructional formats (Al-Rahmi et al., 2021).
Perceived Ease of Use (PEOU) measures a user’s subjective perception regarding how effortless it is to operate a system or technology. In the context of online teaching, PEOU is reflected in the convenience of system operations, the efficiency of resource access, the intuitiveness of the user interface, and the human-centered design of features. First, the simplicity of system operations is a critical determinant of the learner’s UE and acceptance of technology. An easy-to-learn platform can substantially reduce the user’s cognitive load and increase their engagement by minimizing frustration during initial use (Silva, 2015). Interfaces with clear information architecture and streamlined navigation enable learners to familiarize themselves with functions quickly, thereby lowering the barrier to effective use (Jiang et al., 2022). Second, the efficiency of resource access facilitates the smooth functioning of online teaching systems. Platforms that respond rapidly to user requests not only boost learning efficiency, but also foster greater satisfaction and trust in the system (Yu & Xu, 2022). In particular, intelligent searches and automated download capabilities significantly enhance the resource retrieval experience by reducing wait times and simplifying workflows (Y. Jia & Zhang, 2021). Moreover, an intuitive interaction design greatly facilitates user operations, allowing learners to complete tasks without unnecessary complexity. Clean, uncluttered user interfaces help users to understand and engage with system features effortlessly (Yu & Xu, 2022). Humanized functionalities—such as progress tracking, automated reminders, and multi-device synchronization—further enrich the learning experience by enabling personalized study paths and effective task management (Alamri, 2022).
In summary, an online teaching system that combines operational simplicity, efficient resource access, an intuitive interface, and human-centric features can deliver a more seamless learning experience. Through the reduction of cognitive barriers, high PEOU enables learners to focus their cognitive resources on the content itself, thereby enhancing overall LO (Venkatesh & Bala, 2008).
UE directly influences the attitudes, engagement, and satisfaction of learners. Online courses—by virtue of their flexibility, diverse resources, and personalized learning support—are widely regarded as an effective mode of instruction. Research has indicated that, compared with traditional classroom settings, online teaching better accommodates learners’ schedules and needs, thereby significantly enhancing learning efficiency (Anderson & Dron, 2011). This is especially true for adult and working learners, for whom online instruction offers both flexibility and efficacy (Abedini et al., 2021). Learners’ positive attitudes toward online teaching are considered a critical component of UE. Studies have shown that the more favorable the attitudes of learners, the higher their engagement and LO (Ferrer et al., 2022). Features such as flipped-classroom formats and case-based instruction, together with interactive tools (e.g., discussion forums and real-time Q&A), effectively stimulate interest and motivation (J. Cheng et al., 2024).
Moreover, the rich content and personalized pathway design characteristic of online courses can meet the diverse needs of different learners. By selecting modules that align with their interests and goals, learners can pursue individualized objectives (Anderson & Dron, 2011). Instant access to a broad array of resources further reinforces a learner’s perception that their educational needs are being met (Y. Jia & Zhang, 2021). In terms of experiential quality, online teaching allows for the delivery of intuitive multimedia presentations—such as videos, animations, and interactive simulations—and fosters an enjoyable learning atmosphere (Anderson & Dron, 2011). This sense of enjoyment not only heightens learners’ interest, but also deepens their immersion and achievement (Al-Rahmi et al., 2021). Gamification elements and virtual-reality applications, for example, create engaging, game-like environments that markedly boost enjoyment and sustained usage intentions (Bai et al., 2020). Such enhancements in enjoyment play a pivotal role in promoting learners’ continued commitment and further cement the status of online teaching as an efficient learning tool.
Actual System Use (ASU) is a key behavioral variable in the TAM, directly reflecting a user’s acceptance of and reliance on a system. In an online teaching context, ASU is evidenced by the level of engagement of learners, their intention to continue using the platform, recommendation behaviors, and the decision to integrate online instruction into their daily learning plans.
Engagement lies at the heart of ASU and is jointly shaped by learner attitudes, platform functionality, and Content Quality (Chen et al., 2022). Online teaching platforms boost engagement by offering abundant learning materials, diversified interactive tools, and flexible scheduling. This engagement is reflected not only in the completion of course tasks, but also in learners’ autonomous exploration and proactive interactions (Bailey et al., 2022). Continuance intention is another critical indicator of ASU. Learners typically persist with an online teaching platform to satisfy their long-term learning needs, especially when it provides a wide variety of course options, personalized learning pathways, and a seamless learning experience (Venkatesh & Bala, 2008). Such intentions signify high recognition of the platform’s functionality and usefulness. Recommendation behaviors further illustrate a user’s trust and satisfaction with online teaching. When learners are willing to recommend a platform to others, it indicates their confidence in the LO, CQ, and TQ associated with the platform (Lee & Jung, 2021). Driven by positive user experiences and reliable support, recommendations serve as a vital mechanism for platform diffusion. Moreover, the choice of an online teaching platform as a preferred mode of learning is often motivated by its flexibility and convenience. This is particularly true for working professionals or interdisciplinary learners, for whom online teaching can efficiently meet their diverse learning requirements (Anderson & Dron, 2011). The incorporation of online learning into daily study routines marks a deep integration of the system into the user’s behavioral patterns and demonstrates sustained usage. Based on a high valuation of the benefits of online learning and clear awareness of their own educational needs, learners regard the platform as an essential tool for achieving their career goals and ongoing development (Aleixo et al., 2021).

2. Materials and Methods

2.1. Research Design

This study drew on Scale Development: Theory and Application by DeVellis and Thorpe (2021), followed the scientific scale development procedures outlined by Jebb et al. (2021, pp. 1995–2019), and employed a 7-point Likert scale (South et al., 2022) to construct the proposed instrument. Development of the scale was carried out in the following stages: Initial item generation, establishment of expert-based content validity, scale pretest and item analysis, EFA, CFA, reliability and validity assessment, and structural equation modeling.
The Credamo platform, which has a large user base, was used for survey data collection in this study. To ensure data quality and sample accuracy, the platform sets precise distribution conditions. It restricts participants to university students (Generation Z, as digital natives who value personalized, fast, and interactive learning experiences) and sets thresholds for “participant credit score” and “historical acceptance rate,” requiring both to exceed 70 for participation. Additionally, the platform allows a rejection rate of up to 35%, effectively preventing random or incomplete responses, ensuring the reliability and validity of the obtained data. Compensation for participants is determined based on the number of survey questions, with higher distribution requirements corresponding to higher service fees.

2.2. User Experience Scale Development

During scale construction, abstract dimensions were operationalized into observable indicators by systematically reviewing and adapting established domestic and international instruments for online course learning experiences. As a result, 6 first-order dimensions and 19 second-order dimensions were identified (Table A2). Based on their conceptual definitions, the final instrument comprised 64 items across 6 dimensions: IE (11 items), CQ (9 items), TQ (9 items), TS (10 items), LO (11 items), and LM (14 items). All items used a 7-point Likert scale (1 = strongly disagree; 7 = strongly agree). Demographic questions cover gender, age, education level, and cumulative online learning time.

2.3. User Actual System Use Scale Development

In this study, users’ ASU of online courses was measured across ten dimensions, with the specific items for each dimension presented in Table A3.

2.4. User Actual System Use Model Construction

This study is grounded in the TAM, but introduces an innovative modification by replacing its traditional “Attitude Toward Use” (ATU) and “Behavioral Intention” (BI) constructs with “User Experience” (UE). Recent studies have demonstrated that UE not only significantly influences initial technology acceptance, but also plays a critical role in enhancing long-term satisfaction and continuance intentions (Franque et al., 2020). While the classic TAM emphasizes PEOU and PU, the broader integrative construct of UE more comprehensively captures learners’ overall responses—encompassing affective reactions, cognitive appraisals, and behavioral feedback—when engaging with online instructional resources, thereby offering superior predictive power for sustained system usage (Zuo et al., 2024).
Embedding UE as a central variable within the TAM framework, this study aims to more precisely capture the multidimensional experiences that users undergo during their actual course interactions, thus enhancing the model’s practical applicability and explanatory strength. The resulting model of ASU for online courses (Figure 1) extends beyond the purely cognitive focus of the traditional TAM to include dimensions such as Content Quality, platform interaction support, learning ambiance, and personalized services—factors which have been shown to be pivotal in driving continued engagement with online teaching platforms. Accordingly, we propose UE as the core mediating variable within the improved TAM, better reflecting the genuine needs and sustained participation of learners in online learning environments.
Based on the actual system usage model constructed in this study, 14 hypotheses are proposed.
H1-1. 
Interaction Experience has a direct positive effect on Perceived Usefulness.
H1-2. 
Interactive Experience has a direct positive impact on Perceived Ease of Use.
H2-1. 
Content Quality has a direct positive impact on Perceived Usefulness.
H2-2. 
Content Quality has a direct positive impact on Perceived Ease of Use.
H3-1. 
Learning Outcomes have a direct positive impact on Perceived Usefulness.
H3-2. 
Learning Outcomes have a direct positive impact on Perceived Ease of Use.
H4-1. 
Teaching Quality has a direct positive impact on Perceived Usefulness.
H4-2. 
Teaching Quality has a direct positive impact on Perceived Ease of Use.
H5-1. 
Technical Support has a direct positive impact on Perceived Usefulness.
H5-2. 
Technical Support has a direct positive impact on Perceived Ease of Use.
H6-1. 
Learning Motivation has a direct positive effect on Perceived Usefulness.
H6-2. 
Learning Motivation has a direct positive effect on Perceived Ease of Use.
H7. 
Perceived Ease of Use has a direct positive impact on Perceived Usefulness.
H8. 
Perceived Ease of Use has a direct positive impact on User Experience.
H9. 
Perceived Usefulness has a direct positive impact on User Experience.
H10. 
User Experience has a direct positive impact on Actual System Use.

3. Results

3.1. Verification and Analysis of Online Course User Experience Scale

3.1.1. Expert Review for the Content Validity of the User Experience Scale

To ensure content validity (Park et al., 2007), five experts in educational technology (one full professor, two associate professors, and two lecturers) reviewed the scale. Their evaluation focused on: (1) clarity of item wording—identifying ambiguity, redundancy, or incomprehensibility; and (2) the degree to which each item represented its target dimension. Based on their feedback, the initial items were refined: expressions were rewritten in the first person, redundant items were deleted, and overlapping items were adjusted. The expert review confirmed the overall structure, yielding a 64-item scale pre-test.

3.1.2. Scale Pre-Test and Item Analysis

The pre-survey aimed to evaluate the scale’s reliability and validity. The questionnaire was designed and administered via the Credamo online platform. In the pre-survey stage, each subject received a reimbursement of RMB 5. Due to multiple restrictions (occupation restrictions, subject reputation restrictions, etc.), the platform charges a service fee of RMB 98 at this stage. A total of 171 copies were issued, 14 invalid responses were excluded, and 157 valid questionnaires were retained (effective rate = 91.8%), comprising 55 male and 102 female respondents. The data analysis proceeded as follows.
For data analysis and processing, SPSS 23.0 was used to perform item analysis via the Critical Ratio (CR) method, removing items with poor discrimination. First, total scores were computed and sorted; the lowest 25% (≤316 points) and highest 25% (≥356 points) of cases formed the low- and high-score groups, respectively. Independent-samples t-tests were then conducted to compare the mean item scores between these groups. Levene’s test for equality of variances and mean comparisons showed that items IE11, CQ3, LO1, LO2, LO7, LO8, LO9, TQ6, TQ7, and LM1 were non-significant (p > 0.05), while all other items differed significantly, indicating good discriminative ability. Simultaneously, item homogeneity was examined via bivariate Pearson correlations with the total scale score. Although most items correlated with the overall score at the 0.01 level (two-tailed), items CQ3, CQ8, LO7, TQ6, TQ9, TS2, LM2, and LM4 had coefficients below 0.30, indicating they did not relate significantly to the total scale.
Reliability analysis of the pre-test scale showed that Cronbach’s α for each dimension exceeded 0.70, indicating acceptable reliability. Item analysis, however, revealed that deleting certain items would further improve the α value. Accordingly, items IE11, CQ3, CQ6, CQ8, LO1, LO7, LO8, LO9, LO11, TQ1, TQ6, TQ9, TS1, TS2, TS6, LM1, LM2, LM4, and LM5 were removed, yielding a final overall Cronbach’s α of 0.908 (Table 3).

3.1.3. Exploratory Factor Analysis

To examine the internal dimensional structure of the scale, EFA was employed. Prior to factor extraction, the suitability of data was assessed: a Kaiser–Meyer–Olkin (KMO) measure above 0.60 indicates adequacy for factor analysis, and Bartlett’s test of sphericity evaluates the inter-variable correlations; a significant p-value denotes that the correlation matrix is not an identity matrix, and that factor analysis is appropriate.
Using SPSS 23.0, the KMO value was 0.784 (>0.60) and Bartlett’s test was significant at p < 0.001, indicating the presence of common factors among variables and confirming that the data were highly suitable for factor analysis, thereby providing a solid foundation for subsequent analyses, as detailed in Table 4.
This study used expert consultation and a literature review to establish the scale’s theoretical dimensions and, having confirmed its content validity, conducted exploratory factor analyses at the dimension level. For each theoretical dimension, Principal Component Analysis (PCA) with varimax rotation was applied. After iterative refinement to remove 19 items (IE11, CQ3, CQ6, CQ8, LO1, LO7, LO8, LO9, LO11, TQ1, TQ6, TQ9, TS1, TS2, TS6, LM1, LM2, LM4, LM5), PCA extracted 13 factors which collectively explained 76.273% of the total variance. The first factor had the largest eigenvalue (9.254), accounting for 20.563% of the variance, with each subsequent factor contributing progressively less, indicating a robust factor structure that effectively reduced dimensionality and yielded clear variable groupings. Factors were named in accordance with the theoretical dimensions, and the final allocation of items to each factor is shown in Table A4.

3.1.4. Confirmatory Factor Analysis

After the pre-test, item analysis, and EFA, 45 items were retained to form the “Formal Online Course User Experience Scale.” The instrument was administered via Credamo, yielding 490 responses, of which 457 were valid (effective rate = 93.3%; 140 males and 317 females). The demographic characteristics of respondents to the formal survey are detailed in Table 5.
Reliability analysis of the 457 formal survey responses using SPSS 23.0 indicated that Cronbach’s α for each dimension exceeded 0.70, with an overall α = 0.931, demonstrating excellent internal consistency (as detailed in Table 6).
KMO and Bartlett’s tests were then conducted on the 457 formal survey responses using SPSS 23.0. The KMO measure was 0.784 (>0.60) and Bartlett’s test was significant at p < 0.001 (Table 7), indicating that the data were suitable for factor analysis.
Using AMOS 28.0, CFA was performed to assess the structural validity of the Online Course Learning Experience Scale. In the SEM (Figure 2), 45 items served as observed variables, 13 factors as first-order latent variables, and 5 factors as second-order latent constructs. A second-order model was specified in AMOS, and the data were imported for analysis. Maximum likelihood estimation was used for model identification and parameter estimation. Fit indices (Table 8) indicated that the model met the recommended thresholds for absolute fit (CMIN/DF, RMSEA), incremental fit (IFI, TLI, CFI), and parsimony fit (PNFI, PCFI), demonstrating satisfactory overall model fit.
The data shown in Table 9 indicate that all six constructs—IE, CQ, LO, TQ, TS, and LM—exhibited satisfactory reliability and validity. Specifically, the Average Variance Extracted (AVE) for each construct exceeded 0.50 and the Composite Reliability (C.R.) values were all above 0.70. Although a small number of factor loadings fell below 0.70, the majority exceeded this threshold, confirming strong convergent validity and internal consistency. Moreover, all path loadings were significant and the latent variables demonstrated substantial explanatory power over their respective indicators. These results validate the model’s soundness and provide empirical support for further examination of the relationships among constructs.
As shown in Table A5 and Figure 2, the six dimensions of IE, CQ, LO, TQ, TS, and LM demonstrated excellent reliability and validity. Specifically, the AVE values for all dimensions exceeded 0.5, indicating adequate convergent validity; the C.R. values for all dimensions were above 0.7, confirming strong internal consistency; and, while a few factor loadings were slightly below 0.7, the majority exceeded 0.7, further supporting the scale’s convergent validity and internal consistency. Additionally, the factor loadings along all paths were significant, indicating that the latent variables strongly explain their respective measurement indicators. These findings validate the rationality of the model and provide robust data support for further exploration of the relationships among the dimensions.
As shown in Table 10, the results of the correlation analysis indicated that the correlation coefficients between all dimensions reached a significant level (p < 0.001), demonstrating significant positive relationships among the dimensions. Among these, the correlation between TQ and LO was the highest (0.593 ***), suggesting that TQ has a notably strong influence on IE. Additionally, LM showed high correlations with other dimensions, such as LO and TQ, highlighting the critical role of LM in the overall UE. Furthermore, the square root of the AVE for each dimension was greater than the correlation coefficients between the dimensions, indicating good discriminant validity among the dimensions. These results support the reliability and validity of the scale and reinforce its theoretical foundation.

3.2. Online Course Actual System Use Structural Equation Model Validation and Analysis

3.2.1. Expert Review for the Content Validity of the Actual System Use Scale

Five experts in the field of educational technology evaluated the scientific rigor and appropriateness of the scale items. Following their review, the experts endorsed the overall structure, and the number of items measuring ASU was finalized at 65.

3.2.2. Project Analysis

As the ASU scale was adapted from established measures, no factor analysis was performed. The Formal Online Course User Experience Scale combined with the ASU dimensions was administered via Credamo, yielding 457 valid responses. Item discrimination was assessed via the CR method: cases were sorted by total score, with the bottom 25% (≤274) forming the low group and the top 25% (≥332) the high group. Independent-samples t-tests revealed that items PU3, PEOU2, UE4, and ASU2 did not differ significantly (p > 0.05), whereas all other items showed significant high–low group differences, indicating good discrimination. Concurrently, bivariate Pearson correlations with the total scale score confirmed that, when items with coefficients below 0.30 were removed, the remaining items all correlated significantly with the overall score (p < 0.01).

3.2.3. Descriptive Statistics and Normality Test of Each Dimension Measurement Item of Formal Survey Data

Table A6 presents descriptive statistics and normality test results for each factor. Mean scores ranged from 4 to 5 on the 1–7 positive scale, indicating that participants’ performance across all dimensions was at a moderate level. Normality for each item was assessed according to skewness and kurtosis. Following Kline (1998), absolute skewness ≤ 3 and kurtosis ≤ 8 indicate approximate normality. All items met these criteria, confirming near-normal distributions.

3.2.4. Subsubsection

Exploratory Pearson correlation analysis was conducted to examine the relationships between multiple variables. As shown in Table 10, most dimensions exhibited significant positive correlations, with all coefficients being positive at the 0.01 significance level, indicating strong positive associations. For example, IE correlated positively with CQ, LO, and TQ (r > 0). In contrast, PU showed weaker correlations with other dimensions, with some (e.g., IE and TQ) even being negatively correlated (r < 0), indicating more complex relationships. Notably, PEOU and UE had a strong positive correlation (r = 0.616), suggesting a significant link between these two dimensions. Additionally, ASU showed strong positive correlations with PU, PEOU, and UE, indicating their considerable impact on user ASU.

3.2.5. Reliability and Validity Analysis

Reliability analysis of the 457 formal survey responses was conducted using SPSS 23.0. Cronbach’s α values for all dimensions exceeded 0.70, with the overall Cronbach’s α (=0.926) indicating good reliability (Table 11).
KMO and Bartlett’s test of sphericity were performed on the 457 formal survey responses using SPSS 23.0. The KMO value was 0.932 (>0.6) and Bartlett’s test was significant at p < 0.001 (Table 12), indicating that the data were suitable for factor analysis.

3.2.6. Structural Equation Modeling Analysis

The model fit was assessed, and the results showed that the model met the required thresholds for absolute fit indices (CMIN/DF = 1.564, RMSEA = 0.035), incremental fit indices (IFI = 0.934, TLI = 0.93, CFI = 0.934), and parsimony fit indices (PNFI = 0.796, PCFI = 0.888), indicating that the model demonstrates a good overall fit (Table 13).
Path relationship tests revealed several significant effects. Specifically, PEOU ← IE (p = 0.042) and PU ← TQ (p = 0.049) were significant, indicating that IE positively influences PEOU and TQ positively affects PU. Other significant paths, such as PEOU ← CQ, PU ← LM, and UE ← PU, highlighted the importance of these factors in the model. The path ASU ← UE (p < 0.01) revealed a strong and highly significant impact of UE on ASU. Overall, these paths demonstrated positive interactions and the roles of various factors in enhancing user experience, as detailed in Table 14 and Figure 3.

4. Discussion

This study constructed and validated the Online Course User Experience Scale to explore the multidimensional factors influencing online learning. Our findings are generally consistent with the existing literature, but also offer new perspectives.
Significant positive correlations were observed between IE and LM, TQ, and CQ. This aligns with Bailey et al. (2022), who found that the interactive features of online learning platforms effectively enhance learner engagement and satisfaction. Furthermore, the significance of subdimensions such as learning autonomy and collaborative learning supports previous research (Weinberger & Fischer, 2006) highlighting the importance of active participation in social interaction and collaborative learning for LO. However, while these factors showed significant relationships, some paths—especially between certain interaction dimensions and Teaching Quality—exhibited weaker effects, suggesting that future research could further explore the integration of interactive features and teaching methods.
PU and PEOU were found to significantly influence the system usage behaviors of learners, consistent with Davis’s (1989) TAM, where these variables are crucial for acceptance of technology. However, the impact of PU on LM is more complex, with weaker positive associations in some paths, aligning with the results of Venkatesh and Bala (2008). Traditionally focused on cognitive factors, the TAM was expanded in this study by introducing user experience (which moderates the relationship between PU and PEOU), offering new perspectives for extending the TAM.
Motivation plays a central role in promoting engagement with and the continued use of online platforms by learners; this is consistent with H. Yang (2024), who found that motivation significantly impacts online learning engagement and outcomes. Our study further reveals that LM is driven not only by content appeal, but also by learning support and community belonging. This finding reinforces the idea that task design and platform interaction mechanisms strongly influence motivation, supporting the role of gamification in enhancing motivation for learning (Bai et al., 2020).
The innovation of this study lies in its use of the TAM to model online course UE, while introducing a multidimensional framework for UE that emphasizes the combined effects of factors such as IE, CQ, and TS. For example, (Quadir & Yang, 2024) found that interactive features in online teaching platforms enhance learners’ immersion and engagement, and our study further confirmed their impacts on LM and continued usage behaviors. Additionally, the EFA and structural equation modeling results validated the internal relationships between these dimensions, deepening our understanding of online course learning experiences.
Despite providing valuable insights into online course UE, this study had some limitations. First, the sample primarily consisted of university students from a specific region, which may limit the generalizability of the results due to regional and cultural differences. As such, future research could consider cross-cultural or cross-regional comparisons. Second, while a comprehensive user experience scale was developed, further exploration of personalized learning experiences and long-term tracking of user behaviors are needed. Finally, with the continuous development of technology, future studies could incorporate AI techniques and big data analytics to explore how intelligent recommendation systems can further optimize learning experiences and improve academic performance.

5. Conclusions

This study developed and validated the Online Course User Experience Scale, allowing for a comprehensive analysis of key factors in the context of online learning. The findings demonstrated that IE, CQ, LM, TQ, and PU are the main factors affecting learners’ online course experiences and continued usage behaviors. Notably, LM and IE were found to play significant roles in enhancing learner engagement and satisfaction.
Compared to previous studies, our research extended the TAM by introducing a more comprehensive UE framework and validating the relationships between these dimensions. These results offer theoretical insights for the design of online teaching platforms, particularly in terms of enhancing learner interactions, improving the quality of content, and providing personalized support.
Future research could explore how emerging technologies, such as AI and big data analytics, can help to further optimize personalized online course experiences, while also considering differences in UE across cultural contexts, which may inform broader theoretical and practical guidance for the development of global online teaching platforms.

Author Contributions

Conceptualization, M.W. and S.S.R.; methodology, M.W.; software, M.W.; validation, M.W., S.S.R. and A.Y.D.; formal analysis, M.W.; investigation, M.W.; resources, M.W.; data curation, M.W.; writing—original draft preparation, M.W.; writing—review and editing, X.Y.; visualization, S.S.R.; supervision, X.Y.; project administration, S.S.R.; funding acquisition, X.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Sichuan Tourism University Applied Brand Course, grant number ZL2024004; Sichuan Education Informatization and Big Data Center, grant number DSJZXKT241; Sichuan Food Development Research Center, grant number CC22G13; University Computer Course Teaching Steering Committee of the Ministry of Education, grant number AIGE-202409; National Association for Basic Computer Education in Higher Education, grant number 2023-AFCEC-062; And The APC was funded by Sichuan Tourism University Applied Brand Course.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board (or Ethics Committee) of Chiang Mai University Research Ethics Committee (CMUREC) (CMUREC 68/015 and 17 February 2025).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Original data can be obtained, upon request, from the corresponding author.

Acknowledgments

We acknowledge the valuable efforts of the student assistants during data collection.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

TAMTechnology Acceptance Model
UEUser experience
ASUActual System Use
CQContent Quality
TQTeaching Quality
TSTechnical Support
LOLearning Outcomes
LMLearning Motivation
PUPerceived Usefulness
PEOUPerceived Ease of Use
SEMStructural Equation Model
IEInteractive Experience
CFAConfirmatory factor analysis
EFAExploratory factor analysis
C.R.Composite Reliability
CRCritical Ratio
KMOKaiser–Meyer–Olkin
AVEAverage Variance Extracted

Appendix A

Table A1. Construct dimensions and definitions of Online Course User Experience Scale.
Table A1. Construct dimensions and definitions of Online Course User Experience Scale.
First-Level IndicatorSecond-Level IndicatorDefinitionReferences
Interactive Experience (IE)Degrees of Learning Freedom (DOLF)Refers to the extent to which an online course enables learners to freely choose the time, place, mode, and content or learning pathways of their participation, thus reflecting its capacity to meet individualized needs.(Hung et al., 2010; Jung et al., 2019; Mamun et al., 2020)
Learning Community (LC)Interaction refers to communication, discussion, and social exchange among instructors, learning facilitators, and peers, emphasizing connection-building to enrich the learning experience.(Yassine et al., 2022; M. Wang et al., 2024; Y. Tang & Hew, 2022)
Learning to Collaborate (LTC)Refers to leveraging on- and off-platform tools (e.g., social media) to facilitate instructor–learner cooperation and emotional support, fostering resource sharing and deep engagement in the learning process.(Seifert & Bar-Tal, 2023; Vartiainen et al., 2022; Erkens & Bodemer, 2019; Mansour, 2024; Da-Hong et al., 2020; Nong et al., 2023; W.-S. Wang et al., 2024)
Content Quality (CQ)Content and Resources (CAR)Refers to the accuracy and diversity of an online course’s learning materials and resources, ensuring they meet learners’ needs and facilitate knowledge acquisition.(Wu et al., 2024; B. Liu & Yuan, 2024; Shen, 2018)
Knowledge Presentation (KP)Content coverage and depth denotes the scope and rigor of course topics, the practicality of case examples, and the extent to which presentation formats support real-world applications.(Mayer, 2017; P. Tang et al., 2022)
Course Presentation (CP)Refers to the visual appeal, clarity, and appropriateness of design of course videos and materials, as well as the instructor’s on-screen delivery (e.g., verbal expression and presence).(Plass et al., 2014; Coman et al., 2020; Morris et al., 2019; Quadir & Yang, 2024)
Learning Outcomes (LO)Knowledge, Ability and Interest (KAAI)Refers to the course’s ability to help students to acquire new skills or knowledge and stimulate their interest in the subject.(Theelen & van Breukelen, 2022; Mei et al., 2023; Castaneda et al., 2018; Zhu et al., 2020; Kuo et al., 2023; Mayer, 2017; Da-Hong et al., 2020)
Online Learning Skills (OLS)Refers to learners’ ability to solve problems, collaborate, and effectively use the platform’s tools in an online environment.
Online Learning Attitude (OLA)Refers to learners’ confidence in completing online learning tasks and their recognition of the effectiveness of online learning.
Teaching Quality (TQ)Course Structure and Organization (CSAO)Refers to the logical structure, clear objectives, and the course’s ability to help learners achieve their learning goals.(Haagen-Schützenhöfer & Hopf, 2020; Oliveira et al., 2021; Theelen & van Breukelen, 2022; Jimoyiannis, 2010; Wu et al., 2024; Mayer, 2017; Morris et al., 2019; Da-Hong et al., 2020; Castaneda et al., 2018)
Teaching Means (TM)Refers to the clarity, organization, and richness of the explanation, as well as the effectiveness of multimedia technology used.
Teaching Evaluation (TE)Refers to the evaluation of teaching organization, student learning progress, and performance, along with providing effective feedback support.
Technical Support (TS)Assessment Format (AF)Refers to the platform’s provision of multiple, diverse, and effective evaluation methods that comprehensively reflect learning outcomes and promote knowledge retention.(Yan et al., 2024; Wei et al., 2021; Howell, 2021; Y. Liu et al., 2022; Gao & Li, 2024; Wei et al., 2021)
Functional and Technical Environment (FATE)Refers to the interface design, completeness of features, responsiveness of technical support, and platform stability.
After-school Support (AS)Refers to effective services such as tutoring, assignment feedback, and Q&A, which help students to consolidate the learned content.
Learning Motivation (LM)Learning Emotional Experience (LEE)Refers to students’ emotional responses during the learning process, including positive emotions (e.g., accomplishment) and negative emotions (e.g., frustration).(C. Wang et al., 2022; Feng et al., 2024; Coman et al., 2020; Daniels et al., 2021; Xiao & Hew, 2024; Zhang et al., 2024; John et al., 2023)
Learning Motivation Elicitation (LME)Refers to the course design’s ability to stimulate learners’ intrinsic interest and extrinsic motivation (e.g., points, reward systems).
Sense of Belonging and Support (SOBAS)Refers to the emotional connection and support that learners perceive from the platform, instructors, or peers during the learning process.
Course Incentive Mechanism (CIM)Refers to the platform’s use of mechanisms such as points and rankings to effectively and engagingly motivate learner participation.
Table A2. User experience measurement items for online courses.
Table A2. User experience measurement items for online courses.
Index N.1-Level IndicatorSerial N.ItemReferences
1IEIE-1Online courses allow me to flexibly schedule learning time and location.(Hung et al., 2010; Jung et al., 2019; Mamun et al., 2020; Yassine et al., 2022; M. Wang et al., 2024; Y. Tang & Hew, 2022; Seifert & Bar-Tal, 2023; Vartiainen et al., 2022; Erkens & Bodemer, 2019; Mansour, 2024; Da-Hong et al., 2020; Nong et al., 2023; W.-S. Wang et al., 2024)
2IE-2Online courses allow me to choose course modules based on goals or interests.
3IE-3I can actively promote understanding and knowledge acquisition through sharing and discussion.
4IE-4Online course learning connects me closely with other learners.
5IE-5In online courses, I receive emotional support and academic assistance.
6IE-6I frequently engage in with peers, extending beyond the course.
7IE-7Online courses enable me to collaborate with other learners to complete tasks.
8IE-8The online course platform supports diverse interactive features.
9IE-9I can join relevant communities and receive after-class tutoring and Q&A support.
10IE-10I can ask questions at any time and receive timely feedback from instructors.
11IE-11The online course platform supports extracurricular learning resources and discussions.
12CQCQ-1The online course platform provides diverse course-related resources.(Wu et al., 2024; B. Liu & Yuan, 2024; Shen, 2018; Mayer, 2017; P. Tang et al., 2022; Plass et al., 2014; Coman et al., 2020; Morris et al., 2019; Quadir & Yang, 2024)
13CQ-2Online course resources are regularly updated.
14CQ-3Online courses cover a broad range of topics.
15CQ-4Online courses include case studies and experiments.
16CQ-5Online course content is presented in formats such as charts, animations, and videos.
17CQ-6The online course materials are well-designed with vibrant colors.
18CQ-7The online course highlights key points and maintains an appropriate level of difficulty.
19CQ-8The instructor’s voice is clear, the pace is moderate, and the attire is appropriate.
20CQ-9The online course videos feature live comments and real-time Q&A.
21LOLO-1I can develop a systematic understanding of the key concepts in the online course.(Theelen & van Breukelen, 2022; Mei et al., 2023; Castaneda et al., 2018; Zhu et al., 2020; Kuo et al., 2023; Mayer, 2017; Da-Hong et al., 2020)
22LO-2The knowledge gained from the online course is very useful to me.
23LO-3I can apply the knowledge from the online course to my studies and work.
24LO-4I can report course issues to the instructor or teaching assistant.
25LO-5I can efficiently solve learning issues using the platform’s tools.
26LO-6I improve my communication skills through discussions with learners.
27LO-7I can reflect on the problem-solving process to improve learning efficiency.
28LO-8I can confidently complete online learning tasks.
29LO-9The format of the online course content motivates my learning.
30LO-10The online course convinces me of the efficiency of online learning.
31LO-11Online learning has sparked my interest in exploring more related content.
32TQTQ-1The online course has clear objectives and well-structured chapters.(Haagen-Schützenhöfer & Hopf, 2020; Oliveira et al., 2021; Theelen & van Breukelen, 2022; Jimoyiannis, 2010; Wu et al., 2024; Mayer, 2017; Morris et al., 2019; Da-Hong et al., 2020; Castaneda et al., 2018)
33TQ-2The online course content is logical, with clear modules and appropriately arranged tasks.
34TQ-3The online course process is simple and clear, allowing me to get started quickly.
35TQ-4The online course explanations are accompanied by case studies.
36TQ-5The online course employs diverse teaching strategies and multimedia technology.
37TQ-6The online course instructor is experienced and highly skilled.
38TQ-7The online course instructor adjusts the course based on learners’ progress.
39TQ-8The online course has a comprehensive assessment system for students’ learning.
40TQ-9The online course team regularly evaluates the course’s suitability.
41TSTS-1The online course platform tracks and provides feedback on my learning progress.(Yan et al., 2024; Wei et al., 2021; Howell, 2021; Y. Liu et al., 2022; Gao & Li, 2024; Wei et al., 2021)
42TS-2The online course uses diverse assessment methods to showcase my learning outcomes.
43TS-3The online course assessments accurately reflect my learning progress.
44TS-4The online course assessments help me reinforce my knowledge.
45TS-5The online course platform is easily accessible with a clear layout.
46TS-6The online course platform supports multiple devices and resolves issues promptly.
47TS-7The online course platform’s reminder service helps me track learning progress and important tasks.
48TS-8The assignments and exams in online courses help consolidate my knowledge.
49TS-9The feedback from online course instructors and the platform’s Q&A support help improve my learning.
50TS-10The online course provides toolkits and answers.
51LMLM-1Completing online courses gives me a sense of accomplishment.(C. Wang et al., 2022; Feng et al., 2024; Coman et al., 2020; Daniels et al., 2021; Xiao & Hew, 2024; Zhang et al., 2024; John et al., 2023)
52LM-2Learning the content of online courses brings me pleasure.
53LM-3When facing difficulties, the online course platform provides supportive feedback.
54LM-4Inability to complete online course tasks makes me feel frustrated.
55LM-5The task design in online courses motivates me to achieve the goals.
56LM-6The task design in online courses motivates me to achieve the goals.
57LM-7Completing online course learning rewards me with learning points, badges, or other incentives.
58LM-8I feel part of the learning community when participating in online courses.
59LM-9The platform’s learning reminders, encouraging messages, and interactive design make me feel acknowledged.
60LM-10During the learning process, I feel a sense of accomplishment due to the collective effort of the group.
61LM-11The course reward system is fair and reasonable, motivating me to actively engage in learning.
62LM-12The certificates or skill badges earned upon task completion motivate me to study actively.
63LM-13Group tasks or competition activities on the online course platform motivate me to engage actively in learning.
64LM-14The engaging reward system motivates me to study actively.
Table A3. Measurement items for ASU of online course users.
Table A3. Measurement items for ASU of online course users.
Index NumberIndicatorSerial NumberItemReferences
1Perceived Usefulness (PU)PU1Using online courses for learning has improved my academic performance.(Wong et al., 2024; Al-Adwan et al., 2024; W. Liu et al., 2020; AL-Nuaimi et al., 2022; Tawafak et al., 2020)
2PU2Online course learning makes it easier for me to complete learning tasks.
3PU3Using online courses has enhanced my learning efficiency.
4PU4Online course learning makes it easier for me to master knowledge.
5PU5Overall, online courses are very helpful to my learning.
6Perceived Ease of Use (PEOU)PEOU1I find the online course system easy to operate and user-friendly.(Wong et al., 2024; Al-Adwan et al., 2024; W. Liu et al., 2020; AL-Nuaimi et al., 2022; Tawafak et al., 2020)
7PEOU2I can quickly access the online course resources I need.
8PEOU3My interaction with the online course system is intuitive and straightforward.
9PEOU4I find the online course system’s features highly user-friendly.
10PEOU5Overall, I find learning to use the online course system effortless.
11User Experience (UE)UE1I believe online courses are an effective learning mode.(Al-Fraihat et al., 2020; B. Wu & Wang, 2022; Martin et al., 2020; Zuo et al., 2022; Lazim & Ismail, 2021; Al-Adwan, 2020; Rautela et al., 2024)
12UE2I maintain a positive attitude towards learning through online courses.
13UE3Online courses can meet all my learning needs.
14UE4Learning through online courses brings me joy.
15UE5Overall, online courses make the learning process more engaging.
16Actual System Use (ASU)ASU1I actively engage in learning through online courses.(Wong et al., 2024; Al-Adwan et al., 2024; W. Liu et al., 2020; AL-Nuaimi et al., 2022; Tawafak et al., 2020)
17ASU2I will continue using online courses to meet my learning needs.
18ASU3I am willing to recommend online courses to others for learning support.
19ASU4I am willing to prioritize online courses as my preferred learning method.
20ASU5Overall, I have incorporated online courses into my daily learning routine.
Table A4. Grouping of factors in each dimension in EFA results.
Table A4. Grouping of factors in each dimension in EFA results.
2-Level IndicatorItemFactor
12345678910111213
DOLFIE1 0.84
IE2 0.659
LCIE30.825
IE40.809
IE50.82
IE60.853
IE70.69
LTCIE8 0.789
IE9 0.712
IE10 0.673
CARCQ1 0.517
CQ2 0.639
KPCQ4 0.875
CQ5 0.848
CQ7 0.751
CQ9 0.802
KAAILO2 0.881
LO3 0.599
LO10 0.73
OLSLO4 0.811
LO5 0.79
LO6 0.768
TETQ2 0.695
TQ3 0.832
TQ4 0.776
TQ5 0.651
CSAOTQ7 0.86
TQ8 0.595
TS3 0.725
TS4 0.751
TSTS5 0.783
TS7 0.831
TS8 0.796
TS9 0.764
TS10 0.824
LMELM3 0.805
LM6 0.624
LM7 0.559
SOBASLM8 0.724
LM9 0.692
LM10 0.742
CIMLM11 0.737
LM12 0.888
LM13 0.805
LM14 0.713
Extraction method: principal component analysis; Rotation method: Caesar normalized maximum variance method; A-rotation converged after eight iterations.
Table A5. Convergent validity and combined reliability results for each dimension.
Table A5. Convergent validity and combined reliability results for each dimension.
PathEstimateS.E.pAVEC.R.
DOLF ← IE0.664 0.5210.7642
LC ← IE0.8020.161***
LTC ← IE0.6920.131***
CAR ← CQ0.668 0.57010.7238
KP ← CQ0.8330.186***
KAAI ← LO0.719 0.62570.7684
OLS ← LO0.8570.138***
TE ← TQ0.7940.103***0.61630.7626
CSAO ← TQ0.776
LME ← LM0.824 0.61090.8245
SOBAS ← LM0.7310.091***
CIM ← LM0.7870.094***
IE1 ← DOLF0.746 0.62350.7676
IE2 ← DOLF0.8310.101***
IE4 ← LC0.7930.046***0.64880.8471
IE5 ← LC0.7970.045***
IE6 ← LC0.8260.047***
IE8 ← LTC0.76 0.7790.8173
IE9 ← LTC0.7820.07***
IE10 ← LTC0.7790.07***
LM7 ← LME0.776 0.6050.8212
LM6 ← LME0.7980.059***
LM3 ← LME0.7590.062***
TS4 ← CSAO0.85 0.6450.645
TS3 ← CSAO0.7760.048***
TQ8 ← CSAO0.7730.049***
TQ7 ← CSAO0.8110.048***
TQ5 ← TE0.799 0.7680.8628
TQ4 ← TE0.790.052***
TQ3 ← TE0.770.055***
TQ2 ← TE0.7680.055***
LO4 ← OLS0.789 0.780.8268
LO5 ← OLS0.7820.064***
LO6 ← OLS0.780.064***
LO2 ← KAAI0.771 0.61750.8288
LO3 ← KAAI0.8070.068***
LO10 ← KAAI0.7790.067***
CQ4 ← KP0.872 0.63970.6397
CQ5 ← KP0.780.047***
CQ7 ← KP0.7350.048***
CQ9 ← KP0.8060.046***
CQ1 ← CAR0.789 0.67850.8082
CQ2 ← CAR0.8570.091***
TS8 ← TS0.781 0.61670.8892
TS7 ← TS0.7680.057***
TS5 ← TS0.7830.058***
TS9 ← TS0.8520.058***
TS10 ← TS0.7380.057***
LM8 ← SOBAS0.836 0.6310.8365
LM9 ← SOBAS0.8050.054***
LM10 ← SOBAS0.7390.053***
LM11 ← CIM0.808 0.8210.8784
LM12 ← CIM0.760.052***
LM13 ← CIM0.8190.054***
LM14 ← CIM0.8210.054***
IE3 ← LC0.856 0.8330.8326
IE7 ← LC0.8330.046***
*** p < 0.001.
Table A6. Normality test results and descriptive statistics for each dimension.
Table A6. Normality test results and descriptive statistics for each dimension.
DimensionItemMeanStd. DeviationSkewnessKurtosisMeanStd. Deviation
IEIE15.261.568−1.2390.9155.06591.06652
IE25.211.53−1.1550.955
IE34.931.481−0.8970.531
IE45.021.516−0.9140.428
IE55.041.475−1.0460.809
IE64.941.568−0.7480.044
IE75.121.532−1.0680.687
IE85.031.506−1.0380.684
IE95.061.526−0.9670.467
IE105.061.531−1.0050.518
CQCQ14.91.657−0.854−0.1784.85631.23041
CQ24.881.635−0.751−0.199
CQ44.841.581−0.786−0.097
CQ54.921.639−0.835−0.153
CQ74.821.595−0.762−0.21
CQ94.771.6−0.709−0.282
LOLO24.891.565−0.8320.0414.94491.17445
LO34.861.6−0.706−0.222
LO44.971.622−0.784−0.142
LO55.031.507−0.9640.542
LO64.811.568−0.721−0.1
LO105.111.54−0.970.491
TQTQ24.911.558−0.8850.0824.90961.20805
TQ34.91.598−0.756−0.125
TQ45.191.507−1.1150.886
TQ551.64−0.821−0.101
TQ74.741.618−0.609−0.388
TQ84.711.615−0.622−0.42
TSTS34.721.583−0.615−0.3664.88931.1418
TS44.741.603−0.604−0.431
TS55.041.597−0.8750.041
TS74.981.568−0.8950.099
TS84.971.588−0.81−0.015
TS94.861.622−0.662−0.328
TS104.911.568−0.834−0.001
LMLM34.741.597−0.725−0.2464.91051.12738
LM65.111.497−0.9370.633
LM74.881.62−0.694−0.34
LM84.711.624−0.559−0.512
LM94.661.619−0.51−0.533
LM105.171.571−1.0210.503
LM114.881.6−0.775−0.164
LM125.211.552−1.0790.693
LM134.821.62−0.703−0.29
LM144.911.597−0.81−0.107
PUPU15.171.553−1.0560.6545.20731.29305
PU25.21.542−1.180.898
PU45.151.51−1.1230.887
PU55.321.582−1.2831.032
PEOUPEOU15.221.553−1.130.8255.10671.2584
PEOU35.151.505−1.1861.039
PEOU45.081.492−1.130.813
PEOU54.981.514−0.9830.588
UEUE14.771.797−0.738−0.4984.68931.42214
UE24.741.751−0.78−0.451
UE34.61.69−0.683−0.449
UE54.651.688−0.77−0.349
ASUASU14.891.667−0.9450.0224.87471.38583
ASU34.821.633−0.9030.042
ASU44.781.595−0.9460.111
ASU551.671−1.0230.158

References

  1. Abedini, A., Abedin, B., & Zowghi, D. (2021). Adult learning in online communities of practice: A systematic review. British Journal of Educational Technology, 52(4), 1663–1694. [Google Scholar] [CrossRef]
  2. Al-Adwan, A. S. (2020). Investigating the drivers and barriers to MOOCs adoption: The perspective of TAM. Education and Information Technologies, 25(6), 5771–5795. [Google Scholar] [CrossRef]
  3. Al-Adwan, A. S., Meet, R. K., Anand, S., Shukla, G. P., Alsharif, R., & Dabbaghia, M. (2024). Understanding continuous use intention of technology among higher education teachers in emerging economy: Evidence from integrated TAM, TPACK, and UTAUT model. Studies in Higher Education, 50(3), 505–524. [Google Scholar] [CrossRef]
  4. Alamri, M. M. (2022). Investigating students’ adoption of MOOCs during COVID-19 pandemic: Students’ academic self-efficacy, learning engagement, and learning persistence. Sustainability, 14(2), 714. [Google Scholar] [CrossRef]
  5. Aleixo, A. M., Leal, S., & Azeiteiro, U. M. (2021). Higher education students’ perceptions of sustainable development in Portugal. Journal of Cleaner Production, 327, 129429. [Google Scholar] [CrossRef]
  6. Al-Fraihat, D., Joy, M., Masa’deh, R., & Sinclair, J. (2020). Evaluating E-learning systems success: An empirical study. Computers in Human Behavior, 102, 67–86. [Google Scholar] [CrossRef]
  7. Al-Mekhlafi, A.-B. A., Othman, I., Kineber, A. F., Mousa, A. A., & Zamil, A. M. A. (2022). Modeling the Impact of Massive Open Online Courses (MOOC) Implementation Factors on Continuance Intention of Students: PLS-SEM Approach. Sustainability, 14(9), 9. [Google Scholar] [CrossRef]
  8. AL-Nuaimi, M. N., Al Sawafi, O. S., Malik, S. I., Al-Emran, M., & Selim, Y. F. (2022). Evaluating the actual use of learning management systems during the covid-19 pandemic: An integrated theoretical model. Interactive Learning Environments, 31(10), 6905–6930. [Google Scholar] [CrossRef]
  9. Al-Rahmi, W. M., Yahaya, N., Alamri, M. M., Alyoussef, I. Y., Al-Rahmi, A. M., & Kamin, Y. B. (2021). Integrating innovation diffusion theory with technology acceptance model: Supporting students’ attitude towards using a massive open online courses (MOOCs) systems. Interactive Learning Environments, 29(8), 1380–1392. [Google Scholar] [CrossRef]
  10. Anderson, T., & Dron, J. (2011). Three generations of distance education pedagogy. International Review of Research in Open and Distributed Learning, 12(3), 80–97. [Google Scholar] [CrossRef]
  11. Bai, S., Hew, K. F., & Huang, B. (2020). Does gamification improve student learning outcome? Evidence from a meta-analysis and synthesis of qualitative data in educational contexts. Educational Research Review, 30, 100322. [Google Scholar] [CrossRef]
  12. Bailey, D. R., Almusharraf, N., & Almusharraf, A. (2022). Video conferencing in the e-learning context: Explaining learning outcome with the technology acceptance model. Education and Information Technologies, 27(6), 7679–7698. [Google Scholar] [CrossRef] [PubMed]
  13. Castaneda, D. I., Manrique, L. F., & Cuellar, S. (2018). Is organizational learning being absorbed by knowledge management? A systematic review. Journal of Knowledge Management, 22(2), 299–325. [Google Scholar] [CrossRef]
  14. Chen, G., Chen, P., Huang, W., & Zhai, J. (2022). Continuance intention mechanism of middle school student users on online learning platform based on qualitative comparative analysis method. Mathematical Problems in Engineering, 2022(1), 3215337. [Google Scholar] [CrossRef]
  15. Cheng, J., Liu, Z., Zhou, F., Jiang, N., & Ou, Y. (2024). The application of informatization teaching and flipped classroom in the teaching of traditional culture in college languages. Journal of Electrical Systems, 20(2), 1877–1884. [Google Scholar]
  16. Cheng, Y. M. (2020). Students’ satisfaction and continuance intention of the cloud-based e-learning system: Roles of interactivity and course quality factors. Education+ Training, 62(9), 1037–1059. [Google Scholar] [CrossRef]
  17. Cho, M.-H., Oh, E. G., Chang, Y., & Hwang, S. (2024). Effects of personal and instructor goals on MOOC continuance intention. Distance Education, 46(2), 134–147. [Google Scholar] [CrossRef]
  18. Coman, C., Țîru, L. G., Meseșan-Schmitz, L., Stanciu, C., & Bularca, M. C. (2020). Online teaching and learning in higher education during the coronavirus pandemic: Students’ perspective. Sustainability, 12(24), 10367. [Google Scholar] [CrossRef]
  19. Da-Hong, L., Hong-Yan, L., Wei, L., Guo, J.-J., & En-Zhong, L. (2020). Application of flipped classroom based on the Rain Classroom in the teaching of computer-aided landscape design. Computer Applications in Engineering Education, 28(2), 357–366. [Google Scholar] [CrossRef]
  20. Daniels, L. M., Goegan, L. D., & Parker, P. C. (2021). The impact of COVID-19 triggered changes to instruction and assessment on university students’ self-reported motivation, engagement and perceptions. Social Psychology of Education, 24(1), 299–318. [Google Scholar] [CrossRef]
  21. Davis, F. D. (1989). Technology acceptance model: TAM. Al-Suqri, MN, Al-Aufi, AS: Information Seeking Behavior and Technology Adoption, 205, 5. [Google Scholar]
  22. DeVellis, R. F., & Thorpe, C. T. (2021). Scale development: Theory and applications. SAGE Publications. [Google Scholar]
  23. Du, B. (2023). Research on the factors influencing the learner satisfaction of MOOCs. Education and Information Technologies, 28(2), 1935–1955. [Google Scholar] [CrossRef]
  24. Dwivedi, Y. K., Hughes, D. L., Coombs, C., Constantiou, I., Duan, Y., Edwards, J. S., Gupta, B., Lal, B., Misra, S., Prashant, P., Raman, R., Rana, N. P., Sharma, S. K., & Upadhyay, N. (2020). Impact of COVID-19 pandemic on information management research and practice: Transforming education, work and life. International Journal of Information Management, 55, 102211. [Google Scholar] [CrossRef]
  25. Erkens, M., & Bodemer, D. (2019). Improving collaborative learning: Guiding knowledge exchange through the provision of information about learning partners and learning contents. Computers & Education, 128, 452–472. [Google Scholar] [CrossRef]
  26. Feng, L., Shen, X., Xie, Z., & Yan, X. (2024). How gamification-based course drives online learners’ engagement: Focusing on intrinsic motivation and effect mechanism. Education and Information Technologies, 30, 10943–10977. [Google Scholar] [CrossRef]
  27. Ferrer, J., Ringer, A., Saville, K., A Parris, M., & Kashi, K. (2022). Students’ motivation and engagement in higher education: The importance of attitude to online learning. Higher Education, 83(2), 317–338. [Google Scholar] [CrossRef]
  28. Franque, F. B., Oliveira, T., Tam, C., & Santini, F. de O. (2020). A meta-analysis of the quantitative studies in continuance intention to use an information system. Internet Research, 31(1), 123–158. [Google Scholar] [CrossRef]
  29. Gao, H., & Li, F. (2024). The application of virtual reality technology in the teaching of clarinet music art under the mobile wireless network learning environment. Entertainment Computing, 49, 100619. [Google Scholar] [CrossRef]
  30. Haagen-Schützenhöfer, C., & Hopf, M. (2020). Design-based research as a model for systematic curriculum development: The example of a curriculum for introductory optics. Physical Review Physics Education Research, 16(2), 020152. [Google Scholar] [CrossRef]
  31. Hassenzahl, M. (2018). The Thing and I: Understanding the relationship between user and product. In M. Blythe, & A. Monk (Eds.), Funology 2: From usability to enjoyment (pp. 301–313). Springer International Publishing. [Google Scholar] [CrossRef]
  32. Hassenzahl, M., Burmester, M., & Koller, F. (2021). User experience is all there is: Twenty years of designing positive experiences and meaningful technology. I-Com, 20(3), 197–213. [Google Scholar] [CrossRef]
  33. Hlosta, M., Herodotou, C., Papathoma, T., Gillespie, A., & Bergamin, P. (2022). Predictive learning analytics in online education: A deeper understanding through explaining algorithmic errors. Computers and Education: Artificial Intelligence, 3, 100108. [Google Scholar] [CrossRef]
  34. Howell, R. A. (2021). Engaging students in education for sustainable development: The benefits of active learning, reflective practices and flipped classroom pedagogies. Journal of Cleaner Production, 325, 129318. [Google Scholar] [CrossRef]
  35. Huang, F., Teo, T., & Guo, J. (2021). Understanding English teachers’ non-volitional use of online teaching: A Chinese study. System, 101, 102574. [Google Scholar] [CrossRef]
  36. Hung, M.-L., Chou, C., Chen, C.-H., & Own, Z.-Y. (2010). Learner readiness for online learning: Scale development and student perceptions. Computers & Education, 55(3), 1080–1090. [Google Scholar] [CrossRef]
  37. Iglesias-Pradas, S., Hernández-García, Á., Chaparro-Peláez, J., & Prieto, J. L. (2021). Emergency remote teaching and students’ academic performance in higher education during the COVID-19 pandemic: A case study. Computers in Human Behavior, 119, 106713. [Google Scholar] [CrossRef]
  38. Jebb, A. T., Ng, V., & Tay, L. (2021). A review of key Likert scale development advances: 1995–2019. Frontiers in Psychology, 12, 637547. [Google Scholar] [CrossRef]
  39. Jia, C., Hew, K. F., Bai, S., & Huang, W. (2022). Adaptation of a conventional flipped course to an online flipped format during the Covid-19 pandemic: Student learning performance and engagement. Journal of Research on Technology in Education, 54(2), 281–301. [Google Scholar] [CrossRef]
  40. Jia, Y., & Zhang, L. (2021). Research and application of online SPOC teaching mode in analog circuit course. International Journal of Educational Technology in Higher Education, 18(1), 10. [Google Scholar] [CrossRef]
  41. Jiang, T., Luo, G., Wang, Z., & Yu, W. (2022). Research into influencing factors in user experiences of university mobile libraries based on mobile learning mode. Library Hi Tech, 42(2), 564–579. [Google Scholar] [CrossRef]
  42. Jimoyiannis, A. (2010). Designing and implementing an integrated technological pedagogical science knowledge framework for science teachers professional development. Computers & Education, 55(3), 1259–1269. [Google Scholar] [CrossRef]
  43. John, D., Hussin, N., Zaini, M. K., Ametefe, D. S., Aliu, A. A., & Caliskan, A. (2023). Gamification equilibrium: The fulcrum for balanced intrinsic motivation and extrinsic rewards in learning systems: Immersive gamification in Muhamad Khairulnizam ZainiLearning system. International Journal of Serious Games, 10(3), 3. [Google Scholar] [CrossRef]
  44. Jung, E., Kim, D., Yoon, M., Park, S., & Oakley, B. (2019). The influence of instructional design on learner control, sense of achievement, and perceived effectiveness in a supersize MOOC course. Computers & Education, 128, 377–388. [Google Scholar] [CrossRef]
  45. Kline, R. B. (1998). Software review: Software programs for structural equation modeling: Amos, EQS, and LISREL. Journal of Psychoeducational Assessment, 16(4), 343–364. [Google Scholar] [CrossRef]
  46. Kuo, Y.-C., Lin, H.-C. K., Lin, Y.-H., Wang, T.-H., & Chuang, B.-Y. (2023). The influence of distance education and peer self-regulated learning mechanism on learning effectiveness, motivation, self-efficacy, reflective ability, and cognitive load. Sustainability, 15(5), 4501. [Google Scholar] [CrossRef]
  47. Lai, J. W. M., De Nobile, J., Bower, M., & Breyer, Y. (2022). Comprehensive evaluation of the use of technology in education—Validation with a cohort of global open online learners. Education and Information Technologies, 27(7), 9877–9911. [Google Scholar] [CrossRef]
  48. Larmuseau, C., Desmet, P., & Depaepe, F. (2019). Perceptions of instructional quality: Impact on acceptance and use of an online learning environment. Interactive Learning Environments, 27(7), 953–964. [Google Scholar] [CrossRef]
  49. Laugwitz, B., Held, T., & Schrepp, M. (2008). Construction and evaluation of a user experience questionnaire. In A. Holzinger (Ed.), HCI and usability for education and work (pp. 63–76). Springer. [Google Scholar] [CrossRef]
  50. Lazim, C. S. L., & Ismail, N. D. B. (2021). Application of technology acceptance model (tam) towards online learning during covid-19 pandemic: Accounting students perspective. International Journal of Business, Economics and Law, 24(1), 13–20. [Google Scholar]
  51. Lee, J., & Jung, I. (2021). Instructional changes instigated by university faculty during the COVID-19 pandemic: The effect of individual, course and institutional factors. International Journal of Educational Technology in Higher Education, 18(1), 52. [Google Scholar] [CrossRef]
  52. Liu, B., & Yuan, D. (2024). Research on personalized teaching strategies based on learner profiles in a blended learning environment. International Journal of Information and Communication Technology Education (IJICTE), 20(1), 1–25. [Google Scholar] [CrossRef]
  53. Liu, W., Wang, Y., & Wang, Z. (2020). An empirical study of continuous use behavior in virtual learning community. PLoS ONE, 15(7), e0235814. [Google Scholar] [CrossRef]
  54. Liu, Y., Su, H., Nie, Q., & Song, Y. (2022). The distance learning framework for design-related didactic based on cognitive immersive experience. In P. Zaphiris, & A. Ioannou (Eds.), Learning and collaboration technologies. Novel technological environments (pp. 81–96). Springer International Publishing. [Google Scholar] [CrossRef]
  55. Mamun, M. A. A., Lawrie, G., & Wright, T. (2020). Instructional design of scaffolded online learning modules for self-directed and inquiry-based learning environments. Computers & Education, 144, 103695. [Google Scholar] [CrossRef]
  56. Mansour, N. (2024). Students’ and facilitators’ experiences with synchronous and asynchronous online dialogic discussions and e-facilitation in understanding the Nature of Science. Education and Information Technologies, 29(12), 15965–15997. [Google Scholar] [CrossRef]
  57. Martin, F., Sun, T., & Westine, C. D. (2020). A systematic review of research on online teaching and learning from 2009 to 2018. Computers & Education, 159, 104009. [Google Scholar] [CrossRef]
  58. Mayer, R. E. (2017). Using multimedia for e-learning. Journal of Computer Assisted Learning, 33(5), 403–423. [Google Scholar] [CrossRef]
  59. Mei, W., Ramasamy, S., Dawod, A., & Xi, Y. (2023, November 23–25). Exploring the user experience model of online teaching platforms in the post-pandemic Era. 1st International Conference on Artificial Intelligence, Communication, IoT, Data Engineering and Security, IACIDS 2023, Lavasa, Pune, India. [Google Scholar] [CrossRef]
  60. Mei, W., Ramasamy, S. S., & Dawod, A. Y. (2025). Post-pandemic era user experience model and technology acceptance for online teaching platforms in China. International Journal of Innovative Research and Scientific Studies, 8(3), 6934. [Google Scholar] [CrossRef]
  61. Menabo, L., Skrzypiec, G., Sansavini, A., Brighi, A., & Guarini, A. (2022). Distance Education among Italian Teachers: Differences and Experiences. Education and Information Technologies, 27(7), 9263–9292. [Google Scholar] [CrossRef]
  62. Morris, N. P., Swinnerton, B., & Coop, T. (2019). Lecture recordings to support learning: A contested space between students and teachers. Computers & Education, 140, 103604. [Google Scholar] [CrossRef]
  63. Nong, L., Liu, G., Tang, C., & Chen, Y. (2023). The design and implementation of campus informatization in Chinese universities: A Conceptual framework. Sustainability, 15(6), 4732. [Google Scholar] [CrossRef]
  64. Norman, W. L. (2004). Hassenzahl. Nielsen Norman Group. Available online: https://www.nngroup.com/articles/definition-user-experience/ (accessed on 3 October 2023).
  65. Oliveira, G., Grenha Teixeira, J., Torres, A., & Morais, C. (2021). An exploratory study on the emergency remote education experience of higher education students and teachers during the COVID-19 pandemic. British Journal of Educational Technology, 52(4), 1357–1376. [Google Scholar] [CrossRef]
  66. Oreopoulos, P., Patterson, R. W., Petronijevic, U., & Pope, N. G. (2022). Low-touch attempts to improve time management among traditional and online college students. Journal of Human Resources, 57(1), 1–43. [Google Scholar] [CrossRef]
  67. Park, N., Lee, K. M., & Cheong, P. H. (2007). University instructors’ acceptance of electronic courseware: An application of the technology acceptance model. Journal of Computer-Mediated Communication, 13(1), 163–186. [Google Scholar] [CrossRef]
  68. Plass, J. L., Heidig, S., Hayward, E. O., Homer, B. D., & Um, E. (2014). Emotional design in multimedia learning: Effects of shape and color on affect and learning. Learning and Instruction, 29, 128–140. [Google Scholar] [CrossRef]
  69. Qu, J. (2021). Research on mobile learning in a teaching information service system based on a big data driven environment. Education and Information Technologies, 26(5), 6183–6201. [Google Scholar] [CrossRef]
  70. Quadir, B., & Yang, J. C. (2024). Interactive learning with WeChat’s Rain Classroom in a blended setting: The influence of three types of interactions on learning performance. Educational Technology & Society, 27(3), 147–164. [Google Scholar]
  71. Rautela, S., Sharma, S., & Virani, S. (2024). Learner-learner interactions in online classes during COVID-19 pandemic: The mediating role of social media in the higher education context. Interactive Learning Environments, 32(2), 639–654. [Google Scholar] [CrossRef]
  72. Scherer, R., Siddiq, F., & Tondeur, J. (2019). The technology acceptance model (TAM): A meta-analytic structural equation modeling approach to explaining teachers’ adoption of digital technology in education. Computers & Education, 128, 13–35. [Google Scholar] [CrossRef]
  73. Seifert, T., & Bar-Tal, S. (2023). Student-teachers’ sense of belonging in collaborative online learning. Education and Information Technologies, 28(7), 7797–7826. [Google Scholar] [CrossRef]
  74. Shen, C. (2018). A transdisciplinary review of deep learning research and its relevance for water resources scientists. Water Resources Research, 54(11), 8558–8593. [Google Scholar] [CrossRef]
  75. Silva, P. (2015). Davis’ Technology Acceptance Model (TAM) (1989). In Information seeking behavior and technology adoption: Theories and trends (pp. 205–219). IGI Global. [Google Scholar] [CrossRef]
  76. Soares, M. M., Rosenzweig, E., & Marcus, A. (2021). Design, user experience, and usability: Design for diversity, well-being, and social development: 10th international conference, DUXU 2021, held as part of the 23rd HCI international conference, HCII 2021, Virtual Event, July 24–29, 2021, Proceedings, Part II. Springer Nature. [Google Scholar]
  77. Sobaih, A. E. E., Salem, A. E., Hasanein, A. M., & Elnasr, A. E. A. (2021). Responses to COVID-19 in higher education: Students’ learning experience using Microsoft Teams versus social network sites. Sustainability, 13(18), 10036. [Google Scholar] [CrossRef]
  78. South, L., Saffo, D., Vitek, O., Dunne, C., & Borkin, M. A. (2022). Effective use of likert scales in visualization evaluations: A systematic review. Computer Graphics Forum, 41(3), 43–55. [Google Scholar] [CrossRef]
  79. Su, J.-M., Tseng, S.-S., Lin, H.-Y., & Chen, C.-H. (2011). A personalized learning content adaptation mechanism to meet diverse user needs in mobile learning environments. User Modeling and User-Adapted Interaction, 21(1), 5–49. [Google Scholar] [CrossRef]
  80. Tang, P., Yao, Z., Luan, J., & Xiao, J. (2022). How information presentation formats influence usage behaviour of course management systems: Flow diagram navigation versus menu navigation. Behaviour & Information Technology, 41(2), 383–400. [Google Scholar] [CrossRef]
  81. Tang, Y., & Hew, K. F. (2022). Effects of using mobile instant messaging on student behavioral, emotional, and cognitive engagement: A quasi-experimental study. International Journal of Educational Technology in Higher Education, 19(1), 3. [Google Scholar] [CrossRef]
  82. Tao, D., Fu, P., Wang, Y., Zhang, T., & Qu, X. (2022). Key characteristics in designing massive open online courses (MOOCs) for user acceptance: An application of the extended technology acceptance model. Interactive Learning Environments, 30(5), 882–895. [Google Scholar] [CrossRef]
  83. Tawafak, R. M., Romli, A. B. T., Arshah, R. bin A., & Malik, S. I. (2020). Framework design of university communication model (UCOM) to enhance continuous intentions in teaching and e-learning process. Education and Information Technologies, 25(2), 817–843. [Google Scholar] [CrossRef]
  84. Theelen, H., & van Breukelen, D. H. J. (2022). The didactic and pedagogical design of e-learning in higher education: A systematic literature review. Journal of Computer Assisted Learning, 38(5), 1286–1303. [Google Scholar] [CrossRef]
  85. Van Wart, M., Ni, A., Medina, P., Canelon, J., Kordrostami, M., Zhang, J., & Liu, Y. (2020). Integrating students’ perspectives about online learning: A hierarchy of factors. International Journal of Educational Technology in Higher Education, 17(1), 53. [Google Scholar] [CrossRef]
  86. Vartiainen, H., Vuojärvi, H., Saramäki, K., Eriksson, M., Ratinen, I., Torssonen, P., Vanninen, P., & Pöllänen, S. (2022). Cross-boundary collaboration and knowledge creation in an online higher education course. British Journal of Educational Technology, 53(5), 1304–1320. [Google Scholar] [CrossRef]
  87. Venkatesh, V., & Bala, H. (2008). Technology acceptance Model 3 and a research agenda on interventions. Decision Sciences, 39(2), 273–315. [Google Scholar] [CrossRef]
  88. Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46(2), 186–204. [Google Scholar] [CrossRef]
  89. Vlachogianni, P., & Tselios, N. (2022). Perceived usability evaluation of educational technology using the System Usability Scale (SUS): A systematic review. Journal of Research on Technology in Education, 54(3), 392–409. [Google Scholar] [CrossRef]
  90. Wang, C., Mirzaei, T., Xu, T., & Lin, H. (2022). How learner engagement impacts non-formal online learning outcomes through value co-creation: An empirical analysis. International Journal of Educational Technology in Higher Education, 19(1), 32. [Google Scholar] [CrossRef] [PubMed]
  91. Wang, M., Ramasamy, S. S., Dawod, A. Y., Wang, S., & Zhang, H. (2025, March 14–16). Development and validation of the online course user experience scale: A multidimensional assessment framework. 2025 14th International Conference on Educational and Information Technology (ICEIT) (pp. 374–379), Guangzhou, China. [Google Scholar] [CrossRef]
  92. Wang, M., Ramasamy, S. S., Yu, X., Liu, M., Dawod, A. Y., & Chen, H. (2024). User sentiment analysis of the shared charging service for China’s G318 route. Electronics, 13(22), 4335. [Google Scholar] [CrossRef]
  93. Wang, M., & Zhao, Z. (2022). A cultural-centered model based on user experience and learning preferences of online teaching platforms for Chinese national university students: Taking teaching platforms of WeCom, VooV meeting, and DingTalk as examples. Systems, 10(6), 6. [Google Scholar] [CrossRef]
  94. Wang, S., & Lee, S.-K. (2024). Exploring the role of learning goal orientation, instructor reputation, parasocial interaction, and tutor intervention in university students’ MOOC retention: A TAM-TRA based examination. PLoS ONE, 19(9), e0299014. [Google Scholar] [CrossRef] [PubMed]
  95. Wang, S., Liu, Y., Song, F., Xie, X., & Yu, D. (2021). Research on Evaluation System of User Experience With Online Live Course Platform. IEEE Access, 9, 23863–23875. [Google Scholar] [CrossRef]
  96. Wang, W.-S., Lin, C.-J., Lee, H.-Y., Wu, T.-T., & Huang, Y.-M. (2024). Feedback mechanism in immersive virtual reality influences physical hands-on task performance and cognitive load. International Journal of Human–Computer Interaction, 40(15), 4103–4115. [Google Scholar] [CrossRef]
  97. Wei, X., Saab, N., & Admiraal, W. (2021). Assessment of cognitive, behavioral, and affective learning outcomes in massive open online courses: A systematic literature review. Computers & Education, 163, 104097. [Google Scholar] [CrossRef]
  98. Weinberger, A., & Fischer, F. (2006). A framework to analyze argumentative knowledge construction in computer-supported collaborative learning. Computers & Education, 46(1), 71–95. [Google Scholar] [CrossRef]
  99. Wong, G.-Z., Wong, K.-H., Lau, T.-C., Lee, J.-H., & Kok, Y.-H. (2024). Study of intention to use renewable energy technology in Malaysia using TAM and TPB. Renewable Energy, 221, 119787. [Google Scholar] [CrossRef]
  100. Wu, B., & Wang, Y. (2022). Formation mechanism of popular courses on MOOC platforms: A configurational approach. Computers & Education, 191, 104629. [Google Scholar] [CrossRef]
  101. Wu, D., Zhang, S., Ma, Z., Yue, X.-G., & Dong, R. K. (2024). Unlocking potential: Key factors shaping undergraduate self-directed learning in AI-enhanced educational environments. Systems, 12(9), 332. [Google Scholar] [CrossRef]
  102. Xiao, Y., & Hew, K. F. T. (2024). Intangible rewards versus tangible rewards in gamified online learning: Which promotes student intrinsic motivation, behavioural engagement, cognitive engagement and learning performance? British Journal of Educational Technology, 55(1), 297–317. [Google Scholar] [CrossRef]
  103. Yan, Y., Zuo, M., & Luo, H. (2024). Investigating co-teaching presence and its impact on student engagement: A mixed-method study on the blended synchronous classroom. Computers & Education, 222, 105153. [Google Scholar] [CrossRef]
  104. Yang, H. (2024). The utility of remote work solutions in the post-pandemic era: Exploring the mediating effects of productivity and work flexibility. Technology in Society, 78, 102613. [Google Scholar] [CrossRef]
  105. Yang, H.-H., & Su, C.-H. (2017). Learner Behaviour in a MOOC practice-oriented course: In empirical study integrating TAM and TPB. International Review of Research in Open and Distributed Learning: IRRODL, 18(5), 35–63. [Google Scholar] [CrossRef]
  106. Yang, M., Shao, Z., Liu, Q., & Liu, C. (2017). Understanding the quality factors that influence the continuance intention of students toward participation in MOOCs. Educational Technology Research and Development, 65(5), 1195–1214. [Google Scholar] [CrossRef]
  107. Yassine, S., Kadry, S., & Sicilia, M.-A. (2022). Detecting communities using social network analysis in online learning environments: Systematic literature review. WIREs Data Mining and Knowledge Discovery, 12(1), e1431. [Google Scholar] [CrossRef]
  108. Yu, Z., & Xu, W. (2022). A meta-analysis and systematic review of the effect of virtual reality technology on users’ learning outcomes. Computer Applications in Engineering Education, 30(5), 1470–1484. [Google Scholar] [CrossRef]
  109. Zardari, B. A., Hussain, Z., Arain, A. A., Rizvi, W. H., & Vighio, M. S. (2021). Development and Validation of User Experience-Based E-Learning Acceptance Model for Sustainable Higher Education. Sustainability, 13(11), 11. [Google Scholar] [CrossRef]
  110. Zhang, J., Huang, Y., Wu, F., Kan, W., & Zhu, X. (2024). Scaling up online professional development through institution-initiated blended learning programs in higher education. The Internet and Higher Education, 65, 100988. [Google Scholar] [CrossRef]
  111. Zhou, L., Xue, S., & Li, R. (2022). Extending the Technology Acceptance Model to Explore Students’ Intention to Use an Online Education Platform at a University in China. Sage Open, 12(1), 21582440221085259. [Google Scholar] [CrossRef]
  112. Zhu, Y., Zhang, J. H., Au, W., & Yates, G. (2020). University students’ online learning attitudes and continuous intention to undertake online courses: A self-regulated learning perspective. Educational Technology Research and Development, 68(3), 1485–1519. [Google Scholar] [CrossRef]
  113. Zuo, M., Hu, Y., Luo, H., Ouyang, H., & Zhang, Y. (2022). K-12 students’ online learning motivation in China: An integrated model based on community of inquiry and technology acceptance theory. Education and Information Technologies, 27(4), 4599–4620. [Google Scholar] [CrossRef] [PubMed]
  114. Zuo, M., Yan, Y., Ma, Y., & Luo, H. (2024). Modeling the factors that influence schoolteachers’ work engagement and continuance intention when teaching online. Education and Information Technologies, 29(8), 9091–9119. [Google Scholar] [CrossRef]
Figure 1. ASU model for online course users.
Figure 1. ASU model for online course users.
Education 15 00855 g001
Figure 2. CFA measurement model path diagram.
Figure 2. CFA measurement model path diagram.
Education 15 00855 g002
Figure 3. SEM analysis model diagram.
Figure 3. SEM analysis model diagram.
Education 15 00855 g003
Table 1. Building blocks of user experience.
Table 1. Building blocks of user experience.
User ExperienceElements of CompositionReferences
User experience in MOOCsPlatform, resources, community, interaction, behavior control, attitude, emotion, collaboration, instructional design, differentiation, teachers, organization, management, policy(Al-Fraihat et al., 2020; B. Wu & Wang, 2022; Martin et al., 2020)
User experience in online teachingCourse platform, teacher–student interaction, student–student interaction, individual learning, learning motivation, learning effect, learning attitude, learning participation(Zuo et al., 2022; Lazim & Ismail, 2021; Al-Adwan, 2020; Rautela et al., 2024)
User experience in smart learning environmentsLearning activities, physical environment, resource access, lively interaction, content presentation, learning equipment(Huang et al., 2021; Menabo et al., 2022)
User experience in online teaching platformsSensory experience, interaction experience, learning experience, design experience, personalization(Zardari et al., 2021; Tawafak et al., 2020; M. Wang & Zhao, 2022; M. Wang et al., 2024)
User experience in online coursesTeaching objectives, teaching difficulty, teaching burden, emphasis on independence, teacher control, curriculum value(Cho et al., 2024; S. Wang & Lee, 2024; M. Wang et al., 2025)
Table 2. Extended dimensions of TAM-based models.
Table 2. Extended dimensions of TAM-based models.
ModelVariablesReferences
TAM + TPBRenewable Energy Usage Intention, Attitude Towards Renewable Energy, Subjective Norm, Perceived Behavioral Control, Perceived Ease of Use, Perceived Usefulness(Wong et al., 2024)
TAM + TPACKTeacher Self-Efficacy, Perceived Ease of Use, Perceived Usefulness, Continuous Use Intention, Social Influence, Facilitating Citations, Top Management Support(Al-Adwan et al., 2024)
TAM + ECMPerceptual Normativeness, Perceived Enjoyment, Perceived Interactivity, Perceived Value, Expectation Confirmation, Perceived Usefulness, Customer Satisfaction, Continuous Use Intention, Continuous Use Behavior, Self-Efficacy (SLE), Contributing Factor(W. Liu et al., 2020)
TAM + TPB + IS Success ModelActual Use, Behavioral Intention, Information Quality, Technical System Quality, Service Quality, Perceived Behavioral Control, Perceived Ease of Use, Perceived Usefulness, Subjective Norms(AL-Nuaimi et al., 2022)
TAM + UCOMPerceived Usefulness, Perceived Ease of Use, Interactivity, Teacher Subject Knowledge, Course Content, Technology Integration, Behavior Intention, Academic Performance, Effectiveness, Student Satisfaction, Support-Assessment, Continued Intention to Use(Tawafak et al., 2020)
Table 3. Reliability of each dimension after removing items from pre-survey data.
Table 3. Reliability of each dimension after removing items from pre-survey data.
ItemIECQLOTQTSLMTotal
Alpha0.8830.8690.8340.8540.7960.8850.908
N of Items1066661045
Table 4. Pre-survey data validity analysis: KMO and Bartlett’s test results.
Table 4. Pre-survey data validity analysis: KMO and Bartlett’s test results.
Kaiser–Meyer–Olkin Measure of Sampling Adequacy0.784
Bartlett’s Test of SphericityApprox. Chi-Square4291.41
df990
Sig.0
Table 5. Demographic characteristics of user formal survey respondents.
Table 5. Demographic characteristics of user formal survey respondents.
ProvincesDistributionFrequencyPercentageProvincesDistributionFrequencyPercentage
Anhui Province224.81%Jilin Province82.32%
Hainan Province50.97%Jiangsu Province399.46%
Beijing102.12%Jiangxi Province224.25%
Fujian Province152.90%Liaoning Province184.44%
Gansu Province20.39%Inner Mongolia Autonomous Region30.39%
Guangdong Province6715.64%Ningxia Hui Autonomous Region20.39%
Guangxi Zhuang Autonomous Region173.28%Shaanxi Province101.93%
Guizhou Province30.58%Yunnan Province41.16%
Hebei Province185.41%Shandong Province377.14%
Henan Province224.25%Shanxi Province255.79%
Heilongjiang Province101.93%Shanghai City132.51%
Hubei Province132.51%Sichuan Province204.44%
Hunan Province173.28%Zhejiang Province193.67%
Xizang10.19%Tianjin City40.77%
Chongqing City72.12%Xinjiang40.77%
Answer timeShortest100S Age>2418540.48%
Longest1036S 18–2427259.52%
GenderMale14030.63%
Female31769.37%Length of teachingLess than 1 h7314.09%
Level of teachingSpecialist department327.00%1 to 2 h22543.44%
Undergraduate29464.33%2–3 h16932.63%
Master’s degree or above13128.67%More than 3 h519.85%
Table 6. Reliability Analysis Results for Each Dimension of the User Experience Scale.
Table 6. Reliability Analysis Results for Each Dimension of the User Experience Scale.
ItemIECQLOTQTSLMTotal
Alpha0.8840.8540.8440.8540.8430.890.931
N of Items1066661045
Table 7. Validity analysis: KMO and Bartlett’s test results of the User Experience Scale.
Table 7. Validity analysis: KMO and Bartlett’s test results of the User Experience Scale.
Kaiser–Meyer–Olkin Measure of Sampling Adequacy0.899
Bartlett’s Test of SphericityApprox. Chi-Square11,411.663
df990
Sig.0
Table 8. Results of CFA fitting index analysis.
Table 8. Results of CFA fitting index analysis.
Statistical IndicatorsCMIN/DFRMSEAIFITLICFIPNFIPCFI
Index of fit1.1660.0190.9860.9850.9860.8430.806
Criteria of fit1~3<0.05>0.9>0.9>0.9>0.5>0.5
Is it up to standard?YesYesYesYesYesYesYes
Table 9. Pearson correlation analysis results.
Table 9. Pearson correlation analysis results.
DimensionIECQLOTQTSLMPUPEOUUEASU
IE1
CQ0.324 **1
LO0.347 **0.368 **1
TQ0.347 **0.332 **0.390 **1
TS0.377 **0.387 **0.392 **0.539 **1
LM0.299 **0.274 **0.327 **0.332 **0.385 **1
PU0.0720.0730.0150.0560.0090.0411
PEOU0.0520.0610.0170.0070.060.0590.634 **1
UE0.0340.098 *0.0570.0030.0480.0630.603 **0.616 **1
ASU0.0210.114 *0.0750.0060.0810.0230.486 **0.506 **0.629 **1
* Correlation is significant at the 0.05 level (2-tailed); ** Correlation is significant at the 0.01 level (2-tailed).
Table 10. Results of discriminant validity test.
Table 10. Results of discriminant validity test.
ItemTSIECQLOTQLM
TS0.785
IE0.386 ***0.722
CQ0.44 ***0.458 ***0.755
LO0.426 ***0.491 ***0.518 ***0.791
TQ0.462 ***0.514 ***0.497 ***0.593 ***0.785
LM0.4 ***0.407 ***0.368 ***0.451 ***0.481 ***0.782
AVE0.6170.5210.5700.6260.6160.611
*** p < 0.001.
Table 11. Reliability Analysis Results for Each Dimension of the Actual System Use Model.
Table 11. Reliability Analysis Results for Each Dimension of the Actual System Use Model.
ItemPUPEOUUEASUTotal
Alpha0.8560.8490.8390.8650.926
N of Items444416
Table 12. Validity analysis: KMO and Bartlett’s test results of the Actual System Use Model.
Table 12. Validity analysis: KMO and Bartlett’s test results of the Actual System Use Model.
Kaiser–Meyer–Olkin Measure of Sampling Adequacy0.932
Bartlett’s Test of SphericityApprox. Chi-Square4004.985
df120
Sig.0
Table 13. Results of SEM fitting index analysis.
Table 13. Results of SEM fitting index analysis.
Statistical IndicatorsCMIN/DFRMSEAIFITLICFIPNFIPCFI
Index of fit1.5640.0350.9340.930.9340.7960.888
Criteria of fit1~3<0.05>0.9>0.9>0.9>0.5>0.5
Is it up to standard?YesYesYesYesYesYesYes
Table 14. SEM path relationship test results.
Table 14. SEM path relationship test results.
PathEstimateS.E.C.R.p
PEOU ← CQ0.1920.035.29***
PEOU ← IE0.1230.0792.0330.042
PEOU ← LM0.2610.0326.76***
PEOU ← TS0.2670.0297.076***
PEOU ← TQ0.1930.0295.177***
PEOU ← LO0.1420.0313.625***
PU ← PEOU0.7010.06411.829***
PU ← CQ0.1930.0295.308***
PU ← IE0.5680.04913.727***
PU ← LM0.4010.059.077***
PU ← TS0.1640.0433.661***
PU ← TQ0.1140.0591.9710.049
PU ← LO0.2460.0296.235***
UE ← PU0.3720.0775.56***
UE ← PEOU0.4750.0856.83***
*** p < 0.001.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, M.; Ramasamy, S.S.; Dawod, A.Y.; Yu, X. Development and TAM-Based Validation of a User Experience Scale for Actual System Use in Online Courses. Educ. Sci. 2025, 15, 855. https://doi.org/10.3390/educsci15070855

AMA Style

Wang M, Ramasamy SS, Dawod AY, Yu X. Development and TAM-Based Validation of a User Experience Scale for Actual System Use in Online Courses. Education Sciences. 2025; 15(7):855. https://doi.org/10.3390/educsci15070855

Chicago/Turabian Style

Wang, Mei, Siva Shankar Ramasamy, Ahmad Yahya Dawod, and Xi Yu. 2025. "Development and TAM-Based Validation of a User Experience Scale for Actual System Use in Online Courses" Education Sciences 15, no. 7: 855. https://doi.org/10.3390/educsci15070855

APA Style

Wang, M., Ramasamy, S. S., Dawod, A. Y., & Yu, X. (2025). Development and TAM-Based Validation of a User Experience Scale for Actual System Use in Online Courses. Education Sciences, 15(7), 855. https://doi.org/10.3390/educsci15070855

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop