Next Article in Journal
The Integration of Reliability, Availability, and Maintainability into Model-Based Systems Engineering
Next Article in Special Issue
Effectiveness of Virtual Reality on Attention Training for Elementary School Students
Previous Article in Journal
Global Quality Management System (G-QMS) in Systems of Systems (SoS)—Aspects of Definition, Structure and Model
Previous Article in Special Issue
A Model for Knowledge Management Systems in the UbuntuNet Alliance Member Institutions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

How to Promote Online Education through Educational Software—An Analytical Study of Factor Analysis and Structural Equation Modeling with Chinese Users as an Example

School of Design, Jiangnan University, Wuxi 214122, China
*
Author to whom correspondence should be addressed.
Systems 2022, 10(4), 100; https://doi.org/10.3390/systems10040100
Submission received: 31 May 2022 / Revised: 2 July 2022 / Accepted: 6 July 2022 / Published: 11 July 2022

Abstract

:
Online learning has emerged as a fresh method to successfully prevent teacher and student gatherings as well as the propagation of viruses in the context of the ongoing influence of the COVID-19. A problem deserving of consideration is how to increase users’ desire to participate in online learning through online class APPs, identify the variables that affect users’ use of them, and create a useful assessment scale. In this study, user ratings from 68 students who had used an online class APP were collected using the combination of qualitative and quantitative research, and 200 online questionnaires were sent out to complement the interview findings, based on which 328 assessment questionnaires were gathered and 23 valid items were obtained. The influencing elements that had an impact on users’ online learning experiences were identified using factor analysis, and the relationships among the components were investigated using structural equation modeling. Perceived benefits are the main influencing factors, subjective norms and functional quality are the direct factors influencing users’ perceived benefits, and self-efficacy is influenced by subjective norms while promoting the perception of functional quality. The factors influencing users’ use of online class APPs are ultimately identified as perceived benefits, functional quality and self-efficacy. In order to facilitate users’ online learning, user psychological traits, social ties and software functions should be integrated into a cohesive system while designing online class APPs.

1. Introduction

Due to the disruption of education for 90 percent of the world’s school-age children caused by the COVID-19 epidemic, traditional school systems in 26 countries composed of classrooms were completely shut down [1]; however, in China, a large portion of educational resources have been transferred to online delivery, which has contributed to the development of online education to some extent. With the advancement of mobile Internet technology, online education is seen as a self-directed technique that enables users to finish their learning utilizing small, portable wireless devices [2]. With the tremendous development of mobile users globally, online education has a bright future [3]. As a result, it is critical to identify the elements that impact users’ use of online classroom APPs. Although there have been a large number of studies on online education and online classroom APPs since the outbreak of the COVID-19, the majority of the relevant studies have been focused on specific professional disciplines, such as medical care [4,5] and teacher education [6,7,8,9], while studies on online class APPs have primarily been focused on the features of software itself, such as usability [10,11,12] and interface design [13], leaving the lack of a systematic and comprehensive discussion on the factors influencing users’ use of online classroom APPs.
At the same time, the research on online class APPs in previous papers is primarily carried out through quantitative analysis [14,15,16,17], while through this method, how a certain factor or theory can influence users’ willingness to use can be explained, but it cannot prove that users will pay attention to this element when using the software. The questionnaire scale of the original theoretical model may not be entirely suitable for the evaluation of online class APPs, and typical single research approaches have several disadvantages. For this reason, a combined quantitative and qualitative approach will be taken in this study. By establishing the factors that users truly care about when assessing online class APPs through qualitative study approaches such as user research, online open-ended questionnaire collecting, summarization and discussion based on current theories, appropriate assessment scales are developed. We may assist a qualitative study with quantitative research techniques such as data analysis and hypothesis testing, meanwhile examining the relationships among the influencing elements to give some guidance for the creation and management of online class APPs.
In this study, the development status of the online class APPs and the relevant theoretical models used are analyzed in detail in Section 2 together with the shortcomings of the current study, and the significance of this study as well as the rationality of the research methods is argued; Section 3 provides information about participants of the study, the research design and methodologies as well as a brief summary of the findings through data analysis; Section 4 details the results of the analysis of each data, and the related results of the judgment indicators; Section 5 provides a discussion of the literature to name the factors obtained from the data analysis in the “Section 4” and to explore the relationships among the factors using the method of structural equation modeling, and ultimately, to make certain management recommendations; and Section 6 provides a brief summary of the study, in which the existing shortcomings and future directions of the study are pointed out.

2. Literature

2.1. Current Status of Online Education Development

The Internet has had a significant impact on the evolution of education [18,19], with online education being the most common. Online learning can take place in classrooms with teachers and students face to face, where digital media are used to complete learning tasks, communication and interaction [20], or distance education is conducted without face-to-face communication or interaction between teachers and students through various communication tools [21]. Distance online education is the emphasis of this issue, and Massive Open Online Courses (MOOCs) are the most common teaching tool for professors to conduct teaching courses over the Internet [22,23], which have features such as video lecture recording, establishing discussion boards and real-time chats [24]. This technology has the potential to benefit students in locations where educational resources are limited [25], meanwhile enhancing the influence of educational institutions. This is an attempt to move away from face-to-face education towards online learning [26]. Although the effectiveness and attendance of online education were initially questioned [27], as the field has grown, more universities are allocating a portion of their course credits to online education meanwhile issuing corresponding certificates of completion [28,29], or increasing users’ motivation to participate in online education using the brand influence of top universities [30]. At the same time, open teaching resources help users to carry out online learning at any time [31].
Since COVID-19 was first reported in December 2019, the number of infections has continued rising [32], prompting the rapid development of online education which has proven to be an effective means to avoid the congregation of students and the spread of the virus in some studies, indicating that the learning experience can be improved [33,34,35] and the willingness of users to learn can be increased through online education during the epidemic [36]. According to Bojovi et al. [37], online education must be built on a unique technological, social and economic context, which may not offer teachers enough time to change the original teaching plans, thereby reducing the efficiency of online education [38,39,40]. During COVID-19, a considerable amount of research on online education was presented; however, the majority was focused on certain professional fields, such as health care [4,5] and teacher education [6,7,8,9]. The influence of online courseware on users’ willingness to engage in online classes, which is taken as an essential technical instrument for online education, on the other hand, has received less attention. According to research, applied technologies play a significant role in online learning [41], and Phillips et al. [42] suggest that technologies have now become an integral and crucial aspect of higher education. However, at this time, the majority of relevant research on online course APPs is still centered on user experience. For example, Liu [43] states that school categorization, course discipline division, text, icon density, interface color saturation and other design components should be prioritized in the user interface design of Chinese universities for Moke and Coursera in China. Wang [14] argues that the key elements of live course software user experience are ease of operation, functional completeness, interface rationality and technical reliability; Zaharias [44] evaluates interactive content instructional feedback, assessment navigation, visual design, learner guidance, the design accessibility of support-learning strategies and learnability. There are several studies that concentrate on the usability and use of online classroom systems [10,11,12]. There is a lack of systematic and thorough discussion on the variables impacting users’ usage of online class applications and the corresponding assessment scales of their desire to use, meanwhile research on software is still restricted to programs themselves. As a result, factor analysis and structural equation modeling will be used in this study to determine the specific factors that influence users’ use of online class APPs as well as the relationships among the factors, so as to provide some guidance for the design and operation of online class APPs.

2.2. The Use of Pertinent Theories in Online Classes

Online class APPs have developed quickly in tandem with the quick development of the Internet and the ongoing influence of COVID-19. The influencing factors that affect users’ use of the online class APPs have begun to be investigated in numerous studies, to which numerous behavioral theoretical models have been applied. Lew et al. [14] built a model of sustained users’ use and examined its correlation with users’ intention to use, combining with the Theory of Planned Behavior (TPB) of perceived behavioral control [45], the perceived usefulness and perceived ease of use in the Technology Acceptance Model (TAM) [46], self-efficacy in Social Cognitive Theory (SCT) [47] and enjoyment in Motivation Theory [48]. In a study on users’ willingness to learn online, Chaker et al. [15] discovered that the mind flow dimension was a key mediator based on Mind Flow Theory [49] and social intention; to investigate users’ opinions regarding the usage of online education, Xu et al. [16] combined perceived value and TAM from the teachers’ perspective. To investigate the characteristics influencing users’ desire to utilize desktop webinars for online learning, Lakhal & Khechine [50] expanded the Unified Theory of Acceptance and Use of Technology (UTAUT) created by Venkatesh et al. [51]. In order to create a model of students’ satisfaction with online learning, Kang and Park [52] combined the ISS [53], PELS [54] and EESS model [55] through Generalized Structural Equation Modeling (GSEM) analysis, who discovered that the value of instructor feedback was crucial to students’ preferences for online learning. To analyze the link among all aspects influencing the online learning community, Liu et al. [56] added user interface design, experience and perceived interaction features to the TAM model; to create an enhanced model of the TAM to explain users’ purpose and behavior when utilizing online service, Agudo-Peregrina et al. [57] incorporated aspects including personal innovativeness and perceived interaction in the information technology sector. Combining the Information Systems Expectation Confirmation Model [58], the Cognitive Model [59], the Theory of Technology Persistence [60] and the Information Systems Success Model [61], Dahan & Akkoyunlu [62] investigated students’ desire to continue using online learning environments. To investigate Chinese university students’ intentions to continue learning online, Dai et al. [63] additionally introduced curiosity factors from the perspective of Expectation Confirmation Model. The UTAUT model was expanded upon by Nikolopoulou et al. [17] to investigate the intentions influencing teachers’ use of mobile Internet education, who discovered that habits, hedonic motivation, performance expectations and the knowledge of technological pedagogy could be used to significantly predict teachers’ intentions to use mobile Internet; Hu et al. [64] used the TAM model to contrast the e-learning intentions of anxious and non-anxious college students.
There are multiple theoretical models that may be employed for the research investigation of online class APPs based on intentions, and the research is primarily based on TAM according to a review of prior works. Quantitative research methods have been primarily used in related previous studies, in which developing hypotheses are entailed based on social phenomena or a compilation of literature, and then those hypotheses are validated through the analysis of quantitative data. However, this method cannot be directly applied to the current study, where a set of user evaluation scales are hoped to be constructed for online class APPs and determine the influencing factors that really affect users’ willingness to use them. Through traditional quantitative research methods, it can only prove that relevant influencing factors or models can be used to explain users’ usage behavior by means of hypothesis validation, i.e., a certain element can influence users’ behavior. However, it is not established that such elements are the key to influencing users’ willingness to use, i.e., users will definitely pay attention to such elements when evaluating online class APPs. For this reason, a combination of qualitative and quantitative approaches will be adopted in this study through user interviews, user research and literature-based discussions in the qualitative study, so as to determine the elements that users really pay attention to when evaluating online class APPs, and develop an evaluation scale that is more compatible with online classroom APPs. Then, through data analysis and hypothesis validation in the quantitative study, we can provide a relatively objective and fair theoretical basis as well as data to support the results of this study, meanwhile analyzing the influence relationship among the factors to make up for the shortcomings of the previous studies to some extent. Through this research method, the existing questionnaire scales of influencing factors are not required to be used in this study.

3. Materials and Methods

3.1. Participant

From May 2021 to March 2022, researchers randomly interviewed a sample of 68 students, including 20 males and 48 females, who had attended online classes using an APP at a university in Wuxi, Jiangsu Province, China. An open-ended questionnaire was then distributed on the China questionnaire-collecting site to obtain random user assessments of the education APP, so as to eliminate the limitation that the respondents were mostly students. There were 70 men and 130 women, whose ages were mostly concentrated in the 21–30-year-old group (62%) and the 31–40-year-old group (20%). All sorts of employments were represented, with students accounting for 40% and private businesses accounting for 30%. The data were compiled and summarized as a project in which new questionnaires would be released for a factor analysis. A total of 356 questionnaires were collected, with invalid questionnaires such as those with a short answer time, duplicate answers filled in, missing answers and so on being deleted, and 328 valid questionnaires were left. There was a response rate of 92 percent for the surveys, and the sample size was sufficient for the factor analysis [65,66,67,68,69]. In total, 31.5 percent of the responders were men, while 68.5 percent of them were women. The age group of 21–30 years old accounted for 58.72 percent of all responses, followed by the 31–40-year-old group, which accounted for 30.28 percent, and then the above-40-year-old group, which accounted for just 6.43 percent. Respondents with a high school diploma or less made up 3.67 percent, those with a college or bachelor’s degree made up 86.23 percent, and those with a bachelor’s degree or higher made up 10.09 percent. The authors of this study performed initial user interviews, and three graduate students unconnected to this study underwent both the stages of interview summary and the online open-ended questionnaire content summary. Following the aggregation, the information collected will be reviewed and evaluated by two university professors. Since the COVID-19 epidemic, both of them have frequently used online class APPs for online teaching, who have also studied through those APPs and have a good understanding of online learning. Both professors are from national key discipline institutions of design, with more than three years of teaching experience, and their primary research interests are user behavior and human–computer interaction. They will also decide and assess if the final factor-naming discussion is logical. The expert details are shown in Table 1.

3.2. Measures

This study was primarily conducted utilizing the research method of factor analysis, in which a branch of multivariate statistics was commonly employed in the field of psychology to address the problem of cognitive variable correlation structure [69]. Factor analysis is being utilized extensively in psychology [69] and behavioral science [70]. Meanwhile, it was found in this study that the factor analysis research method conducted by Wei et al. [68] provided detailed and specific descriptions of the analysis methods, result corrections as well as final results in both the initial user opinion collection stage and the later data analysis, with which the data analysis results were more satisfactory, and the factors affecting users’ willingness to use were also explored in the study through factor analysis. The above fits well with this study, so the research process of this study mainly refers to the experimental method of Wei et al. [68]. The whole factor analysis process is divided into three stages: in the first stage, users’ evaluation of the usage experience of education APPs is collected in a large area, which is organized and summarized into evaluation items; in the second stage, the summarized items are collected as evaluation scales for secondary questionnaires; and in the third stage, the influence factors are refined through data analysis, and according to the content of the items corresponding to the factors based on a large amount of literature, the factors are named so as to determine the specific influence factors that affect users’ usage of education APPs.
The research process of this study primarily follows Wei et al.’s [68] experimental method, with the addition of offline interviews with users in the original experiment, so as to better understand their inner feelings about using online class APPs, and the number of interview samples far exceeds the standard proposed by Slaton [69] and Mellor [71] in the experiment. The sample size in the collection phase of the open-ended online questionnaire exceeds Wei et al.’s [68] experimental sample criterion, and the sample size required for factor analysis exceeds Comrey et al.’s [72,73] criterion, that is, more than 10 times the question size, which has been used in a large number of related studies [61,72,73].
This study was conducted through user interviews and an open-ended questionnaire sent across a vast region utilizing Credamo, a large Chinese domestic questionnaire-collecting platform, so as to obtain a large number of users’ opinions on using online class APPs at the start of the study. During the offline user interview phase, semi-structured interviews with the key questions presented in Table 2 were conducted with a random sample of respondents. Before the interview began, users were told of the aim of the interview, and interviewers were requested to sign an informed consent form, which was included as the first page of the online open-ended questionnaire collecting phase, and respondents had to pick the “informed and agreed” option before they could reply. It was asked in the open-ended poll that, “What do you believe are the advantages or disadvantages of utilizing an APP to attend online classes (please fill in three or more comments)?” The question “Have you ever used an APP to attend online classes?” was used to assess the questionnaire validity. The content summary stage was conducted by three graduate students who subjectively evaluated the content and combined it with the same or similar user comments. The content was then evaluated and reviewed by university professors, and the summary results were revised by participants of the study based on the results of the professors’ review and discussion, together with those of the three graduate students and university professors until the results were unobjectionable. Adult users’ assessments were referred to as “items” in the study, whereas the summary results of the later factor analysis were referred to as “factors”. After combining the results of the two parts, 237 original evaluations and 17 items were yielded in the final offline interview stage, meanwhile 975 original evaluations and 21 items were yielded in the open-ended online questionnaire, with the overall items shown in Figure 1, in which all negative evaluations were adjusted to positive ones.
Figure 1 reveals that Q1–Q4 are stated more than 100 times each in the first phase, which indicates that they are the most fundamental, superficial and easily found advantages and characteristics of online class APPs for users. Q1 is cited for 228 times, much more frequently than other items, which means that “APP-based online classes are free from outside influences” is the most intuitive feature of users’ evaluation of APPs for online classes. After closely examining the substance of Q1, it can be concluded that this evaluation is not produced by the functional aspects of an online class APP itself, but rather by the inescapable outcome of the learning behavior that takes place throughout online classes. Since the outbreak of COVID-19, the online class learning of the majority of users takes place at home alone, reducing the influence of outside factors such as virus transmission and the disruption of classroom’s order by other students. Therefore, despite the fact that this point is frequently brought up, it is challenging to control the practical changes in the actual layout and functionality of an online class APP. Q19–Q21 were mentioned by the interviewed users for the lowest number of times, no more than 10 times, indicating that these three features were more deeply hidden and less discovered by users, which are also features less concerned by the current online class APP operators, so they are also features with greater potential. If APP operators can increase the investment in this area to a certain extent, the user experience may be improved to a certain extent.
The questionnaire used in the data-collecting phase of factor analysis comprised of the basic information of respondents and the content of Figure 1, which was assessed using a seven-level Likert scale, with 1 indicating full disagreement and 7 indicating complete agreement.

3.3. Data Analysis

328 data points were analyzed in this study using the professional data analysis program SPSS 26.0, and four factors with distinctive roots of higher than one were identified through a factor analysis. The post-rotation variance explanations for the four factors were 19.814 percent, 15.985 percent, 11.889 percent and 11.231 percent, respectively, meanwhile the total explanation rate was 58.919 percent, with KMO > 0.6 [74]; both the individual reliability of the factors and the overall reliability of the questionnaire scale were higher than the criterion of 0.6 [75], with a df-value of 120 for the degree of freedom and a p-value of 0.000, which met the requirements of Bartlet ’s sphericity (p < 0.05) [76], and all the standardized loading coefficients were greater than 0.5 in the validation factor analysis [76,77], with AVE > 0.36 [78] and CR > 0.6 [78,79].

4. Results

4.1. Results of Exploratory Factor Analysis

The overall Cronbach alpha coefficient of all items in this study was 0.889, indicating a good scale reliability [80]. The results of KMO and Bartlett’s test revealed that KMO was 0.903, Bartlett’s approximate chi-square was 2294.556, the df-value was 210 and the p-value was 0.00, all of which satisfied Bartlett’s sphericity requirement, indicating that the study data were suitable [74].
In the factor analysis, the correspondence among items was found through varimax; as is shown in Table 2, when all research items correspond to a common degree value of higher than 0.4, it means that there is a strong correlation between the research items and the factors; and when the absolute value of the factor loading coefficient is greater than 0.4 [81], it means that there is a good correspondence between the items and the factors. Ideally, the items should correspond to only one factor, and the absolute value of the factor-loading coefficient of the items should be greater than 0.4 with the corresponding factor. However, when the absolute value of the factor-loading coefficient of the items is greater than 0.4 with multiple factors, it indicates the existence of factor entanglement, and some entangled items need to be removed appropriately to optimize the results [82,83].
The explained variance after rotation of these four factors was 19.540 percent, 13.380 percent, 10.332 percent and 10.167 percent, respectively, the explained variance after rotation was 53.420 percent cumulatively, and the factor characteristic roots were 4.103, 2.810, 2.170 and 2.135, all of which were greater than 1. However, it was discovered that the items were factors entangled when varimax was used to determine the correspondence among the items. Therefore, factor-entangled items were removed from this study with MA Joshi [84] and Wei’s [68] experiments as a guide. The enumeration method was used in the deletion process, and a factor analysis was carried out after each item with entanglement removed. This process continued until the factor entanglement vanished, the indexes satisfied the research criteria, and the number of removed items was kept to the minimum.
After deleting items Q1, Q5, Q12, Q15 and Q18 one by one, the results of factor analysis are improved, as is shown in Table 3 and Table 4. Factor 1 contains six items, namely Q2, Q10, Q8, Q6, Q17 and Q9; Factor 2 contains four items, namely Q11, Q16, Q21 and Q19; Factor 3 contains three items, namely Q13, Q14 and Q20; and Factor 4 contains three items, namely Q3, Q. The eigenroot of these four components is 3.17, 2.558, 1.902 and 1.797, respectively, with an explained variance of 19.814 percent, 15.985 percent, 11.889 percent and 11.231 percent after rotation. In addition, the overall explanation of variance rose from 53.420 percent to 58.919 percent. The aggregate Cronbach alpha coefficient of the remaining items in the reliability study was 0.866. The independent reliability of the four components was 0.829, 0.759, 0.669 and 0.634, all of which were greater than the criterion of 0.6 [75]. KMO was greater than 0.6, the Bartlett spherical approximation chi-square was 1657.688, the degree of freedom df-value was 120 and the p-value was 0.000, all of which fulfilled the criteria of Bartlett’s spherical degree (p < 0.05). The above findings demonstrate that the data meet the factor analysis standards after removing some elements to maximize the factor analysis results [74].

4.2. Confirmatory Factor Analysis

In order to ensure that the four factors obtained in the factor analysis have a good discriminant validity and that there is a good correlation among the items contained in the factors, a validation factor analysis needs to be conducted on the factors, and the factor-loading coefficients indicate the correlation of items in the factors, as is shown in Table 5, in which good significance of the items is shown with all the standardized loading coefficients greater than 0.5 [76,77], while all the AVE values are higher than 0.36 [78] and all CR values are higher than 0.6 [78,79], indicating a good control relationship and convergence between the factors and items. As shown in Table 6, according to the AVE square root value test, the AVE square root value is the diagonal line of the table plus the underline, and all the relationships between this value and its correlation coefficient values meet the validation requirements, indicating that the factors have a good discriminant validity.

5. Discussion

5.1. Discussion of the Results of the Factor Analysis

Following factor analysis and validation factor analysis, it was determined that the four factors summarized in the study fully met the research criteria, and on this basis, this study attempted to name and discuss the factors based on the content of items included in the factors, so as to identify the influencing factors that influenced users’ use of Internet classes APPs.
First of all, Factor 1 contains six items, each of which demonstrates a positive perception of the software in terms of communication and collaboration, learning efficiency, learning quality, learning atmosphere, course level as well as price when users use an APP online, demonstrating the benefits experienced by users while taking classes using the APP; thus, Factor 1 is referred to as perceived benefits in this study, which is a cognitive emotion positively affecting an individual’s behavior [85] and a prediction of the positive results that can be produced by taking recommended actions [86]. It is usually divided into two parts: utilitarian benefits and hedonic benefits, of which utilitarian benefits are instrumental, i.e., users’ perceptions of the functionality, convenience and value for money of an APP as well as the means to reach their goals, which is consistent with Q2. This aligns with Q2, Q10, Q8, Q6 and Q17, but hedonic benefits are non-instrumental and often contain experience and emotions, which aligns with Q9 [87,88].
Secondly, Factor 2 contains a total of four items, which involve the main interpersonal relationships among students in the learning process, including: teachers, parents and classmates, meanwhile showing the aspect of learning supervision and relationship maintenance in interpersonal relationships as well as the influence of others’ attention on their own learning; therefore, they are named as subjective norms in this study, which generally refer to individuals’ consideration of the social pressure felt by them when considering whether to perform a specific action or not [89]. Because an individual’s intention to perform specific behavior is easily influenced by the opinions of families, friends, communities and government agencies, a number of researchers have demonstrated a positive relationship between subjective norms and behavioral intentions [90,91].
Thirdly, Factor 3 contains three items showing the richness, usefulness and compatibility of online class APPs, which is a form to present the technologies contained in the APPs to users, as well as a specific process and experience perception of users while receiving the service [92]. Such performance is similar to the concept of functional quality proposed by Gronroos [93] in the service sector, where it is interpreted as a process through which customers experience receiving service and achieving their purposes such as production or consumption; meanwhile, Bernardo [94] states in his study of websites that functional quality is described as a specific way to help users achieve their goals, which is similar to this study. Therefore, Factor 3 is named as functional quality in this study.
Finally, Factor 4 has three variables suggesting that users feel that they can utilize online class APPs effectively for learning from the level of operationality, course control and their own class status, respectively, leading to the moniker “self-efficacy” in this study. Bandura was the first to propose the self-efficacy hypothesis [95]. An individual’s appraisal of his or her capacity to perform a given level of work, i.e., the degree to which a user feels he or she can accomplish a job when presented with it, is referred to as self-efficacy [96]. The three components of Factor 4 each demonstrate that users feel that they can quickly grasp software operation, freely manage course learning progress and be in a good learning state themselves, which is comparable to the idea of self-efficacy.
The final results of the naming discussion of these four factors were also approved by the two expert professors mentioned in the previous section.

5.2. Discussion on the Relationship among Factors

Based on the results of the factor analysis, the goal of this study is to further investigate the link among the factors. Self-efficacy is essentially a psychological state of a user, i.e., confidence in one’s ability to be competent [96] according to an extensive literature review, and some studies have shown that an individual’s intrinsic motivation can have a significant impact on the predicted use of a particular technology [97]. Individuals’ levels of determination and confidence play a role in the adoption and long-term use of relevant technological applications [98], particularly in the health domain, and a large body of research has shown that followers’ perceptions of their own bodily functions can be effectively improved with a stronger self-efficacy [99,100].
Self-efficacy can promote individuals’ perceptions of the usefulness of application functions [98], and individuals’ perceptions of service benefits can be effectively promoted through reliable and effective service functions [101,102], while studies have shown that students with higher self-efficacy have better teaching effectiveness through corresponding teaching methods in the educational field as well [103,104]. Greeni Maheshwari [105], together with Valtonen et al. [106], have indicated in their studies that subjective norms positively affect self-efficacy; subjective norms can also be effective in increasing perceptions of effectiveness [107,108]. Therefore, the following hypotheses are proposed in this study.
Hypothesis 1:
Users’ self-efficacy positively affects the perceived benefits of online class APPs.
Hypothesis 2:
User’s self-efficacy positively affects the functional quality of online class APPs.
Hypothesis 3:
The functional quality of an online class APP positively influences the perceived benefits.
Hypothesis 4:
Users’ subjective norms positively influence self-efficacy.
Hypothesis 5:
Users’ subjective norms positively influence the perceived effectiveness of an online class APP.
The structural equation modeling of each factor was analyzed using SPSS 26.0 software in this study, and the findings are displayed in Table 7. Table 8 shows the total fitness measures: p = 0.075 > 0.05, χ2/df = 1.186 < 3, GFI = 0.954 > 0.9, RMSEA = 0.024 < 0.10, RMR = 0.023 < 0.05, CFI = 0.986 > 0.9, NFI = 0.92 > 0.9, and NNFI = 0.983 > 0.9. All data were within acceptable limits. H2–H5 have a p-value of less than 0.05, and all the routes have significant correlations. As a result, all of the hypotheses were correct, with the exception of H1. The path model is shown in Figure 2.
H1 is not valid, but not H2, H3 or H5, indicating that in users’ use process of APPs for online lessons, users’ subjective norms and the functional quality of APPs are the direct influencing factors that affect users’ perceived benefits; users’ self-efficacy while using an APP for learning cannot directly affect their perceived benefits, but it must indirectly affect their perceived benefits through functional quality; and users believe that they can use an APP for online lessons. Meanwhile, we can see that the route coefficient of subjective norms is 0.467, which is larger than that of functional quality on perceived benefits when we compare the path coefficients of subjective norms and functional quality. Thus, it can be seen that subjective norms with social influence relations are the main factors affecting users’ perceived benefits in the process of online classes through APPs.
H5 is valid, implying that subjective norms influence self-efficacy in a positive way, i.e., the higher a user’s subjective norms are, the higher his or her self-efficacy will be. As is previously stated, subjective norms refer to the pressure that a user values from social relationships, i.e., the more a user values the opinions and evaluations of parents, teachers and classmates when using an APP for online lessons, the more it is believed that he or she will be in a good learning state during the online lessons under their supervision and influence.

5.3. Discussion on Management Significance and Research Contribution

Factor analysis is used in this study to summarize the factors influencing users’ use of online classroom APPs, which include perceived benefits, subjective norms, functional quality, self-efficacy, the specific benefits and actual functions that users can obtain using online classroom APPs, external concerns at the social level as well as the creation of user confidence, among which subjective norms and self-efficacy were less frequently mentioned in previous studies. Less attention has been paid to subjective standards and self-efficacy in previous research on educational APPs. Meanwhile, it can be clearly seen from the inter-factor relationship analysis in Section 4.2 that social factors, personal factors and APP functional factors in the online learning process influence and promote each other to improve users’ perceptions of benefits, implying that the design of a successful online class APP should be systematic, involving users’ own psychology, social influence and software functions only by controlling all aspects of the online learning process. Perceived benefits account for 19.814 percent of the variations, which is higher than other factors, implying that perceived advantages are the most significant elements affecting users’ use of online class APPs, confirming the prior finding that perceived benefits are defined as the primary motivator for individuals to engage in a certain activity or action [109]. When particular categories under perceived benefits are compared, it becomes clear that a high learning efficiency, learning environment and learning quality earn more perceptive assessments among the total benefits felt by users for an online class APP. While under ideal circumstances involving effective communication and collaboration between teachers and students, a high quality and value for money of online classes can, to a degree, improve users’ learning efficiency; online classes provide opportunities for collaborative learning and discussion, while online classes of high-quality levels and reasonable prices are more specific evaluations of the benefits of an online APP, which are the specific basis for its design. For these necessities of online class APPs, particularly those with a large number of online course resources, to strictly control the teaching level of teachers when recording online classes and effectively improve the quality level of online classes, you should collaborate with teachers from prestigious schools or those with extensive teaching experience, optimize teachers’ teaching methods, reasonably arrange the courses and set clear teaching tasks, while course pricing should be reasonable. At the same time, the design of APPs should include functions for students to learn and discuss, such as allowing them to form study groups and use software to have discussion meetings. In the discussion stage, to avoid students who are unwilling to actively discuss, a reminder function should be set; for example, if discussion group members have not spoken for a long time, a notification will be automatically given by the system.
Subjective norms were explained second only to perceived benefits in terms of the variances explained, and the path coefficient between subjective norms and perceived benefits was 0.467 higher than that between functional quality and perceived benefits, implying that subjective norms had a direct impact on self-efficacy. This suggests that personally perceived social pressure is the second factor that influences users’ use of the online class APPs, as well as the primary factor that influences users’ perceived benefits meanwhile increasing their belief that they can complete their learning tasks effectively using an online class APP. The focus is on monitoring and relationship maintenance, as is shown by those listed in the subjective specification. As a result, the development of functions for students’ learning supervision and communication should be prioritized in the design of online class APPs together with the addition of certain social functions, such as learning communication and publishing learning news, so that students can carry out certain social activities through the software. Users may also monitor not only their personal learning progress, but also the average progress of others who have enrolled for the courses, so learning progress rankings are routinely arranged to encourage users to increase their learning efficiency after registering for a series of courses. The instructor can also prepare questions ahead of time and ask them at random throughout the courses, or the system can send out random check-in pop-ups throughout the courses, if a user does not perform the check-in within the prescribed period, his or her wandering behavior will be noted. Students’ roaming and question-answering records will be put into a newsletter and distributed to their instructors as well as parents at the end of the courses, so that parents and teachers can understand and monitor students’ condition.
The quality of software functions is also one of the important factors affecting the perception of users’ benefits, which requires that the actual needs of users be taken into account in the design process of online class APPs and the corresponding rich, complete as well as useful functions be introduced, such as submitting assignments, sharing learning materials, students’ speaking in classes, raising hands, break reminders and other corresponding supervision functions, and the APPs should also ensure a high degree of compatibility, or multiple versions should be set up to allow the software to be logged in and used on multiple platforms.
Although self-efficacy cannot be used to directly improve users’ perceptions of the benefits of an online class APP, it can facilitate an effective promotion and users’ evaluation on the quality of its functions, which needs the software to be simple and easy to understand, so as to reduce the additional software learning costs incurred by users in the use process, meanwhile the software should also be able to automatically record each live class, allowing users to watch it repeatedly or rewind it. At the same time, because subjective norms may positively impact users’ self-efficacy, by setting specific supervision features for an online class APP, users’ mastery of course progress and their own focus can be increased.
Additionally, in the user research, it is found that there are two main types of common online course software, one is the current live course software used by students of normal schools to carry out prescribed courses, which is relatively simple in functions and often relies on social software or conference software, while the second type of course software has a strong commercial nature, which comes with a wealth of course resources itself, usually for users who need to improve their own level of use. Because the two are incompatible to a considerable part, online course APPs can be used to advance local or regional college union organizations. The program involves the standard live function for academic courses as well as superb instructor course materials. If students want to improve their own capacity to study relevant courses, they can pay to obtain both superior instructor power and extra money for the teacher to some extent.

6. Conclusions

In the context of the continuing influence of COVID-19, the reality that online education has become the new norm, as well as the theoretical background that the current research on the elements of the usage of online class APPs is not systematic enough and that the elements that users really care about when using online class APPs cannot be exactly explored based on the existing theories, user evaluations from the perspective of the design and development of online class APPs are obtained in this study through offline user interviews and open-ended online questionnaires, which are used as a scale to determine the main elements influencing users’ use of online class APPs through factor analysis and literature discussion; meanwhile, an evaluation scale is constructed for online class APPs, determining that the main elements influencing users’ use of the online class APPs are: perceived benefits, subjective norms, functional quality and self-effectiveness. In addition, the relationship among the influencing elements was explored through structural equation modeling analysis, and finally certain managerial suggestions were proposed.
There are also limitations in this study because there are two main types of common online course software: one is live course software, which is currently used by school students to conduct their normal academic courses, and the other is course software that comes with its own rich course resources, which is typically used by users who want to improve themselves. There are substantial contrasts between the two, and consumers’ problems may be concentrated in one design. Since the evaluation scale developed in this study is generic in nature, further research on the two different types of online class APPs was not conducted, and different results may also be produced by user groups of different online class APPs. Therefore, in the subsequent related research, a corresponding research method can be used to further explore and study the focus of users’ experience in using different types of online class APPs as well as the experience of different user groups on the basis of the questionnaire scale in this study.

Author Contributions

Investigation, conceptualization, software, validation, formal analysis, visualization, writing—original draft preparation, Z.W.; investigation, conceptualization, Z.L.; methodology, writing—review and editing, Q.J. All authors have read and agreed to the published version of the manuscript.

Funding

The fund number is 2021M701460 from China Postdoctoral Science Foundation.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Acknowledgments

Thanks to Jiangnan University for supporting this study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Watch, H.R. Pandemic’s Dire Global Impact on Education: Remedy Lost Learning; Make School Free, Accessible; Expand Internet Access. Available online: https://tinyurl.com/4adchm55 (accessed on 23 June 2022).
  2. Kumar, B.A.; Chand, S.S. Mobile learning adoption: A systematic review. Educ. Inf. Technol. 2019, 24, 471–487. [Google Scholar] [CrossRef]
  3. Holst, A. Forecast Number of Mobile Users Worldwide from 2019 to 2023 (in Billion). 2019. Available online: https://www.statista.com/statistics/218984/number-of-global-mobile-users-since-2010/ (accessed on 23 June 2022).
  4. Napp, A.; Kosan, J.; Hoffend, C.; Häge, A.; Breitfeld, P.; Doehn, C.; Daubmann, A.; Kubitz, J.; Beck, S.J.R. Implementation of basic life support training for school children: Online education for potential instructors? Results of a cluster randomised, controlled, non-inferiority trial. Resuscitation 2020, 152, 141–148. [Google Scholar] [CrossRef] [PubMed]
  5. Lamb, L.R.; Baird, G.L.; Roy, I.T.; Choi, P.H.; Lehman, C.D.; Miles, R.C. Are English-language online patient education materials related to breast cancer risk assessment understandable, readable, and actionable? Breast 2022, 61, 29–34. [Google Scholar] [CrossRef] [PubMed]
  6. Brinia, V.; Psoni, P. Online teaching practicum during COVID-19: The case of a teacher education program in Greece. J. Appl. Res. High. Educ. 2021, 14, 610–624. [Google Scholar] [CrossRef]
  7. Gorfinkel, L.; Muscat, T.; Ollerhead, S.; Chik, A. The role of government’s ‘Owned Media’in fostering cultural inclusion: A case study of the NSW Department of Education’s online and social media during COVID-19. Media Int. Aust. 2021, 178, 87–100. [Google Scholar] [CrossRef]
  8. Han, H.; Lien, D.; Lien, J.W.; Zheng, J. Finance. Online or face-to-face? Competition among MOOC and regular education providers. Int. Rev. Econ. Financ. 2022, 80, 857–881. [Google Scholar]
  9. Barros-del Río, M.A.; Nozal, C.L.; Mediavilla-Martínez, B. Practicum management and enhancement through an online tool in foreign language teacher education. Soc. Sci. Humanit. Open 2022, 6, 100273. [Google Scholar]
  10. Tsironis, A.; Katsanos, C.; Xenos, M. Comparative usability evaluation of three popular MOOC platforms. In Proceedings of the Global Engineering Education Conference, Abu Dhabi, United Arab Emirates, 10–13 April 2016. [Google Scholar]
  11. Park, B.; Knoerzer, L.; Plass, J.L.; Bruenken, R. Emotional design and positive emotions in multimedia learning: An eyetracking study on the use of anthropomorphisms. Comput. Educ. 2015, 86, 30–42. [Google Scholar] [CrossRef]
  12. Ip, H.H.S.; Li, C.; Leoni, S.; Chen, Y.B.; Ma, K.F.; Wong, C.H.T.; Li, Q. Design and Evaluate Immersive Learning Experience for Massive Open Online Courses (MOOCs). IEEE Trans. Learn. Technol. 2019, 12, 503–515. [Google Scholar] [CrossRef]
  13. Wang, S.R.; Liu, Y.; Song, F.H.; Xie, X.J.; Yu, D. Research on Evaluation System of User Experience With Online Live Course Platform. IEEE Access 2021, 9, 23863–23875. [Google Scholar] [CrossRef]
  14. Lew, S.-L.; Lau, S.-H.; Leow, M.-C. Usability factors predicting continuance of intention to use cloud e-learning application. Heliyon 2019, 5, e01788. [Google Scholar]
  15. Chaker, R.; Bouchet, F.; Bachelet, R.J.C.i.H.B. How do online learning intentions lead to learning outcomes? The mediating effect of the autotelic dimension of flow in a MOOC. Comput. Hum. Behav. 2022, 134, 107306. [Google Scholar] [CrossRef]
  16. Xu, Y.; Jin, L.; Deifell, E.; Angus, K. Chinese character instruction online: A technology acceptance perspective in emergency remote teaching. System 2021, 100, 102542. [Google Scholar] [CrossRef]
  17. Nikolopoulou, K.; Gialamas, V.; Lavidas, K. Habit, hedonic motivation, performance expectancy and technological pedagogical knowledge affect teachers’ intention to use mobile internet. Comput. Educ. Open 2021, 2, 100041. [Google Scholar] [CrossRef]
  18. de Freitas, S.I.; Morgan, J.; Gibson, D. Will MOOCs transform learning and teaching in higher education? Engagement and course retention in online learning provision. Br. J. Educ. Technol. 2015, 46, 455–471. [Google Scholar] [CrossRef] [Green Version]
  19. Deng, R.; Benckendorff, P.; Gannaway, D. Progress and new directions for teaching and learning in MOOCs. Comput. Educ. 2018, 129, 48–60. [Google Scholar] [CrossRef]
  20. Kraľovičová, D. Online vs. Face to face teaching in university background. Megatrendy A Médiá 2020, 7, 199–205. [Google Scholar]
  21. Hurajova, A.; Kollarova, D.; Ladislav, H. Trends in education during the pandemic: Modern online technologies as a tool for the sustainability of university education in the field of media and communication studies. Heliyon 2022, 8, e09367. [Google Scholar] [CrossRef]
  22. Ma, L.; Lee, C.S. Understanding the barriers to the use of MOOCs in a developing country: An innovation resistance perspective. J. Educ. Comput. Res. 2019, 57, 571–590. [Google Scholar] [CrossRef]
  23. Veletsianos, G.; Shepherdson, P. A systematic analysis and synthesis of the empirical MOOC literature published in 2013–2015. Int. Rev. Res. Open Distrib. Learn. 2016, 17, 198–221. [Google Scholar] [CrossRef] [Green Version]
  24. Mozahem, N.A. The online marketplace for business education: An exploratory study. Int. J. Manag. Educ. 2021, 19, 100544. [Google Scholar] [CrossRef]
  25. Mangan, K. MOOCs could help 2-year colleges and their students, says Bill Gates. Chron. High. Educ. 2013. Available online: https://www.chronicle.com/article/moocs-could-help-2-year-colleges-and-their-students-says-bill-gates/ (accessed on 23 June 2022).
  26. Lucas, H.C. Can the current model of higher education survive MOOCs and online learning? Educ. Rev. 2013, 48, 54–56. [Google Scholar]
  27. Breslow, L.; Pritchard, D.E.; DeBoer, J.; Stump, G.S.; Ho, A.D.; Seaton, D.T. Studying learning in the worldwide classroom research into edX’s first MOOC. Res. Pract. Assess. 2013, 8, 13–25. [Google Scholar]
  28. Sandeen, C. Assessment’s Place in the New MOOC World. Res. Pract. Assess. 2013, 8, 5–12. [Google Scholar]
  29. Major players in the MOOC Universe. 2013. Available online: https://www.chronicle.com/article/major-players-in-the-mooc-universe/ (accessed on 23 June 2022).
  30. Lněnička, M.; Nikiforova, A.; Saxena, S.; Singh, P. Investigation into the adoption of open government data among students: The behavioural intention-based comparative analysis of three countries. Aslib J. Inf. Manag. 2022, 74, 549–567. [Google Scholar] [CrossRef]
  31. World Health Organization. Q&As on COVID-19 and Related Health Topics. Available online: https://www.who.int/emergencies/diseases/novel-coronavirus-2019/question-and-answers-hub (accessed on 27 May 2022).
  32. Lin, X.; Gao, L.-L. Students’ sense of community and perspectives of taking synchronous and asynchronous online courses. Asian J. Dist. Educ. 2020, 15, 169–179. [Google Scholar]
  33. Basri, M.; Husain, B.; Modayama, W. University students’ perceptions in implementing asynchronous learning during covid-19 era. Metathesis J. Eng. Lang. Lit. Teach 2021, 4, 263–276. [Google Scholar] [CrossRef]
  34. Amelia, A.R.; Qalyubi, I.; Qamariah, Z. Qamariah. Lecturer and students’ perceptions toward synchronous and asynchronous in speaking learning during covid-19 pandemic. Proc. Int. Conf. Engl. Lang. Teach. 2021, 5, 8–18. [Google Scholar]
  35. Khan, M.; Vivek, V.; Nabi, M.; Khojah, M.; Tahir, M. Students’ Perception towards E-Learning during COVID-19 Pandemic in India: An Empirical Study. Sustainability 2020, 13, 57. [Google Scholar] [CrossRef]
  36. Moorhouse, B.L.; Kohnke, L. Thriving or Surviving Emergency Remote Teaching Necessitated by COVID-19: University Teachers’ Perspectives. Asia-Pac. Educ. Res. 2021, 30, 279–287. [Google Scholar] [CrossRef]
  37. Bojović, Ž.; Bojović, P.D.; Vujošević, D.; Šuh, J. Education in times of crisis: Rapid transition to distance learning. Comput. Appl. Eng. Educ. 2020, 28, 1467–1489. [Google Scholar] [CrossRef]
  38. Cutri, R.M.; Mena, J.; Whiting, E.F. Whiting. Faculty readiness for online crisis teaching: Transitioning to online teaching during the COVID-19 pandemic. Eur. J. Teach. Educ 2020, 43, 523–541. [Google Scholar] [CrossRef]
  39. Moorhouse, B.L. Beginning teaching during COVID-19: Newly qualified Hong Kong teachers’ preparedness for online teaching. Educ. Stud. 2021, 1–17. [Google Scholar] [CrossRef]
  40. Almazova, N.; Krylova, E.; Rubtsova, A.; Odinokaya, M. Challenges and opportunities for Russian higher education amid COVID-19: Teachers’ perspective. Educ. Sci. 2020, 10, 368. [Google Scholar] [CrossRef]
  41. Ali, W. Online and remote learning in higher education institutes: A necessity in light of COVID-19 pandemic. High. Educ. Stud. 2020, 10, 16–25. [Google Scholar] [CrossRef]
  42. Phillips, F.; Linstone, H. Key ideas from a 25-year collaboration at technological forecasting & social change. Technol. Forecast. Soc. Change 2016, 105, 158–166. [Google Scholar]
  43. Liu, S.Q.; Liang, T.Y.; Shao, S.; Kong, J. Evaluating Localized MOOCs: The Role of Culture on Interface Design and User Experience. IEEE Access 2020, 8, 107927–107940. [Google Scholar] [CrossRef]
  44. Zaharias, P.; Poylymenakou, A. Developing a Usability Evaluation Method for e-Learning Applications: Beyond Functional Usability. Int. J. Hum. Interact. 2009, 25, 75–98. [Google Scholar] [CrossRef]
  45. Ajzen, I. The theory of planned behavior. Organ. Behav. Hum. Decis. Processes 1991, 50, 179–211. [Google Scholar] [CrossRef]
  46. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef] [Green Version]
  47. Bandura, A. Social foundations of thought and action. Englewood Cliffs 1986, 1986, 23–28. [Google Scholar]
  48. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. Extrinsic and intrinsic motivation to use computers in the workplace 1. J. Appl. Soc. Psychol. 1992, 22, 1111–1132. [Google Scholar] [CrossRef]
  49. Heutte, J.; Fenouillet, F.; Martin-Krumm, C.; Gute, G.; Raes, A.; Gute, D.; Bachelet, R.; Csikszentmihalyi, M. Optimal experience in adult learning: Conception and validation of the flow in education scale (EduFlow-2). Front. Psychol. 2021, 12, 828027. [Google Scholar] [CrossRef]
  50. Lakhal, S.; Khechine, H. Student intention to use desktop web-conferencing according to course delivery modes in higher education. Int. J. Manag. Educ. 2016, 14, 146–160. [Google Scholar] [CrossRef]
  51. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User acceptance of information technology: Toward a unified view. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef] [Green Version]
  52. Kang, D.; Park, M.J. Interaction and online courses for satisfactory university learning during the COVID-19 pandemic. Int. J. Manag. Educ. 2022, 20, 100678. [Google Scholar] [CrossRef]
  53. DeLone, W.H.; McLean, E.R. The DeLone and McLean model of information systems success: A ten-year update. J. Manag. Inf. Syst. 2003, 19, 9–30. [Google Scholar]
  54. Sun, P.-C.; Tsai, R.J.; Finger, G.; Chen, Y.-Y.; Yeh, D. What drives a successful e-Learning? An empirical investigation of the critical factors influencing learner satisfaction. Comput. Educ. 2008, 50, 1183–1202. [Google Scholar] [CrossRef]
  55. Al-Fraihat, D.; Joy, M.; Masa’Deh, R.; Sinclair, J. Evaluating E-learning systems success: An empirical study. Comput. Hum. Behav. 2019, 102, 67–86. [Google Scholar] [CrossRef]
  56. Liu, I.-F.; Chen, M.C.; Sun, Y.S.; Wible, D.; Kuo, C.-H. Extending the TAM model to explore the factors that affect Intention to Use an Online Learning Community. Comput. Educ. 2010, 54, 600–610. [Google Scholar] [CrossRef]
  57. Agudo-Peregrina, Á.F.; Hernández-García, Á. Pascual-Miguel, F.J. Behavioral intention, use behavior and the acceptance of electronic learning systems: Differences between higher education and lifelong learning. Comput. Hum. Behav. 2014, 34, 301–314. [Google Scholar] [CrossRef]
  58. Bhattacherjee, A. Understanding information systems continuance: An expectation-confirmation model. MIS Q. 2001, 25, 351–370. [Google Scholar] [CrossRef]
  59. Oliver, R.L. A cognitive model of the antecedents and consequences of satisfaction decisions. J. Mark. Res. 1980, 17, 460–469. [Google Scholar] [CrossRef]
  60. Liao, C.; Palvia, P.; Chen, J.-L. Information technology adoption behavior life cycle: Toward a Technology Continuance Theory (TCT). Int. J. Inf. Manag. 2009, 29, 309–320. [Google Scholar] [CrossRef]
  61. DeLone, W.H.; McLean, E.R. Information systems success: The quest for the dependent variable. Inf. Syst. Res. 1992, 3, 60–95. [Google Scholar] [CrossRef] [Green Version]
  62. Dağhan, G.; Akkoyunlu, B. Modeling the continuance usage intention of online learning environments. Comput. Hum. Behav. 2016, 60, 198–211. [Google Scholar] [CrossRef]
  63. Dai, H.M.; Teo, T.; Rappa, N.A.; Huang, F. Explaining Chinese university students’ continuance learning intention in the MOOC setting: A modified expectation confirmation model perspective. Comput. Educ. 2020, 150, 103850. [Google Scholar] [CrossRef]
  64. Hu, X.; Zhang, J.; He, S.; Zhu, R.; Shen, S.; Liu, B. E-learning intention of students with anxiety: Evidence from the first wave of COVID-19 pandemic in China. J. Affect. Disord. 2022, 309, 115–122. [Google Scholar] [CrossRef]
  65. Price, B. A First Course in Factor Analysis. Technometrics 1993, 35, 453. [Google Scholar] [CrossRef]
  66. Lawley, D.N.; Maxwell, A.E. Factor Analysis as a Statistical Method. J. R. Stat. Soc. Ser. D 1962, 12, 209. [Google Scholar] [CrossRef]
  67. Lamash, L.; Josman, N. Full-information factor analysis of the daily routine and autonomy (DRA) questionnaire among adolescents with autism spectrum disorder. J. Adolesc. 2020, 79, 221–231. [Google Scholar] [CrossRef] [PubMed]
  68. Wei, W.; Cao, M.; Jiang, Q.; Ou, S.-J.; Zou, H. What Influences Chinese Consumers’ Adoption of Battery Electric Vehicles? A Preliminary Study Based on Factor Analysis. Energies 2020, 13, 1057. [Google Scholar] [CrossRef] [Green Version]
  69. Slaton, J.D.; Hanley, G.P.; Raftery, K.J. Interview-informed functional analyses: A comparison of synthesized and isolated components. J. Appl. Behav. Anal. 2017, 50, 252–277. [Google Scholar] [CrossRef] [PubMed]
  70. Ho, H.-H.; Tzeng, S.-Y. Using the Kano model to analyze the user interface needs of middle-aged and older adults in mobile reading. Comput. Hum. Behav. Rep. 2021, 3, 100074. [Google Scholar] [CrossRef]
  71. Mellor, D.; Cummins, R.A.; Loquet, C. The gold standard for life satisfaction: Confirmation and elaboration using an imaginary scale and qualitative interview. Int. J. Soc. Res. Methodol. 1999, 2, 263–278. [Google Scholar] [CrossRef]
  72. Comrey, A.L.; Lee, H.B. A First Course in Factor Analysis; Göttingen University Press: Göttingen, Germany, 1992. [Google Scholar]
  73. Hair, J.F., Jr.; Hult, G.T.M.; Ringle, C.M.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM); Sage Publications: Southend Oaks, CA, USA, 2021. [Google Scholar]
  74. Nunnally, J.C. Psychometric Theory 3E; Tata McGraw-Hill: New York, NY, USA, 1994. [Google Scholar]
  75. Eisinga, R.; Grotenhuis, M.T.; Pelzer, B. The reliability of a two-item scale: Pearson, Cronbach, or Spearman-Brown? Int. J. Public Health 2012, 58, 637–642. [Google Scholar] [CrossRef]
  76. Muilenburg, L.Y.; Berge, Z.L. Student barriers to online learning: A factor analytic study. Distance Educ. 2005, 26, 29–48. [Google Scholar] [CrossRef]
  77. Shevlin, M.; Miles, J. Effects of sample size, model specification and factor loadings on the GFI in confirmatory factor analysis. Pers. Individ. Differ. 1998, 25, 85–90. [Google Scholar] [CrossRef]
  78. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  79. Ahmad, S.; Zulkurnain, N.N.A.; Khairushalimi, F.I. Assessing the Validity and Reliability of a Measurement Model in Structural Equation Modeling (SEM). Br. J. Math. Comput. Sci. 2016, 15, 1–8. [Google Scholar] [CrossRef] [PubMed]
  80. Graham-Rowe, E.; Gardner, B.; Abraham, C.; Skippon, S.; Dittmar, H.; Hutchins, R.; Stannard, J. Mainstream consumers driving plug-in battery-electric and plug-in hybrid electric cars: A qualitative analysis of responses and evaluations. Transp. Res. Part A Policy Pract. 2012, 46, 140–153. [Google Scholar] [CrossRef]
  81. Kaiser, H.F. The varimax criterion for analytic rotation in factor analysis. Psychometrika 1958, 23, 187–200. [Google Scholar] [CrossRef]
  82. Hair, J.F.; Black, B.; Babin, B.J.; Anderson, R. Multivariate Data Analysis; Upper Saddle River: Prentice Hall: New York, NY, USA, 2011. [Google Scholar]
  83. Costello, A.B.; Osborne, J.W. Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Pract. Assess. 2005, 10, 7. [Google Scholar]
  84. Joshi, M.A.; Krishnappa, P.; Prabhu, A.V. Faculty satisfaction and perception regarding emergency remote teaching: An exploratory study. Med. J. Armed. Forces India 2022. [Google Scholar] [CrossRef]
  85. Tsujikawa, N.; Tsuchida, S.; Shiotani, T. Changes in the Factors Influencing Public Acceptance of Nuclear Power Generation in Japan Since the 2011 Fukushima Daiichi Nuclear Disaster. Risk Anal. 2015, 36, 98–113. [Google Scholar] [CrossRef] [PubMed]
  86. Orbell, S.; Crombie, I.; Johnston, G. Social cognition and social structure in the prediction of cervical screening uptake. Br. J. Health Psychol. 1996, 1, 35–50. [Google Scholar] [CrossRef]
  87. Childers, T.L.; Carr, C.L.; Peck, J.; Carson, S. Hedonic and utilitarian motivations for online retail shopping behavior—ScienceDirect. J. Retail. 2001, 77, 511–535. [Google Scholar] [CrossRef]
  88. Kahn, M.B. Cross-category effects of induced arousal and pleasure on the internet shopping experience. J. Retail. 2002, 78, 31–40. [Google Scholar]
  89. Ajzen, I.; Fishbein, M. Understanding attitudes and predicting social behavior; Prentice-Hall: Hoboken, NJ, USA, 1980. [Google Scholar]
  90. Khan, F.; Ahmed, W.; Najmi, A. Understanding consumers’ behavior intentions towards dealing with the plastic waste: Perspective of a developing country. Resour. Conserv. Recycl. 2018, 142, 49–58. [Google Scholar] [CrossRef]
  91. Piazza, A.J.; Knowlden, A.P.; Hibberd, E.; Leeper, J.; Paschal, A.M.; Usdan, S. Mobile device use while crossing the street: Utilizing the theory of planned behavior. Accid. Anal. Prev. 2019, 127, 9–18. [Google Scholar] [CrossRef] [PubMed]
  92. Arora, R.; Stoner, C. The effect of perceived service quality and name familiarity on the service selection decision. J. Serv. Mark. 1996, 10, 22–34. [Google Scholar] [CrossRef]
  93. Gronroos, C. Service quality: The six criteria of good perceived service. Rev. Bus. 1988, 9, 10. [Google Scholar]
  94. Bernardo, M.; Marimon, F.; Alonso-Almeida, M.D.M. Functional quality and hedonic quality: A study of the dimensions of e-service quality in online travel agencies. Inf. Manag. 2012, 49, 342–347. [Google Scholar] [CrossRef]
  95. Bandura, A. Self-efficacy: Toward a unifying theory of behavioral change. Psychol. Rev. 1977, 84, 191. [Google Scholar] [CrossRef]
  96. Bandura, A.; Freeman, W.H.; Lightsey, R. Self-efficacy: The exercise of control. J. Cogn. Psychother. 1999, 13, 158–166. [Google Scholar] [CrossRef]
  97. Kim, J.; Park, H.-A. Development of a Health Information Technology Acceptance Model Using Consumers’ Health Behavior Intention. J. Med. Internet Res. 2012, 14, e133. [Google Scholar] [CrossRef]
  98. Huang, G.; Ren, Y. Linking technological functions of fitness mobile apps with continuance usage among Chinese users: Moderating role of exercise self-efficacy. Comput. Hum. Behav. 2020, 103, 151–160. [Google Scholar] [CrossRef]
  99. Alirezaee, S.; Ozgoli, G.; Majd, H.A. Comparison of sexual self-efficacy and sexual function in fertile and infertile women referred to health centers in Mashhad in 1392. Pajoohandeh J. 2014, 19, 131–136. [Google Scholar]
  100. Estévez-López, F.; Álvarez-Gallardo, I.C.; Segura-Jiménez, V.; Soriano-Maldonado, A.; Borges-Cosic, M.; Pulido-Martos, M.; Aparicio, V.A.; Carbonell-Baeza, A.; Delgado-Fernández, M.; Geenen, R. The discordance between subjectively and objectively measured physical function in women with fibromyalgia: Association with catastrophizing and self-efficacy cognitions. The al-Ándalus project. Disabil. Rehabil. 2016, 40, 1–9. [Google Scholar] [CrossRef]
  101. Lovett, A.A.; Brainard, J.S.; Bateman, I.J. Improving Benefit Transfer Demand Functions: A GIS Approach. J. Environ. Manag. 1997, 51, 373–389. [Google Scholar] [CrossRef]
  102. Rouhani, S.; Ashrafi, A.; ZareRavasan, A.; Afshari, S. The impact model of business intelligence on decision support and organizational benefits. J. Enterp. Inf. Manag. 2016, 29, 19–50. [Google Scholar] [CrossRef]
  103. Landis, B.D.; Altman, J.D.; Cavin, J.D. Underpinnings of Academic Success: Effective Study Skills Use as a Function of Academic Locus of Control nd Self-Efficacy. Psi Chi J. Undergrad. Res. 2007, 12, 126–130. [Google Scholar] [CrossRef]
  104. Pajares, F.; Valiante, G. Students’ self-efficacy in their self-regulated learning strategies: A developmental perspective. Psychologia 2002, 45, 211–221. [Google Scholar] [CrossRef] [Green Version]
  105. Maheshwari, G.; Kha, K.L. Investigating the relationship between educational support and entrepreneurial intention in Vietnam: The mediating role of entrepreneurial self-efficacy in the theory of planned behavior. Int. J. Manag. Educ. 2022, 20. [Google Scholar] [CrossRef]
  106. Valtonen, T.; Kukkonen, J.; Kontkanen, S.; Sormunen, K.; Dillon, P.; Sointu, E. The impact of authentic learning experiences with ICT on pre-service teachers’ intentions to use ICT for teaching and learning. Comput. Educ. 2015, 81, 49–58. [Google Scholar] [CrossRef]
  107. Gong, Z.; Han, Z.; Li, X.; Yu, C.; Reinhardt, J.D. Factors Influencing the Adoption of Online Health Consultation Services: The Role of Subjective Norm, Trust, Perceived Benefit, and Offline Habit. Front. Public Health 2019, 7, 286. [Google Scholar] [CrossRef]
  108. Cislaghi, B.; Heise, L. Theory and practice of social norms interventions: Eight common pitfalls. Glob. Health 2018, 14, 1–10. [Google Scholar] [CrossRef] [Green Version]
  109. Forsythe, S.; Liu, C.; Shannon, D.; Gardner, L.C. Development of a scale to measure the perceived benefits and risks of online shopping. J. Interact. Mark. 2006, 20, 55–75. [Google Scholar] [CrossRef]
Figure 1. Overall summary items.
Figure 1. Overall summary items.
Systems 10 00100 g001
Figure 2. Path coefficient diagram.
Figure 2. Path coefficient diagram.
Systems 10 00100 g002
Table 1. Basic information of experts.
Table 1. Basic information of experts.
ExpertGenderAgeWorking TimeResearch DirectionProfessional Ranks
1Male4822 yearsProduct design;
Human–computer interaction
Doctoral supervisor;
Professor
2Female324 yearsuser perception and preferenceAssociate professor
Table 2. Semi-structured interview outline.
Table 2. Semi-structured interview outline.
NumberProblem
Q1.What are the benefits of online lessons through APP, in your opinion?
Q2.What are the downsides of online classes through APP, in your opinion?
Q3.What additional features do you believe might be added to the present online class software to make it more efficient?
Table 3. Factor analysis results.
Table 3. Factor analysis results.
NumberItemFactor Load FactorCommon Degree
Factor 1Factor 2Factor 3Factor 4
Q11Teachers can understand and monitor students’ learning effectively by using APP for online lessons0.2330.7550.1220.1490.661
Q16The app is good for parental supervision0.0800.7560.0770.0910.592
Q21You can learn about other students’ lessons0.3220.678−0.0760.0320.57
Q19It can maintain the relationship between teachers and students0.2250.7000.1480.0750.568
Q2Students can communicate and collaborate with each other using the app.0.6480.3250.1330.0280.543
Q10High learning efficiency with APP0.6870.1570.1780.2070.571
Q8The level of online lessons on APP is high quality0.7630.0920.1320.1780.64
Q6Online lessons on the APP are good value for money.0.4860.2480.3700.0420.437
Q17The quality of learning is good with the app0.7700.1870.1200.150.665
Q9Good learning atmosphere with APP0.6190.2840.0980.1280.490
Q3High personal control and concentration in online classes with APP0.2170.3240.0560.6330.556
Q7The operation of the functions of the online class APP is easy for me0.1630.0550.1310.7440.600
Q4Students can control their learning progress by fast-forwarding, pausing and re-watching0.1070.0120.2370.7730.665
Q13The functions on the APP are rich and complete0.2620.1340.7650.0870.680
Q14The functions on the APP are practical0.339−0.0710.5790.1390.475
Q20App can be used on multiple platforms such as cell phones and computers−0.0070.1410.7930.2550.713
Explained variance before rotation (%)34.41710.577.6416.291
Explained variance after rotation (%)19.81415.98511.88911.231
Eigenroots3.172.5581.9021.797
The total proportion of variance (%)58.919
Cronbach α for each factor0.8290.7590.6690.634
Overall factor Cronbach α0.866
Table 4. KMO and Bartlett’s inspection.
Table 4. KMO and Bartlett’s inspection.
KMO0.89
Bartlett’s sphericityspherical test1657.688
df-value120
p-value0
Table 5. Factor load factor table.
Table 5. Factor load factor table.
FactorItemCoef.Std. ErrorzpStd. EstimateAVECR
Factor 1Q21---0.6540.4650.775
Factor 1Q101.0390.09810.60400.698
Factor 1Q81.1230.09911.30800.758
Factor 1Q61.0660.09910.77700.712
Factor 1Q170.7280.0819.01700.575
Factor 1Q91.0250.1049.8100.634
Factor 2Q111---0.7900.4550.832
Factor 2Q161.1120.10810.29400.628
Factor 2Q190.8450.07910.700.655
Factor 2Q211.0430.09910.50300.642
Factor 3Q131---0.730.3790.646
Factor 3Q140.6550.0857.69500.535
Factor 3Q200.9480.1088.7800.652
Factor 4Q71---0.5940.4150.677
Factor 4Q31.360.1847.4100.62
Factor 4Q41.0730.1447.46900.632
Table 6. Square root value of AVE.
Table 6. Square root value of AVE.
Factor 1Factor 2Factor 3Factor 4
Factor 10.668
Factor 20.5460.677
Factor 30.3390.4460.616
Factor 40.2620.4750.4250.652
Table 7. Summary of model regression coefficients.
Table 7. Summary of model regression coefficients.
HypothesisXYUnstdSECRpStd.
H1Self-efficacyPerceived benefits0.1950.1421.3720.170.149
H2Self-efficacyFunctional quality0.6810.1066.43800.655
H3Functional qualityPerceived benefits0.4430.1243.57300.352
H4Subjective normsSelf-efficacy0.3270.0565.85700.505
H5Subjective normsPerceived benefits0.3960.0666.04400.467
Table 8. Model-fitting indexes.
Table 8. Model-fitting indexes.
Common Indicatorsχ2dfχ2/dfGFIRMSEACFINNFI
Judgment criteria--<3>0.9<0.10>0.9>0.9
value175.967991.7770.9390.0490.9510.941
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wang, Z.; Jiang, Q.; Li, Z. How to Promote Online Education through Educational Software—An Analytical Study of Factor Analysis and Structural Equation Modeling with Chinese Users as an Example. Systems 2022, 10, 100. https://doi.org/10.3390/systems10040100

AMA Style

Wang Z, Jiang Q, Li Z. How to Promote Online Education through Educational Software—An Analytical Study of Factor Analysis and Structural Equation Modeling with Chinese Users as an Example. Systems. 2022; 10(4):100. https://doi.org/10.3390/systems10040100

Chicago/Turabian Style

Wang, Zheng, Qianling Jiang, and Zichao Li. 2022. "How to Promote Online Education through Educational Software—An Analytical Study of Factor Analysis and Structural Equation Modeling with Chinese Users as an Example" Systems 10, no. 4: 100. https://doi.org/10.3390/systems10040100

APA Style

Wang, Z., Jiang, Q., & Li, Z. (2022). How to Promote Online Education through Educational Software—An Analytical Study of Factor Analysis and Structural Equation Modeling with Chinese Users as an Example. Systems, 10(4), 100. https://doi.org/10.3390/systems10040100

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop