Next Article in Journal
Forest Carbon Sequestration Potential in China under Different SSP-RCP Scenarios
Previous Article in Journal
Redesigning for Disassembly and Carbon Footprint Reduction: Shifting from Reinforced Concrete to Hybrid Timber–Steel Multi-Story Building
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Study of Factors Influencing the Use of the DingTalk Online Lecture Platform in the Context of the COVID-19 Pandemic

1
Social Innovation Design Research Centre, Anhui University, Hefei 203106, China
2
Scientific Research Division, University of Science and Technology of China, Hefei 203106, China
3
College of Environmental Science and Engineering, Ocean University of China, Qingdao 266100, China
*
Authors to whom correspondence should be addressed.
Sustainability 2023, 15(9), 7274; https://doi.org/10.3390/su15097274
Submission received: 7 February 2023 / Revised: 24 April 2023 / Accepted: 26 April 2023 / Published: 27 April 2023
(This article belongs to the Section Sustainable Engineering and Science)

Abstract

:
Online classes quickly became a hot topic in education during the effort to prevent and manage the COVID-19 outbreak. This paper is of great value in analyzing the factors influencing online delivery from a socially acceptable perspective, using the online DingTalk platform course as the research target. The researchers of this paper used a questionnaire to establish the conceptual basis of the survey based on the technology acceptance model (TAM), and developed the corresponding survey questions. The questionnaire was distributed to 528 respondents, of which 495 were valid data samples; furthermore, the sample efficiency of the returned questionnaire was 93.75%. The data were analyzed by SPSSAU software for reliability (Cronbach alpha: 0.967). For the purposes of assessing validity, ANOVA was used, and the SEM structural ANOVA was utilized in order to understand the impact of using the DingTalk platform for online classroom teaching, as well as to study user satisfaction with its use and to make relevant suggestions for continuing to use the online platform for classes in terms of campus management for online/offline hybrid teaching.

1. Introduction

The COVID-19 outbreak, which began in December 2019 [1], quickly spread in China and worldwide, causing worldwide concern [2]. In addition to transmission through droplets, the novel coronavirus can also be transmitted through aerosols in confined spaces, and it has been found in the excrement of infected patients [3,4]. To control the spread of the pandemic, many countries implemented “home isolation” measures, leading numerous universities to adopt online teaching approaches to prioritize the safety of their students [5,6]. Remote learning has been a global trend in higher education, with evidence of its use dating back to the 1840s when instructors communicated with their students by mail [7,8]. Since the mid-1990s, online learning has been used for training [9]. and it has continued to evolve to offer numerous opportunities for sharing information and importing documents through web-based platforms equipped with features to enhance learning efficiency [10,11]. Webinars and internet-based learning platforms have become essential tools for teaching and communication between faculty and students, offering flexibility in both time and location, and enabling college students to attend classes safely from home, mitigating the risks of COVID-19 [12,13,14]. These online platforms also enable instructors to provide additional materials not available in offline courses, while facilitating timely communication between students and faculty [15,16,17,18]. However, the impact of the COVID-19 pandemic on higher education institutions has raised concerns about the quality of teaching for researchers, and while online education has been useful and beneficial in some stages of the pandemic, it also poses challenges [19,20].
In accordance with the guidelines of the Chinese Ministry of Education, educational institutions responded to the demand for free online streaming structures [21]. During the pandemic lockdown, the Ministry of Education and school governing bodies issued guidelines stating that online education is not limited to the classroom, as there are also online lectures, student-led mastery courses, and televised on-air classroom instruction that can serve as alternatives [22]. Online learning room instruction can play a greater role at this stage as it is no longer restricted by time and location and is more widely used by teachers [23]. With the rapid development of the Internet, online education has gradually improved, and popular platforms include Tencent Meeting, the DingTalk platform, and Fishu. The DingTalk platform, in particular, has received high acclaim from users since its launch. It is not only popular among companies, hospitals, and restaurants, but it also provides a great deal of help to individual college teachers [24,25]. University teachers can use the DingTalk platform to teach online during pandemic prevention and control measures [26]. Additionally, the platform can help college teachers improve information communication and work efficiency, significantly improving their information management and office-level communications [27]. The platform offers various features, such as DING function for improving communication efficiency, real-name authentication, continuous microphone for online dialogue and information storage between teachers and students, sticky note function for easy access to important information, and Cloud drive for large-capacity storage of personal files [28].
Therefore, the authors have identified a suitable research model based on satisfaction factors. The willingness of students to use the online teaching platform, DingTalk, is influenced by various factors. Firstly, the platform’s usefulness and ease of use are significant factors affecting students’ willingness to use it. If the platform is easy to operate, has a friendly interface, and provides clear and convenient learning resources and interactive tools, students will be more inclined to use it. Secondly, the quality and appeal of the teaching content are also crucial in determining students’ willingness to use the platform. If the lessons on the platform are engaging, informative, and linked to classroom teaching, meeting the learning needs and interests of students, they will be more likely to participate actively in online learning. Additionally, the mode of interaction between teachers and students also plays a vital role in students’ willingness to use the platform. If teachers can provide interesting interaction, timely feedback, and personalized guidance through the platform, which can stimulate students’ interest and motivation, students will be more eager to use the platform for online learning. Overall, various factors such as usefulness, ease of use, teaching content, interaction, and individual student factors combine to influence students’ willingness to use the DingTalk platform. This study utilizes the technology acceptance model as the fundamental model, employs the index form to create the questionnaire index, utilizes the online questionnaire to survey college students’ willingness to use the DingTalk platform, and then analyzes the questionnaire results to derive the findings.

2. Materials and Methods

2.1. Research Methods

2.1.1. Technology Acceptance Model

In studies on the use of online classes by college students through the DingTalk platform, researchers have examined student attitudes towards online education, as well as their opinions on the roles that teachers should play in the online classroom. They have also looked at student behavior and attitudes towards using the DingTalk platform. There are many theoretical approaches to studying the transmission of information technology, including the technology acceptance model (TAM) [29,30]. This model was pioneered by Professor Davis in 1989, and helps to recognize customer interactions with the internet. Thanks to refinements, the TAM’s overall performance has improved from 10% to 60% [31], making it a valuable tool for improving online education. The diffusion model of the technology acceptance model is the core theoretical basis of this study. This research model involves the information reception model, which applies the theory of rational behavior (TRA) to the commercial enterprise of information content reception by users. It considers personal beliefs, subjective attitudes, willingness to act, and external factors [32]. The technology acceptance model (TAM) is also applied in this study to examine the degree of information content reception and how customers receive it by applying the theory of rational behavior [33].
The model is widely recognized for its clear structure of elements, simplicity of questions, and analytical and predictive power (Figure 1). The model contains five key elements: perceived usefulness, perceived ease of use, user behavior, attitude toward use, and behavioral intention, where the perceived usefulness and perceived ease of use are influenced by external factors [34]. Usage behavior is determined by usage behavior and behavioral intention is determined by usage attitude. In 2003, Venkatesh and Davis extended the model to include elements such as behavioral norms, performance assessments, and competence expectations, and they valued indirect effects of moderators such as personality, age, and experience. Using the educational acceptance model as a theoretical basis, this study analyzes the factors associated with the developmental improvement of online education colleges, and it aims to provide a reference for the improvement of the online mentoring model.

2.1.2. Structural Equation Model

Structural equation modeling (SEM) is a statistical approach that involves building on a current model by describing it through a system of linear equations that are related to it. Compared to exploratory methods, SEM is more capable of specifying the potential effects of relevant or uncorrelated factors, and it is better adapted to psychological practice. It also ensures that an observation is confounded by only one latent variable, making the structure clearer. Additionally, SEM allows for the implementation of multimethod testing (using multi-method-multi-trait models, also known as MMMT), which can eliminate other errors. It does not assume that there is no correlation with errors in any particular variable but indicates those errors that are within specificity variables that have some correlation with both. Second, a correlation between the variables is established by considering measurement errors. In this step, the observed (exogenous) and latent (implicit) variables are redefined with statistical thinking, such that the observed exogenous phenomena can be used to infer latent concepts. Researchers can employ latent factor items as exogenous variables in their analysis of research pathways, allowing them to investigate the direct effects of different factors, the combined effects of indirect factors, and the relationship between expression and mediator effects. Furthermore, this approach can also help assess test bias in the target variables, leading to more accurate research conclusions [35]. In summary, the SEM technique has the following advantages: (1) it allows for tests to exist between the independent and dependent variable; (2) it allows the use of multiple exogenous indicator variables to represent the latent variables (similar to factor analysis) and to assess the confidence and efficiency of the target variables; (3) it provides more flexible testing (i.e., measurement models) than traditional algorithms (in traditional algorithms, an item usually depends on only one factor, but in SEM, an indicator variable can depend on two or more latent factors at the same time); and (4) it allows one to establish the associations between latent data and ensures consistency between predictive models and results [36].

2.2. Questionnaire Design

Through a literature review, the theoretical foundations, the core concept definition, and the user characteristics of the online delivery platform include the richness of the content. Additionally, combined with the structural equation model, the completion quality statistics and analysis of the results of each index are the specific content to be evaluated. Based on this, it can be roughly divided into five major levels of experience usefulness, experience availability, external environment, service satisfaction, and long-term development needs, which are all needed to construct the DingTalk online teaching platform factor index system. We have developed our own scale based on specific scenarios and relevant research methods (Table 1).
The sketch of the scale draws on Davis et al.’s theoretical formula of perceived usefulness, perceived ease of use, usage behavior, intention to use, and behavioral attitudes. The external influences on online education are included through 23 interrogative factors implemented into 5 basic dimensions (Table 2). The test scale was based on a 9-point scale (very dissatisfied to very satisfied) with scores ranging from 1 to 9 in that order, and the questionnaire content was selected according to the degree of personal agreement with the information in the target question statements. The basic information section of the questionnaire examined the respondent’s name, age, position, and whether they had experience in online education. After the questionnaire design was completed, uncertain information was corrected by taking peers as a pilot to test the inappropriate questions. To determine the reliability of the experimental results, an exploratory component evaluation strategy and an experimental type element evaluation method were used for the checklist to test the rationality of the experimental scale layout and the methodological knowledge acceptance model.

3. Questionnaire Statistics and Data Analysis

3.1. Data from Data Collection

Through online research, the authors downloaded 528 samples from the questionnaire platform for further processing and analysis. This is because the study was conducted on students who attended classes using the DingTalk platform software. A total of 20 questionnaire participants did not use the DingTalk platform to listen to classes online, so these 20 questionnaires were not valid. For the other questions, because the data released by the questionnaire platform can only be submitted if all the questions are answered, a total of 508 samples of data were complete without missing values. Subsequently, the writer combed the 508 authentic records for inapplicable pattern data in order to make certain that the pattern records used for empirical evaluation had an acceptable level of satisfaction of completion. The remaining 508 completed questionnaires were manually identified, and 13 sample data with discrepancies in their indicator answers were eliminated. Finally, after sorting the sample data from the 528 responses, the authors obtained 495 valid data samples for empirical analysis, and the sample validity rate of the recovered questionnaires was 93.75%.

3.2. Analysis of the Basic Information of the Questionnaire

Table 3 displays the numerical attributes of the demographic variables, which can be replicated in the demographic distribution of the subjects examined in this survey. The suggestions represent the qualities of the situation, while the incidence deviation signifies the fluctuations. From the effect of the frequency assessment of the survey respondents with the help of gender and age, it is possible to find that the profile of the spread is essentially in line with the qualities of the model survey. Therein, it can be seen from the gender survey that males account for 41%, while girls account for 59%; regarding the age survey effect, it can be seen that 69.50% are in the age of 18–30, which shows that the questionnaire is essentially biased toward the evaluation of the young and middle-aged groups (Table 3). Those who are not yet 18 years old are freshmen who have just entered college, whereas those who are 18–30 years old are roughly undergraduate and graduate students who have been living in college for some time; those who are 31–40 years old are roughly graduate students and doctoral students who are still working hard on the road of research; those who are 41–50 years old are roughly teachers in school and part-time teachers in colleges and universities; and those who are 51 years old and above are roughly teachers who have been teaching in schools for many years.

3.3. Data Analysis

3.3.1. Reliability Analysis

The consistency and reliability of the questionnaire were measured using a Cronbach alpha reliability assessment and data analysis through SPSSAU. A Likert scale was used in the questionnaires to determine its reliability. Table 4 displays the reliability coefficients, and the Cronbach reliability of SPSSAU was chosen for comparison. The standardized Cronbach coefficient was found to be 0.967, indicating high reliability. A ratio alternate of the reliability coefficient was generally between 0 and 1, with higher values indicating higher reliability. The investigation concluded that the reliability of this questionnaire is notably higher than 0.9, making it suitable for subsequent analysis.

3.3.2. Validity Analysis

Validity is a crucial aspect in evaluating the accuracy and reliability of survey in-struments and data. It determines whether the questionnaire results are genuine and if respondents’ evaluations are impartial. Content validity and construct validity are commonly used indicators to assess the validity of surveys.
The structural validity that found out about this task once performed via the use of SPSSAU, and exploratory component evaluation was conducted for the structural validity test. According to Table 5 of the findings of the exploratory aspect analysis, the coefficient of the KMO was considered and was at once calculated to be 0.98. When the coefficient of KMO check is between the numbers 0 to 1, it indicates that the questionnaire has validity. If the coefficient is closer to a variety of 1, this indicates a higher validity. Depending on the size of the test, we also found that the survey still had a proper validity, as the price of the test was at once infinitely close to zero and the particular guesses were rejected.

3.3.3. Test of Variability

The main purpose of the difference detection was to study the differences in variables at different levels by using tests, such as the independent sample t-test, the chi-square test, and one-way ANOVA. In this analysis, we mainly used the independent sample t-test and one-way ANOVA in order to analyze the survey data according to their characteristics. Through the results of the independent sample t-test, as shown in Table 6, we found that there were significant differences in the various satisfaction levels across genders. This indicates that the gender factor has a certain influence on satisfaction. The result of the significance test was found to be greater than 0.05; as such, the original hypothesis cannot be rejected.

3.3.4. Structural Equation Model Analysis

The basic idea of structural equation modeling (SEM) to establish a pattern of interconnections among the factors of interest based on previous concepts and existing understanding. This pattern is derived and assumed, followed by measurement, which approves the coefficients of the exogenous variables and constructs a covariance matrix known as the pattern matrix. Testing the correctness of the equation model involves assessing the ability of the constructed hypothetical model and data matrix to integrate with each other. If the proposed hypothetical approach can integrate the actual sampling effect, it indicates the correctness of the modeling [40].
The main purpose of structural equation modeling is to construct specific outcome models to study the linkages between variables and changes during the hypothesis period. The model is usually described by a system of linear equations and consists of two parts: an econometric model and a structural model. The econometric model describes the relationship between the latent and observed variables, through which the latent variables can be defined based on the observed factors, whereas the constructive model describes the association between the latent variables. The relationship between the econometric model and the constructive model can be represented by a matrix equation as follows [41]:
SEM M e a s u r e m e n t   m o d e l s Y = Y   η + ε X = X   ξ + δ S t r u c t u r a l   M o d e l   η = B η + Γ ξ + ζ
Structural equation modeling considers certain phenomena that cannot be directly observed, but rather when one wants explore, in depth, the latent variables. One then uses certain variables (indicators) that can be directly observed in order to express such latent variables and to then use them to determine the structural links among the latent variables. It is a type of statistical instrument that studies the macroeconomic change patterns of micro-individuals. Regarding the association between latent variables, it can typically be written as the following structural equation:
η = B η + Γ ξ + ζ
where the place η is the endogenous latent variable, B is the relationship between the endogenous latent variables, Γ is the impact of the exogenous latent variable on the endogenous latent variable, ξ is the exogenous latent variable, and ζ is the residual time period of the structural equation [42,43]. The relationship between the goal and latent variables with appreciation to every difference is frequently written as per the following dimension equation [44,45]:
X = X   ξ + δ Y = Y   η + ε
Structural equation modeling has the blessings of permitting the computational blunders contained in the unbiased and established variables, as well as allows one to deal with multiple independent variables at the same time when compared to other models. Therefore, structural equation modeling is a wide ranging mathematical model that can deal with many problems in higher education, business management, market economy, tourism, psychology, socialism, and other professions [45].
The above table provides an assessment of the impact of the potential variables and an assessment of the impact results (Table 7). In terms of impact results, the following conclusions can be drawn from the above table. The relationship between perceived usefulness and perceived ease of use on usage attitudes was above 0.05, with systematic coefficients of 0.287 and 0.701, respectively. This suggests that perceived ease usability has a high effect on usage attitudes [46]. The attitude toward use variable has a strong influence on human loyalty. Meanwhile, the “Summary Table of Model Regression Coefficients” shows all the measured relationships, which can be visualized in the above table, where “-” means that the item is a reference item and is not an output [46]. The table above indicates the size relationship if the size relationship is good, where the normalized load coefficient cost is essentially higher than 0.6.
The table above indicates the impact of the mannequin fitting (Table 8). There were many healthy mannequin indicators, and SPSSAU, alone, lists all of them. Few fashions can make all the match symptoms meet the standard; however, the most used in shape symptoms are endorsed to be inside the ideal range. There are many structural equation modeling SEM match metrics. However, there are no constant necessities for which metrics to use. Most research will solely use a few of these metrics for the purposes of document presentation.
In this experiment, the chi-squared levels of the freedom ratio was 2.219, which is much less than 3, thereby indicating the clear and fantastic shape of the model. Additionally, the RMSEA fee is 0.050, which is equal to 0.05, whilst the RMR fee is 0.034, which is much less than 0.05. Meanwhile, the GFI, CFI, and AGFI values were all larger than 0.9, and the NFI fee was 0.955, which is higher than 0.9 but drawing near 0.9, whilst all the different parameters were additionally inside the regular range. For that reason, this indicated that the mannequin was properly mounted and that the mannequin information was credible.
The above table shows the results of the covariance relationship MI indicator, and this time the parameter is set to MI > 10, before output (Table 9). If the model fits the indicator poorly, one should then consider the need to establish the covariance correlation between B1 and D1 after analyzing again. This table contains the relationships between several variables. The interaction between these variables is measured by the mutual information value. The higher the mutual information value, the stronger the correlation between the two variables. Each relationship is also given a ‘parameter change’, which indicates how the mutual information value will change when the parameter of a variable is changed. These relationships include the interactions between PE0U2 and UB3, PE0U1 and UB2, A2 and PU3, A1 and BI3, A1 and BI1, UB6 and UB3, UB6 and PU2, UB6 and BI1, UB6 and A2, and UB5 and PE0U2.
The table above displays the ‘Influence Relationship MI Metric’, which indicates that the current parameter setting requires an MI value that is greater than 10 in order to generate output (Table 10). The absence of data in the table implies that all MI values in the influence relationships are greater than 10. However, researchers can reset the threshold to MI > 3 for the purposes of the output MI data, although such values in the single digits are usually considered small. If there is a large MI value (e.g., greater than 20), it may be necessary to rebuild the influence relationship model. The relationship in the first row is “Effect of Usage Behavior on Attitude to Use”, which has a mutual information value of 10.266, and if one of the parameters of Usage Behavior is changed, this results in an increase in the mutual information value of 10.937. The relationship in the second row, “Effect of Usage Attitude on Perceived Usefulness”, has a mutual information value of 12.495, with a change in one of the parameters of Usage Attitude resulting in a decrease in the mutual information value. A change in one of the parameters of Usage Attitude would result in a decrease in the mutual information value of 4.514.
Whether to incorporate the influence relationship established in the table into the model (i.e., in order to rebuild the model) depends on two factors: (1) whether professional knowledge allows it, and (2) whether there are changes in the cost and model evaluation of the MI index. If the expert’s understanding does not permit it at present, the model cannot be modified, even if the cost of MI is significant. It is not advisable to arbitrarily adjust the model for the sake of meeting standards. The absence of information in the MI values in the table implies that all of the MI values were far less than 10, and that there was no need to adjust the model any further (as it will not improve the results). Therefore, the current model is the final result for the influence relationships.
The above matrix displays the fitted results for the R-squared values, and the significance of this indicator lies in the strength of its interpretation by the influenced term, i.e., Y, which is influenced by the influenced term X (Table 11). The table shows a series of Independent variables and their respective R-squared values in relation to the dependent variable. In this table, ‘perceived ease of use’ is an independent variable that has a relatively high R-squared value (0.991) with the dependent variable. This means that there is a strong correlation between perceived ease of use and the dependent variable in the data sample, and that perceived ease of use can be used to predict the dependent variable. In other words, this R-squared value suggests that perceived ease of use has a high explanatory power for explaining changes in the dependent variable in this particular study, while other independent variables may have a lower explanatory power for the dependent variable. This suggests that perceived ease of use is more important to the dependent variable than the other independent variables in this study.
After summarizing the above, it can be concluded that the final coefficients of each result are organized and summarized inside a graph, which facilitates the display of the results and thus establishes the user for plotting in SPSSAU, specifically of the following type.
The figure above is drawn for SPSSAU and shows the ‘measurement relationship’ and ‘impact relationship’ (i.e., the standardized regression coefficients in the summary table of model regression coefficients) as well as the covariance relationship (i.e., the standardized estimated coefficients in the table of covariance relationships). The equation model contains a number of variables and their interrelationships, each represented by a box. The arrows indicate the paths in the equation, i.e., the relationships between the variables. The number on the arrow indicates the path coefficient or standardised regression coefficient, which indicates the strength and direction of the interaction between the two variables. As the above figure shows, perceived usefulness, perceived ease of use, attitude toward use, and behavioral intention all influence the final usage behavior, but it can also be seen that perceived usefulness and perceived ease of use are the first to influence users’ intention to use.
Users’ perceived usefulness of the DingTalk platform was positively correlated with their willingness to use it. It can be seen from the questionnaire that the perceived usefulness was directly related to the users’ willingness to use the Internet product, as is shown in Figure 2. Furthermore, the DingTalk platform needs to make users aware that it can provide the information and services needed to help solve problems and can meet their usage needs, and only then will users be willing to use it; otherwise, users may choose traditional Internet products, such as WeChat and QQ.
DingTalk users’ perceived ease of use and willingness to use are closely related, as is shown in Figure 2 from the collected questionnaires. This shows that perceived ease of use is closely related to the users’ willingness to use. Users who feel comfortable using the DingTalk platform are more likely to accept the benefits of the service and are therefore more willing to use it. As the DingTalk platform is easy to use, users rated the ease of use higher [47].

4. Conclusions

4.1. Factors Influencing Willingness to Learn Online

In the special stage of COVID-19 pandemic prevention and control, schools need to establish an information management system to effectively and accurately monitor students’ virus prevention and control status by improving the student group management system. The DingTalk platform is suitable for teaching and implementing various COVID-19 pandemic prevention methods. Furthermore, it aids with the control of knowledge and enables the rapid delivery of teaching content. An Internet interface environment is mainly embodied in pictures, sound, video, and text. For the emphasis of conventional information forms—such as picture forwarding, graphic notifications, and screenshotting videos live—the content of this paper is based on the research model of the technology acceptance model, combined with the user experience questionnaire of the DingTalk platform for structural equation model SEM analysis and testing. As a result, the authors tried to find the factors affecting users’ willingness to use the DingTalk platform, and attempted to identify the factors affecting users’ perceptions as independent variables and willingness to use as dependent variables in terms of perceived usefulness, perceived ease of use, usage behavior, and usage attitudes, as follows:
(1)
Perceived usefulness has a significant indirect effect on students’ willingness to use the online teaching DingTalk platform on their own, which indicates that perceived usefulness is a direct influence on willingness to use, and students are largely influenced by outside sources. As the influence of COVID-19 grows, schools use the teaching DingTalk platform in order to allow students to be managed uniformly on the Internet, and also because schools ensure the quality of teaching, most schools choose the DingTalk platform, so students then use the DingTalk platform in their classes. Therefore, it is more likely to be constrained by the degree of efficiency when making decisions. Therefore, a good and efficient teaching platform is crucial.
(2)
Perceived ease of use also has a significant indirect effect on students’ willingness to use the online teaching DingTalk platform. Students find the DingTalk platform easy to use, and they may have positive evaluations of the platform’s features, performance, and design, thus increasing their satisfaction with the platform and motivating them to use it more actively.
(3)
Behavioral attitudes have a direct effect on students’ willingness to use the online teaching DingTalk platform, which is a way of using the Internet and digital technology for distance education that involves students and teachers interacting and learning in a virtual environment. The analysis of the questionnaire shows that students’ behavioral attitudes, i.e., their perceptions, attitudes, and beliefs about online teaching and learning, affect their willingness to use online teaching and learning, i.e., their willingness to adopt and participate in online teaching and learning.
(4)
Students’ behavior toward using the DingTalk platform is directly influenced by their willingness to use it. DingTalk platform provides rich and practical functions and has a simple and friendly interface, so students will be more inclined to actively use the platform. When students answer that they have higher expectations for the effectiveness of online teaching on the DingTalk platform, they also have higher expectations for usage behavior. Students are active on the platform, and they may use it more frequently because of their social needs. Students are likely to use the DingTalk platform more actively if the school uses it as a primary tool for student learning and communication, and if it is positively evaluated and supported.
(5)
The study found that students’ willingness to use the online teaching platform DingTalk was influenced by several factors. First, perceived usefulness has an indirect effect on students’ willingness to use. Students are largely influenced by external factors, such as the COVID-19 epidemic, which led schools to choose the DingTalk platform for online teaching, so an efficient teaching platform is crucial to students’ willingness to use. Second, perceived ease of use also has an indirect effect on students’ intention to use: students’ perception of the DingTalk platform as easy to use increases their satisfaction and motivates them to use the platform more actively. Again, students’ behavioral attitudes have a direct impact on willingness to use; students’ perceptions, attitudes, and beliefs about online teaching and learning influence their willingness to adopt and participate in online teaching and learning. Finally, students’ behaviors toward using the DingTalk platform are directly influenced by their intention to use it. Students who have high expectations of the platform and are positively evaluated and supported are likely to use it more actively, especially in contexts where social needs are high.

4.2. Recommendations

According to the analysis of the study results, students’ willingness to use online learning platforms was significantly influenced by the quality of platform services and the convenience of learning. Therefore, the learning platform should have a simple design, clear functions, and easy operations by which to facilitate students’ online learning. To facilitate students’ online learning, the learning platform should provide convenient and quick access, such that students can easily view assignments, submit assignments, and access learning materials. As the COVID-19 pandemic recedes, many colleges and universities are gradually shifting from online administration and are learning to again utilize offline instruction. However, the integration of online and offline formats for managing the instructional aspects of students was found to challenge the administrative side of the school. These changes involved not only a shift in the format of student classes, but also the need to adapt to new teaching models, which resulted in an effect of social shifts. In this context, technological conditions, resources, and a sense of innovation can help create more favorable conditions to make learning and living more convenient. Therefore, the authors propose the following recommendations to facilitate the management of online-blended and offline-blended learning and teaching in schools:
(1)
Strengthening the top-level design and systematic planning of curriculum teaching implementation.
In view of the goal of full-time higher vocational education to cultivate high-quality technical skill type talents, the curriculum system needs to be sorted out and planned. Most of the basic courses and professional courses can be transformed into online and offline hybrid courses; however, certain engineering practical training courses still need to be taught with the equipment and facilities in person. With the continuous updating of virtual equipment and virtual instruments, this part of the courses may also need online and offline hybrid teaching. In the process of implementing online and offline hybrid teaching development, we cannot simply copy the offline teaching format, but should also carry out systematic planning based on the teaching principles, take into account the knowledge and skills objectives, analyze which teaching format, methods, and resources are needed for different objectives, and then reasonably allocate the relevant online and offline teaching contents, as well as carry out further teaching design.
(2)
Strengthening the top-level design and systematic planning of course teaching implementation.
This is crucial for the goal of cultivating high-quality technical skill talents in full-time higher vocational education. For most courses in the curriculum system, they can be transformed into online and offline hybrid teaching formats, including basic courses and specialized courses. However, certain engineering internships and training courses still need to be taught with equipment and facilities in person, such that they are only applicable to an offline teaching format. With the continuous progress of simulation technology, this part of the courses mentioned also has the possibility to realize online and offline hybrid teaching. In the development process of implementing online and offline hybrid teaching, we cannot simply copy the offline teaching content. It should be systematically sorted out and planned based on the principles of teaching and learning. It is necessary to take into account the requirements of knowledge and skill objectives, and to reasonably analyze which teaching forms, teaching methods, and teaching resources are needed for different objectives. It is necessary to consider which objectives are applicable to online teaching, which objectives are applicable to offline teaching, and what resources are needed to achieve each. Ultimately, the relevant online and offline teaching content should be reasonably allocated, and instructional design should be carried out.
(3)
Integrate and improve the teaching supervision and inspection mechanism.
The online and offline hybrid teaching quality assurance system is part of the overall teaching quality assurance system of the university. In view of the diversified teaching forms, flexible teaching time, and the various platforms and interaction forms of online teaching, there are difficulties in the management of supervision work. Therefore, the primary task is to formulate corresponding teaching supervision rules and regulations according to different teaching modes to ensure the standardization, systematization, and operability of supervision work. The supervisory team should provide targeted training, and they should also improve the level of information-based teaching and information-based supervision in order to ensure that the supervision work is carried out on the ground. Regarding the concrete implementation process, the following issues should be noted: first, teaching quality supervision should focus on the coherence, articulation, and teaching effect of online and offline teaching; second, teaching design and organization should be student-centered, focusing on students’ learning status and evaluation feedback; third, teachers should be supported to innovate teaching methods and teaching means, and supervision should be carried out through the “feedback–improvement–feedback” process; and fourth, we should pay attention to the construction of online teaching foundation databases, as well as consider how to extract, analyze, evaluate, and quantify the data, all of which are helpful when accurately identifying deficiencies and problems. Furthermore, we should help teachers improve their teaching work, enhance their teaching quality, and provide direct data for the school in order to provide an effective basis for teaching supervision and research.

Author Contributions

Conceptualization, F.Z. and Y.G.; methodology, F.Z. and Y.Z.; software, J.P.; validation, H.Z. and J.P.; formal analysis, H.Z.; investigation, F.Z. and J.P.; resources, Y.G.; data curation, J.P.; writing—original draft preparation, Y.G. and J.P.; writing—review and editing, Y.Z.; visualization, J.P.; supervision, F.Z.; project administration, F.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This study was conducted through the Anhui University 2020 Talent Introduction Scientific Research Start-up Fund Project (Project No. S020318019/001).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The experiment data used to support the findings of this study are included in the article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Moftakhar, L.; Seif, M. The exponentially increasing rate of patients infected with COVID-19 in Iran. Arch. Iran. Med. 2020, 23, 235–238. [Google Scholar] [CrossRef]
  2. Chen, N.; Zhou, M.; Dong, X.; Qu, J.; Gong, F.; Han, Y.; Zhang, L. Epidemiological and clinical characteristics of 99 cases of 2019 novel coronavirus pneumonia in Wuhan, China: A descriptive study. Lancet 2020, 395, 507–513. [Google Scholar] [CrossRef] [PubMed]
  3. Yu, L.; Huang, L.; Tang, H.-R.; Li, N.; Rao, T.-T.; Hu, D.; Wen, Y.-F.; Shi, L.-X. Analysis of factors influencing the network teaching effect of college students in a medical school during the COVID-19 epidemic. BMC Med. Educ. 2021, 21, 397. [Google Scholar] [CrossRef]
  4. Plaza-Ccuno, J.N.R.; Vasquez Puri, C.; Calizaya-Milla, Y.E.; Morales-García, W.C.; Huancahuire-Vega, S.; Soriano-Moreno, A.N.; Saintila, J. Physical Inactivity Is Associated with Job Burnout in Health Professionals during the COVID-19 Pandemic. Risk Manag. Healthc. Policy 2023, 16, 725–733. [Google Scholar] [CrossRef]
  5. Sobaih, A.E.E.; Hasanein, A.M.; Abu Elnasr, A.E. Responses to COVID-19 in higher education: Social media usage for sustaining formal academic communication in developing countries. Sustainability 2020, 12, 6520. [Google Scholar] [CrossRef]
  6. Ali, W. Online and remote learning in higher education institutes: A necessity in light of COVID-19 pandemic. High. Educ. Stud. 2020, 10, 16–25. [Google Scholar] [CrossRef]
  7. Bezovski, Z.; Poorani, S. The evolution of e-learning and new trends. IISTE 2016, 6, 50–57. [Google Scholar]
  8. Coman, C.; Țîru, L.G.; Meseșan-Schmitz, L.; Stanciu, C.; Bularca, M.C. Online Teaching and Learning in Higher Education during the Coronavirus Pandemic: Students’ Perspective. Sustainability 2020, 12, 10367. [Google Scholar] [CrossRef]
  9. Lee, B.C.; Yoon, J.O.; Lee, I. Learners’ acceptance of e-learning in South Korea: Theories and results. Comput. Educ. 2009, 53, 1320–1329. [Google Scholar] [CrossRef]
  10. Sangrà, A.; Vlachopoulos, D.; Cabrera, N. Building an inclusive definition of e-learning: An approach to the conceptual framework. Int. Rev. Res. Open Distrib. Learn. 2012, 13, 145–159. [Google Scholar] [CrossRef]
  11. Khan, M.A.; Raad, B. The Role of E-Learning in COVID-19 Crisis. 2020. Available online: https://www.researchgate.net/publication/340999258_THE_ROLE_OF_E-LEARNING_IN_COVID-19_CRISIS (accessed on 7 October 2022).
  12. Anaraki, F. Developing an Effective and Efficient eLearning Platform. Int. J. Comput. Internet Manag. 2004, 12, 57–63. Available online: https://dialnet.unirioja.es/servlet/articulo?codigo=5823402 (accessed on 27 October 2022).
  13. Costa, C.; Alvelos, H.; Teixeira, L. The use of Moodle e-learning platform: A study in a Portuguese University. Procedia Technol. 2012, 5, 334–343. [Google Scholar] [CrossRef]
  14. Cacheiro-Gonzalez, M.L.; Medina-Rivilla, A.; Dominguez-Garrido, M.C.; Medina-Dominguez, M. The learning platform in distance higher education: Student’s perceptions. Turk. Online J. Distance Educ. 2019, 20, 71–95. [Google Scholar] [CrossRef]
  15. Marinoni, G.; Van’t Land, H.; Jensen, T. The Impact of COVID-19 on Higher Education around the World. International Association of Universities. Available online: https://www.iau-aiu.net/IMG/pdf/iau_covid19_and_he_survey_report_final_may_2020.pdf (accessed on 14 November 2022).
  16. Ouadoud, M.; Nejjari, A.; Chkouri, M.Y.; El-Kadiri, K.E. Learning management system and the underlying learning theories. In Proceedings of the Mediterranean Symposium on Smart City Applications, Tangier, Morocco, 25–27 October 2017; Springer International Publishing: Cham, Switzerland, 2018; pp. 732–744. [Google Scholar]
  17. Qiu, T.S.; Wang, H.Y. CMS, LMS and LCMS for elearning. J. e-Sci. 1994, 16, 569–575. [Google Scholar]
  18. Martín-Blas, T.; Serrano-Fernández, A. The role of new technologies in the learning process: Moodle as a teaching tool in physics—Sciencedirect. Comput. Educ. 2009, 52, 35–44. [Google Scholar] [CrossRef]
  19. Rawat, R.S.; Kothari, H.C.; Chandra, D. Role of the Digital Technology in Accelerating the Growth of Micro, Small and Medium Enterprises in Uttarakhand: Using TAM (Technology Acceptance Model). Int. J. Technol. Manag. Sustain. Dev. 2022, 21, 205–227. [Google Scholar] [CrossRef]
  20. Allo, M.D.G. Is the online learning good in the midst of COVID-19 Pandemic? The case of EFL learners. J. Sinestesia 2020, 10, 1–10. [Google Scholar]
  21. Nieto-Márquez, N.L.; Baldominos, A.; Soilán, M.I.; Dobón, E.M.; Arévalo, J.A.Z. Assessment of COVID-19′s Impact on EdTech: Case Study on an Educational Platform, Architecture and Teachers’ Experience. Educ. Sci. 2020, 12, 681. [Google Scholar] [CrossRef]
  22. Alqahtani, A.Y.; Rajkhan, A.A. E-learning critical success factors during the covid-19 pandemic: A comprehensive analysis of e-learning managerial perspectives. Educ. Sci. 2020, 10, 216. [Google Scholar] [CrossRef]
  23. Vogel-Walcutt, J.J.; Fiorella, L.; Malone, N. Instructional strategies framework for military training systems. Comput. Hum. Behav. 2013, 29, 1490–1498. [Google Scholar] [CrossRef]
  24. Cesari, V.; Galgani, B.; Gemignani, A.; Menicucci, D. Enhancing qualities of consciousness during online learning via multisensory interactions. Behav. Sci. 2021, 11, 57. [Google Scholar] [CrossRef] [PubMed]
  25. Huang, D. Analysis of the Application of Ali Nails in University Management Departments. Educ. Mod. 2018, 5, 338–339. [Google Scholar] [CrossRef]
  26. Lin, A. The design and implementation of the office automation system of the second-level college based on “nail”. Comput. Knowl. Technol. 2020, 16, 89–90+95. [Google Scholar] [CrossRef]
  27. Wang, M.; Zhao, Z. A Cultural-Centered Model Based on User Experience and Learning Preferences of Online Teaching Platforms for Chinese National University Students: Taking Teaching Platforms of WeCom, VooV Meeting, and DingTalk as Examples. Systems 2022, 10, 216. [Google Scholar] [CrossRef]
  28. Linke, M.; Landenfeld, K. Competence-Based Learning in Engineering Mechanics in an Adaptive Online Learning Environment. Teach. Math. Appl. Int. J. IMA 2019, 38, 146–153. [Google Scholar] [CrossRef]
  29. Chuenyindee, T.; Montenegro, L.D.; Ong, A.K.S.; Prasetyo, Y.T.; Nadlifatin, R.; Ayuwati, I.D.; Sittiwatethanasiri, T.; Robas, K.P.E. The perceived usability of the learning management system during the COVID-19 pandemic: Integrating system usability scale, technology acceptance model, and task-technology fit. Work 2022, 73, 41–58. [Google Scholar] [CrossRef] [PubMed]
  30. Alturki, U.; Aldraiweesh, A. Application of Learning Management System (LMS) during the COVID-19 Pandemic: A Sustainable Acceptance Model of the Expansion Technology Approach. Sustainability 2021, 13, 10991. [Google Scholar] [CrossRef]
  31. Navarro, M.M.; Prasetyo, Y.T.; Young, M.N.; Nadlifatin, R.; Redi, A.A.N.P. The Perceived Satisfaction in Utilizing Learning Management System among Engineering Students during the COVID-19 Pandemic: Integrating Task Technology Fit and Extended Technology Acceptance Model. Sustainability 2021, 13, 10669. [Google Scholar] [CrossRef]
  32. Zhang, S.; Li, Y.F. A study of college teachers’ online teaching behavior based on technology acceptance model. J. Distance Educ. 2014, 3, 56–63. [Google Scholar] [CrossRef]
  33. Xiao, R.X.; Wang, H.; Qu, J.P. A study of online teaching behavior of college teachers based on technology acceptance model. Chin. J. Multimed. Web-Based Teach. Learn. 2021, 49, 27–30. (In Chinese) [Google Scholar]
  34. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef]
  35. Browne, M.W.; Cudeck, R. Testing Structural Equation Models. 1993. Available online: https://www.researchgate.net/publication/284653185_Testing_Structural_Equation_Models (accessed on 29 October 2022.).
  36. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User acceptance of computer technology: A comparison of two theoretical models. Manag. Sci. 1989, 35, 982–1003. [Google Scholar] [CrossRef]
  37. Bhattacherjee, A. This paper examines cognitive beliefs and affect influencing one’s intention to continue using (continuance) information systems (is) expectation-confirmat. SBPM 2010, 48, 162–164. [Google Scholar] [CrossRef]
  38. Venkatesh, V.; Davis, F.D. A theoretical extension of the technology acceptance model: Four longitudinal field studies. Manag. Sci. 2000, 46, 186–204. [Google Scholar] [CrossRef]
  39. Bhattacherjee, A. Understanding information systems continuance: An expectation-confirmation model. MIS Q. 2001, 25, 351–370. [Google Scholar] [CrossRef]
  40. Bentler, P.M. EQS Structural Equations Program Manual (Vol. 6). Encino, CA: Multivariate Software. 1995. Available online: https://www.mvsoft.com/wp-content/uploads/2021/04/EQS_6_Prog_Manual_422pp.pdf (accessed on 25 October 2022).
  41. Bagozzi, R.P.; Heatherton, T.F. A general approach to representing multifaceted personality constructs: Application to state self-esteem. Struct. Equ. Model. Multidiscip. J. 1994, 1, 35–67. [Google Scholar] [CrossRef]
  42. Zhang, D.; Huang, G.; Yin, X.; Gong, Q. Residents’ Waste Separation Behaviors at the Source: Using SEM with the Theory of Planned Behavior in Guangzhou, China. Int. J. Environ. Res. Public Health 2015, 12, 9475–9491. [Google Scholar] [CrossRef]
  43. Yuan, Q.; Gu, Y.; Wu, Y.; Zhao, X.; Gong, Y. Analysis of the Influence Mechanism of Consumers’ Trading Behavior on Reusable Mobile Phones. Sustainability 2020, 12, 3921. [Google Scholar] [CrossRef]
  44. Shangguan, Z.; Wang, M.Y.; Huang, J.; Shi, G.; Song, L.; Sun, Z. Study on Social Integration Identification and Characteristics of Migrants from “Yangtze River to Huaihe River” Project: A Time-Driven Perspective. Sustainability 2019, 12, 211. [Google Scholar] [CrossRef]
  45. Shuangli, P.; Guijun, Z.; Qun, C. The Psychological Decision-Making Process Model of Giving up Driving under Parking Constraints from the Perspective of Sustainable Traffic. Sustainability 2020, 12, 7152. [Google Scholar] [CrossRef]
  46. Maaravi, Y.; Heller, B. Digital Innovation in Times of Crisis: How Mashups Improve Quality of Education. Sustainability 2021, 13, 7082. [Google Scholar] [CrossRef]
  47. Lee, J.; Hwang, C.; Kwon, D. On the Effect of Perceived Security, Perceived Privacy, Perceived Enjoyment, Perceived Interactivity on Continual Usage Intention through Perceived Usefulness in Mobile Instant Messenger for business. J. Korea Soc. Digit. Ind. Inf. Manag. 2015, 11, 159–177. [Google Scholar] [CrossRef]
Figure 1. The technology acceptance model.
Figure 1. The technology acceptance model.
Sustainability 15 07274 g001
Figure 2. Model structure diagram.
Figure 2. Model structure diagram.
Sustainability 15 07274 g002
Table 1. The measurement model of factors influencing user intention, and also the measurement questions.
Table 1. The measurement model of factors influencing user intention, and also the measurement questions.
VariablesSourceIndicator ContentIndicators
Perceived usefulnessDavis, F. D. (1989) [34],
Bhattacherjee, A. (2001) [37]
Timeliness of posting newsPU1
Effectiveness of sharing resourcesPU2
Ease of replying to messagesPU3
The degree of improvement of teaching efficiencyPU4
Perceived ease of useDavis, F. D. et al. (1989) [38]Simple and easy to understand software pagesPE0U1
The extent to which it is restricted by time zonePE0U2
Ease of use for software toolsPE0U3
Degree of familiarity with software functionsPE0U4
Usage behaviorDavis, F. D. (1989) [31],
Bhattacherjee, A. (2010) [37]
Stability of platform operationUB1
Clarity of screen and audioUB2
Timeliness of teacher–student interactionUB3
Smoothness of file transferUB4
Satisfaction of course playback functionUB5
Satisfaction of classroom effectUB6
Usage attitudeDavis, F. D. et al. (1989) [38]The effect of teaching having high expectationsA1
Being satisfied with my learning needsA2
The satisfaction with the DingTalk platform software online teachingA3
Behavioral IntentionsBhattacherjee, A. (2001) [39]Supporting the continuation of the online teaching formatBI1
Willingness to participate in teaching activities that are conducted by the softwareBI2
Willingness to share resources with teachers and studentsBI3
Table 2. A description of the indicator conversion questionnaire.
Table 2. A description of the indicator conversion questionnaire.
IndicatorsIndicator Description
PU1The DingTalk platform allows one to watch important information posted by teachers in a timely manner, which will help one to actively integrate it into their learning.
PU2Students and teachers can share learning resources with each other in DingTalk platform, which is helpful for their learning in the course.
PU3Students and teachers can respond to messages quickly through the DingTalk platform, which helps one to actively participate in the course.
PU4Online teaching can improve the efficiency of the classroom and can help learning progress.
PE0U1The DingTalk platform’s pages are simple and easy to understand, which helps one in practice.
PE0U2Where one thinks the DingTalk platform software is less restricted by time zones.
PE0U3One thinks the DingTalk platform software is easy to use.
PE0U4One is familiar with the functions of the DingTalk platform software.
UB1One thinks the DingTalk platform software is relatively stable.
UB2One thinks that the DingTalk platform software has a clearer screen and audio.
UB3One can get timely responses on DingTalk platform when teachers and students interact with each other.
UB4One thinks that the file transfer of the DingTalk platform software is smooth and fast.
UB5One can review the class content through the DingTalk platform software lesson playback function, which brings one a convenient way by which to review the lessons.
UB6One thinks the online classroom effect of the DingTalk platform software is no worse than the offline class effect.
A1One has high expectations for the effectiveness of the DingTalk platform online teaching.
A2The DingTalk platform online classroom can meet learning needs.
A3One’s satisfaction with the DingTalk platform’s capacity for online teaching.
BI1One supports the continuation of the online teaching format.
BI2One is willing to participate in the teaching activities that are conducted by the software.
BI3One is willing to share their learning resources with teachers and students on the DingTalk platform.
Table 3. The frequency analysis of the demographic variables.
Table 3. The frequency analysis of the demographic variables.
VariablesOptionsFrequencyPercentageAverage ValueStandard Deviation
GenderMale20341%1.590.492
Female29259%
AgeUnder 18 years old438.70%2.220.739
18–30 years old34469.50%
31–40 years old7414.90%
41–50 years old265.30%
51 years old and above81.60%
Education levelJunior, High School, and below153%2.990.727
High School8918%
Undergraduate27956.40%
Graduate Student11222.60%
Table 4. Reliability statistics.
Table 4. Reliability statistics.
Cronbach AlphaCronbach Alpha, Based on Standardized TermsNumber of ItemsNumber of Samples
0.9670.96720495
Table 5. KMO and Bartlett’s test.
Table 5. KMO and Bartlett’s test.
KMO Sampling Suitability Quantity0.98
Bartlett’s sphericity testApproximate cardinality7925.11
Degree of freedom190
Significance0
Table 6. Analysis of the differences in gender for each dimension.
Table 6. Analysis of the differences in gender for each dimension.
VariablesQ1Number of CasesAverage ValueStandard DeviationtSig
PUMale20327.236.268−0.7960.426
Female29227.696.4
PE0UMale20327.096.263−0.5230.601
Female29227.396.353
UBMale20340.89.404−0.310.757
Female29241.069.035
AMale20320.344.7540.2020.84
Female29220.254.957
BIMale20320.414.8250.0680.946
Female29220.384.897
Table 7. A summary table of the model regression coefficients.
Table 7. A summary table of the model regression coefficients.
XYUn−Standardized Regression CoefficientsSEz (CRValue)pStandardized Regression Coefficients
Perceptual usefulnessUsage attitude0.2830.1062.6610.0080.287
Perceived UsefulnessBehavioral intention−0.1080.151−0.7150.474−0.117
Perceived ease of usePerceived usefulness−2.1774.440−0.4900.624−2.036
Perceived ease of useAttitude toward use0.7370.1186.2660.0000.701
Usage BehaviorPerceived usefulness2.9604.1250.7180.4732.981
Usage BehaviorPerceived ease of use0.9250.04719.7680.0000.996
Usage AttitudeBehavioral intentions0.9960.1616.1960.0001.059
Table 8. The model-fitting indicators.
Table 8. The model-fitting indicators.
Commonly-Used Indicatorsχ2dfpCardinality Ratio of Freedom, χ2/dfGFIRMSEARMRCFINFINNFI
Judgment criteria-->0.05<3>0.9<0.10<0.05>0.9>0.9>0.9
Value361.7561630.0002.2190.9300.0500.0920.9750.9550.971
Other indicatorsTLIAGFIIFIPGFIPNFISRMRRMSEA 90% CI
Judgment standard>0.9>0.9>0.9>0.9>0.9<0.1-
Value0.9710.9100.9750.7220.8190.0260.043~0.057
Default model: χ2(190) = 8063.576, p = 1.000
Table 9. Covariance relations—MI indicators.
Table 9. Covariance relations—MI indicators.
ItemRelationshipsItemMI ValuePar Change
PE0U2UB315.281−0.331
PE0U1UB212.338−0.196
A2PU311.390−0.191
A1BI313.450−0.228
A1BI112.6470.234
UB6UB311.0830.309
UB6PU210.629−0.228
UB6BI114.3640.314
UB6A217.0670.307
UB5PE0U215.0950.276
Note: The MI in the table are greater than 10.
Table 10. Impact relationship—MI indicators.
Table 10. Impact relationship—MI indicators.
ItemRelationshipsItemMI ValuePar Change
Usage BehaviorAttitude to use10.26610.937
Usage AttitudePerceived usefulness12.495−4.514
Note: MI in the table are greater than 10.
Table 11. Summary table of model fit R2.
Table 11. Summary table of model fit R2.
ItemR-Squared ValueItemR-Squared Value
Perceived usefulness0.945PU40.576
Perceived ease of use0.991PU30.726
Attitude toward use0.949PU20.659
Behavioral Intentions0.903BI20.646
BI30.615PU10.850
UB30.496BI10.566
UB20.767A30.636
UB10.744A20.642
PE0U40.603A10.697
PE0U30.563UB60.543
PE0U20.533UB50.522
PE0U10.548UB40.604
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, F.; Pang, J.; Guo, Y.; Zhu, Y.; Zhang, H. A Study of Factors Influencing the Use of the DingTalk Online Lecture Platform in the Context of the COVID-19 Pandemic. Sustainability 2023, 15, 7274. https://doi.org/10.3390/su15097274

AMA Style

Zhang F, Pang J, Guo Y, Zhu Y, Zhang H. A Study of Factors Influencing the Use of the DingTalk Online Lecture Platform in the Context of the COVID-19 Pandemic. Sustainability. 2023; 15(9):7274. https://doi.org/10.3390/su15097274

Chicago/Turabian Style

Zhang, Fan, Jianbo Pang, Yanlong Guo, Yelin Zhu, and Han Zhang. 2023. "A Study of Factors Influencing the Use of the DingTalk Online Lecture Platform in the Context of the COVID-19 Pandemic" Sustainability 15, no. 9: 7274. https://doi.org/10.3390/su15097274

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop