Next Article in Journal
Calibration of a Field-Scale Soil and Water Assessment Tool (SWAT) Model with Field Placement of Best Management Practices in Alger Creek, Michigan
Previous Article in Journal
Measuring Baseline Agriculture-Related Sustainable Development Goals Index for Southern Africa
Article

A Study on the Instructor Role in Dealing with Mixed Contents: How It Affects Learner Satisfaction and Retention in e-Learning

1
College of Hotel and Tourism Management, Kyung Hee University, Seoul 02447, Korea
2
Department of Hotel Management, Graduate School, Kyung Hee University, Seoul 02447, Korea
*
Author to whom correspondence should be addressed.
Sustainability 2018, 10(3), 850; https://doi.org/10.3390/su10030850
Received: 28 December 2017 / Revised: 21 February 2018 / Accepted: 22 February 2018 / Published: 16 March 2018

Abstract

The information and communication technology has become an indispensable part of modern education. The paradigm shift in the educational environment makes the instructors recollect the traditional roles in classroom education and adjust their responsibilities to accommodate a transformed pedagogy and learner expectations. This paper aims at the instructor’s role in on-line education and studies how the instructor affects the learner satisfaction via the instructor involvement. Modifying the information system success model, the learning-environment qualities are rearranged into two-tiered formats—rigid and flexible contents—depending on the instructor’s manageability. A partial least square analysis was used to examine the structural relationship among rigid and flexible contents qualities (i.e., technology-assisted learning-environment qualities), learner satisfaction, and learner retention, and found that the instructor involvement had a moderating effect on flexible contents qualities (test and activity); further, the moderating effect of instructor is captured as high involvement in tests and low involvement in activities. Consequently, this paper confirms the relationship between learning-environment qualities, learner satisfaction, and instructor involvement. Empirically, the instructor role in on-line education and the degree of instructor involvement in higher education are substantiated; the result of this study will also contribute to e-learning design or content delivery system development in a practical way.
Keywords: on-line education; rigid contents quality; flexible contents quality; learner satisfaction; learner retention; instructor involvement on-line education; rigid contents quality; flexible contents quality; learner satisfaction; learner retention; instructor involvement

1. Introduction

Information technology has made a significant impact on the educational paradigm, and has created a virtual education community. According to data from the Spanish Conference of University Rectors (CRUE, 2009), 71.7% of university teaching and research staff use the institutional virtual teaching platform, and 92.5% of students make use of this platform [1].
E-learning can be defined as technology-based learning in which learning materials are delivered electronically to remote learners via a computer network [2,3]. E-learning encompasses learning resources (contents), tools for implementation (systems), and execution processes (practices) that are implemented with the flexibility of time and cost effectiveness at the expense of inflexibility of traditional learning [4]. The Internet has reconfigured the landscape of higher education [5,6], and online education has had an impact on all aspects of higher education, from teaching and learning, to research and administration. It provides temporal freedom that allows students to learn at their own pace [7] and facilitators’ objectives for tracking the basic trajectory. It enriches course content with various interfaces and links, and allows for a high plausibility of the learner-centered approach [8]. Despite the advantages of online education, the beginning stages of e-learning has raised various problems, such as low retention rates, inadequate interaction, unfamiliarity, and the lack of discipline in students [9,10,11]. The quality and technology of e-learning has constantly improved in design and effectiveness in recent decades [12,13], and e-learning has expanded so that some are forecasting the possibility that it will replace classroom learning [2,3,14,15]. Expanding the available learning environments, a substantial amount of research regarding the mixing of the traditional classroom and technology has been conducted [16,17]. “Blended learning” is a means of education that blends face-to-face and online delivery, where 30–79% of content is delivered online [18,19].
A considerable number of researchers have investigated the underlying reasons for the success of the mixed approach of traditional and online education [1,20,21,22], drawn from the information system success model (IS Success Model) developed and updated by DeLone and McLean [23,24]. Liaw [10], integrating the theory of planned behavior and the technology acceptance model, analyzed how learner characteristics and environmental factors affect user satisfaction, behavioral intention, and effectiveness toward e-learning using the three-tier use model.
The paradigm shifts in educational environments have prompted the reconsideration of instructor role in education policy, process, and implementation [25]. Instructors in the traditional class were the main stakeholder of education, and mainly in charge of students’ learning experience, but the changes in educational environments require different approaches to the instructors, including the scaffolding role [26,27,28] or a coaching role [29]. Instructors in blended learning have implemented two different pedagogies at the same time, both for learners and the learning environment [30,31]. However, there has not been much research drawing on the relationship between the instructor and the learner experience, and research conducted on campus-based higher education where enrollment is high [32,33,34] is rare.
Grounded in the above backdrop, this study examines learner satisfaction and behavioral intention toward e-learning, by considering how the instructor moderates the environmental factors using the two-tiered formats of rigid contents vs. flexible contents qualities. Based on the IS Success Model, this study serves three purposes: (1) to investigate the impacts of success factors (rigid and flexible contents qualities) on learner satisfaction; (2) to identify the moderating effect of instructor involvement in these relationships; and (3) to assess the influence of learner satisfaction on learner retention.

Case Study

In order to accommodate the aforementioned purposes of this study, a brief introduction of the course will be necessary. The course which was observed for this study is a prep-course for a Hospitality and Tourism major, and the main contents are operated by 6 different professors representing each major, and each class is pre-recorded and installed on the site. Using the pre-recorded contents, the instructor designs the 16-week course to implement blended learning—12 weeks of e-learning for the pre-recorded contents and 4 weeks of face-to-face sessions, e.g., the opening and closing sessions, and two in-class evaluations (mid-term and final paper tests) along with guest speaker lectures. The opening session is conducted by the instructor in the off-line class, where the instructor introduces learning objectives and overviews the general concepts relevant to the overall course, which is identical to the traditional classes, while the instructor is introducing the learning environment such as how the students access the contents in the learning management system (LMS), how the attendance is checked in e-learning, and when and how they take quizzes via on-line, etc.
The closing session is similar, but the instructor wraps-up the course with relevant contents and concepts, while confirming student’s personal achievements, such as, attendance, personal assignments, quizzes, etc.
Besides these, two in-class evaluations are conducted under the face-to-face class environment. The evaluations include short answer questions as well as essay questions, which is identical to the traditional classes. A 50-min paper test is followed by the guest speaker sessions, the latter of which are conducted in the off-line class by professionals working in the field to provide the hands-on experience of the industry.
The main body of the on-line lectures is composed of 12-unit classes with 6 professionals for 6 sub-majors in Hospitality and Tourism. After each lecture, students take a quiz for recollection of the pre-recorded contents and a short report (with 2000 words) should be submitted via the Internet for reflection of the contents learned from the contexts.
The on-line learning environment is generally solitary and limited in the learner’s social engagement and interaction. In order to integrate the student’s reflection and meaningful engagement socially and communicatively [34], collaborative activities for whole group achievement are followed for the final project, to reflect and create the authentic learning environment. The group is formed in advance by the student interest and the instructor’s guidance. Once the members of the group are decided, they meet on the discussion boards or social network system and develop the ideas for making a collaborative work, which is a 5–10-min user generated contents (UCC) making. The idea development is followed by a site visitation to the field, with all group members together. All the procedures to produce the final project is recorded and edited to prove collaborative activities. Finally, the collaborative activity is completed by uploading the tangible product of UCC via the Internet.
For the students’ awareness, the instructor keeps sending text messages while executing the course, e.g., notifying lecture time, quizzes, assignments, etc., and responding to the students’ questions via e-mails or bulletin boards. The students’ personal reflective questions included in the weekly assignment are also responded via the Internet.
The instructor, in this way, is performing a blended form of instruction and learning, creating a balance between rigid contents such as pre-recorded text materials and flexible contents, such as weekly assignments and quizzes, collaborative activities, personal interactions through on and off-line classes, and face-to-face in-class activities. Seen from the course design and the implementation process, the pre-determined contents in the LMS are incorporated by the instructor’s constant engagement into the authentic learning environment, thereby reflecting real-world issues.

2. Theoretical Background

2.1. Blended-Learning and Mixed Contents

As technology has expanded the learning environment, a variety of technology-assisted variations of learning—e.g., blended learning—have been introduced [7,35,36,37]. Blended learning is a hybrid of two different pedagogies [38]—traditional teaching and autonomous e-learning. Similar, yet different, is the convergence of the text-based asynchronous on-line learning, with face-to-face approaches, where the instructor implements two different contents of flexibility and inflexibility [30]. The flexible contents are easily handled by the instructor, based on the student’s needs or situations, such as tests, assignments, discussions, etc., whereas the pre-recorded contents are a rigid set of information and are inflexible, including streaming the contents on the media, directing a learner down a specific path of learning outcomes. Blended learning, as a more learner-centered educational approach, allows for communication, collaboration and interaction [39,40,41], as well as flexibility for the instructor to overcome the drawbacks of the unilateral transfer of information, lack of interaction, isolation, etc. [42]. Despite the improved e-learning environment, the LMS of the e-learning platform is still inflexible, mainly in that the pre-recorded video contents are rigidly hardwired and hard to fix to the details of the contents. The course observed for this study employs team-teaching with six different major professionals so that the alternation of the pre-recorded contents is beyond the instructors’ manageability. Thus, while blended learning offers much flexibility in educational environments, the pre-recorded contents are still technically ‘rigid’.

2.2. IS Success Model

The IS Success Model [24] presented in Figure 1 conceptualizes the success factors of information system qualities with three independent variables—system quality, information quality, and service quality—and affects “use” and “user satisfaction” either individually or jointly, and each has a subsequent effect on “individual or organization”, which helps determine the efficiency of the information system success.
The LMS in e-learning convenes instructors’ pedagogical tasks and learners’ acquisition tasks. Applying the IS Success Model (Figure 1) to e-learning, three independent variables in the quality dimension are modified; system quality, being the technology relevant characteristics, is commonly applied and measured by accessibility, portability (downloadable course material), and stability; information quality in e-learning is modified to employ the quality of the learning contents, and measured by video quality, lecture time, and lecture notes.
Web-based learning is on-demand learning, because the information delivered is more packed and concise than it is in the traditional classroom. Most experts in web design recommend shorter lectures that are less than 20 min in length, considering the learner’s concentration span [43,44]. Allowing for differences in the way the course is structured and the student’s motivation to complete the course is high for the major, each lecture was prearranged with three sets of approximately 25-min lectures, which is comparatively longer than Massive Open Online Course or other web-based continued learning for adults. Therefore, lecture time was included to evaluate the information quality.
Due to the inflexibility of the system and the asynchrony of the pre-set contents, the system quality and the pre-set contents quality are beyond the instructor’s manageability. Instructors cannot deviate from pre-set contents or make further alternations for learners’ convenience; thus, they are composed of so-called ‘rigid’ contents. From the instructor’s perspective, blended learning can be described as adjusting the balance between flexibility and inflexibility, in the sense that the inflexible parts of blended learning are the Internet system and pre-set contents, whereas tests or student activities are within the instructor’s control and are therefore easily manageable. The difference in the application of management on the platform is observed by the difference in the degree of instructor’s involvement between the flexible contents and the inflexible contents.
Converting service quality into e-learning is modified into the interface between the instructor and the learner; it is the instructor’s scope of facilitating teaching objectives and the learner’s execution of achieving learning objectives. All the learning relevancies, other than the pre-recorded contents—e.g., tests, assignments, and student activities—are counted as service quality. The constructs of service quality fall on a spectrum depending on the property, structure, and relevance for learner achievement (e.g., more structured or less structured) and serve to measure the learner evaluation. Tests can be a more structured, stronger, and objective construct than student activities, which are less structured, collaborative, and interactive. Thus, the service quality in e-learning is replaced with tests and student activities, both of which are flexibly managed by the instructor.
The replacement of learner satisfaction ensures that user satisfaction and net benefits in e-learning are counted in the outcome of learner satisfaction, such as the learner’s recommendation or the learner’s future intention to return to e-learning. Adopting the IS Success Model (Figure 1), we develop the modified IS Success Model for e-learning, as can be seen in Figure 2.
The intrinsic properties of the constructs measuring the qualities of e-learning (Figure 2) are a hybrid with respect to the instructor’s manageability and contents flexibility. These hybrid constructs are represented as two-tier formats with four constructs in the learning environments. To demonstrate the learner’s behavior toward the hybrid contents, the online platform of university learning archive system, where the students can register and evaluate their lectures, was chosen, and the qualities of the website were measured.

3. Involvement

Involvement refers to the degree of personal interest or perceived importance, which is determined either by high involvement or low involvement, to maximize the benefits and minimize the risks. Wright [45] applies the concept to the teacher involvement in the educational context, but little research has been conducted on the subject of instructor involvement at the actual implementation, namely, educating the students. To substantiate the degree of instructor involvement in other constructs, the moderating effect will be considered at the implementation level. The research design, Figure 3, represents the level of the instructor involvement.

3.1. Research Model and Hypotheses

This study will investigate the following inquiries in the course of “Introduction to Hospitality & Tourism (H&T)” implemented in the university learning archive system: how the four constructs comprising the quality of e-learning environment—system, content, test, and activity—affect learner satisfaction; how the relationships between instructor involvement and learner satisfaction are correlated; and how learner satisfaction affects further commitment, such as recommendation to others, intention to take other similar types of courses, or intention to take the instructor’s other classes. Based on the aforementioned theoretical background, we propose the following research model (Figure 4).

3.2. Learning Environments and Learner Satisfaction

3.2.1. System Quality and Learner Satisfaction

E-learning uses the LMS platform as the primary environment to transfer knowledge. As a virtual classroom, it not only delivers lecture contents, but also checks attendance, evaluates students, and even promotes student interaction to achieve the learning objectives. It needs a few system quality attributes [46]. Of those attributes, Internet compatibility (on Internet Explorer or Google Chrome browsers) is required for the learner to access the platform, which is implemented by the Center for Teaching and Learning at the University. As for execution, performance is done by downloading the pre-recorded video files. Stability in viewing the downloaded resources is governed by the system quality in the LMS platform. Accessibility, downloading time, and stability are considered, in this survey, as the basic system quality attributes. These system qualities are applied to the IS Success Model, and the relation between system qualities and learner satisfaction will be examined. This paper assumes that if the system attributes are satisfied, it will have a positive effect on learner satisfaction. H1 is thus formulated as follows:
Hypothesis 1.
System quality has a positive impact on learner satisfaction.

3.2.2. Content Quality and Learner Satisfaction

The asynchronous information from the LMS complies with the learning objectives. To achieve learner satisfaction, the knowledge transference should coherently have relevance to the learning objectives and facilitate the learners’ comprehension in a clear manner; the lecture notes provided and the amount of the lecture per class should be proper. Therefore, we formulate H2 that the content can affect the learner satisfaction.
Hypothesis 2.
Content quality has a positive impact on learner satisfaction.

3.2.3. Test Quality and Learner Satisfaction

Learner achievement, even in a virtual classroom, is the ultimate learning objective, which is tested either by online or in-class tests. Tests are the evaluation process of the e-learning model. Testing is an important phase because it checks the learners’ understanding of the learning objectives, encourages improvement, and grades learner achievement. Unlike the rigid contents (system and content), it is scaffolding for the instructor to guide and activate the learners’ self-directed learning. Therefore, the test quality in e-learning is assumed to affect the learner satisfaction and H3 is formulated:
Hypothesis 3.
Test quality has a positive impact on learner satisfaction.

3.2.4. Collaborative Activity Quality and Learner Satisfaction

Collaborative activity is a form of engagement and the socialization process inducing participation, interaction, and communication, either in the virtual community or face-to-face, to compensate for the reduced interaction in e-learning. The instructor in this study organizes a class into groups and assigns specific tasks or projects. Each group autonomously works on assignments, members support each other, and the assignment results are shared by the group members and evaluated by the group activity. These activities are essential to engender commitment and ensure learners’ higher thinking and sustainable progress [47,48]. Comparatively less structured but nonetheless an essential part of learning, the instructor’s guidance in the learners’ substantial commitment will have a significant impact on having a meaningful educational experience. Therefore, H4 is formulated:
Hypothesis 4.
Activity quality has a positive impact on learner satisfaction.
The collaborative activities, together with test, particularly from the instructor’s perspective, are comparatively negotiable over the system and the content qualities, which we call flexible contents.

3.3. Learner Satisfaction

Satisfaction is the cognitive accordance between website qualities and user satisfaction [49]. The research of Roca, Chiu and Martínez [50] on e-learning classes shows that service quality, system quality, and especially information quality, are positively related to the satisfaction of the class; Lu and Chiou’s [22] study on e-learning reveals that individual differences in learning styles can affect satisfaction. With reference to the aforementioned models, the qualities of learning environments will make the learners satisfied; this study hypothesizes that system quality, content quality, test quality, and activity quality can affect learner satisfaction.

3.4. Learner Retention

The satisfied customer moves a step toward customer retention [51], and satisfaction is a strong predictor of customer retention. The concept of retention is further applied to e-learning so that when learners are satisfied with the learning environments—such as system, contents, tests, and collaborative activity qualities—they may recommend it to others or retake similar e-learning courses, which is similar to confirmation or use intention. Thus, we hypothesize that there is a positive association between learner satisfaction and learner retention.
Hypothesis 5.
Learner satisfaction has a positive impact on his/her retention.

3.5. Instructor Involvement

Traditionally, the role of an instructor was as a classroom manager responsible for learner’s academic performance. Less attention has been paid to the instructor’s roles in creating learning experiences or how they themselves develop new teaching styles, than that which has been given to the learner’s educational practice or the learning subject. Notably presented in online forms of e-learning, low retention has always been a major problem, but the adoption of complimentary learning [1,52] shows the improved output of retention rates and enhanced achievement in evaluations. Similar research shows that the instructor’s active academic involvement tends to increase learner interest and willingness to achieve and self-improve [53]. Therefore, the instructor involvement in academic monitoring reduces student drop-out rates, and eventually, is positively related to the learner academic achievement. With this in mind, we assume that the instructor as a facilitator in the rigid contents qualities—the system and the content—and at the same time, a manager of the flexible contents qualities—test and activity—is involved in the learning environments and learner satisfaction. Thus, we propose the following hypotheses:
Hypothesis 6.
Instructor involvement moderates the relationship between system quality and learner satisfaction.
Hypothesis 7.
Instructor involvement moderates the relationship between content quality and learner satisfaction.
Hypothesis 8.
Instructor involvement moderates the relationship between test quality and learner satisfaction.
Hypothesis 9.
Instructor involvement moderates the relationship between activity quality and learner satisfaction.

4. Methods

4.1. Data Collection

LMS based e-learning was carried out in the course “Introduction to H&T” offered by K-University in Seoul, as a first-year undergraduate course. The model adopted in this study combines online lectures with group activities and comprehension tests. The survey questionnaire was firstly developed in Korean, and translated into English by a researcher who was proficient in both languages, with academic specialization in e-learning. The survey was conducted at the LMS platform (20–30 June 2015), and the survey participant rate is 62% (208 out of 333). Though a total number of 208 questionnaires were collected, 4 questionnaires were eliminated because of missing data. Out of the total respondents, 82 (40.2%) are male, and 122 (59.8%) are female; most of the respondents are between 20 and 22 (154, 75.5%), and major in hospitality and tourism (179, 87.7%) as in Table 1.

4.2. Measures

Research instruments measuring the study variables were developed primarily on the basis of previous literature [22,24,48,50,54,55,56,57], with some minor modifications to suit the research context. All constructs were measured with multiple items with a five-point scale ranging from (1) “strong disagreement” to (5) “strong agreement”. Instructor involvement, system quality, content quality, test quality, activity quality, learner satisfaction, and learner retention were measured with nine items, five items, five items, four items, four items, nine items, and three items, respectively.

5. Analysis and Results

Since a partial least square (PLS) analysis has some advantages, such as small sample size and few assumptions about measurement scale and normal distribution [58], we used PLS-Graph 3.0 and SPSS 21.0 to analyze the data. First, we checked reliability and validity of the constructs used for our research model in order to validate our measurement model; then we tested the hypothesized relationships among the rigid and flexible contents qualities, learner satisfaction, and learner retention [59]. Finally, we tested the moderating effect of instructor involvement between contents qualities and learner satisfaction by conducting a hierarchical moderated regression analysis.

5.1. Measurement Model

We calculated the composite reliability (CR), Cronbach’s alpha, and average variance extracted (AVE) of each construct. In order to establish the reliability and convergent validity of each construct, CR and AVE must be greater than 0.7 and 0.5 [59], respectively, and Cronbach’s alpha must be greater than 0.7 [60]. Also, all measurement items should be significant. We eliminated three measurement items (two items of system and one item of activity), which hindered reliability and convergent validity. As shown in Table 2, all of the constructs used for the present study satisfied the requirements; thus, the results established that the constructs of our study demonstrated reliability and convergent validity.
We compared the square root of AVE of each construct with the correlation between one construct and the other construct in order to check the discriminant validity [59]. As shown in Table 3, except for the square root of AVE of content (between content quality and learner satisfaction), all of the square root of the AVE exceeded the correlations between one construct and the other construct. We re-checked the discriminant validity between content quality and learner satisfaction by using the χ2 difference test [61]. The χ2 difference score (41.722) between content quality and learner satisfaction was significant (p < 0.001). Therefore, these results indicated evidence of the discriminant validity.

5.2. PLS Analysis and Moderating Effect of Involvement

We used the structural equation model to test the relationship among the rigid and flexible contents qualities, learner satisfaction, and learner retention. The size of bootstrapping sample that was used in the PLS analysis was 500. Figure 5 shows the testing results of direct effect hypotheses (H1~H5), and all of the hypotheses were supported. Specifically, H1 and H2 postulating the impacts of rigid contents qualities such as system quality (β = 0.120, t = 2.732, p < 0.01) and content quality (β = 0.297, t = 4.052, p < 0.001) were supported. H3 and H4 postulating the impacts of flexible contents qualities such as test quality (β = 0.331, t = 4.388, p < 0.001) and activity quality (β = 0.287, t = 6.155, p < 0.001) were also supported. The results indicate that the impact of flexible contents qualities on learner satisfaction is a little greater than rigid contents qualities. Further, learner satisfaction was found to be a strong predictor of learner retention (β = 0.837, t = 28.980, p < 0.001). Thus, H5 was also empirically supported.
Finally, we tested the moderating effect of instructor involvement on the relationship between contents qualities and learner satisfaction. As shown in Table 4, instructor involvement moderates the effect of test quality (test quality x instructor involvement: β = 0.168, t = 3.184, p < 0.01) and activity quality (activity quality x instructor involvement: β = −0.126, t = −2.829, p < 0.01) on learner satisfaction. Specifically, the more deeply the instructor was involved in the test, the more positively it affected learner satisfaction, whereas the instructor involvement in the activity quality was found to negatively influence learner satisfaction. In other words, when the instructors were involved in the class test, learner satisfaction increased, but this was not the case in collective activities. These results supported H8 (the moderating effect of instructor involvement on the relationship between test quality and learner satisfaction) and H9 (the moderating effect of instructor involvement on the relationship between activity quality and learner satisfaction).

6. Discussion and Implications

This study shows the positive association of learning environment qualities with learner satisfaction and learner retention. It further notes the association between the instructor involvement in the e-learning environment and learner satisfaction; both rigid content quality and flexible contents quality impact learner satisfaction and are consequently associated with learner retention. The degree of the instructor involvement with respect to the environmental quality was slightly different depending on the quality of contents for either flexible contents qualities (e.g., test and activity) or rigid contents qualities (e.g., system and content), and the impact of flexible contents quality was a little greater than that of rigid contents (system quality: β = 0.115, content quality: β = 0.284 vs. test quality: β = 0.322, activity quality: β = 0.319). Moreover, the behavior of the moderating variable to the flexible contents quality was also noted with respect to the construct quality—whether it is a structured construct (tests) or less structured (activities). More specifically, the more deeply instructors were involved in the learning process, the more strongly the tests influenced learner satisfaction (β = 0.168), whereas the lesser involvement in activities positively influenced learner satisfaction (β = −0.126).
Interestingly, this result empirically demonstrates the changed learning environment and shows the basic formulation of blended learning. Since education has adopted technology, the learning environment has expanded from traditional classrooms to websites, and will be further expanded by incorporating new and innovative technologies, such as smart phones and applications [32]. Under this changing educational environment, the status of the instructor, the major agent of teaching in the traditional sense, is worthy of reconsideration. In this respect, the effect of the instructor involvement noticed in the flexible contents quality is worthy of attention, as compared to the limited influence on the rigid contents quality.
The dual behavior of instructor involvement reflects the dual properties of the blended learning environment, which is captured by the two-tier formats in the modified IS Success Model. Instructors, in the process of implementing the e-learning contents, inevitably encounter the issue of manageability, either for flexible or rigid contents. Without the assistance of technology, the e-learning objectives cannot be attained; in this regard, the instructor’s active involvement in the system design or cooperation with system developers is suggested. Regarding the rigid contents quality (system and pre-set contents), the instructor is a relevant third party who is actively involved in, but governed by, the system; whereas with the flexible contents quality (tests and activities), the instructor is fully in charge, which is represented as the moderating effect of instructor involvement. Therefore, the status of instructors is modulated as a facilitator in e-learning, and the instructor interfaces in two ways.
As a practical implication, this study implies the paradigm shift in the instructor–technology relationship in technology-assisted blended learning. As mentioned, teaching is not the sole responsibility for the instructor anymore; rather, technology, as a medium of teaching, should be seriously considered as a relevant partner in e-learning. The optimum outcome of teaching in this changed educational environment is obtained only by cooperative engagement and the proper collaboration of the two agents of teaching—technology and instructor. Therefore, the system developer’s invisible role in dealing the hybrid contents should be seriously considered for improving the learning environment’s qualities, especially in designing rigid contents, which subsequently affects learner satisfaction.
Along with this, the effect of the instructor involvement in flexible contents on learner satisfaction shows another practical implication from the instructor–learner relationship. It should be reconsidered regarding learner motivation and autonomy, especially in higher education. The dual moderating effects—high involvement in tests and low involvement in activities—shown in this study, reflect the features of a meaningful learning experience in higher education—higher motivation toward the personal assessment of tests and more autonomy in collective activities [43,62]. This result demonstrates that learner activities should be perceived as another learning phase, and should be autonomous. The finding also has significance in the paradigm shift of the instructor–learner relationship, implying the instructor’s proper and sensitive involvement, and has a practical significance for online learning system developers, instructors, and teachers in designing or operating e-learning programs.
Moreover, this study makes a meaningful theoretical contribution, as it is the first study undertaken in a large-scale classroom at the university level. As one of the rare studies in information systems, this study initially classified the learning environments into two types—rigid contents and flexible contents qualities—and proposed the modified IS Success Model and tested the two contents qualities by means of the moderating effect of instructor involvement. Thereby, the two-tiered classification of e-learning environment qualities was confirmed, so as to represent the changed education environment of blended learning. Further, most of the research on e-learning has focused on the agent of learning—student’s learning effects or the variables that affect student’s learning or satisfaction—while this study focuses on the instructor, the agent managing the learning environment, to find a moderating role in e-learning, which is meaningful and will be further studied with the educational paradigm shift [63]. The learner’s positive perception of the e-learning experience reinforces the positive attitude towards learning, and strong retention intention will eventually contribute to self-directed learning. Therefore, the study results show that a successful e-learning experience is significant not only in determining pedagogies of higher education, but also regarding student commitment to university learning.
However, the present study also has some limitations. First, most participants of this study were majors of one subject at K-University, which limits the generalizations of the results to all learners of e-learning. Second, as stated above, whereas the discriminant validity between content and satisfaction was established by using the χ2 difference test [61], that was not supported by using the AVE test [59], which is a more rigorous test than the χ2 difference test [61]. Therefore, future research should analyze a wider range of respondents and the discriminant validity of the constructs. Finally, the instructor who observed this course has taught the course more than 6 times since 2012 before conducting the survey in 2015. In saying that, though, the instructors’ experience on e-learning has not been properly included in this study. Therefore, it adds to the limitations of this study. Future studies will measure how the instructors’ experience of e-learning affects student satisfaction, because the instructor’s experience of e-learning in the past has a significant impact on the results.

7. Conclusions

The present study converges asynchronous e-learning with the instructor role. It explores the learning experience by extending the learning environment to the LMS platform, and by incorporating the term involvement into the traditional instructor role in IS Success Model. Despite the difference in the knowledge-delivering mechanism, blended learning is versatile to manipulating the authentic learning pedagogy and modern technologies to create a hybrid environment of traditional classroom and e-learning.
As a study on the transformation of traditional classroom learning, this study reconsiders educational issues. The changed learning environment makes us reconsider the paradigm shift of the instructor–learner relationship on the one hand, and instructor–technology on the other hand, and implies a consequent pedagogical change in terms of educational goals, structural model and design, and learning processes to enhance educational effect—learner satisfaction, retention, evaluation, etc.—for a more meaningful learning experience.

Acknowledgments

We would like to express sincere gratitude to Chung, N.H.

Author Contributions

Seung Jae Lee conceived, designed, and collected the data; Hyunae Lee and Taegoo Terry Kim analyzed the data. Seung Jae Lee carried the overall responsibility of completing the paper. Authorship must be limited to those who have contributed substantially to the work reported.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. López-Pérez, M.V.; Pérez-López, M.C.; Rodríguez-Ariza, L. Blended learning in higher education: Students’ perceptions and their relation to outcomes. Comput. Educ. 2011, 56, 818–826. [Google Scholar] [CrossRef]
  2. Zang, D.; Zhao, J.L.; Zhou, L.; Nunamaker, J.F. Can e-learning replace classroom learning? Commun. ACM 2004, 47, 75–79. [Google Scholar] [CrossRef]
  3. Clark, R.C.; Mayer, R.E. E-Learning and the Science of Instruction: Proven Guidelines for Consumers and Designers of Multimedia Learning; John Wiley & Sons: Hoboken, NJ, USA, 2016. [Google Scholar]
  4. Chen, D.T. Uncovering the provisos behind flexible learning. J. Educ. Technol. Soc. 2003, 6, 25–30. [Google Scholar]
  5. Herrington, J.; Oliver, R. An instructional design framework for authentic learning environments. Educ. Technol. Res. Dev. 2000, 48, 23–48. [Google Scholar] [CrossRef]
  6. Levine, A.; Sun, J.C. Barriers to Distance Education; American Council on Education: Washington, DC, USA, 2002. [Google Scholar]
  7. Graham, C.R. Blended learning system: Definition, current trends, and future directions. In Handbook of Blended Learning: Global Perspectives; Bonk, C.J., Graham, C.R., Eds.; Pfeiffer Publishing: Zurich, Switzerland, 2004. [Google Scholar]
  8. Masiello, I.; Ramberg, R.; Lonka, K. Attitudes to the application of a web-based learning system in a microbiology course. Comput. Educ. 2005, 45, 171–185. [Google Scholar] [CrossRef]
  9. Lim, D.H.; Morris, M.L. Learner and instructional factors influencing learning outcomes within a blended learning environment. J. Educ. Technol. Soc. 2009, 12, 282–293. [Google Scholar]
  10. Liaw, S.S. Investigating students’ perceived satisfaction, behavioral intention, and effectiveness of e-learning: A case study of the Blackboard system. Comput. Educ. 2008, 51, 864–873. [Google Scholar] [CrossRef]
  11. Wagner, N.; Hassanein, K.; Head, M. Who is responsible for e-learning success in higher education? A stakeholders’ analysis. J. Educ. Technol. Soc. 2008, 11, 26–36. [Google Scholar]
  12. Johnson, R.D.; Gueutal, H.; Falbe, C.M. Technology, trainees, meta cognitive activity and e-learning effectiveness. J. Manag. Psychol. 2009, 24, 545–566. [Google Scholar] [CrossRef]
  13. Lee, M.C. Explaining and predicting users’ continuance intention toward e-learning: An extension of the expectation–confirmation model. Comput. Educ. 2010, 54, 506–516. [Google Scholar] [CrossRef]
  14. Welsh, E.T.; Wanberg, C.R.; Brown, K.G.; Simmering, M.J. E-learning: Emerging uses, empirical results and future directions. Int. J. Train. Dev. 2003, 7, 245–258. [Google Scholar] [CrossRef]
  15. Tallent-Runnels, M.K.; Thomas, J.A.; Lan, W.Y.; Cooper, S.; Ahern, T.C.; Shaw, S.M.; Liu, M. Teaching courses online: A review of the research. Rev. Educ. Res. 2006, 76, 93–135. [Google Scholar] [CrossRef]
  16. Dowling, C.; Godfrey, J.M.; Gyles, N. Do hybrid flexible delivery teaching methods improve accounting students’ learning outcomes? Acc. Educ. 2003, 12, 373–391. [Google Scholar] [CrossRef]
  17. Osguthorpe, R.T.; Graham, C.R. Blended learning environments: Definitions and directions. Q. Rev. Distance Educ. 2003, 4, 227–233. [Google Scholar]
  18. Allen, I.E.; Seaman, J.; Garrett, R. Blending in: The Extent and Promise of Blended Education in the United States; Sloan Consortium: Needham, MA, USA, 2007. [Google Scholar]
  19. Alonso, F.; López, G.; Manrique, D.; Viñes, J.M. An instructional model for web-based e-learning education with a blended learning process approach. Br. J. Educ. Technol. 2005, 36, 217–235. [Google Scholar] [CrossRef]
  20. Bhatiasevi, V. Acceptance of e-learning for users in higher education: An extension of the technology acceptance model. Soc. Sci. 2011, 6, 513–520. [Google Scholar]
  21. Cheng, B.; Wang, M.; Yang, S.J.; Peng, J. Acceptance of competency-based workplace e-learning systems: Effects of individual and peer learning support. Comput. Educ. 2011, 57, 1317–1333. [Google Scholar] [CrossRef][Green Version]
  22. Lu, H.P.; Chiou, M.J. The impact of individual differences on e-learning system satisfaction: A contingency approach. Br. J. Educ. Technol. 2010, 41, 307–323. [Google Scholar] [CrossRef]
  23. DeLone, W.H.; McLean, E.R. Information systems success: The quest for the dependent variable. Inf. Syst. Res. 1992, 3, 60–95. [Google Scholar] [CrossRef]
  24. Delone, W.H.; McLean, E.R. The DeLone and McLean model of information systems success: A ten-year update. J. Manag. Inf. Syst. 2003, 19, 9–30. [Google Scholar]
  25. Garrison, D.R.; Kanuka, H. Blended learning: Uncovering its transformative potential in higher education. Internet High. Educ. 2004, 7, 95–105. [Google Scholar] [CrossRef]
  26. Salomon, G.; Perkins, D.N.; Globerson, T. Partners in cognition: Extending human intelligence with intelligent technologies. Educ. Res. 1991, 20, 2–9. [Google Scholar] [CrossRef]
  27. Perkins, D.N. What constructivism demands of the learner. Educ. Technol. 1991, 31, 18–23. [Google Scholar]
  28. Perkins, D.N. Person-plus: A distributed view of thinking and learning. In Distributed Cognitions: Psychological and Educational Considerations; Salomon, G., Ed.; Cambridge University Press: Cambridge, UK, 1993; pp. 88–110. [Google Scholar]
  29. Harley, S. Situated learning and classroom instruction. Educ. Technol. 1993, 33, 46–51. [Google Scholar]
  30. Harding, A.; Engelbrecht, J.; Lazenby, K.; le Roux, I. Blended learning in mathematics: Flexibility and Taxonomy. In Handbook of Blended Learning Environments: Global Perspectives, Local Designs; Bonk, C., Graham, C., Eds.; Pfeiffer Publishing: San Francisco, CA, USA, 2005; pp. 400–415. [Google Scholar]
  31. Owston, R.; Lupshenyuk, D.; Wideman, H. Lecture capture in large undergraduate classes: What is the impact on the teaching and learning environment? Internet High. Educ. 2011, 14, 262–268. [Google Scholar] [CrossRef]
  32. Wang, M.; Shen, R.; Novak, D.; Pan, X. The impact of mobile learning on students’ learning behaviors and performance: Report from a large blended classroom. Br. J. Educ. Technol. 2009, 40, 673–695. [Google Scholar] [CrossRef]
  33. Bliuc, A.M.; Goodyear, P.; Ellis, R.A. Research focus and methodological choices in studies into students’ experiences of blended learning in higher education. Internet High. Educ. 2007, 10, 231–244. [Google Scholar] [CrossRef]
  34. Herrington, A.; Herrington, J. What is an authentic learning environment? In Authentic Learning Environments in Higher Education; Herrington, A., Harrington, J., Eds.; Information Science Publishing: Hershey, PA, USA, 2006; pp. 1–13. [Google Scholar]
  35. Young, J. ‘Hybrid’ teaching seeks to end the divide between traditional and online instruction. Chron. High. Educ. 2002, 48, A33–A34. [Google Scholar]
  36. Wu, D.; Hiltz, R. Predicting learning from asynchronous online discussions. J. Asynchronous Learn. Netw. 2004, 8, 139–152. [Google Scholar]
  37. Bhatti, A.; Tubaisahat, A.; El-Qawasmeh, E. Using technology-mediated learning environment to overcome social and cultural limitations in higher education. Issues Inf. Sci. Inf. Technol. 2005, 2, 67–76. [Google Scholar]
  38. Buzzetto-More, N.; Sweat-Guy, R. Incorporating the hybrid learning model into minority education at a historically black university. J. Inf. Technol. Educ. 2006, 5, 153–164. [Google Scholar]
  39. Aspden, L.; Helm, P. Making the connection in a blended learning environment. Educ. Media Int. 2004, 41, 245–252. [Google Scholar] [CrossRef]
  40. Khine, M.S.; Lourdusamy, A. Blended learning approach in teacher education: Combining face-to-face instruction, multimedia viewing and online discussion. Br. J. Educ. Technol. 2003, 34, 671–675. [Google Scholar] [CrossRef]
  41. Lambropoulos, N.; Faulkner, X.; Culwin, F. Supporting social awareness in collaborative e-learning. Br. J. Educ. Technol. 2012, 43, 295–306. [Google Scholar] [CrossRef]
  42. Ludwig-Hardman, S.; Dunlap, J.C. Learner Support Services for Online Students: Scaffolding for Success. International Review of Research in Open and Distance Learning. Available online: http://www.irrodl.org/content/v4.1/dunlap.html (accessed on 25 November 2017).
  43. Barron, B.; Darling-Hammond, L. Teaching for Meaningful Learning: A Review of Research on Inquiry-Based and Cooperative Learning; Stanford University: Stanford, CA, USA, 2008. [Google Scholar]
  44. Carr-Chellman, A.; Duchastel, P. The ideal online course. Br. J. Educ. Technol. 2000, 31, 229–241. [Google Scholar] [CrossRef]
  45. Wright, P. Astrology and science in seventeenth-century England. Soc. Stud. Sci. 1975, 5, 399–422. [Google Scholar] [CrossRef] [PubMed]
  46. Wiegers, K.E. Software Requirements: Practical Techniques for Gathering and Managing Requirement through the Product Development Cycle; Microsoft Press: Redmond, OR, USA, 2003. [Google Scholar]
  47. Stacey, E. Collaborative learning in an online environment. Int. J. E-Learn. Distance Educ. 2007, 14, 14–33. [Google Scholar]
  48. Pawan, F.; Paulus, T.M.; Yalcin, S.; Chang, C. Online learning: Patterns of engagement and interaction among in-service teachers. Lang. Learn. Technol. 2003, 7, 119–140. [Google Scholar]
  49. Bhattacherjee, A. Understanding information systems continuance: An expectation-confirmation model. MIS Q. 2001, 25, 351–370. [Google Scholar] [CrossRef]
  50. Roca, J.C.; Chiu, C.M.; Martínez, F.J. Understanding e-learning continuance intention: An extension of the Technology Acceptance Model. Int. J. Hum. Comput. Stud. 2006, 64, 683–696. [Google Scholar] [CrossRef]
  51. Stauss, B.; Chojnacki, K.; Decker, A.; Hoffmann, F. Retention effects of a customer club. Int. J. Serv. Ind. Manag. 2001, 12, 7–19. [Google Scholar] [CrossRef]
  52. Berge, Z.L.; Huang, Y.P. A model for sustainable student retention: A holistic perspective on the student dropout problem with special attention to e-learning. DEOSNEWS 2004, 13, 1–26. [Google Scholar]
  53. Midgley, C.; Feldlaufer, H.; Eccles, J.S. Change in teacher efficacy and student self-and task-related beliefs in mathematics during the transition to junior high school. J. Educ. Psychol. 1989, 81, 247–258. [Google Scholar] [CrossRef]
  54. Lee, K.C.; Chung, N. Understanding factors affecting trust in and satisfaction with mobile banking in Korea: A modified DeLone and McLean’s model perspective. Int. Comp. 2009, 21, 385–392. [Google Scholar] [CrossRef]
  55. Chung, N.; Lee, H.; Lee, S.J.; Koo, C. The influence of tourism website on tourists’ behavior to determine destination selection: A case study of creative economy in Korea. Technol. Forecast. Soc. Chang. 2015, 96, 130–143. [Google Scholar] [CrossRef]
  56. Bru, E.; Stephens, P.; Torsheim, T. Students perceptions of teachers class management styles and their report of misbehavior. J. Sch. Psychol. 2002, 40, 287–307. [Google Scholar] [CrossRef]
  57. Stornes, T.; Bru, E.; Idsoe, T. Classroom social structure and motivational climates: On the influence of teachers’ involvement, teachers’ autonomy support and regulation in relation to motivational climates in school classrooms. Scan. J. Educ. Res. 2008, 52, 315–329. [Google Scholar] [CrossRef]
  58. Ahuja, M.K.; Thatcher, J.B. Moving beyond intentions and toward the theory of trying: Effects of work environments and gender on post-adoption information technology use. MIS Q. 2005, 29, 427–459. [Google Scholar] [CrossRef]
  59. Fornell, C.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  60. Nunnally, J.C. Psychometric Theory, 2nd ed.; McGraw-Hill: New York, NY, USA, 1978. [Google Scholar]
  61. Anderson, J.C.; Gerbing, D.W. Structural equation modeling in practice: A review and recommended two-step approach. Psychol. Bull. 1988, 103, 411–423. [Google Scholar] [CrossRef]
  62. Black, A.E.; Deci, E.L. The effects of instructors’ autonomy support and students’ autonomous motivation on learning organic chemistry: A self-determination theory perspective. Sci. Educ. 2000, 84, 740–756. [Google Scholar] [CrossRef]
  63. Barr, R.B.; Tagg, J. From teaching to learning—A new paradigm for undergraduate education. Chang. Mag. Higher Learn. 1995, 27, 12–26. [Google Scholar] [CrossRef]
Figure 1. IS Success Model [25].
Figure 1. IS Success Model [25].
Sustainability 10 00850 g001
Figure 2. Modified IS Success Model for e-learning.
Figure 2. Modified IS Success Model for e-learning.
Sustainability 10 00850 g002
Figure 3. Research design.
Figure 3. Research design.
Sustainability 10 00850 g003
Figure 4. Hypothesized research model.
Figure 4. Hypothesized research model.
Sustainability 10 00850 g004
Figure 5. Standardized structural estimates and testing of direct effect hypotheses.
Figure 5. Standardized structural estimates and testing of direct effect hypotheses.
Sustainability 10 00850 g005
Table 1. Subject profile (N = 204).
Table 1. Subject profile (N = 204).
FrequencyPercentage
Gender
  Male8240.2
  Female12259.8
Age
  1962.9
  207838.2
  215024.5
  222612.7
  ≥234421.6
Major
  Hospitality & tourism17987.7
  Business104.9
  Foreign language & culture 62.9
  Natural sciences & engineering52.5
  Undeclared42.0
Table 2. Reliability and convergent validity testing.
Table 2. Reliability and convergent validity testing.
Construct and IndicatorLoadingt
Instructor involvement (CR 1 = 0.932; α 2 = 0.918; AVE 3 = 0.605)
Has the instructor appropriately managed the students?0.84033.458
Has the instructor fulfilled his or her role as the counselor by understanding the needs of the students and responding in a professional manner?0.79128.230
Did the instructor promptly fix any problems related to the online system?0.81425.426
Were the text messages for lecture notification helpful to taking the lectures?0.75521.567
Were the text messages for quiz notification helpful in preparing for the quizzes?0.76921.337
Has the teaching assistant promptly attend to problems in the KLAS system and fix them?0.69614.203
Has the teaching assistant promptly respond to students’ questions?0.74318.179
Did the bulletin board for the course provide information punctually?0.83627.930
This lecture publishes class attendance and mid-term evaluations. Were these resources useful for taking the course?0.74613.396
Systemquality (CR = 0.925; α = 0.877; AVE = 0.805)
Were you able to take lectures without difficulties via the KLAS system?0.92067.013
Were you able to easily access course materials via the KLAS system while taking lectures?0.88947.707
Was the KLAS system stable while taking lectures?0.88236.590
Did you have any trouble connecting to the KLAS 4 system to evaluate courses? R--
Have you experienced any inconvenience during course evaluation due to instability of the KLAS system? R--
Contentquality (CR = 0.903; α = 0.864; AVE = 0.651)
Were the six major courses offered this term helpful for understanding the respective individual majors?0.83541.595
Was the special course on the six majors helpful for understanding the respective majors?0.80526.573
Was the video quality good enough to follow the lecture?0.74415.215
Were the course notes suitable for the lectures?0.82927.153
Was the course loading suitable for the lectures?0.81526.761
Testquality (CR = 0.907; α = 0.865; AVE = 0.711)
Were the quizzes helpful in understanding the lectures?0.88848.980
Were the quizzes for the major courses manageable in difficulty?0.86132.695
Was there enough time to finish the quizzes?0.73912.935
Were the problems in the quiz suitable for evaluating your understanding in the subject?0.87630.715
Activityquality (CR = 0.912; α = 0.855; AVE = 0.776)
Was the group activity helpful in choosing a major?0.88441.460
Were your teammates for the group activity cooperative?--
Do you think group activities can overcome the individualistic nature of the online course?0.89151.686
Do you think the UCC 5 production as the assessment was helpful in deciding your major?0.86835.230
Learner satisfaction (CR = 0.955; α = 0.944; AVE = 0.704)
Did you gain a better understanding of H&T 6 by taking this course?0.86737.409
Are you satisfied with the course design?0.86443.224
Are you satisfied with the KLAS system?0.85427.583
Are you satisfied with the evaluation system for this course?0.89752.477
Are you satisfied with group activity in this course?0.77721.084
Do you think you have successfully completed this self-directed course?0.85633.372
Do you think the off-line activities are sufficiently complementary for the online part of the course?0.76816.938
Do you think the involvement of the instructor had a positive effect overall on the course?0.81722.186
Was this course helpful for deciding your major?0.84727.618
Learner retention (CR = 0.921; α = 0.871; AVE = 0.795)
Would you refer this course to someone else?0.89542.790
Would you take a similar course, if one exists?0.90545.607
Would you take other courses offered by the same instructor?0.87531.916
1 Composite reliability; 2 Cronbach’s α; 3 Average variance extracted; 4 Kyung Hee learning archive system; 5 User generated contents; 6 Hospitality & Tourism; R reverse-coded item; All loadings are significant (p < 0.01).
Table 3. Summary statistics, correlations, and discriminant validity testing.
Table 3. Summary statistics, correlations, and discriminant validity testing.
MeanSD 11234567
1. Instructor involvement4.3490.6060.778
2. System quality3.9200.8660.4710.897
3. Content quality4.0560.7030.7280.6700.807
4. Test quality4.1300.6490.7550.5910.7990.843
5. Activity quality3.6600.9530.4360.5640.6730.6120.881
6. Learner satisfaction3.9240.7670.6960.6750.8330.8120.7720.839
7. Learner retention4.0410.8420.6640.5500.7120.7440.5990.8310.892
1 Standard deviation; Correlations among study variables are presented in the lower off diagonal and square root average variances extracted (AVEs) are presented in parentheses along the diagonal. All correlations among study variables are significant (p < 0.01).
Table 4. Moderated regression analysis of the effect of instructor involvement.
Table 4. Moderated regression analysis of the effect of instructor involvement.
Model 1Model 2Model 3
Dependent variable: Learner satisfactionβtβtβt
Independent variables
System quality0.1152.774 **0.1222.998 **0.1082.688 **
Content quality0.2844.979 ***0.2193.648 ***0.1853.165 **
Test quality0.3226.351 ***0.2484.450 ***0.2865.260 ***
Activity quality0.3197.639 ***0.3428.198 ***0.3558.471 ***
Moderating variable
Instructor involvement 0.1432.963 **0.1583.154 **
Interactions
System quality * Instructor involvement 0.0511.226
Content quality * Instructor involvement −0.092−1.540
Test quality * Instructor involvement 0.1683.184 **
Activity quality * Instructor involvement −0.126−2.829 **
R20.8220.8300.846
Adjusted R20.8180.8250.838
F229.666 ***192.669 ***118.005 ***
ΔR2-0.0080.016
** p < 0.01, *** p < 0.001.
Back to TopTop