Next Article in Journal
Using Net Primary Productivity to Characterize the Spatio-Temporal Dynamics of Ecological Footprint for a Resource-Based City, Panzhihua in China
Previous Article in Journal
China’s Socioeconomic and CO2 Status Concerning Future Land-Use Change under the Shared Socioeconomic Pathways
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Factors in Evaluating Online Learning in Higher Education in the Era of a New Normal Derived from an Analytic Hierarchy Process (AHP) Based Survey in South Korea †

College of Liberal Arts and Interdisciplinary Studies, Kyonggi University, Suwon 16227, Korea
*
Author to whom correspondence should be addressed.
This study was conducted with the ‘University Innovation Support Project’ supported by the Center for Teaching & Learning of Kyonggi University in the academic year of 2021.
Sustainability 2022, 14(5), 3066; https://doi.org/10.3390/su14053066
Submission received: 4 February 2022 / Revised: 2 March 2022 / Accepted: 3 March 2022 / Published: 6 March 2022
(This article belongs to the Section Sustainable Education and Approaches)

Abstract

:
Before COVID-19, online learning in higher education was more of a choice than a requirement. The majority of universities in South Korea are currently not utilizing an evaluation index tailored specifically to online courses and are instead using the traditional in-person class evaluation standards. The study, hence, examines the factors that could be used to evaluate the quality of online learning in higher education taking place due to the COVID-19 pandemic from the point of views of the main subjects of online education: e-learning system administrators, instructors, and students. The analytical hierarchy process (AHP) method was used to determine the relative importance of factors in evaluating online learning. The conclusions derived from this research can be used as foundational material for evaluation factors of online learning in higher education.

1. Introduction

The COVID-19 pandemic has caused the world to move online, and the educational system in Korea has responded in kind by reorganizing university coursework to be conducted virtually [1]. The switch from in-person to virtual classrooms caused much difficulty for instructors, students, and online learning system administrators alike when it came to virtual coursework design and instruction [2,3]. The sudden change left no time for preparation, and instructors were given a large work burden while being forced, along with their students, to adjust to the change in environment [4].
While changes made due to the COVID-19 pandemic were considered temporary, online learning in higher education has continued for two years. The common view from research in education acknowledges that the current remote teaching might not be temporary and will even evolve after COVID-19 [5]. From a sustainable educational design perspective, campus-based education and online education must blend seamlessly and, above all, digital technologies and approaches pioneered in flexible distance education must play a continuing and expanding role [5]. In response to these changes, this study focuses on examining online evaluation standards which also can apply to a sustainable learning solution. A sustainable learning solution requires key competencies including holistic understanding, collaboration skills, ability and willingness for critical and reflective thinking, creativity, and innovativeness [6,7].
This study is based on the premise that the above concept should be appropriately included in the learning evaluation factors so that a sustainable solution for online learning proceeds. In online learning, the role of e-learning system administrators, instructors, and students are essential subjects, but relatively little research has been conducted with a comprehensive view of these three subjects. In particular, e-learning system administrators are often excluded in the previous literature. The main discussions of this study are as follows. (1) This study takes a comprehensive view on class evaluations for online classes to include necessary factors related to the three primary roles of online learning: e-learning system administrators, instructors, and students. (2) Using the analytical hierarchy process (AHP), various factors relevant to qualitative evaluations of online learning are examined for the purpose of understanding their relative importance. (3) The study suggests sustainable solutions applicable to evaluation designs for online learning in higher education.

2. Literature Reviews and Research Questions

2.1. Online Learning in Higher Education during the COVID-19 Pandemic

Online learning, also called ‘e-learning’, refers to methods of education delivered via the internet, including teaching, coaching, and studying digitized educational content [8]. Classes were brought entirely online regardless of the will of those directly involved in them due to the COVID-19 pandemic, creating a sort of live experiment in which instructors, students, and administrators experienced negative aspects of this style of learning. Instructors were forced to quickly re-purpose their classes for the online format with limited understanding of how to conduct them virtually, while their communication with students was restricted, making it harder to guarantee a certain level of quality [2]. Students were required to self-direct their learning for the duration of the courses. Compared to in-person courses, students generally received fewer clear explanations from their instructors and delays in feedback led to lower study motivation. Students found themselves feeling isolated from their fellow students and professors [9,10]. Administrators responsible for managing online education were also unprepared when classes moved online, causing system connection, copyright, and privacy issues, and were faced with a completely new reality different to the previous one in which online classes were only a portion of total classes [11]. The existing LMS (Learning Management System) for online classes became foundational for virtual learning, creating a great need for tailor-made systems specifically for the virtual learning environment. Such a task requires expanding human resources for management and training.
Previous studies on online education largely focus on student satisfaction with online learning [12]. However, this study aims to fill the need for foundational research on class evaluations of full-scale online learning conducted due to a global health crisis. In this regard, this study approaches the analysis of online education slightly differently than previous literature. This study looks at the special circumstances created by the COVID-19 pandemic leading to virtually conducted education at universities, specifically from the perspectives of its three major subjects, and how standards for class evaluations of virtual learning at the university-level should be set. E-Learning system administrators who manage the online learning system seamlessly, instructors who create lesson plans for virtual classes and communicate with students online, and students who are the receivers of the education are the major subjects involved in the many aspects of virtual education. This study examines the perceptions they have of online classes at the university level and how such perceptions may differ, which will inform the relative importance of various factors included in the evaluation index of online classes.

2.2. Online Class Evaluations at Universities

2.2.1. E-Learning System Administrator Competencies

The purpose of online learning systems is to make online education easier to use and convenient to access for both instructors and students, helping them gain better access to content, allowing easy instructor-student and student-student interactions, and managing attendance software conveniently [13]. The role of e-learning system administration becomes crucial in order to improve the quality of online education. Additionally, both instructors and students reported that the rapid switch to online classes required far more independent study. This is especially true when requiring use of an online learning system that users are unfamiliar with, requiring time and effort to grow accustomed to, thus lowering the quality of learning [14].
According to previous studies, the aspect reported to be the most difficult was the perceived lack of support from administrators [15,16]. Instructors who are unable to receive any technical support could be anxious about how to plan for online learning or may have no desire to [15]. However, if instructors are given adequate support to use the proper technologies by the online learning system administrator, they are much more able to plan for online classes and reduce stress [17]. Online learning departments must offer regular training sessions on how to utilize their virtual learning platforms and work cooperatively with instructors for coursework and lesson planning. Developing a cooperative relationship with instructors can even lead to higher utilization rates of the online learning system itself [14].
E-learning administrators must also possess adequate knowledge and technical skill to carry out their work. This is not only related to system stability. In order to provide high-quality education and learning opportunities, the system must be accessible on various devices, connected to the latest solutions, and feedback must be ongoing for the human resources working in administrative positions to increase their expertise [3]. If the professionalism of the educational support team can be guaranteed, both students and instructors have access to an optimized LMS, and this can be the driving force behind better utilization.

2.2.2. Instructor Competencies

Instructor competency has long been considered an important factor for student immersion and learning experience, as well as overall learning effectiveness for traditional classes, and this has carried over into online learning [14,18,19]. After university classes were moved online, the skill most often mentioned as of high importance has been the interaction between instructors and students [3,20]. Instructor-student interaction not only adds to the learning experience for students but influences levels of teaching satisfaction [19]. In online classes, interaction is lacking in comparison to traditional in-person methods, and this leads to the possibility of a negative effect on learning outcomes, experience, and academic achievement [21,22]. Consequently, the various strategies for increasing instructor-student interaction in online learning are gaining importance [14]. Timely feedback is paramount as is the instructor’s ability to lead discussion among students [23,24]. Instructors’ use of diverse communication methods with students, such as email, LMS messenger, text message, and instant messenger [12], and ability to take on the role of counselor, coach, and facilitator [25] are key factors enhancing instructor-student interactions.
In order to facilitate instructor-student and student-student interaction, discussion can be built into class time, online forums can be used for assignment submissions where student can give and receive feedback from others, or instructors can consistently check the status of assigned tasks for the purposes of grading [11]. Instructor-student interaction in online classes may be considered lacking in comparison to in-person teaching, but if the flexibility and convenience of online learning is fully utilized, such interaction can take place through built-for-purpose communication channels at any time and place [26].
While planning and designing courses that can facilitate online interaction, instructors must decide accordingly the assessment methods that can persuade and motivate students to accept these classroom environments [27]. In order to pursue higher levels of interaction, the teaching methods applied to in-person classes cannot be identical to those used in online courses. There is a need to develop assessments and assignments tailored to the online class environment, and they must be explained clearly from the beginning [27,28].
Some effective strategies for creating a positive learning experience in online classes are enhancing instructors’ interpersonal competencies, such as addressing students by their names, providing quick feedback on students’ ideas, maintaining an attitude of openness and nonjudgment in relation to student ideas, and using appropriate humor and other communication methods to build rapport [29].

2.2.3. Student Competencies

The majority of studies on student competencies in online learning have focused on the student’s capacity for self-directed learning [30]. This is especially true for pre-recorded lectures, as the coursework can be completed flexibly at any time or place. Such classes, as they are entirely instructor-led in terms of instruction, have slow feedback and low levels of interaction, leading to lower ability for students to focus, and therefore produce limited academic achievement [24]. This reality is reflected in studies that show that compared to blended or in-person classes, online classes have higher rates of withdrawals [31]. Pre-recorded classes allow for better time management and convenience and allow students to continue through the coursework at individual speeds and at different times, making them highly customizable. However, the lack of immediate instructor feedback or lowered rates of interaction are consequences of this method of instruction, making it necessary for students to utilize self-directed learning skills [32]. Even for real-time virtual classes, the lack of social presence of the other students taking the class leads to some individuals losing concentration and becoming distracted [33]. In these circumstances, students are largely left to their own devices to study independently, requiring higher motivation and self-directed learning skills on the part of the students compared to traditional in-person learning [27].
Students who do well in online courses have literacy skills for understanding class content and are able to use the provided content to answer questions posed through critical and analytical thinking skills [34,35]. The majority of provided content is digitized and shared online, so students more than before must have the skills to independently read such content, understand it, and complete related tasks [34]. Learning may take place through a variety of digital mediums, such as video, news infographics, and podcasts, or the homework assigned may require use of such multimedia, thus necessitating digital literacy for utilizing online mediums.
Finally, among the skills required for effectiveness of online learning, the desire for student-student interaction is crucial. The majority of LMS include platforms for carrying out group projects, but due to the situation where virtual study is ongoing, student interaction and rapport cannot be established, so it is not possible to rely entirely on these systems [2]. The majority of incoming first- and second-year students now have very little experience in attending traditional in-person classes, making it difficult for them to engage in student-student interaction during online classes, and they struggle to feel a sense of camaraderie with other students and the instructor. The willingness for students to interact in class greatly contributes to enhancing the effectiveness of online learning [36]. Student-student interaction helps them form relationships, which increases their interest in the class, acting as further motivation, and is an important factor in class satisfaction and academic achievement [2,12].
Given the literature reviews, this study focuses on the major subjects involved in the many aspects of online education. Their respective perceptions and competencies were examined for the purposes of comparison. Based on the above discussion, the research questions were derived as follows:
Research Question 1: What are the factors used for evaluating online education in universities?
Research Question 2: How do university-level e-learning system administrators, instructors, and students perceive the relative importance of each factor in evaluating online learning in higher education and what order of priority is given to each one?

3. Research Method

3.1. Evaluation Factors and Establishing the Hierarchy

This study examines several factors that can evaluate the quality of online education at universities. In order to understand their relative importance, the AHP was conducted in relation to e-learning systems administrators, instructors, and students. AHP is a theory of measurement for decision-making that helps experts to judge how much one element dominates another through pairwise comparison [37]. This technique has been widely used in government and business decision-making studies [38].
This study followed the three steps to build a hierarchical structure to generate priorities to make the decision [39]. Using AHP, decision-makers arrange determinants in a hierarchical structure, starting with the overall goal and descending to criteria, sub-criteria, and alternatives on successive levels. This technique allows the comparison of all mutually comparable components in structuring the content of a specific domain, and measures and reflects the consistency of judgments [39].
First, relevant literature was reviewed to understand the factors influencing the quality of university-level online education. These factors were categorized into sub-categories that were revised and supplemented from recommendations from the advisory group consisting of three e-learning administrators, three instructors, and four students. Second, with the help of the advisory group, the names of unclear factors were revised, and factors were modified to be mutually exclusive to each other. The final step was to create a hierarchical structure of evaluation factors derived from the expert participants with their consent and re-organized into three stages of evaluation criteria.
The first-stage evaluation criteria were divided into the three major subjects, ‘e-learning system administrator competency’, ‘instructor competency’, and ‘student competency’. The e-learning system administrator competency refers to the ability to promote the technical stability of the online learning system required for online classes and proper system administration to ensure that both instructors and students can use the system without issues. Instructor competency refers to the ability of the instructor to effectively design online classes, communicate with students, and utilize effective grading methods. Student competency refers to the ability to engage in proactive learning and active participation in class, utilizing the educational content provided by the virtual learning system for university-level online classes. Second and third-stage evaluation criteria were structured hierarchically, as shown in Table 1.

3.2. Questionnaire

Based on the hierarchical structure developed, this study created a questionnaire following the AHP evaluation model for online education. The questionnaire consists of an overview, an explanation of terms for each evaluation factor and items, an indication of the priority for each evaluation factor (subject, category, and sub-category levels), comparative questions within each classification, and brief demographic questions for statistical purposes. A 17-point scale was used for the comparative questions [37]. The scale can display up to nine points on both the left and the right, with the number one in the middle position. This middle position indicates ‘equal importance’ in which two items are equally important when comparing two situations, three indicates ‘slightly more important’, five is ‘important’ in which one factor is strongly more important than the other, seven is ‘very important’ in which one factor is stronger and obviously more important than the other, and nine on either side indicates ‘absolute importance’ in which one element is fully more important than the other.
The survey was conducted with ten instructors working in universities in Seoul and the surrounding metropolitan area, ten administrators working at e-learning departments in universities in Seoul and the surrounding metropolitan area, and ten undergraduate students. This study used a purposive sampling method. Several e-learning system experts working in universities were initially contacted and asked to recommend one expert who had sufficient experience with the e-learning system. Likewise, several instructors and students in higher education were contacted and asked to recommend other instructors in the field. Instructors and students who have been experiencing online classes since the start of the transition were selected, as well as administrators selected mainly from among the staff of the teaching and learning support centers where online learning systems are managed. In order to compare traditional and online classes, students who enrolled in university before 2020 were selected from the students who maintained close communication with their instructors, participating in voluntary feedback at least five times with the instructor. The survey participants were then contacted, informed about the purpose of the study, and invited to do the survey. If the person opted not to participate, the participant recommended another person. Through this process, 30 participants were collected for the survey. The major advantage of AHP is that it does not require a large sample size to achieve sound and statistically robust results [40]. After we confirmed the participants, the surveys were conducted for 30 days from 15 September to 14 October 2021.
In order to increase the consistency of responses, the research team contacted each respondent ahead of time to explain the details of the questionnaire and how to mark priority for each item through videoconference and in-person meetings. A separate orientation was held to help respondents use their priority rankings to answer the comparison questions. After the questionnaire was completed, consistency analysis was performed on the responses of individual respondents to determine reliability. In the study, Expert Choice 11, a special analytical program for AHP, was used to check for this consistency. CR values higher than 0.1 were not included in the observation. No respondents were excluded, and all 30 responses were included in the final analysis.

4. Results

4.1. Demographic Chracteristics

Of the 30 respondents, 16 were men (53.3%), and 14 were women (46.7%). Regarding their years of experience in their expertise, e-learning administrators had 9.4 years, instructors had 10.6 years, and students had 2.6 years of experience in their field. The average age of the respondents was 35.4 years, and the average years of experience of the respondents was 5.9 years (Table 2).

4.2. E-Learning System Administrators

According to the responses from the administrator’s cohort, all ten selected ‘instructor competency’ (0.455) as the most important factor in the subject category (Table 3). ‘Online learning system administrator competency’ (0.416) ranked second, and ‘student competency’ (0.130) ranked third.
Within the categories of ‘instructor competency’, ‘communication skills’ (0.606) was the most important factor, and ‘designing a course tailored to online delivery’ (0.232) and ‘Fair assessment tailored to online learning’ (0.162) ranked second and third, respectively (Table 4). ‘Ease of use’ (0.475) was the highest ranked category under ‘online learning system administrative competence’, while ‘technical stability’ (0.384) and ‘management and feedback’ (0.141) were ranked second and third, respectively. These results show that administrators value an easy to use and convenient online learning system. The instructor’s communication skills are considered to have priority over designing a course or fair assessment. Among the categories of ‘student competency’, ‘participation’ (0.667) was considered more important than ‘study skills’ (0.333).
Table 5 explains the priority of each subcategory item as follows. ‘Ease of downloading’ class materials (0.351) was considered the most important in the category of ‘ease of use’ of the online learning system, and ‘instructor-student interaction’ (0.626) was considered most important in the subcategory of ‘communication skills’ of instructors. Lastly, among the subcategories of ‘participation’, ‘student-student and student-instructor interaction’ (0.781) was selected as the most important evaluation factor.
Although the relative importance analysis is useful for examining the priority of specific evaluation items within each category or subcategory, it does not provide a comprehensive answer as to which of the individual items are overall more or less important among all sub-categories. Therefore, the total weight was calculated for all items in the model. Global weights were obtained by multiplying the weights of the sub-category items by the weights of the category and subject level factors, respectively (Table 5).
The result of comparing the two evaluation factors showed that the three highest items based on the importance of online learning evaluation criteria were ‘instructor-student interaction’ (0.173), ‘stability of web server’ (0.088), and the instructor’s ‘role as a facilitator’ (0.073). These results show that among the major factors used for evaluating online learning, administrators value communication between instructors and students, the stability of the internet network in which classes are conducted, and the role of instructors in assisting students in their studies.

4.3. Instructors

According to responses from the instructor cohort consisting of ten individuals, ‘instructor competency’ (0.460) was selected as the most important factor in the subject category (Table 3). ‘E-Learning system administrator competency’ (0.339) ranked second, and ‘student competency’ (0.202) ranked third (Table 6).
Table 7 explains the relative importance within the category level. Within the categories under ‘instructor competency’, ‘communication skills’ (0.606) was the most important, and ‘designing a course tailored to online delivery’ (0.232) and ‘fair assessment tailored to online learning’ (0.162) ranked second and third, respectively. Instructors, similar to administrators, felt that the instructor’s communication skills take priority in online classes. ‘Ease of use’ (0.491) was ranked first among the categories of ‘e-learning system administrator competency’, followed by ‘technical stability’ (0.360) and ‘management and feedback’ (0.149). These results show that instructors are similar to system administrators in valuing ease and convenience of the online learning system. In the ‘student competency’ categories, the instructors understood ‘participation’ (0.667) to take priority over ‘learning skills’ (0.217).
Then, among the subcategories, ‘accessibility of applications in LMS’ (0.541) was considered the most important in the category of ‘ease of use’, while ‘instructor-student interaction’ (0.598) was considered most important in the category of ‘communication skills.’ Lastly, under the category of ‘participation’, ‘student-student and instructor-student interaction’ (0.669) was selected as the most important evaluation factor. All rankings based on the weight for each item are shown in Table 8. The results of the comprehensive analysis show that the three highest ranked items were ‘instructor-student interaction’ (0.138), ‘accessibility of applications within LMS’ (0.090), and ‘stability of web server’ (0.080). These results reveal that instructors value communication between themselves and students, having a variety of applications easily available to them for online classes, and the stability of the internet network in which classes are conducted.

4.4. Students

According to the responses from the student cohort consisting of ten individuals, ‘instructor competency’ (0.414) was selected as the most important factor in online learning at universities (Table 9). ‘E-Learning system administrator competency’ (0.382) ranked second, and ‘student competency’ (0.203) ranked third.
As for the categories of ‘e-learning system administrator competency’, ‘technical stability’ (0.497) took priority, followed by ‘ease of use’ (0.355) and ‘management and feedback’ (0.148). Among the ‘instructor competency’ categories, ‘designing a course tailored to online delivery’ (0.499) was ranked first, while ‘communication skills’ (0.317) and ‘fair assessment tailored to online learning’ (0.184) ranked second and third, respectively. From this, it appears that students value classes planned with online delivery in mind. Students gave highest priority to the stability of the online learning system. In the ‘student competency’ categories, ‘participation’ (0.651) was ranked higher than ‘study skills’ (0.349). The priority of each subcategory item is as follows (Table 10).
As showed in Table 11, among the subcategories of ‘technical stability’ of the online learning system, students ranked ‘stability of web server’ (0.629) as the most important, and among the subcategories of ‘designing a course tailored to online classes’, ‘utilizing up-to-date course materials’ suitable for online classes (0.271) was chosen as the most important factor among the instructor competencies. Finally, looking at the subcategories under ‘participation’, ‘student-student and instructor-student interaction’ (0.724) was selected as the most important evaluation factor. As a result of comprehensively analyzing all 22 evaluation factors, ‘stability of web server’ (0.119) ranked first, followed by ‘student-student and instructor-student interaction’ (0.096) and ‘accessibility of LMS via various devices’ (0.070).

4.5. Summary of Results

The results confirmed that each subject held different priorities for online education. From the perspective of the e-learning system administrator, communication between instructors and students was selected as the most important evaluation standard, and the stability of the web server and the assistive role of the instructor in student learning were selected as the next most important. Thus, the results revealed that from the administrator’s point of view, instructor competencies are the most important factor. The role of the administrator, as perceived by the administrator, is to manage the online learning system so that instructors and students can enjoy online classes without interruptions. One of the interesting findings of this study is that administrators valued instructor-student interaction over technical stability of the online learning system.
From the instructor’s perspective, communication with their students was the most important factor. However, results show how instructors perceive administrator competency to be a close second. Using various applications within the system and stability of the system server so that classes can be provided without interruption were selected as important factors. This can be explained by the convenience of consistently well-connected classes. Linking various applications to the LMS such as Zoom, Google Classroom, and Google Meet allows for more flexible and convenient classes. As the fourth most important factor, the instructors selected student participation, which is essentially the student’s willingness to interact with the class. It appears that while instructors value their own communication skills, the desire and ability of students to actively engage in the class is more important. They recognize that proactive participation on the part of the students contributes to improving the quality of online classes.
Among the three main subjects, students appear to have the most critical views of online class evaluation standards as they are the consumers of e-learning management systems, as well as the targets of the educational services, users of content, and participants in the role of the instructor-student relationship. Students ranked network stability as their highest priority. This was followed by self-directed interaction with other students and the instructor, and the ability to access the online learning system through various devices to enable online learning regardless of place or time. These results show that the most important factor for students in their evaluation of online classes is whether they can participate in their learning as consistently and conveniently as possible without disconnection between class time. One reason for this is that as students already have high levels of digital literacy, most are unlikely to find learning new online LMS or reading user manuals to be necessary.

5. Discussion and Conclusions

5.1. Discussion

The differences between the results of this study compared to the results of previous studies are as follows. Most of the previous studies investigated the satisfaction of online classes from the perspectives of students and instructors [16,17,32,34]. The interaction of students and instructors in both offline and online classes were the main factors determining class satisfaction [21,33]. The peculiarity of this study, however, was that the determinant of online class satisfaction from the student’s point of view was the network stability. This means that, as mentioned earlier, when participating in online classes, it has become most important to participate in classes without any system malfunction or internet disconnection. The technical stability belongs to the responsibility of the e-learning system manager which we will discuss next.
The purpose of this study was to overcome limitations of online coursework evaluations, particularly in that they focus solely on the perspectives of instructors and students [2,3,10,11]. Despite the COVID-19 pandemic forcing all university classes online, thus expanding the influence of e-learning management systems administrators, studies of the evaluation index from their perspective are lacking. System administrators need to be included in any evaluation index because they are responsible for directly updating and maintaining the learning management systems, and their role must be objectively evaluated and disclosed to both instructors and students.
Although both e-learning system administrators and instructors are in the position of providers and/or operators in online learning, the study confirmed that there was a difference in perception between the two. E-learning systems administrators value the role of instructors as assisting students in using online learning management systems, and instructors place priority on having a diversity of applications available to supplement their online classes. It can be inferred from the results that instructors demand various means of communication to interact with students. In response to the demand, e-learning system administers needs to develop the system so that the interaction between instructors and students can occur actively. This study re-confirmed that developing a cooperative relationship with instructors is one of the key competencies for e-learning system administrators, and this leads to higher utilization rates of the online learning system itself [14].

5.2. Academic and Practical Implications

An online communication channel needs to be provided for these three subjects through which their differing needs can be communicated. This responsibility lies primarily with the overall university administration. Of course, each university has been operating a temporary ‘help desk’ window to handle the flood of complaints arising due to the sudden move to online classes in response to COVID-19. This system, however, is merely delivering complaints to a small group of non-experts, i.e., teaching assistants, who then deliver the complaints to system administrators and professors in charge of. For example, when problems related to the system and course design occur simultaneously, each subject responsible blames the other party, which only results in students’ suffering the consequences. An official space for public discourse needs to be provided within the system in which administrators, instructors, and students actively collaborate to raise issues and find resolution rather than just having a help desk for filing complaints.
Online communication channels such as LMS messenger, email, external messenger applications, and video conferencing, have grown more diverse, but a depth of communication compared to in-person classes lacks. Students should always be given the opportunity to express their opinions through both personal, immediate communication channels as well as formal channels such as online class evaluations. As course evaluations, so far, have been somewhat biased toward administrative procedures, meaning the results have lower flexibility and arbitrary standards, it is necessary to reflect feedback for improvement by replacing it with a relative priority indicator centered on smooth communication between instructors, students, and system administrators.
As mentioned in the introduction of this study, due to online learning being fully utilized at universities, it has reached a level where the class evaluation index must be revised so that is more appropriate for the delivery medium. This study is meaningful in that it focuses on the three main subjects of online learning, but practical application of the indexes developed in this study would need to address several considerations, such as the number of students per course, teaching methods, and subject matter. Nevertheless, given that this study took on these perspectives, it has provided an opportunity to think about which issues are fundamentally affecting online classes, and it is hoped that this will influence individual items to be included on evaluations developed by universities in the future.
As we are now in the second full year of the forced shift to online classes, the structure of the evaluation index for online courses must adapt in the same way that education itself has in the face of pandemic instability. While this study is meaningful as a means to achieve this adaptation, further research is needed. Post-pandemic courses will inevitably be delivered both in-person and in a hybrid manner, combining traditional offline classes with online content, which may require further revision of evaluation standards. As a stepping stone between offline and hybrid methods, it is expected that the limitations of this study will be extended and revised in the future and utilized as an integrated research method.

Author Contributions

Conceptualization, Y.Y.C. and H.W.; methodology, Y.Y.C.; formal analysis, Y.Y.C.; investigation, H.W.; writing—original draft preparation, Y.Y.C. and H.W.; writing—review and editing, Y.Y.C. and H.W.; visualization, Y.Y.C. and H.W.; funding acquisition, H.W. All authors have read and agreed to the published version of the manuscript.

Funding

This study was conducted with the ‘University Innovation Support Project’ supported by the Center for Teaching & Learning of Kyonggi University in the academic year of 2021.

Institutional Review Board Statement

IRB Approval for the study was not required in accordance with Article 23 of the Personal Information Protection Act(Personal information controllers are not likely to significantly infringe on the privacy of the data subject, including information on ideology and belief, union or political party membership or withdrawal, political opinion, health, and/or sexual orientation).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Acknowledgments

We acknowledge the Center for Teaching & Learning of Kyonggi University for its support in our research. We also are grateful to the peer reviewers’ valuable comments, which has helped us to improve the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ministry of Education Republic of Korea. Responding to COVID-19: Online Classes in Korea (A Challenge toward the Future Education). Available online: https://kosis.kr/files/covid/Responding_to_COVID-19_ONLINE_CLASSES_IN_KOREA.pdf (accessed on 23 December 2021).
  2. Kim, J.; Park, Y.; Kim, K.Y.; Yang, K. An analysis of college professors’ and students’ perceptions and experiences of online classes under the COVID-19 situation. Educ. Res. 2021, 80, 33–58. [Google Scholar]
  3. Sari, T.; Nayır, F. Challenges in Distance Education During the (COVID-19) Pandemic Period. Qual. Res. Educ. 2020, 9, 328–360. [Google Scholar] [CrossRef]
  4. Allen, J.; Rowan, L.; Singh, P. Teaching and teacher education in the time of COVID-19. Asia-Pac. J. Teach. Educ. 2020, 48, 233–236. [Google Scholar] [CrossRef]
  5. Portuguez Castro, M.; Gómez Zermeño, M.G. Challenge based learning: Innovative pedagogy for sustainability through e-learning in higher education. Sustainability 2020, 12, 4063. [Google Scholar] [CrossRef]
  6. Takala, A.; Korhonen-Yrjänheikki, K. A decade of Finnish engineering education for sustainable development. Int. J. Sustain. High. Educ. 2019, 20, 170–186. [Google Scholar] [CrossRef]
  7. Portuguez Castro, M.; Ross Scheede, C.; Gómez Zermeño, M. The impact of higher education in entrepreneurship and the innovation ecosystem: A case study in Mexico. Sustainability 2019, 20, 5597. [Google Scholar] [CrossRef] [Green Version]
  8. Fallon, C.; Brown, S. E-Learning Standards: A Guide to Purchasing, Developing and Deploying Standards-Conformant e-Learning, 1st ed.; St. Lucie Press: Boca Raton, FL, USA, 2002. [Google Scholar]
  9. Karkar-Esperat, T.M. International Graduate Students’ Challenges and Learning Experiences in Online Classes. J. Int. Stud. 2018, 8, 1722–1735. [Google Scholar] [CrossRef]
  10. Yim, Y.-K.K. Second Language Students’ Discourse Socialization in Academic Online Communities. Can. Mod. Lang. Rev. 2011, 67, 1–27. [Google Scholar] [CrossRef]
  11. Lee, D.J.; Kim, M. University students’ perceptions on the practices of online learning in the COVID-19 situation and future directions. Mult-Assist. Lang. Learn. 2020, 23, 359–377. [Google Scholar]
  12. Naseer, S.; Rafique, S. Moderating Role of Teachers’ Academic Support between Students’ Satisfaction with Online Learning and Academic Motivation in Undergraduate Students during COVID-19. Educ. Res. Int. 2021, 2021, 7345579. [Google Scholar] [CrossRef]
  13. Sivo, S.A.; Ku, C.-H.; Acharya, P. Understanding how university student perceptions of resources affect technology acceptance in online learning courses. Australas. J. Educ. Technol. 2018, 34. [Google Scholar] [CrossRef]
  14. Tanner, J.R.; Noser, T.C.; Totaro, M.W.; Bruno, M.S. Business school administrators’ and faculty perceptions of online learning: A comparative study. Proc. Acad. Educ. Leader. 2008, 13, 76–80. [Google Scholar]
  15. Luongo, N. An Examination of Distance Learning Faculty Satisfaction Levels and Self-Perceived Barriers. J. Educ. Online 2018, 15. [Google Scholar] [CrossRef]
  16. Wingo, N.P.; Ivankova, N.V.; Moss, J.A. Faculty Perceptions about Teaching Online: Exploring the Literature Using the Technology Acceptance Model as an Organizing Framework. Online Learn. 2017, 21, 15–35. [Google Scholar] [CrossRef]
  17. Merillat, L.; Scheibmeier, M. Developing a quality improvement process to optimize faculty success. Online Learn. 2016, 20, 159–172. [Google Scholar] [CrossRef]
  18. Roache, D.; Rowe-Holder, D.; Muschette, R. Transitioning to online distance learning in the COVID-19 era: A call for skilled leadership in higher education institutions. Int. Stud. Educ. Adm. 2020, 48, 103–110. [Google Scholar]
  19. You, S.; Park, H. University public relations and academic achievement. Korean J. Journal. Commun. Stud. 2018, 62, 329–363. [Google Scholar] [CrossRef]
  20. Alomyan, H. The Impact of Distance Learning on the Psychology and Learning of University Students during the COVID-19 Pandemic. Int. J. Instr. 2021, 14, 585–606. [Google Scholar] [CrossRef]
  21. Lee, H.; Rha, I. Influence of structure and interaction on student achievement and satisfaction in Web-based distance learning. Educ. Tech. Soc. 2009, 12, 372–382. [Google Scholar]
  22. Morrison, J.S. Getting to Know You: Student-Faculty Interaction and Student Engagement in Online Courses. J. High. Educ. Theory Pract. 2021, 21, 38–44. [Google Scholar] [CrossRef]
  23. Everett, D.R. Adding value: Online student engagement. Inf. Sys. Educ. J. 2015, 13, 68–76. [Google Scholar]
  24. Hong, K.S.; Lai, K.W.; Holton, D. Students’ satisfaction and perceived learning with a Web-based course. J. Educ. Tech. Soc. 2003, 6, 116–124. [Google Scholar]
  25. Costley, J.; Lange, C. The effects of instructor control of online learning environments on satisfaction and perceived Learning. Electron. J. e-Learn. 2016, 14, 169–180. [Google Scholar]
  26. Ali, A.; Ahmad, I. Key Factors for Determining Student Satisfaction in Distance Learning Courses: A Study of Allama Iqbal Open University. Contemp. Educ. Technol. 2011, 2, 118–134. [Google Scholar] [CrossRef]
  27. Banerjee, M.; Brinckerhoff, L.C. Assessing Student Performance in Distance Education Courses: Implications for Testing Accommodations for Students with Learning Disabilities. Assess. Eff. Interv. 2002, 27, 25–35. [Google Scholar] [CrossRef]
  28. George, M.L. Effective Teaching and Examination Strategies for Undergraduate Learning during COVID-19 School Restrictions. J. Educ. Technol. Syst. 2020, 49, 23–48. [Google Scholar] [CrossRef]
  29. Arbaugh, J.B. How Instructor Immediacy Behaviors Affect Student Satisfaction and Learning in Web-Based Courses. Bus. Commun. Q. 2001, 64, 42–54. [Google Scholar] [CrossRef] [Green Version]
  30. Carter, R.A., Jr.; Rice, M.; Yang, S.; Jackson, H.A. Self-regulated learning in online learning environments: Strategies for remote learning. Inf. Learn Sci. 2020, 121, 311–319. [Google Scholar] [CrossRef]
  31. Harker, M.; Koutsantoni, D. Can it be as effective? Distance versus blended learning in a web-based EAP programme. ReCALL 2005, 17, 197–216. [Google Scholar] [CrossRef]
  32. Yang, Y.; Cornelius, L.F. Students’ perceptions towards the quality of online education: A qualitative approach. Assoc. Educ. Commun. Technol. 2004, 27, 861–877. [Google Scholar]
  33. Lee, S.J.; Huang, K. Online interactions and social presence in online learning. J. Interact. Learn. Res. 2018, 29, 113–128. [Google Scholar]
  34. Cranfield, D.J.; Tick, A.; Venter, I.M.; Blignaut, R.J.; Renaud, K. Higher Education Students’ Perceptions of Online Learning during COVID-19—A Comparative Study. Educ. Sci. 2021, 11, 403. [Google Scholar] [CrossRef]
  35. Tang, C.M.; Chaw, L.Y. Digital literacy: A prerequisite for effective learning in a blended learning environment? Electron. J. e-Learn. 2016, 14, 54–65. [Google Scholar]
  36. Sher, A. Assessing the relationship of student-instructor and student-student interaction to student learning and satisfaction in Web-based online learning environment. J. Interact. Online Learn. 2009, 8, 102–120. [Google Scholar]
  37. Saaty, T.L.; Vargas, L.G. The possibility of group choice: Pairwise comparisons and merging functions. Soc. Choice Welf. 2012, 38, 481–496. [Google Scholar] [CrossRef]
  38. Wind, Y.; Saaty, T.L. Marketing applications of the analytic hierarchy process. Manag. Sci. 1980, 26, 641–658. [Google Scholar] [CrossRef]
  39. Saaty, T.L. Decision making with the analytic hierarchy process. Int. J. Serv. Sci. 2008, 1, 83–98. [Google Scholar] [CrossRef] [Green Version]
  40. Doloi, H. Application of AHP in improving construction productivity from a management perspective. Constr. Manag. Econ. 2008, 26, 841–854. [Google Scholar] [CrossRef]
Table 1. Hierarchical structure of evaluation index for online learning in higher education.
Table 1. Hierarchical structure of evaluation index for online learning in higher education.
Subject LevelCategory LevelSub-Category LevelDetailed Explanation
E-Learning system administrator competencyTechnical
stability
Stability of web serverMaintaining stable communication network, prevention of system overload
Accessibility via various devicesAccessibility of LMS through various digital devices such as smartphone, laptop, and tablet.
Ease of useAccessibility of AppsApplications within the LMS (Zoom, Google Classroom, Google Hangout, Google Meet, etc.)
Easy of downloadingSystem for convenient downloading of class materials, academic information, and assignments
Updates of manualUpdates to LMS manual whenever it is updated
Management and
feedback
LMS tutorialsLMS tutorials for instructors, students, and administrators to use the LMS fully
Management of
complaints
Proper handling of complaints and thorough feedback
Support expertiseProfessional training and recruitment of staff for online complaint feedback
Instructor
competency
Fair assessment tailored to online learningFair gradingDeveloping a fair grading method for tests and assignments suitable for online classes
Clarity of grading standardsDisclosing objective indicators for tests, assignments, and peer evaluation (from group members)
Communication skillsInstructor-student
interaction
Instructor-student communication via email, LMS messages, texting, and blogs
Role as facilitatorSystematic management according to learning level; coach, facilitator, and counselor
Nondiscriminatory mannerAbility to conduct classes without prejudice regarding race, gender, religion, disability, etc.
Designing a course
tailored to online delivery
Clear class goalsClass goals are suitable for online classes
Non-linguistic
components
Quality of sound and video playback speed of real-time lectures and recorded lectures
Utilizing up-to-date course materialsReflecting the latest trends in various content fields such as K-MOOC (open lectures), electronic journals, video clips, popular culture, classical literature, etc.
Teaching methodsUsing teaching methods such as PBL (problem-based learning), discussion, debate, and flipped learning suitable for online learning
Student
competency
Learning skillsDigital literacySkills related to online system use, such as ability to read, write, and understand digital contents
Self-directed learningAbility to self-regulate the quantity and quality of online learning, motivation for academic achievement
Critical thinkingProblem solving, analytical, discussion, and debate skills using online learning materials
ParticipationStudent-student and instructor-student
interaction
Student-student and instructor-student communication skills
Use of online mediumsUsing various multimedia such as video, podcast, or instant messaging to improve participation
Table 2. Demographics of respondents.
Table 2. Demographics of respondents.
SubjectGender
Female/Male
Age
(Mean)
Experience in Year
(Mean)
N
E-Learning system
administrators
5/538.19.410
Instructors3/745.010.610
Students6/423.12.610
Total14/1635.45.930
Table 3. Administrator group: Relative importance of subject level factor.
Table 3. Administrator group: Relative importance of subject level factor.
Subject LevelRelative
Importance (%)
Importance
Order
Consistency
Index (C.I.)
E-Learning system
administrator competency
41.62
Instructor competency45.510.00
Student competency13.03CR < 0.01
Table 4. Administrator group: Relative importance of category level factor.
Table 4. Administrator group: Relative importance of category level factor.
Subject LevelCategory LevelRelative
Importance (%)
Importance
Order
Consistency
Index (C.I.)
E-Learning
system administrator competency
Technical stability38.42
Ease of use47.510.03
Management and
feedback
14.13CR < 0.01
Instructor
competency
Fair assessment tailored to online learning16.23
Communication skills60.610.00
Designing a course tailored to online delivery23.22CR < 0.01
Student
competency
Learning skills33.320.00
Participation66.71CR < 0.01
Table 5. Administrator group: Relative importance (global weight) of each component of online learning evaluation.
Table 5. Administrator group: Relative importance (global weight) of each component of online learning evaluation.
Subject LevelCategory LevelSub-Category Level
Relative Importance in % (Order *)
Global Weights in % (Order **)
E-learning system administrator
competency
Technical stabilityStability of web server55.3 (1)8.8 (2)
Accessibility via various devices44.7 (2)7.1 (4)
Ease of useAccessibility of Apps30.1 (3)5.9 (8)
Ease of downloading35.1 (1)6.9 (5)
Updates to manuals34.8 (2)6.9 (5)
Management and
feedback
LMS tutorials34.4 (2)2.0 (16)
Management of complaints39.8 (1)2.3 (15)
Support expertise25.8 (3)1.5 (19)
Instructor
competency
Fair assessment tailored to online learningFair grading53.7 (1)4.0 (10)
Clarity of grading standards46.3 (2)3.4 (11)
Communication skillsInstructor-student interaction62.6 (1)17.3 (1)
Role as a facilitator26.4 (2)7.3 (3)
Nondiscriminatory manner11.0 (3)3.0 (12)
Designing a course tailored to online deliveryClear class goal23.3 (2)2.5 (13)
Non-linguistic components6.3 (4)0.7 (20)
Utilizing up-to-date course materials23.1 (3)2.4 (14)
Teaching method47.2 (1)5.0 (9)
Student
competency
Learning skillsDigital literacy24.7 (2)0.7 (20)
Self-directed learning56.8 (1)1.6 (18)
Critical thinking18.4 (3)0.5 (22)
ParticipationStudent-student, student-instructor interactions78.1 (1)6.8 (7)
Use of online medium21.9 (2)1.9 (17)
* Order of relative importance is marked with parenthesis. ** Top 10 order of global weights are marked with bold characters.
Table 6. Instructor group: Relative importance of subject level factor.
Table 6. Instructor group: Relative importance of subject level factor.
Subject LevelRelative
Importance (%)
Importance
Order
Consistency
Index (C.I.)
E-Learning system
administrator competency
33.92
Instructor competency46.010.00
Student competency20.23CR < 0.01
Table 7. Instructor group: Relative importance of category level factor.
Table 7. Instructor group: Relative importance of category level factor.
Subject LevelCategory LevelRelative
Importance (%)
Importance
Order
Consistency
Index (C.I.)
E-Learning
system administrator competency
Technical stability36.02
Ease of use49.110.01
Management and
feedback
14.93CR < 0.01
Instructor
competency
Fair assessment tailored to online learning16.23
Communication skills60.610.00
Designing a course tailored to online delivery23.22CR < 0.01
Student
competency
Learning skills21.720.00
Participation66.71CR < 0.01
Table 8. Instructor group: Relative importance (global weight) of each component of online learning evaluation.
Table 8. Instructor group: Relative importance (global weight) of each component of online learning evaluation.
Subject LevelCategory LevelSub-Category Level
Relative Importance in % (Order *)
Global Weights in % (Order **)
E-Learning system administrator
competency
Technical stabilityStability of web server65.8 (1) 8.0 (3)
Accessibility via various devices34.2 (2)4.2 (10)
Ease of useAccessibility of Apps54.1 (1)9.0 (2)
Ease of downloading26.0 (2)4.3 (9)
Updates to manuals19.9 (3)3.3 (14)
Management and
feedback
LMS tutorials32.5 (3)1.6 (21)
Management of complaints39.0 (2)2.0 (20)
Support expertise47.0 (1)2.4 (18)
Instructor
competency
Fair assessment tailored to online learningFair grading47.2 (2)3.7 (12)
Clarity of grading standards52.8 (1)4.1 (11)
Communication skillsInstructor-student interaction59.8 (1)13.8 (1)
Role as a facilitator27.1 (2)6.3 (5)
Nondiscriminatory manner13.1 (3)3.0 (16)
Designing a course tailored to online deliveryClear class goal22.8 (3)3.4 (13)
Non-linguistic components9.1 (4)1.4 (22)
Utilizing up-to-date course materials31.7 (2)4.8 (8)
Teaching method36.4 (1)5.5 (7)
Student
competency
Learning skillsDigital literacy24.1 (2)2.5 (17)
Self-directed learning53.9 (1)5.7 (6)
Critical thinking22.0 (3)2.3 (19)
ParticipationStudent-student, student-instructor interactions66.9 (1)6.4 (4)
Use of online medium33.1 (2)3.2 (15)
* Order of relative importance is marked with parenthesis. ** Top 10 order of global weights are marked with bold characters.
Table 9. Student group: Relative importance of subject level factor.
Table 9. Student group: Relative importance of subject level factor.
Subject LevelRelative
Importance (%)
Importance
Order
Consistency
Index (C.I.)
E-Learning system
administrator competency
38.22
Instructor competency41.410.00
Student competency20.33CR < 0.01
Table 10. Student group: Relative importance of category level factor.
Table 10. Student group: Relative importance of category level factor.
Subject LevelCategory LevelRelative
Importance (%)
Importance
Order
Consistency
Index (C.I.)
E-Learning
system administrator competency
Technical stability49.71
Ease of use35.520.00
Management and feedback14.83CR < 0.01
Instructor
competency
Fair assessment tailored to online learning18.43
Communication skills31.720.00
Designing a course tailored to online delivery49.91CR < 0.01
Student
competency
Learning skills34.920.00
Participation65.11CR < 0.01
Table 11. Student group: Relative importance (global weight) of each component of online learning evaluation.
Table 11. Student group: Relative importance (global weight) of each component of online learning evaluation.
Subject LevelCategory LevelSub-Category Level
Relative Importance in % (Order *)
Global Weights in % (Order **)
Online learning system administrator
competency
Technical stabilityStability of web server62.9 (1)11.9 (1)
Accessibility via various devices37.1 (2)7.0 (3)
Ease of useAccessibility of Apps31.0 (2)4.2 (13)
Ease of downloading49.8 (1)6.8 (4)
Updates to manuals19.2 (3)2.6 (17)
Management and
feedback
LMS tutorials30.4 (3)1.7 (21)
Management of complaints42.3 (1)2.4 (18)
Support expertise42.0 (2)2.4 (18)
Instructor
competency
Fair assessment tailored to online learningFair grading43.1 (2)3.3 (16)
Clarity of grading standards56.9 (1)4.3 (11)
Communication skillsInstructor-student interaction37.8 (1)5.0 (8)
Role as a facilitator37.1 (2)4.9 (9)
Nondiscriminatory manner25.2 (3)3.3 (15)
Designing a course tailored to online deliveryClear class goal23.8 (4)4.9 (9)
Non-linguistic components24.1 (3)5.0 (7)
Utilizing up-to-date course materials27.1 (1)5.6 (5)
Teaching method25.1 (2)5.2 (6)
Student
competency
Learning skillsDigital literacy10.0 (3)0.7 (22)
Self-directed learning60.8 (1)4.3 (12)
Critical thinking29.3 (2)2.1 (20)
ParticipationStudent-student, student-instructor interactions72.4 (1)9.6 (2)
Use of online medium27.6 (2)3.7 (14)
* Order of relative importance is marked with parenthesis. ** Top 10 order of global weights are marked with bold characters.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Cho, Y.Y.; Woo, H. Factors in Evaluating Online Learning in Higher Education in the Era of a New Normal Derived from an Analytic Hierarchy Process (AHP) Based Survey in South Korea. Sustainability 2022, 14, 3066. https://doi.org/10.3390/su14053066

AMA Style

Cho YY, Woo H. Factors in Evaluating Online Learning in Higher Education in the Era of a New Normal Derived from an Analytic Hierarchy Process (AHP) Based Survey in South Korea. Sustainability. 2022; 14(5):3066. https://doi.org/10.3390/su14053066

Chicago/Turabian Style

Cho, Yoon Y., and Hyunju Woo. 2022. "Factors in Evaluating Online Learning in Higher Education in the Era of a New Normal Derived from an Analytic Hierarchy Process (AHP) Based Survey in South Korea" Sustainability 14, no. 5: 3066. https://doi.org/10.3390/su14053066

APA Style

Cho, Y. Y., & Woo, H. (2022). Factors in Evaluating Online Learning in Higher Education in the Era of a New Normal Derived from an Analytic Hierarchy Process (AHP) Based Survey in South Korea. Sustainability, 14(5), 3066. https://doi.org/10.3390/su14053066

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop