Next Article in Journal
Trust-Degree-Based Secure Relay Selection in SWIPT-Enabled Relay Networks
Previous Article in Journal
Improved-Odd-Even-Prime Reconfiguration to Enhance the Output Power of Rectangular Photovoltaic Array under Partial Shading Conditions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Fuzzy-Based Evaluation of E-Learning Acceptance and Effectiveness by Computer Science Students in Greece in the Period of COVID-19

by
Konstantina Chrysafiadi
*,
Maria Virvou
and
George A. Tsihrintzis
Department of Informatics, University of Piraeus, 80, M. Karaoli & A. Dimitriou St., 18534 Piraeus, Greece
*
Author to whom correspondence should be addressed.
Electronics 2023, 12(2), 428; https://doi.org/10.3390/electronics12020428
Submission received: 23 December 2022 / Revised: 9 January 2023 / Accepted: 12 January 2023 / Published: 13 January 2023
(This article belongs to the Section Computer Science & Engineering)

Abstract

:
In this paper, a fuzzy-based evaluation method is presented for the impact of e-learning on several aspects of the learning lives of academic students of Information and Communication Technologies (ICT) during the COVID-19 pandemic. Specifically, the academic year 2020–2021 was considered when a general lockdown was imposed in Greece and all courses were delivered exclusively through the web. The evaluation criteria are (i) e-learning acceptance, (ii) learning effectiveness, (iii) students’ engagement and (iv) socializing and interpersonal relationships in the educational community. The evaluation was conducted through questionnaires. Ninety two (92) undergraduate and postgraduate students and Ten (10) computer science in-structors of the department of Informatics of the University of Piraeus, Greece, participated in this survey. The questionnaire answers were analyzed using a fuzzy-based mechanism. Particularly, five fuzzy sets were used to describe the 5-point Likert scale answers to the questionnaires with linguistic values, and four other fuzzy sets were used for the description of the evaluation results concerning the four defined evaluation criteria. Moreover, 209 fuzzy rules were created to analyze and calculate the evaluation results per criterion, considering both the students’ answers (119 fuzzy rules) and the instructors’ answers (90 fuzzy rules) to the questionnaires. The gain of this approach is that the analysis of evaluation data with fuzzy rules imitates the human way of thinking and, thus, makes the process more explainable. The evaluation results showed a positive impact of e-learning on student confidence, self-discipline and active participation and a negative impact on student interpersonal relationships. The impact of e-learning on other learning issues was neutral.

1. Introduction

The COVID-19 pandemic dramatically changed many aspects of humans’ everyday activities. Major changes occurred in the field of education, where course delivery needed to be performed through the internet and e-learning platforms and applications. Particularly, COVID-19 is a contagious disease that has expanded all over the world since December 2019, causing millions of cases, of which an important percentage concerns cases of severe disease. Moreover, COVID-19 caused 6,365,510 confirmed deaths worldwide until 20 July 2022 [1]. That is the reason why many governments took a series of measures and restrictions, which included the use of protective masks, hand hygiene, surface disinfection and social distancing [2,3].
Furthermore, in March 2000, several governments decided to close down several economic, social and educational activities. Therefore, schools, colleges, universities and other educational institutions suspended their traditional educational practices and services. As a result, about 1 billion students were affected [4]. According to United Nations [5], about 1725 billion children and youth (about 98.6% of students worldwide) in 200 countries were affected by the closing of educational institutions. Therefore, governments and educational institutions “transferred” their teaching and educational practices and services over the internet [6]. Particularly, during the lockdown period, e-learning constituted the only means for delivering courses and providing educational services, facilitating students to continue with their learning process [5]. According to Toth-Stub [4,7], e-learning was growing at about 15.4% each year in educational institutions before the COVID-19 pandemic, while over 60% of students around the world participated in e-learning programs during the COVID-19 pandemic.
Consequently, the rapid and sudden transition from face-to-face courses and traditional educational practices to online courses and e-learning practices caused changes, problems, challenges and opportunities. It has been noticed that student anxiety, stress and depression increased; students faced problems with internet connectivity and e-learning tools and platform accessibility and navigation; they also had problems concerning their eyes and neck and back pain due to increased screen time and prolonged computer and digital device use [5,6,8,9,10,11]. However, the use of e-learning platforms during the lockdown created the opportunity to promote and advance the use of e-learning practices [6,10,12].
Considering the previous, there are several studies in the literature on the impact of the COVID-19 pandemic on teaching and learning [10,13,14]. These studies analyze the effectiveness of e-learning during the COVID-19 pandemic [11,12,15] and evaluate the e-learning outcomes, challenges and opportunities in the era of the COVID-19 pandemic [10,16,17,18] and the students’ acceptance of e-learning [9,19]. The evaluations of these studies were mainly based on questionnaires. The answers to the questionnaires, which depict the participants’ opinions, are either numeric or verbal and cover a range of opinions on a topic. For example, a range of answers can be “very dissatisfied”, “somewhat dissatisfied”, “neither dissatisfied nor satisfied”, “somewhat satisfied”, “very satisfied” or a number from 1 to 5, which depicts the degree of satisfaction (i.e., 1 is for “not at all satisfied” and 5 is for “absolutely satisfied”). A very popular scale of opinion answers for constructing questionnaires that measure opinions and perceptions is the Likert scale [20]. Therefore, the analysis of the answers to a questionnaire is a process of computing with words. This is performed effectively using fuzzy logic [21,22], which allows the description of questionnaire answers and evaluation results in a more realistic way. Furthermore, the use of linguistic values and the application of fuzzy rules over them imitate the human way of thinking [21,23,24,25] and make the process of analysis of questionnaire results better explainable.
Regarding the previous, in this paper, we conducted an evaluation of e-learning acceptance and effectiveness during the COVID-19 pandemic by academic students of Information and Communication Technologies (ICT) in Greece. We chose this discipline since ICT knowledge is critical for new Information Systems developers, and the education of new programmers and information systems experts is an issue that concerns the scientific community. The survey concerns the academic year 2020–2021, when a general lockdown was imposed in Greece, and all courses were delivered only through the web. The evaluation is performed through questionnaires, which were answered by 92 students and 10 instructors of an undergraduate and a postgraduate program in Informatics at the University of Piraeus, Greece. The questionnaire answers were analyzed using a fuzzy-based mechanism. Moreover, the questionnaire results were grouped into four factors, namely acceptance, learning effectiveness, engagement and interpersonal relationships in the educational community. The results concerning these four factors were described via four fuzzy sets. For the calculation of the membership values of the four factors in the four fuzzy sets, 209 fuzzy rules were used, which include all the cases of possible answers for each question of the questionnaires, as analyzed in Section 3.3. The gain of this approach is that the applied analysis mechanism of questionnaire answers models the human way of thinking and, thus, is more understandable. Additionally, the presented mechanism facilitates the evaluation data analysis process since fuzzy logic allows the application of operations over words and the computation with them.

2. Related Work

In the literature review, there are several studies that assess the impact of the COVID-19 pandemic on education and the acceptance and effectiveness of distance learning in the era of the pandemic. In [19], quantitative and qualitative data were collected from 270 college students to assess their perceptions of the use and acceptance of online learning and how it affected their cognitive engagement and academic performance. Moreover, in [5], the authors described the impact of e-learning, the students’ interest in using e-learning resources and their performance during the period of the lockdown due to the COVID-19 pandemic. The research was conducted through questionnaires completed by 175 students across the world who were familiar with web-based technology. The aim of this study was to discover the global trend of using e-learning resources, identify the interest and attitude of students towards using e-learning resources and suggest prospects for using e-learning resources.
The authors in [26] investigated and evaluated the learners’ perception in a higher education institution in India and compared the difference in the perception of the same students in the pre- and post-COVID-19 periods. In [9], a study is presented that evaluates the impact on undergraduate students of shifting from traditional learning to online learning during the COVID-19 pandemic. In more detail, the authors of this study conducted two online surveys to examine undergraduate student satisfaction with online learning and identify the positive and negative aspects of online learning. The first survey was performed directly after the transition to online learning with 483 participants, while the second survey, with 853 undergraduate student participants, was performed after the students’ experience with distance learning. The survey showed that more than one-third of the students were dissatisfied with the e-learning experience, mainly due to their distraction, reduced focus, and psychological and management issues.
Furthermore, in [27], an online survey was presented, which was conducted through a questionnaire that was completed by 232 students of undergraduate and postgraduate programs. This survey examined the learning status of the students and the problems and challenges they faced during their education in the era of the COVID-19 pandemic. In addition, the authors of [28] explored the impact of e-learning on the lives and mental health of students during the period of the COVID-19 pandemic. The survey data were collected from 1182 students of different ages from various educational institutes in Delhi, National Capital Region (NCR), India. Moreover, in [29], a descriptive method and an online questionnaire were used to examine the issues and challenges with which university students were confronted and were associated with e-learning during the COVID-19 pandemic lockdown. A total of 216 students from six faculties at the Libyan International Medical University (LIMU) participated in this survey.
Furthermore, a survey is presented in [30] about the challenges and experiences of university students concerning online courses during the COVID-19 pandemic. Particularly, this study presents the learning outcomes, the students’ satisfaction and experience, and the reflections from the instructors for six civil engineering courses that were taught in two universities in Spain and one university in Peru. Furthermore, Alhammadi [12] evaluated the e-learning effect on academic students’ performance and engagement during the COVID-19 pandemic. Finally, the authors of [31] evaluated the attitude and the readiness of university students with regard to distance learning during the lockdown; clarified the difficulties, the possible changes and the future expectations from distance learning; and proposed recommendations and measures for improving the higher education environment.
With regard to the impact of the COVID-19 pandemic on students of computer science, there are also several studies in the literature review. Particularly, the authors in [32] found that the performance of students in computer science education decreased during online lessons. Furthermore, the results of the study in [33] indicate that the pandemic negatively affected the students’ interactions with eBooks and the amount of studying in an introductory computer science course. In addition, the certification rate of students of online courses on Python fell short during the pandemic [34]. Moreover, the same study revealed that the learners’ behavior changed concerning their interactions with one another on the course discussion forum. This change disclosed that the COVID-19 pandemic negatively affected the students’ participation in the course forums. Furthermore, in [35], an experimental survey showed that Electrical Engineering, Mathematics and Computer Science students at Delft University of Technology suffered from mental problems, loneliness and lack of study motivation during the COVID-19 pandemic. Moreover, the authors in [36] discovered that, during the COVID-19 pandemic, computer science students chose mainly work division or distributed peer programming approaches rather than synchronous distribution collaboration tools for performing group programming tasks. Therefore, their interaction with each other was negatively affected.
Taking the previous into account, we conclude that there are a significant number of research papers that examine the challenges and problems that academic students from all over the world were confronted with during the period of lockdown due to the COVID-19 pandemic. Moreover, the students’ satisfaction and engagement while participating in online courses and the impact of e-learning on students’ performance are also investigated in these previous works. Regarding computer science education, the corresponding surveys investigate students’ performance, amount of studying, participation and interactions. Most surveys were conducted through interviews and questionnaires. The analysis of the questionnaire answers was not based on a common statistical method. Furthermore, the survey participants’ answers usually were verbal. Therefore, the task of analyzing interviews and questionnaire answers in order to produce evaluation results includes working with words and linguistic values. An ideal way for computing with words is provided by fuzzy logic, which was introduced by Lotfi Zadeh [37]. Furthermore, fuzzy logic models the human way of thinking. Therefore, it is ideal for analyzing and computing the results of a survey that is based on opinions and perceptions. After a thorough literature review research, we found that although there are researches that concern the use of fuzzy logic for the evaluation of vague and subjective data, such as students’ performance [38], employees’ performance [39,40], enterprise agility [41], the sensory quality of food products [42] or health indexes [43]; there is no research on using fuzzy logic for analyzing the data of a questionnaire-based survey. As a consequence, in this paper, we present a fuzzy-based evaluation method for the assessment of e-learning acceptance and effectiveness during the COVID-19 pandemic period.

3. Evaluation

3.1. The Method

The most common techniques for an e-learning system evaluation are observations, questionnaires and experiments. In this paper, a survey, through questionnaires, is presented. Two questionnaires were used: the first one was addressed to students, and the second was addressed to instructors. The questionnaire (stud_quest), which is presented in Table 1, was completed by 92 students who attended e-learning courses in an undergraduate and a postgraduate program in Informatics at the University of Piraeus in Greece during the lockdown due to the COVID-19 pandemic. The questionnaire contains 23 close-end questions, the answers to which follow the 5-point Likert scale [18]. The questionnaire also contains one more question (Q4), the answers to which form a multiple choice. In Table 2, the questionnaire (instructor_quest) is presented, which was completed by 10 computer science instructors of the department of Informatics of the University of Piraeus, Greece. The questionnaire contains 18 close-end questions, the answers to which follow the 5-point Likert scale.
The students and the instructors completed the questionnaires anonymously after their participation in the e-learning courses. The answers to the questionnaires were gathered, and their mean value per question was measured. Then, they were grouped into four evaluation criteria, which are described in Section 3.2. Furthermore, a fuzzy-based mechanism was used for the interpretation of the questionnaire results. Particularly, we used four linguistic values: “Low”, “Medium”, “High” and “Very High” to describe in a more realistic way the different degrees of students’ and instructors’ satisfaction and opinion. More details about the fuzzy-based evaluation mechanism are given in Section 3.3.

3.2. Criteria

In this section, the criteria of the evaluation are described. The questionnaire questions are grouped into the following four evaluation criteria.
  • Acceptance: It examines how much the learners like e-learning programs, how willing they are to participate in such programs and how they feel during their enrolment in e-learning programs. This criterion is measured through questions Q1, Q2, Q3 and Q4 of stud_quest and through questions Q1, Q2 and Q3 of instructor_quest.
  • Learning effectiveness: It examines how much e-learning improves learning outcomes. Particularly, it concerns performance improvement, the learners’ fatigue during the learning process, the e-learning contribution to the learners’ confidence and self-discipline, and its effect on the course and learning planning and in education in general. This criterion is measured through questions Q5, Q6, Q7, Q8, Q9, Q10, Q11, Q12 and Q13 of stud_quest and through questions Q4, Q5, Q6, Q7, Q8 and Q9 of instructor_quest.
  • Engagement: It examines how much e-learning improves the students’ engagement in the learning process and how focused they are while attending an online course. This criterion is measured through questions Q14, Q15, Q16, Q17, Q18 and Q19 of stud_quest and through questions Q10, Q11, Q12, Q13 and Q14 of instructor_quest.
  • Socializing and interpersonal relationships in the educational community: It examines the immediacy and the quality of the interpersonal relationships among fellow students and among students and tutors. This criterion is measured through questions Q20, Q21, Q22, Q23 and Q24 of stud_quest and through questions Q15, Q16, Q17 and Q18 of instructor_quest.

3.3. Fuzzy-Based Evaluation Mechanism

The results of the questionnaires, which are based on the Likert scale, are a number between 1 and 5 for each question. For example, the mean answer of a questionnaire can be 2.83, 3.45, 4.18, etc. This format of results is not very explicable to the evaluation receivers. For instance, the mean answer 4.24 of question Q2 of stud_quest “Would you like to participate in an e-learning program again?” indicates that the students like to attend an e-course between “much” and “very much”. Similarly, the mean answer 3.85 of question Q13 of stud_quest “Evaluate the impact that distance learning has had on your education as a whole” indicates that the students believe that e-learning affects their education in a degree between “neutral” and “positively”. This feedback from the questionnaires is not clearly understood. It would be more explicable to the receivers of the survey results if the feedback of the questionnaire results were of the following format: “The students’ preference to attend an e-learning course is high”, or “The students believe that the impact of e-learning on their education is medium”. That is the reason for using fuzzy logic for the interpretations and presentation of the questionnaire results.
In more detail, a Mamdani Fuzzy Inference System was constructed for interpreting the questionnaire results (Figure 1). The Mamdani Fuzzy Inference System was chosen because (i) it imitates better the human way of thinking, (ii) its fuzzy rules are more interpretable and (iii) it is the most commonly-used Fuzzy Inference System [44,45,46,47]. Particularly, five fuzzy triangular sets are used for describing the 5-point Likert scale answers with linguistic values, which is closer to the human way of thinking and expression (Table 3 and Figure 2). For the answers of Q4 of stud_quest, which are in a multiple-choice format, the matching to fuzzy sets is performed concerning the percentage of occurrence of positive emotions in relation to negative ones. The fuzzy sets for Q4 of stud_quest and their partition are presented in Table 4 and Figure 3. Furthermore, four fuzzy trapezoidal sets are used to describe the degree of opinion and/or satisfaction of the survey participants (students and instructors) for each evaluation criterion (Table 5 and Figure 4).
In addition, 119 fuzzy rules were constructed to calculate the evaluation results per criterion considering the students’ answers to the questionnaire. Below, the fuzzy rules for the estimation of each evaluation criterion are presented:
  • The fuzzy rules for the estimation of the “acceptance” are 19. Particularly, the format of the used fuzzy rules for questions Q1, Q2 and Q3 are the following:
    If Qi is “Very Low”, then “acceptance” is “Low”.
    If Qi is “Low”, then “acceptance” is “Low”.
    If Qi is “Neutral”, “acceptance” is “Medium”.
    If Qi is “High”, then “acceptance” is “High”.
    If Qi is “Very High”, then “acceptance” is “Very High”.
where i ∈ {1, 2, 3}.
  • Furthermore, the fuzzy rules for Q4 are:
    If Q4 is “Low”, then “acceptance” is “Low”.
    If Q4 is “Medium”, then “acceptance” is “Medium”.
    If Q4 is “High”, then “acceptance” is “High”.
    If Q4 is “Very High”, then “acceptance” is “Very High”.
  • The fuzzy rules for the estimation of “learning effectiveness” are 45. Particularly, the fuzzy rules are of the following format:
    If Qi is “Very Low”, then “learning effectiveness” is “Low”.
    If Qi is “Low”, then “learning effectiveness” is “Low”.
    If Qi is “Neutral”, “learning effectiveness” is “Medium”.
    If Qi is “High”, then “learning effectiveness” is “High”.
    If Qi is “Very High”, then “learning effectiveness” is “Very High”.
where i = 5 or i ∈ [8, 13].
  • The fuzzy rules for questions Q6 and Q7 are of the following format:
    If Qi is “Very Low”, then “learning effectiveness” is “Very High”.
    If Qi is “Low”, then “learning effectiveness” is “High”.
    If Qi is “Neutral”, “learning effectiveness” is “Medium”.
    If Qi is “High”, then “learning effectiveness” is “Low”.
    If Qi is “Very High”, then “learning effectiveness” is “Low”.
  • The fuzzy rules for the estimation of the “engagement” are 30. Particularly, the fuzzy rules are of the following format:
    If Qi is “Very Low”, then “engagement” is “Low”.
    If Qi is “Low”, then “engagement” is “Low”.
    If Qi is “Neutral”, “engagement” is “Medium”.
    If Qi is “High”, then “engagement” is “High”.
    If Qi is “Very High”, then “engagement” is “Very High”.
where i =14 or i = 15 or i = 17.
  • The fuzzy rules for questions Q16, Q18 and Q19 are of the following format:
    If Qi is “Very Low”, then “engagement” is “Very High”.
    If Qi is “Low”, then “engagement” is “High”.
    If Qi is “Neutral”, “engagement” is “Medium”.
    If Qi is “High”, then “engagement” is “Low”.
    If Qi is “Very High”, then “engagement” is “Low”.
  • The fuzzy rules for the estimation of the “socializing & interpersonal relationships in the educational community” are 25. Particularly, the fuzzy rules are of the following format:
    If Qi is “Very Low”, then “socializing & interpersonal relationships” is “Low”.
    If Qi is “Low”, then “socializing & interpersonal relationships” is “Low”.
    If Qi is “Neutral”, “socializing & interpersonal relationships” is “Medium”.
    If Qi is “High”, then “socializing & interpersonal relationships” is “High”.
    If Qi is “Very High”, then “socializing & interpersonal relationships” is “Very High”.
where i ∈ [20, 24].
Similarly, 90 fuzzy rules were constructed to calculate the evaluation results per criterion considering the instructors’ answers to the questionnaire. Below, the fuzzy rules for the estimation of each evaluation criterion are presented.
  • The fuzzy rules for the estimation of the “acceptance” are 15. Particularly, the fuzzy rules are of the following format:
    If Qi is “Very Low”, then “acceptance” is “Low”.
    If Qi is “Low”, then “acceptance” is “Low”.
    If Qi is “Neutral”, “acceptance” is “Medium”.
    If Qi is “High”, then “acceptance” is “High”.
    If Qi is “Very High”, then “acceptance” is “Very High”.
where i ∈ {1, 2, 3}.
  • The fuzzy rules for the estimation of the “learning effectiveness” are 30. Particularly, the fuzzy rules are of the following format:
    If Qi is “Very Low”, then “learning effectiveness” is “Low”.
    If Qi is “Low”, then “learning effectiveness” is “Low”.
    If Qi is “Neutral”, “learning effectiveness” is “Medium”.
    If Qi is “High”, then “learning effectiveness” is “High”.
    If Qi is “Very High”, then “learning effectiveness” is “Very High”.
where i = 4 or i ∈ [6, 9].
  • The fuzzy rules for question Q5 are of the following format:
    If Qi is “Very Low”, then “learning effectiveness” is “Very High”.
    If Qi is “Low”, then “learning effectiveness” is “High”.
    If Qi is “Neutral”, “learning effectiveness” is “Medium”.
    If Qi is “High”, then “learning effectiveness” is “Low”.
    If Qi is “Very High”, then “learning effectiveness” is “Low”.
  • The fuzzy rules for the estimation of the “engagement” are 25. Particularly, the fuzzy rules are of the following format:
    If Qi is “Very Low”, then “engagement” is “Low”.
    If Qi is “Low”, then “engagement” is “Low”.
    If Qi is “Neutral”, “engagement” is “Medium”.
    If Qi is “High”, then “engagement” is “High”.
    If Qi is “Very High”, then “engagement” is “Very High”.
where i ∈ [10, 13].
  • The fuzzy rules for question Q14 are of the following format:
    If Qi is “Very Low”, then “engagement” is “Very High”.
    If Qi is “Low”, then “engagement” is “High”.
    If Qi is “Neutral”, “engagement” is “Medium”.
    If Qi is “High”, then “engagement” is “Low”.
    If Qi is “Very High”, then “engagement” is “Low”.
  • The fuzzy rules for the estimation of the “socializing & interpersonal relationships in the educational community” are 20. Particularly, the fuzzy rules are of the following format:
    If Qi is “Very Low”, then “socializing & interpersonal relationships” is “Low”.
    If Qi is “Low”, then “socializing & interpersonal relationships” is “Low”.
    If Qi is “Neutral”, “socializing & interpersonal relationships” is “Medium”.
    If Qi is “High”, then “socializing & interpersonal relationships” is “High”.
    If Qi is “Very High”, then “socializing & interpersonal relationships” is “Very High”.
where i ∈ [15, 18].
For each criterion, the following steps are followed:
  • Check which questions concern the criterion.
  • For each question:
    2.1
    Calculate the mean answers.
    2.2
    Calculate the fuzzy sets they belong to.
    2.3
    Apply the rules to estimate the degree of participants’ opinions for the criterion.
    2.4
    If more than one rule is applied, aggregate the fuzzy rules’ results.
  • Use the centroid method [48] to calculate and defuzzify the linguistic value, representing the criterion degree of estimation. The basic principle in the centroid defuzzification method is to find a crisp value x*, for which the total area of the membership function distribution used to represent the aggregated fuzzy rules is divided into two equal masses (Figure 5). The crisp value x* is calculated by the following formula (n is the number of sub-areas, into which the total area is divided, xi is a sample value of x and μ(xi) is the corresponding value of the membership function):
x * = i = 1 n x i μ ( x i ) i = 1 n μ ( x i )

3.4. The Testbed

Ninety-two students participated in the evaluation process. Thirty of them attended e-learning courses in a postgraduate program in Informatics at the University of Piraeus in Greece. The remaining sixty-two students attended e-learning courses in an undergraduate program in Informatics at the University of Piraeus in Greece. Sixty of the ninety-two students were males, and the other thirty-two were females. Their age varied from 20 to 40 years old. Fifty-three students were between 20 and 25 years old, twenty-six students were between 26 and 30 years old, eight students were between 31 and 35 years old, and five students were between 36 and 40 years old. Concerning the participants’ familiarity with computer and software use, it was characterized as “bad” for 3.85% of the students, “mediocre” for 7.69%, “good” for 38.46% and “excellent” for 50% of the participants. Moreover, all participants had experience in face-to-face courses.
Furthermore, 10 computer science instructors of the department of Informatics of the University of Piraeus in Greece participated in the survey. All of them had at least seven years of experience in teaching computer science lessons. They had experience in both e-learning and face-to-face courses. Furthermore, each of the instructors gave lectures to the participating students for three semesters at least.

3.5. Reliability and Validity

In this subsection, the reliability and validity of the survey questionnaires are presented. More specifically, for the reliability and validity measurement, the following three methods were used: (i) Cronbach’s alpha coefficient, (ii) the average variance extracted (AVE) indicators and (iii) the composite reliability (CR) indicator [49]. Cronbach’s alpha coefficient measures internal consistency [50]. A Cronbach’s alpha value over 0.70 is acceptable, a value over 0.8 is good, and a value over 0.9 is excellent. AVE is a measure of the amount of variance that is captured by a construct in relation to the amount of variance due to measurement error [51]. An AVE value over 0.5 is acceptable. CR measures the overall reliability of a set of items loaded on a latent construct [52]. Its value has to be higher than 0.7 to be acceptable.
The results for both questionnaires of the presented survey are depicted in Table 6 and Table 7. We noticed that the AVE and CR values of all evaluation criteria for both questionnaires are acceptable. Concerning Cronbach’s alpha coefficient, it is excellent for “acceptance” for both questionnaires, it is good for the interpersonal relations of students with their fellows and for the “learning effectiveness” of instructors’ questionnaire, and it is acceptable for the “learning effectiveness” and the “interpersonal relationships with instructors” of the students’ questionnaire and the “interpersonal relationships” of the instructors’ questionnaire. The value of Cronbach’s alpha coefficient is not acceptable for the “engagement” of both questionnaires. The reason for this is the heterogeneity of questions. Particularly, a small value as an answer in some questions is interpreted as a negative effect on the engagement (i.e., Q14 of stud_quest and Q10 of instructor_quest), when in others, such an answer has a positive effect on the engagement (i.e., Q16 of stud_quest and Q14 of instructor_quest).

3.6. Results and Discussion

In this section, the evaluation results per criterion are presented.
  • Acceptance:
By taking into account the students’ answers to the questionnaire stud_quest, the evaluation results concerning the students’ acceptance of e-learning are described in Table 8 and Table 9. Specifically, Table 9 presents the number of students who experienced each one of the defined emotions and the corresponding percentage. Thus, 72% are positive emotions, and 28% are negative emotions. Therefore, the quadruplet, which depicts the fuzzy set partition concerning the students’ emotions, is (0, 0, 1, 0). Applying Fuzzy Rule 17, we conclude that the acceptance according to Q4 is “High” with (0, 0, 1, 0). Then, the centroid method [48] was used to estimate the final value of acceptance. The result showed that the acceptance of e-learning is 69.84%. This is a high value; however, we noticed that, although the students were satisfied with their participation in the e-learning courses, they would not choose again to attend an e-learning course instead of a face-to-face course to a very high degree (Q3).
By taking into account the instructors’ answers to the questionnaire instructor_quest, the evaluation results concerning the students’ acceptance of e-learning are described in Table 10. We used the centroid method [48] to aggregate the fuzzy outcomes of the questions, which are depicted in Table 10, to estimate the final value of e-learning acceptance. The result showed that the e-learning acceptance is 67.95%. This is a slightly high value, which indicates that there is an increased interest in e-learning courses after the COVID-19 pandemic, but the major preference of students is to attend face-to-face courses. Therefore, the instructors’ opinion coincides with the learners’ evaluation results.
  • Learning effectiveness:
The evaluation results concerning the students’ opinions about the learning effectiveness of e-learning are described in Table 11. We used the centroid method [48] to aggregate the fuzzy outcomes of the questions, which are depicted in Table 11, to estimate the final value of learning effectiveness. The result showed that the learning effectiveness of e-learning is 61.53%. This is a slightly high value. We noticed that the students believe that their performance was slightly improved during their participation in e-learning courses (Q5), while their fatigue increased a bit versus attending a face-to-face course (Q8). However, the students considered that their confidence and self-discipline increased significantly during their participation in e-learning courses (Q9, Q10, Q11, Q12). This can be explained by the fact that the anonymity and the usually turned-off cameras, which occur in the case of online courses, decrease the suspensions and the feelings of shame and embarrassment that a student can feel during their physical presence in a class. Moreover, the students believe that distance education helps them significantly in the planning of the courses (Q8).
The evaluation results concerning the instructors’ opinions about the learning effectiveness of e-learning are described in Table 12. We used the centroid method [48] to aggregate the fuzzy outcomes of the questions, which are depicted in Table 12, to estimate the final value of learning effectiveness. The result showed that the e-learning effectiveness is 50.1%. This is a medium value, which indicates that there was a slight increase in learning effectiveness during e-learning courses. Concerning the students’ performance especially, the instructors did not notice a significant improvement in the performance of students during the e-learning courses during the period of the COVID-19 pandemic. A slight increase was noticed in the students’ confidence and active participation while they were attending online courses. This happened due to turned-off cameras and to the fact that the students were in a known and friendly environment (usually, their home). Moreover, the teaching of the learning objective of the courses, which included computer programming and demonstration of computer applications, was facilitated by the ability to share screens and by the fact that all the participants in the learning process interacted with their personal computers during the online lessons. These also resulted in a significant improvement in the learners’ self-discipline. Furthermore, the instructors noticed that the online courses cause extra fatigue to the learners.
  • Engagement
The evaluation results concerning the students’ engagement in the learning process during an online course, as derived from the answers to the questionnaire stud_quest, are described in Table 13. In order to estimate the final value of students’ engagement, we used the centroid method [48] to aggregate the fuzzy outcomes of the questions depicted in Table 13. The result showed that the students’ engagement in online courses is 57.73%. This is a moderate value. Therefore, we can conclude that e-learning does not increase students’ engagement in the learning process. We noticed that e-learning did not allow students to be more focused on the course (Q14, Q15). Indeed, e-learning benefits the students’ distraction (Q16), while it does not cause any significant change in the students’ absences from the courses (Q18, Q19).
Regarding the instructors’ answers, the evaluation results concerning the students’ engagement in the learning process are described in Table 14. In order to estimate the final value of learning effectiveness, we used the centroid method [48] to aggregate the fuzzy outcomes of the questions, which are depicted in Table 14. The result showed that the e-learning engagement is 59%. This is a moderate value, which indicates that e-learning does not significantly affect the learners’ engagement in the learning process. Although e-learning courses do not significantly affect the students’ focus on lessons and their absences from them, a slight improvement is noticed in their participation in learning activities and discussions. The reasons for these are the same as those that cause the improvement of learners’ confidence and self-discipline.
  • Socializing and interpersonal relationships in the educational community
The evaluation results, according to the answers to the questionnaire stud_quest concerning the students’ socializing and interpersonal relationships in the educational community during an online course, are described in Table 15. We used the centroid method [48] to aggregate the fuzzy outcomes of the questions in Table 15 to estimate the final value of students’ socializing and interpersonal relationships in the educational community. The result showed that students’ socializing and interpersonal relationships in the educational community are 37.52%. Consequently, e-learning has a negative effect on students’ interpersonal relationships in the educational community. Particularly, the students supported that communication with their fellow students declined (Q20, Q21). Only the immediacy in their communication with the instructor improved a little (Q22, Q23).
Regarding the instructors’ answers, the evaluation results concerning socializing and interpersonal relationships in the educational community during an online course are described in Table 16. The result showed that socializing and interpersonal relationships in the educational community were estimated at 62.06%. This is a slightly high value, which indicates that the immediacy in communication among the learners and instructors increased during the e-learning courses. However, this does not mean that their relationships have improved. Moreover, the great difference between the interpersonal relationships assessment results of students and instructors arose because students additionally evaluated their relationships and communication with their fellow students, which was not evaluated by the instructors.

4. Discussion and Implications

Due to the pandemic, a large-scale experiment was carried out involving e-learning as the exclusive way of delivering lessons and educational services for at least one year. Therefore, we had the opportunity to discover the advantages and disadvantages of e-learning in the educational process, as well as its effect on the lives and outcomes of academic students. According to the findings of the presented survey, the disadvantages include the following observations:
  • The students’ fatigue increases during online courses versus attending a face-to-face course.
  • E-learning causes the students’ distraction to increase.
  • The students’ communication with their fellow students and their interpersonal relationships deteriorates during e-learning.
On the other hand, according to the presented study, the advantages of e-learning in the education of academic students in the field of computer science include the following observations:
  • A large percentage of academic students (namely 72%) experienced positive emotions during e-learning, which indicates a positive learning experience.
  • E-learning encourages students to be more confident.
  • E-learning increases students’ self-discipline.
  • E-learning help tools assist students in planning their courses.
  • E-learning facilitates students’ participation in courses as students make fewer absences from lessons.
  • The teaching of computer programming and lessons, which include a demonstration of computer applications, is facilitated by e-learning.
Finally, the educational aspects that remain essentially invariant are the following:
  • E-learning does not affect the students’ performance positively or negatively.
  • E-learning does not affect the students’ focus on the course positively or negatively.
Consequently, the evaluation results are in accordance with the previous related literature regarding the e-learning acceptance and the effect of e-learning on students’ performance, studying behavior, distraction and interpersonal relationships during lockdown due to the COVID-19 pandemic [9,12,15,29,30]. However, previous studies did not outline how the students’ confidence and self-discipline were affected by e-learning. Furthermore, this study showed that, although the distraction of students during online courses increased, the ability of screen sharing provided by the tools and applications of e-learning, along with the fact that all students use a computer during an online course, facilitated the teaching of computer science lessons (such as teaching a programming language) and motivated students to participate more actively.
The aforementioned advantages satisfy the students and contribute to increasing their interest in e-learning courses. However, due to the aforementioned disadvantages, the students still need to attend a face-to-face course instead of an online course. Therefore, the challenge is to use the findings of the presented study in order to improve academic education by integrating e-learning into teaching techniques in such a way as to benefit from its advantages. Regarding the previous, a blended teaching method, which includes face-to-face lessons in combination with e-learning courses (especially for lessons that concern programming language or computer program demonstration), would be useful for higher education in the field of Information and Communication Technologies (ICT).
Another important contribution of the presented study is the use of fuzzy logic for analyzing the survey data. This method has not been applied in previous studies in the literature. The use of fuzzy rules in the analysis of the evaluation results better imitates the human way of thinking, making the process more explainable and the results more interpretable. Furthermore, fuzzy rules allow handling the heterogeneous questions in a questionnaire, where answers follow a particular scale that represents the level of the responders’ agreement. In such questionnaires, the lower value of the answer scale often implies a negative answer to some questions (e.g., questions Q2, Q10 and Q24 of the students’_questionnaire and questions Q1, Q4, Q6 and Q9 of the instructors’_questionnaire), while it implies a positive answer for some others (e.g., questions Q6, Q16 and Q19 of the students’_questionnaire and questions Q5 and Q14 of instructors’_questionnaire). Therefore, fuzzy rules such as “If answer is low then the result is low”, “If answer is low then the result is high”, “If answer is high then the result is high” or “If answer is high then the result is low” make the analysis of answers more interpretable and facilitate the exportation of analysis results.
Consequently, the findings of the presented study are very important for e-learning and data analysis. They contribute to the effective integration of e-learning into the education process of academic institutions concerning the discipline of Information and Communication Technologies (ICT). Furthermore, this work contributes to the methodologies of data analysis questionnaires and the extraction of their respective results using fuzzy logic.

5. Conclusions

This paper has described an evaluation of the impact of e-learning on academic students of computer science during the 2020–2021 academic year lockdown because of the COVID-19 pandemic. The evaluation was performed through questionnaires concerning four different factors: e-learning acceptance, learning effectiveness, students’ engagement, and socializing and interpersonal relationships in the educational community. The analysis of the questionnaire answers and their interpretation of the four factors was performed through a fuzzy-based mechanism. In particular, fuzzy sets were used to describe the evaluation results. Moreover, 209 fuzzy rules were used to calculate the evaluation concerning the four factors, modeling the human way of thinking and making the process of evaluation data analysis more explainable.
The evaluation results showed that although the students were very satisfied with online courses, they preferred to attend courses with a physical presence. Moreover, students believe that their performance improved during e-learning, although the instructors did not notice a significant improvement in the learners’ performance. The students’ fatigue increased. The evaluation results show a significant improvement in students’ confidence and self-discipline. Furthermore, the students’ distraction increased, but their participation in learning activities and discussions became more active. Finally, e-learning had a negative effect on students’ socializing and interpersonal relationships in the educational community. However, the immediacy in communication among the learners and instructors increased during the e-learning courses. Regarding the findings of the presented study, a blended teaching technique, which combines the advantages of courses with physical presence with the benefits of e-learning, would improve the academic education process and outcomes.
The contribution of the presented study is significant for e-learning and data analysis. Particularly, the findings of the presented study can be used by researchers and experts in the fields of e-learning and education to integrate e-learning into the education process of Information and Communication Technologies courses to improve higher education processes, outcomes and environments. Furthermore, the fuzzy-based data analysis method, which is used in this study, can also be used by other researchers in data analysis.
The limitations of the presented survey are the number of participants, the fact that most of them were familiar with computer use, and the knowledge domain of their studies and of the delivered courses, which allows teaching through digital means.

Author Contributions

Conceptualization, K.C., M.V. and G.A.T.; methodology, K.C. and M.V.; validation, K.C., M.V. and G.A.T.; writing—original draft preparation, K.C.; writing—review and editing, K.C., M.V. and G.A.T.; supervision, M.V.; All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. World Health Organization. WHO Coronavirus Disease (COVID-19) Dashboard. Available online: https://covid19.who.int/ (accessed on 21 July 2022).
  2. World Health Organization. Coronavirus Disease (COVID-19) Advice for the Public. Available online: https://www.who.int/emergencies/diseases/novel-coronavirus-2019/advice-forpublic (accessed on 29 April 2020).
  3. World Health Organization. Available online: https://www.who.int/ (accessed on 20 July 2022).
  4. UNESCO; Global Education Coalition. COVID-19 Education Response. Available online: https://en.unesco.org/covid19/educationresponse/globalcoalition (accessed on 28 May 2020).
  5. Dhawan, S. Online learning: A panacea in the time of COVID-19 crisis. J. Educ. Technol. Syst. 2020, 49, 5–22. [Google Scholar] [CrossRef]
  6. Radha, R.; Mahalakshmi, K.; Kumar, V.S.; Sara-vanakumar, A.R. E-Learning during lockdown of Covid-19 pandemic: A global perspective. Int. J. Control Autom. 2020, 13, 1088–1099. [Google Scholar]
  7. Toth-Stub, S. Countries Face an Online Education Learning Curve: The Coronavirus Pandemic has Pushed Education Systems Online, Testing Countries’ Abilities to Provide Quality Learning for All. Available online: https://www.usnews.com/news/best-countries/articles/2020-04-02/coronaviruspandemic-tests-countries-abilities-to-create-effective-online-education (accessed on 27 April 2020).
  8. Murgatrotd, S. COVID-19 and Online Learning. In Alberta, Canada. 2020. Available online: https://www.researchgate.net/publication/339784057_COVID-19_and_Online_Learning (accessed on 23 December 2022).
  9. Maqableh, M.; Alia, M. Evaluation online learning of undergraduate students under lockdown amidst COVID-19 Pandemic: The online learning experience and students’ satisfaction. Child. Youth Serv. Rev. 2021, 128, 106160. [Google Scholar] [CrossRef]
  10. Pokhrel, S.; Chhetri, R. A literature review on impact of COVID-19 pandemic on teaching and learning. High. Educ. Future 2021, 8, 133–141. [Google Scholar] [CrossRef]
  11. Abdull Mutalib, A.A.; Jaafar, M.H. A systematic review of health sciences students’ online learning during the COVID-19 pandemic. BMC Med. Educ. 2022, 22, 524. [Google Scholar] [CrossRef]
  12. Alhammadi, S. The effect of the COVID-19 pandemic on learning quality and practices in higher education—Using deep and surface approaches. Educ. Sci. 2021, 11, 462. [Google Scholar] [CrossRef]
  13. García-Peñalvo, F.J.; Corell, A.; Rivero-Ortega, R.; Rodríguez-Conde, M.J.; Rodríguez-García, N. Impact of the COVID-19 on higher education: An experience-based approach. In Information Technology Trends for a Global and Interdisciplinary Research Community; García-Peñalvo, F.J., Ed.; IGI Global: Hershey, PA, USA, 2021; pp. 1–18. [Google Scholar]
  14. Aivazidi, M.; Michalakelis, C. Exploring Primary School Teachers’ Intention to Use E-Learning Tools during the COVID-19 Pandemic. Educ. Sci. 2021, 11, 695. [Google Scholar] [CrossRef]
  15. Alkhwaldi, A.F.; Absulmuhsin, A.A. Crisis-centric distance learning model in Jordanian higher education sector: Factors influencing the continuous use of distance learning platforms during COVID-19 pandemic. J. Int. Educ. Bus. 2021, 15, 250–272. [Google Scholar] [CrossRef]
  16. Adedoyin, O.B.; Soykan, E. Covid-19 pandemic and online learning: The challenges and opportunities. Interact. Learn. Environ. 2020, 1–13. [Google Scholar] [CrossRef]
  17. Daniel, S.J. Education and the COVID-19 pandemic. Prospects 2020, 49, 91–96. [Google Scholar] [CrossRef] [Green Version]
  18. Maatuk, A.M.; Elberkawi, E.K.; Aljawarneh, S.; Rashai-deh, H.; Alharbi, H. The COVID-19 pandemic and E-learning: Challenges and opportunities from the perspective of students and instructors. J. Comput. High. Educ. 2022, 34, 21–38. [Google Scholar] [CrossRef] [PubMed]
  19. Aguilera-Hermida, A.P. College students’ use and acceptance of emergency online learning due to COVID-19. Int. J. Educ. Res. Open 2020, 1, 100011. [Google Scholar] [CrossRef] [PubMed]
  20. Schrum, M.L.; Johnson, M.; Ghuy, M.; Gombolay, M.C. Four years in review: Statistical practices of likert scales in human-robot interaction studies. In Proceedings of the Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020. [Google Scholar]
  21. Chrysafiadi, K.; Virvou, M. Fuzzy Logic for Adaptive Instruction in an E-learning Environment for Computer Programming. IEEE Trans. Fuzzy Syst. 2015, 23, 164–177. [Google Scholar] [CrossRef]
  22. Bhardwaj, N.; Sharma, P. An advanced uncertainty measure using fuzzy soft sets: Application to decision-making problems. Big Data Min. Anal. 2021, 4, 94–103. [Google Scholar] [CrossRef]
  23. Iqbal, A.; Zhao, G.; Cheok, Q.; He, N. Estimation of Machining Sustainability Using Fuzzy Rule-Based System. Materials 2021, 14, 5473. [Google Scholar] [CrossRef]
  24. Yang, L.H.; Ye, F.F.; Liu, J.; Wang, Y.M.; Hu, H. An improved fuzzy rule-based system using evidential reasoning and subtractive clustering for environmental investment prediction. Fuzzy Sets Syst. 2021, 421, 44–61. [Google Scholar] [CrossRef]
  25. Chrysafiadi, K.; Papadimitriou, S.; Virvou, M. Cognitive-based adaptive scenarios in educational games using fuzzy reasoning. Knowl.-Based Syst. 2022, 250, 10911. [Google Scholar] [CrossRef]
  26. Sharma, A.; Alvi, I. Evaluating pre and post COVID-19 learning: An empirical study of learners’ perception in higher education. Educ. Inf. Technol. 2021, 26, 7015–7032. [Google Scholar] [CrossRef]
  27. Kapasia, N.; Paul, P.; Roy, A.; Saha, J.; Zaveri, A.; Mallick, R.; Barman, B.; Das, P.; Chouhan, P. Impact of lockdown on learning status of undergraduate and postgraduate students during COVID-19 pandemic in West Bengal, India. Child. Youth Serv. Rev. 2020, 116, 105194. [Google Scholar] [CrossRef]
  28. Chaturvedi, K.; Vishwakarma, D.K.; Singh, N. COVID-19 and its impact on education, social life and mental health of students: A survey. Child. Youth Serv. Rev. 2021, 121, 105866. [Google Scholar] [CrossRef]
  29. Elberkawi, E.K.; Maatuk, A.M.; Elharish, S.F.; Eltajoury, W.M. Online Learning during the COVID-19 Pandemic: Issues and Challenges. In Proceedings of the 2021 IEEE 1st International Maghreb Meeting of the Conference on Sciences and Techniques of Automatic Control and Computer Engineering MI-STA, Tripoli, Libya, 25–27 May 2021. [Google Scholar]
  30. García-Alberti, M.; Suárez, F.; Chiyón, I.; Mosquera Feijoo, J.C. Challenges and experiences of online evaluation in courses of civil engineering during the lockdown learning due to the COVID-19 pandemic. Educ. Sci. 2021, 11, 59. [Google Scholar] [CrossRef]
  31. Ilieva, G.; Yankova, T.; Klisarova-Belcheva, S.; Ivanova, S. Effects of COVID-19 pandemic on university students’ learning. Information 2021, 12, 163. [Google Scholar] [CrossRef]
  32. Cindy, N.; Fenwick, J.B., Jr. Experiences with online education during COVID-19. In Proceedings of the 2022 ACM Southeast Conference, Virtual, 18–20 April 2022. [Google Scholar]
  33. YeckehZaare, I.; Grot, G.; Dimovski, I.; Pollock, K.; Fox, E. Another Victim of COVID-19: Computer Science Education. In Proceedings of the 53rd ACM Technical Symposium on Computer Science Education V. 1, Providence, RI, USA, 3–5 March 2022. [Google Scholar]
  34. Roy, A.; Yee, M.; Perdue, M.; Stein, J.; Bell, A.; Carter, R.; Miyagawa, S. How COVID-19 Affected Computer Science MOOC Learner Behavior and Achievements: A Demographic Study. In Proceedings of the Ninth ACM Conference on Learning@ Scale, New York, NY, USA, 1–3 June 2022. [Google Scholar]
  35. Rothkrantz, L. The impact of COVID-19 Epidemic on Teaching and Learning. In Proceedings of the 23rd International Conference on Computer Systems and Technologies, Ruse, Bulgaria, 17–18 June 2022. [Google Scholar]
  36. Lacave, C.; Molina, A.I. The Impact of COVID-19 in Collaborative Programming. Understanding the Needs of Undergraduate Computer Science Students. Electronics 2021, 10, 1728. [Google Scholar] [CrossRef]
  37. Zadeh, L.A. Fuzzy Logic = Computing with Words. In Computing with Words in Information/Intelligent Systems 1; Studies in Fuzziness and Soft Computing; Zadeh, L.A., Kacprzyk, J., Eds.; Physica: Heidelberg, Germany, 1999. [Google Scholar]
  38. Gokmen, G.; Akinci, T.Ç.; Tektaş, M.; Onat, N.; Kocyigit, G.; Tektaş, N. Evaluation of student performance in laboratory applications using fuzzy logic. Procedia-Soc. Behav. Sci. 2010, 2, 902–909. [Google Scholar] [CrossRef] [Green Version]
  39. Izquierdo, N.V.; Lezama, O.B.P.; Dorta, R.G.; Viloria, A.; Deras, I.; Hernández-Fernández, L. Fuzzy logic applied to the performance evaluation. Honduran coffee sector case. In Proceedings of the International Conference on Swarm Intelligence, Shanghai, China, 17–22 June 2018. [Google Scholar]
  40. Derebew, B.; Thota, S.; Shanmugasundaram, P.; Asfetsami, T. Fuzzy logic decision support system for hospital employee performance evaluation with maple implementation. Arab. J. Basic Appl. Sci. 2021, 28, 73–79. [Google Scholar] [CrossRef]
  41. Lin, C.T.; Chiu, H.; Tseng, Y.H. Agility evaluation using fuzzy logic. Int. J. Prod. Econ. 2006, 101, 353–368. [Google Scholar] [CrossRef]
  42. Vivek, K.; Subbarao, K.V.; Routray, W.; Kamini, N.R.; Dash, K.K. Application of fuzzy logic in sensory evaluation of food products: A comprehensive study. Food Bioprocess Technol. 2020, 13, 1–29. [Google Scholar] [CrossRef]
  43. Idrees, M.; Riaz, M.T.; Waleed, A.; Paracha, Z.J.; Raza, H.A.; Khan, M.A.; Hashmi, W.S. Fuzzy logic based calculation and analysis of health index for power transformer in-stalled in grid stations. In Proceedings of the 2019 International Symposium on Recent Advances in Electrical Engineering (RAEE), Islamabad, Pakistan, 28–29 August 2019. [Google Scholar]
  44. Mamdani, E.H.; Assilian, S. An Experiment in Linguistic Synthesis with a Fuzzy Logic Controller. Int. J. Man-Mach. Stud. 1975, 7, 1–13. [Google Scholar] [CrossRef]
  45. Izquierdo, S.; Izquierdo, L.R. Mamdani Fuzzy Systems for Modelling and Simulation: A Critical Assessment. 2017. Available online: http://dx.doi.org/10.2139/ssrn.2900827 (accessed on 23 December 2022).
  46. Hamam, A.; Georganas, N.D. A comparison of Mamdani and Sugeno fuzzy inference systems for evaluating the quality of experience of Hapto-Audio-Visual applications. In Proceedings of the 2008 IEEE International Workshop on Haptic Audio visual Environments and Games, Ottawa, ON, Canada, 18–19 October 2008. [Google Scholar]
  47. Gilda, K.; Satarkar, S. Review of Fuzzy Systems through various jargons of technology. Int. J. Emerg. Technol. Innov. Res. 2020, 7, 260–264. [Google Scholar]
  48. Klir, G.; Yuan, B. Fuzzy Sets and Fuzzy Logic; Prentice Hall: Hoboken, NJ, USA, 1995; Volume 4. [Google Scholar]
  49. Pallant, J. SPSS Survival Manual: A Step by Step Guide to Data Analysis Using IBM SPSS, 6th ed.; Open University Press: Maidenhead, UK, 2016. [Google Scholar]
  50. Cronbach, L.J. Coefficient alpha and the internal structure of tests. Psychometrika 1951, 16, 297–334. [Google Scholar] [CrossRef] [Green Version]
  51. Shrestha, N. Factor analysis as a tool for survey analysis. Am. J. Appl. Math. Stat. 2021, 9, 4–11. [Google Scholar] [CrossRef]
  52. Bacon, D.R.; Sauer, P.L.; Young, M. Composite reliability in structural equations modeling. Educ. Psychol. Meas. 1995, 55, 394–406. [Google Scholar] [CrossRef]
Figure 1. The Mamdani Inference System.
Figure 1. The Mamdani Inference System.
Electronics 12 00428 g001
Figure 2. The membership functions of answer.
Figure 2. The membership functions of answer.
Electronics 12 00428 g002
Figure 3. The membership functions of question’s Q4 answers.
Figure 3. The membership functions of question’s Q4 answers.
Electronics 12 00428 g003
Figure 4. The membership functions of evaluation criteria.
Figure 4. The membership functions of evaluation criteria.
Electronics 12 00428 g004
Figure 5. The aggregated active fuzzy rules and their centroid x*.
Figure 5. The aggregated active fuzzy rules and their centroid x*.
Electronics 12 00428 g005
Table 1. Stud_quest: the survey questionnaire, which is addressed to students.
Table 1. Stud_quest: the survey questionnaire, which is addressed to students.
QuestionAnswers
Q1How did your opinion about e-learning change after your participation in e-learning programs during the COVID-19 pandemic?1—it got extremely worse2345—significantly improved
Q2Would you like to participate in an e-learning program again?1—not at all2345—very much
Q3Would you choose to attend an e-learning course instead of a face-to-face course?1—not at all2345—very much
Q4Note how you felt while attending an online course.Electronics 12 00428 i001
Q5How do you think distance learning has affected your performance?1—absolutely negative2345—absolutely positive
Q6Evaluate your fatigue while attending an online course.1—not at all2345—very much
Q7Evaluate your fatigue while attending an online course in relation to attending a face-to-face course.1—extremely decreased2345—extremely increased
Q8Did e-learning help you in the planning of the courses?1—not at all2345—very much
Q9How confident were you while attending an online course?1—not at all2345—very much
Q10How do you think distance learning has affected your confidence in your participation in the learning process?1—absolutely negative2345—absolutely positive
Q11How self-discipline were you while attending an online course?1—not at all2345—very much
Q12How do you think distance learning has affected your self-discipline in your participation in the learning process?1—absolutely negative2345—absolutely positive
Q13Evaluate the impact that distance learning has had on your education as a whole.1—absolutely negative2345—absolutely positive
Q14Evaluate how focused you are when attending an online course.1—not at all2345—very much
Q15Did e-learning help you to be more focused on the learning process?1—not at all2345—very much
Q16Evaluate how much your attention is distracted while attending an online course.1—not at all2345—very much
Q17Did e-learning contribute to your distraction from the educational process?1—not at all2345—very much
Q18How many absences did you make during your participation in online courses?1—not at all2345—very much
Q19Evaluate the absences you made during your participation in e-learning in relation to the absences you usually make during the attending of a face-to-face course.1—extremely decreased2345—extremely increased
Q20Evaluate the communication with your fellow students during e-learning.1—extremely bad2345—excellent
Q21During your participation in e-learning, the communication with your fellow students:1—it got extremely worse2345—significantly improved
Q22Was the communication with the tutor direct during the e-learning period?1—not at all2345—very much
Q23Evaluate the immediacy in solving questions/problems during the e-learning period.1—not at all2345—very much
Q24During your participation in e-learning, the communication with the instructor.1—it got extremely worse2345—significantly improved
Table 2. Instructor_quest: the survey questionnaire, which is addressed to instructors.
Table 2. Instructor_quest: the survey questionnaire, which is addressed to instructors.
QuestionAnswers
Q1How do you think that the learner’s opinion about e-learning has changed after their participation in e-learning programs during the COVID-19 pandemic?1—it got extremely worse2345—significantly improved
Q2Do learners ask you to offer e-learning courses?1—not at all2345—very much
Q3Evaluate the current requests for e-learning courses in relation to the corresponding requests before the COVID-19 pandemic.1—extremely decreased2345—extremely increased
Q4How do you think distance learning has affected the learners’ performance?1—absolutely negative2345—absolutely positive
Q5Evaluate the learners’ fatigue while attending an online course in relation to attending a face-to-face course.1—extremely decreased2345—extremely increased
Q6How confident do you think that the learners were while attending an online course in relation to attending a face-to-face course?1—not at all2345—very much
Q7Evaluate how active the learners were while attending an online course in relation to attending a face-to-face course.1—not at all2345—very much
Q8How do you think distance learning has affected learners’ self-discipline?1—absolutely negative2345—absolutely positive
Q9Evaluate the impact that distance learning has had on the learning process as a whole.1—absolutely negative2345—absolutely positive
Q10Evaluate the learners’ participation in learning activities while attending an online course in relation to attending a face-to-face course.1—extremely decreased2345—extremely increased
Q11Evaluate the learners’ participation in discussions while attending an online course in relation to attending a face-to-face course.1—extremely decreased2345—extremely increased
Q12Evaluate how focused the learners are when attending an online course.1—not at all2345—very much
Q13Did e-learning help the learners to be more focused on the learning process?1—not at all2345—very much
Q14Evaluate the absences that the learners made during their participation in e-learning courses in relation to the absences they usually make while attending a face-to-face course.1—extremely decreased2345—extremely increased
Q15Was the communication with the learners direct during the e-learning period?1—not at all2345—very much
Q16Evaluate the immediacy in solving questions/problems during the e-learning period.1—not at all2345—very much
Q17How willing were the learners to express their opinion and questions?1—not at all2345—very much
Q18During e-learning courses, the communication between the instructor and the learners:1—it got extremely worse2345—significantly improved
Table 3. Fuzzy sets for answers.
Table 3. Fuzzy sets for answers.
Linguistic ValueFuzzy Partition
Very Low(0, 1, 2)
Low(1, 2, 3)
Neutral(2, 3, 4)
High(3, 4, 5)
Very high(4, 5, 5)
Table 4. Fuzzy sets for answers of Q4.
Table 4. Fuzzy sets for answers of Q4.
Linguistic ValueFuzzy Partition
Low(0, 0, 40, 50)
Medium(40, 50, 60, 70)
High(60, 70, 80, 90)
Very high(80, 90, 90, 100)
Table 5. Fuzzy sets that describe each evaluation criterion.
Table 5. Fuzzy sets that describe each evaluation criterion.
Linguistic ValueFuzzy Partition
Low(0, 0, 30, 50)
Medium(30, 50, 60, 70)
High(60, 70, 80, 90)
Very high(80, 90, 100, 100)
Table 6. Reliability and validity results of the students’ questionnaire.
Table 6. Reliability and validity results of the students’ questionnaire.
Evaluation CriterionQuestionsCronbach’s AlphaAVECR
AcceptanceQ1–Q31.490.690.87
Learning effectivenessQ5–Q130.770.820,97
EngagementQ14–Q190.490.710.85
Socializing and interpersonal relationshipswith fellowsQ20–Q210.800.850.83
With instructorsQ22–Q240.720.780.82
Table 7. Reliability and validity results of the instructors’ questionnaire.
Table 7. Reliability and validity results of the instructors’ questionnaire.
Evaluation CriterionQuestionsCronbach’s AlphaAVECR
AcceptanceQ1–Q31.210.630.84
Learning effectivenessQ4–Q90.850.630.91
EngagementQ10–Q140.540.590.72
Interpersonal relationshipsQ15–Q180.780.510.80
Table 8. Analysis of questions concerning the e-learning acceptance: the students’ opinions.
Table 8. Analysis of questions concerning the e-learning acceptance: the students’ opinions.
QuestionLikert ValueMean AnswerRulesAcceptance
Fuzzy SetLinguistic ValueFuzzy SetLinguistic Value
Q13.88(0, 0, 0.12, 0.88, 0)Neutral and High3 and 4(0, 0, 1, 0)High
Q24.24(0, 0, 0, 0.76, 0.24)High and Very High9 and 10(0, 0, 1, 0)High
Q33.44(0, 0, 0.56, 0.44, 0)Neutral and High13 and 14( 0, 1, 0, 0)Medium
Table 9. Results concerning the students’ feelings.
Table 9. Results concerning the students’ feelings.
FeelingCrisp NumberPercentage
None56.25%
Sadness22.5%
Fear22.5%
Anger45%
Anxiety56.25%
Pleasure1215%
Disappointment810%
Interest1620%
Satisfaction1417.5%
Excitement1215%
Table 10. Analysis of questions concerning the e-learning acceptance: the instructors’ opinions.
Table 10. Analysis of questions concerning the e-learning acceptance: the instructors’ opinions.
QuestionLikert ValueMean AnswerRulesAcceptance
Fuzzy SetLinguistic ValueFuzzy SetLinguistic Value
Q14(0, 0, 0, 1, 0)High123(0, 0, 1, 0)High
Q23.5(0, 0, 0.5, 0.5, 0)Neutral and High127 and 128(0, 0.83, 0.17, 0)Medium and High
Q34.8(0, 0, 0, 0.2, 0.8)High and Very High133 and 134(0, 0, 0.358, 0.642)High and Very High
Table 11. Analysis of questions concerning the learning effectiveness: The students’ opinions.
Table 11. Analysis of questions concerning the learning effectiveness: The students’ opinions.
QuestionLikert ValueMean AnswerRulesLearning Effectiveness
Fuzzy SetLinguistic ValueFuzzy SetLinguistic Value
Q53.56(0, 0, 0.44, 0.56, 0)Neutral and High22 and 23(0, 0.765, 0.235, 0)Medium and High
Q62.76(0, 0.76, 0.24, 0, 0)Low and Neutral26 and 27(0, 0.315, 0.685, 0)Medium and High
Q73.53(0, 0, 0.47, 0.53, 0)Neutral and High32 and 33(0.86, 0.14, 0, 0)Low and Medium
Q83.88(0, 0, 0.12, 0.88, 0)Neutral and High37 and 38(0, 0, 1, 0)High
Q93.71(0, 0, 0.29, 0.71, 0)Neutral and High42 and 43(0, 0.44, 0.56, 0)Medium and High
Q103.82(0, 0, 0.18, 0.82, 0)Neutral and High47 and 48(0, 0.59, 0.41, 0)Medium and High
Q113.74(0, 0, 0.26, 0.74, 0)Neutral and High52 and 53(0, 0.37, 0.63, 0)Medium and High
Q123.94(0, 0, 0.06, 0.94, 0)Neutral and High57 and 58(0, 0, 1, 0)High
Q133.85(0, 0, 0.15, 0.85, 0)Neutral and High62 and 63(0, 0.06, 0.94, 0)Medium and High
Table 12. Analysis of questions concerning the learning effectiveness: The instructors’ opinions.
Table 12. Analysis of questions concerning the learning effectiveness: The instructors’ opinions.
QuestionLikert ValueMean AnswerRulesLearning Effectiveness
Fuzzy SetLinguistic ValueFuzzy SetLinguistic Value
Q43.4(0, 0, 0.6, 0.4, 0)Neutral and High137 and 138(0, 1, 0, 0)Medium
Q53.3(0, 0, 0.7, 0.3, 0)Neutral and High142 and 143(0.49, 0.51, 0, 0)Low and Medium
Q63.7(0, 0, 0.3, 0.7, 0)Neutral and High147 and 148(0, 0.46, 0.54, 0)Medium and High
Q73.7(0, 0, 0.3, 0.7, 0)Neutral and High152 and 153(0, 0.46, 0.54, 0)Medium and High
Q83.9(0, 0, 0.1, 0.9, 0)Neutral and High157 and 158(0, 0, 1, 0)High
Q93.6(0, 0, 0.4, 0.6, 0)Neutral and High162 and 163(0, 0.68, 0.32, 0)Medium and High
Table 13. Analysis of questions concerning the students’ engagement: The students’ opinions.
Table 13. Analysis of questions concerning the students’ engagement: The students’ opinions.
QuestionLikert ValueMean AnswerRulesEngagement
Fuzzy SetLinguistic ValueFuzzy SetLinguistic Value
Q143.15(0, 0, 0.85, 0.15, 0)Neutral and High67 and 68(0, 1, 0, 0)Medium
Q152.82(0, 0.18, 0.82, 0, 0)Low and Neutral71 and 72(0.27, 0.73, 0, 0)Low and Medium
Q163.53(0, 0, 0.47, 0.53, 0)Neutral and High77 and 78(0.86, 0.14, 0, 0)Low and Medium
Q172.94(0, 0.06, 0.94, 0, 0)Low and Neutral81 and 82(0.03, 0.97, 0, 0)Low and Medium
Q181.82(0.18, 0.82, 0, 0, 0)Very Low and Low85 and 86(0, 0, 1, 0)High
Q192.38(0, 0.62, 0.38, 0, 0)Low and Neutral91 and 92(0, 0.64, 0.36, 0)Medium and High
Table 14. Analysis of questions concerning the students’ engagement: The instructors’ opinions.
Table 14. Analysis of questions concerning the students’ engagement: The instructors’ opinions.
QuestionLikert ValueMean AnswerRulesEngagement
Fuzzy SetLinguistic ValueFuzzy SetLinguistic Value
Q103.5(0, 0, 0.5, 0.5, 0)Neutral and High167 and 168(0, 0.88, 0.12, 0, 0)Medium and High
Q113.5(0, 0, 0.5, 0.5, 0)Neutral and High172 and 173(0, 0.88, 0.12, 0, 0)Medium and High
Q123.2(0, 0, 0.8, 0.2, 0)Neutral and High177 and 178(0, 1, 0, 0)Medium
Q133(0, 0, 1, 0, 0)Neutral182(0, 1, 0, 0)Medium
Q142,3(0, 0.7, 0.3, 0, 0)Low and Neutral186 and 187(0, 0.53, 0.47, 0)Medium and High
Table 15. Analysis of questions concerning students’ socializing and interpersonal relationships in the educational community: The students’ opinions.
Table 15. Analysis of questions concerning students’ socializing and interpersonal relationships in the educational community: The students’ opinions.
QuestionLikert ValueMean AnswerRulesSocializing and Interpersonal Relationships
Fuzzy SetLinguistic ValueFuzzy SetLinguistic Value
Q202.68(0, 0.32, 0.68, 0, 0)Low and Neutral96 and 97(0.52, 0.48, 0, 0)Low and Medium
Q212.5(0, 0.5, 0.5, 0, 0)Low and Neutral101 and 102(0.81, 0.19, 0, 0)Low and Medium
Q223.62(0, 0, 0.38, 0.62, 0)Neutral and High107 and 108(0, 0.64, 0.36, 0)Medium and High
Q233.94(0, 0, 0.06, 0.94, 0)Neutral and High112 and 113(0, 0.1, 0.9, 0)Medium and High
Q243.03(0, 0, 0.97, 0.03, 0)Neutral and High117 and 118(0, 1, 0, 0)Medium
Table 16. Analysis of questions concerning students’ socializing and interpersonal relationships in the educational community: The instructors’ opinions.
Table 16. Analysis of questions concerning students’ socializing and interpersonal relationships in the educational community: The instructors’ opinions.
QuestionLikert ValueMean AnswerRulesSocializing and Interpersonal Relationships
Fuzzy SetLinguistic ValueFuzzy SetLinguistic Value
Q153.7(0, 0, 0.3, 0.7, 0)Neutral and High192 and 193(0, 0.46, 0.54, 0)Medium and High
Q164(0, 0, 0, 1, 0)High198(0, 0, 1, 0)High
Q173.7(0, 0, 0.3, 0.7, 0)Neutral and High202 and 203(0, 0.46, 0.54, 0)Medium and High
Q183(0, 0, 1, 0, 0)Neutral207(0, 1, 0, 0)Medium
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chrysafiadi, K.; Virvou, M.; Tsihrintzis, G.A. A Fuzzy-Based Evaluation of E-Learning Acceptance and Effectiveness by Computer Science Students in Greece in the Period of COVID-19. Electronics 2023, 12, 428. https://doi.org/10.3390/electronics12020428

AMA Style

Chrysafiadi K, Virvou M, Tsihrintzis GA. A Fuzzy-Based Evaluation of E-Learning Acceptance and Effectiveness by Computer Science Students in Greece in the Period of COVID-19. Electronics. 2023; 12(2):428. https://doi.org/10.3390/electronics12020428

Chicago/Turabian Style

Chrysafiadi, Konstantina, Maria Virvou, and George A. Tsihrintzis. 2023. "A Fuzzy-Based Evaluation of E-Learning Acceptance and Effectiveness by Computer Science Students in Greece in the Period of COVID-19" Electronics 12, no. 2: 428. https://doi.org/10.3390/electronics12020428

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop