Next Article in Journal
The Role of Cellulose in Microbial Diversity Changes in the Soil Contaminated with Cadmium
Next Article in Special Issue
Assessing Students’ Awareness of 4Cs Skills after Mobile-Technology-Supported Inquiry-Based Learning
Previous Article in Journal
A Low-Cost Web Application System for Monitoring Geometrical Impacts of Surface Subsidence
Previous Article in Special Issue
Investigating the Association between Algorithmic Thinking and Performance in Environmental Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Two Years of Hybrid Education in Romania: A Comparative Overview of the Students’ Expectations for the Online Educational Platforms

by
Mădălin-Dorin Pop
1,
Adrian Pavel Pugna
2,3,
Vladimir-Ioan Crețu
1 and
Sabina Alina Potra
2,3,*
1
Computer and Information Technology Department, Politehnica University of Timisoara, 300223 Timisoara, Romania
2
Management Department, Politehnica University of Timisoara, 300223 Timisoara, Romania
3
Research Centre in Engineering and Management, Politehnica University Timisoara, 300223 Timisoara, Romania
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(21), 14241; https://doi.org/10.3390/su142114241
Submission received: 9 October 2022 / Revised: 26 October 2022 / Accepted: 28 October 2022 / Published: 31 October 2022

Abstract

:
During the COVID-19 pandemic, due to the high infection rates, in some periods, a hybrid or totally online educational system was used. In both types of fully online and hybrid educational processes, the need for online educational platforms arose. This research aims to provide a comparative overview of the expectations for these platforms using the responses of two successive generations of students as input by applying the Kano methodology. Furthermore, this article performs a Fong test as a relevance check to identify the features for which the Kano analysis results are statistically significant. The results of the analysis show that the expectations of students are more related to the features that describe their access to administrative resources and the technical-related features. This can be explained through a permanent need, independent of the chosen educational system (i.e., face-to-face, fully online, or hybrid) and their field of study. The use of virtual reality (VR) technology in creating laboratory experimental lessons is identified as the only possible mechanism, from the proposed features in the engagement category, to keep students engaged during periods when fully online and hybrid educational processes are needed.

1. Introduction

The COVID-19 pandemic forced the transition to a fully online educational system throughout the world. This led to a decrease in student engagement and motivation, while anxiety has increased because of this fast change. Anxiety continued to increase when students experienced a lack of learning atmosphere during online classes, the lack of a collaborative face-to-face environment, and feelings of loneliness caused by the lack of socialization [1]. From this comes the need for online educational platforms that address these issues.
Compared to other countries that continued the online educational process, in Romania, a hybrid educational system has been employed after six months of the COVID-19 pandemic. Universities have been allowed to manage the return of students to the university and, in the case of an increase in the number of COVID-19 infections, to impose the online educational system for all students. There are also existing some similar works analyzing the perceptions of students related to the COVID-19 pandemic in Romania [2,3,4,5], but there is a lack of studies proposing new solutions regarding the future use and development of new online educational platforms. Regarding the experiences of other countries from the same region of Central and Eastern Europe as Romania, most of them faced COVID-19 with “emergency remote teaching” at the beginning (spring 2020), when teachers and students have been trained to use Microsoft Teams, Google Classroom, and Zoom platforms. In the Czech Republic and Slovakia, afterward, the learning management system (LMS) Moodle has been commonly considered (autumn 2020), as argued by Guncaga et al. [6]. Although Ukraine has experienced the need to implement different forms of distance learning in the past with the 2009 flu quarantine and 2015–2016 online teaching due to low temperatures, it did not fully incorporate distance learning [7]. Croatia struggled with too many ICT tools and platforms such as Loomen, Google classroom, Yammer, Microsoft Teams, YouTube videos, Office 365 tools, Genially, wizer.me, bookwidgets, Powtoon, Geogebra, and so on. Baksa and Luić [8] considered that a correlation between remote teaching, the duration of the tasks assigned to students, and the real-time lesson plan is mandatory. In Hungary and Serbia, the popularity of online education has grown, and even completely online environments have been developed before the pandemic. In Hungary, the K-MOOC project (Carpathian Basin Online Education Center) offered online courses in Hungarian for the higher education system and summed 46 courses from 2016 until 2020, and in Serbia, the ITACADEMY also provided completely online courses [9].
Mitescu-Manea et al. chose Romania, Bulgaria, the Republic of Moldova, and Hungary in their analysis of the education policy debates during the first wave of COVID-19. In all of these countries in the region of Central and Eastern Europe, the policies considered two main directions, one focusing on governing “through” the crisis with a strong orientation toward complying with sanitary measures, and the other using the crisis as an opportunity to intervene to reduce inequities in the educational system [10]. The authorities of Bulgaria, the Czech Republic, and Hungary made changes through new guidelines for the adaptation of the learning process and evaluation, while the education-oriented guidelines of Romania focused the most on addressing delays in student learning [11]. However, until now, there is no clear overview of which Central Eastern European country adapted the educational system in the most successful way during the COVID-19 pandemic. The results of the decision taken by the authorities can be better evaluated in the future, especially when the students who started university during the pandemic crisis will graduate and face the demands of the labor market. This research continues the work of Pop et al. [12] and uses the same online questionnaire (Table A1) for further data collection and analysis. The study of Pop et al. [12] among students of the Politehnica University of Timisoara managed to receive 57 responses from five faculties and various years of study (from the first year of the Bachelor’s study program to the second year of the Master’s study program). The initial study analyzed the results collected between November and December 2021. The questionnaire was shared with the students via Zoom during online activities and through Facebook groups. The interesting results of the analysis performed by Pop et al. [12] opened the opportunity for the extension of that study and the improvement of the applied methodology.
The current research presents the results collected in the same manner from November 2021 to April 2022 among the students of the Politehnica University of Timisoara and additionally discusses from a statistical perspective the characteristics of the respondents. In this regard, this study neglected the faculties from which the number of responses was not considered statistically valid compared to the corresponding population (i.e., the total number of students enrolled at that faculty).

1.1. Context and Research Purpose

The COVID-19 pandemic caused a forced and immediate change in the Romanian educational system at all levels. The Politehnica University of Timisoara also applied legal policies that comply with pandemic restrictions and moved all educational activities of the university year 2019–2020 to a fully online format, initially for two weeks starting from 11 March 2020 [13]. The increased number of COVID-19 infections led to an extension of this fully online education period, until the end of the university year. The Politehnica University of Timisoara permanently monitored the national evolution of the pandemic and adapted its decisions regarding the educational process according to the legal restrictions decided by the national government. In this sense, the students joined a hybrid educational process aimed at increasing student engagement and developing collaborative relationships between the students. This hybrid educational system was first adopted on 14 September 2020 by the Senate of the Politehnica University of Timisoara for the university year 2020–2021 [14]. Due to several waves of COVID-19 infections, it was mandatory to switch several times between fully online and hybrid educational processes and vice versa.
This context shows that in both educational processes, the online educational aspect is present. Here, arises the need for online educational platforms that can offer access to educational materials, support the evaluation process, and at the same time keep students engaged, especially in the time of a pandemic.
The purpose of this research is to evaluate the proposal for online education that contains the features described previously by Pop et al. [12] and to address the following research questions:
  • How are the proposed features perceived by students from the Politehnica University of Timisoara?
  • Are there significant differences in perceptions based on the students’ field of study?
  • Is there a correlation between student perceptions and their previous experience with the online educational process?

1.2. Related Works

The Kano methodology [15] applies to various domains as a mechanism to identify the features of a product or a service that will delight potential customers. Wu and Lin [16] mentioned this method as suitable for future use in assessing student expectations with e-learning. Fujs et al. [17] applied the Kano methodology to analyze possible efficiency improvements in the use of remote conference tools in higher education. Even if a statistical test was not performed on the results of the analysis, a great contribution represents the improvement of the Berger et al. [18] approach consisting of the incorporation of reverse Kano quality categories in the calculation of satisfaction and dissatisfaction coefficients. Several studies neglected the use of statistical validation in Kano evaluation results [12,17,19,20,21,22]. However, a small number of studies applied a validation based on the Fong test [23,24] or Pearson’s coefficient correlation [25].
Some recent studies discussed the psychological impact of the COVID-19 pandemic and the use of online educational platforms. The rapid change to the online educational system caused anxiety in the student community. This can be seen as the result of student concerns about their health, the social distancing restrictions that deprived them of university life experiences [5], the existence of computer anxiety (technophobia) in the case of e-learning [26], or the lack of a learning atmosphere in online classes [1]. Anxiety also has a strong connection to online educational platforms, especially with the use of the gamification concept. Wang and Tahir [27] took as a case study the analysis of Kahoot! use, which is debatable among researchers in terms of influence on student anxiety. They found that many existing studies reported a reduction in student anxiety, and only two studies that were also statistically validated “showed a statistically significant reduction of student anxiety, and one study reported that Kahoot! could produce agitation” [27].
The impact of the use of online educational platforms on student achievements represents a key factor in an online or hybrid/blended educational system. Learning motivation and student engagement directly influence their achievements, especially on educational platforms based on VR [28] or augmented reality (AR) [29,30]. Furthermore, these learning platforms facilitate the achievement and improvement of student soft skills such as time management, commitment, collaboration, teamwork, and creative, critical, and affective thinking [31,32,33,34,35].
Other studies focus on the practical implementation of online educational platforms from a technical perspective. Liu et al. [36] proposed the use of cloud computing in the development of a new online educational platform as an efficient mechanism to store and provide educational resources through large-scale distributed environments for student communities. The “Hstar” platform also used cloud services to provide hardware infrastructure services and the main functional features (e.g., online forum, assignment, curriculum resources, classroom management, etc.) [37]. “Aoki Cloud” uses cloud technology in the development process of an online educational platform customized to address business domain needs [38]. This customized platform allows small and medium companies to train their future employees at relatively affordable costs, providing a win-win environment for both trainees and businesses.
Sometimes, it is better to focus on needs and development costs rather than blindly following recent trends in technical solutions. As Yuan [39] argued, “technology application should focus on the time demand, effective application and cost-effectiveness”, and Web-based solutions are preferred in some cases instead of the current trend of cloud computing resource platforms. Chen et al. [40] addressed the need for decentralized online education platforms that use blockchain technology to manage educational resources. The strengths of this solution consist of providing a more secure environment for users and resource storage and solving existing issues on many of the current platforms related to centralized management of educational resources.
Abuhassna et al. [41] applied “Transactional Distance Theory (TDT) and Bloom’s Taxonomy Theory (BTT) to investigate the levels of student satisfaction and academic achievements by analyzing the background, experience, collaborations, interaction, and autonomy of students” [12]. The outcome of this research was to propose a guideline for the development of online educational platforms. They also highlighted the importance of previous experience with such a platform and the good knowledge of the class instructor about the features of the online platform. Zhou et al. [42] analyzed the intention of students to use an online educational platform from the perspective of four external variables “Online Course Design, Perceived System Quality, and Perceived Enjoyment, along with an additional perceived variable (Perceived Interaction)”.
An important step in identifying student perceptions is the chosen data acquisition method. Many researchers prefer the use of questionnaires as a method of collecting data for the analysis of student perceptions related to the evaluation of the learning process. Emanuel et al. [32] used questionnaires to assess the influence of an online educational platform on the achievement of soft skills. Ibáñez et al. [29] also used this method to collect data needed to characterize the impact of AR on the academic achievement and motivation of students. In addition to collecting data for a quantitative analysis, Teo et al. [43] used the focus-group interview method for the qualitative analysis. Potra et al. [5] performed a qualitative analysis based on the grounded theory approach and adopted the asynchronous online interview method for data acquisition for safety reasons during the COVID-19 pandemic. Iosif et al. [4] used a Google Forms questionnaire to identify the perceptions of students related to psychological and educational impact during the COVID-19 pandemic.

2. Materials and Methods

To get a better overview of the methodology applied in this research, Figure 1 illustrates the steps followed in this research, from the formulation of research questions to their completion. After defining the research questions, the next step is to identify the potential features of a new online educational platform. The list of these features is also used in the preparation of an online questionnaire that contains questions and possible response options consistent with the Kano methodology, which will be described in detail in Section 2.3.
After sharing the designed questionnaire with the student community, the analysis of the collected data begins. The first step in this analysis consists of identifying the total population and calculating the sample size, which provides a better statistical overview of the relevance of this study from the perspective of confidence level and average error. Section 2.4 describes this statistical methodology.
The Kano evaluation represents the second step of the analysis of the results and aims to classify the proposed feature according to the approach detailed in Section 2.3. This classification provides an overview of the student’s expectations regarding the online educational process and emphasizes the features worth investing in. In addition to the previous statistical analysis, this Kano evaluation performs a statistical test on the classification of features using the Kano quality attributes. This significance test, usually neglected by most Kano-based evaluation studies, aims to provide a validation of the Kano classification. In this way, this research provides a double validation of the results obtained (statistical check based on population characteristics and statistical check based on the relevance of the collected results by applying the Fong test [44]).
The last step in the proposed research methodology is to verify the relevance of the significant results obtained from the analysis of the data collected with the formulated research question. Discussions in this regard will be made in Section 4.

2.1. Identification of Potential Features

This article uses potential features defined in a previous study by Pop et al. [12]. Table 1 presents the list of features, their description, and a classification of them into four categories according to their purpose. In addition to standard features, such as availability on multiple types of devices, custom profiles, communication-related features, and features related to administrative resources were proposed. In addition, other functionalities that influence student engagement and motivation were identified based on the related scientific literature, and the reason behind these choices is presented further.
Student engagement represents a big challenge in the design of a new online educational platform. The difficulty of keeping students engaged becomes greater in a fully online educational system because it is harder to monitor and control their activities. The face-to-face learning system facilitates the identification of specific characteristics that influence student engagement, and professors can directly appeal to their didactic and pedagogical competencies.
A good way to engage students during online learning periods is to implement game-based features that have proven their efficiency in various cases [43,45,46,47]. For this reason, this research aims to address the intrinsic motivation of students by implementing a reward system based on quizzes. This allows the students to answer some questions during the courses (a feature also present for the students who are watching the course recording) and gives them the chance to win badges. In this way, the student’s attention and engagement during course activities can be increased from a competitive perspective. Ross et al. [48] proposed the use of adaptive quizzes that attempt to address the individual needs of each student. As expected, this approach did not influence student scores and achievements because final evaluations consider all concepts presented for a discipline, and these customized quizzes allowed the neglect of concepts that did not arouse students’ curiosity. However, an increase in student motivation and engagement was observed. As a result of the research by Göksün and Gürsoy [47] who compared two quiz-based assessment tools (i.e., Kahoot! and Quizizz), characteristics such as “presentation of questions, feedbacks, progression speed and method of the questions, technical requirements” can influence student achievement and engagement.
The level of student satisfaction and their skills can increase by implementing the virtual reality feature. Even if VR usage increases engagement through spatial presence in a mediated learning environment, this has a negative impact on information recall [28]. As argued by Ahn et al. [28], this has a strong connection with individual differences in terms of user experience with spatial presence and is independent of VR content segmentation. The application of this approach to engineering classes showed that it positively influenced communication between students and teamwork skills and increased their motivation to develop their practical skills [49]. A similar study applied the theory of intrinsic motivation in the case of using AR [30]. The researchers developed an AR-based mobile educational application and collected feedback from 78 students using questionnaires before and after the use of the created application. The results show that the use of AR technology increased user satisfaction, attention, and confidence, keeping them motivated during the learning process [29,30].
The availability of video recordings after the courses took place is also an important topic, even if there are concerns from both lecturing staff and students about this functionality [50]. During the COVID-19 pandemic, Schmitz et al. [51] made a comparison between video-based learning in the medical field and learning based only on textbooks provided in terms of student achievement. Video recordings were uploaded and made available to students on their university video-based online platform. The exam results show significantly higher scores for students who used the video-based online platform in their preparation for the exam instead of the classical method based on the usage of textbooks. The experiment performed by Fujs et al. [17] showed that the availability of the course recordings did not significantly influence attendance at live online presentations. On the other hand, the research by Morris et al. [52] identified a significant increase in attendance at lectures that were not recorded. Furthermore, they observed that “students have high expectations about the availability and quality of recordings” [52].

2.2. Design of the Questionnaire

The questionnaire was designed according to the Kano methodology (Section 2.3), creating two questions for each feature proposed in Table 1 to cover both its functional and dysfunctional characteristics, in short terms, the underlying of its presence and absence.
In addition to Kano-related questions, this questionnaire (see Table A1) asks for details about the faculty and year of study. Furthermore, the questionnaire also includes informed consent that no personal data will be collected and that the responses will be used only for research purposes.

2.3. Kano-Based Analysis

Kano et al. [15] proposed a methodology to characterize the non-linear relationship between customer satisfaction and the features of a product or service. This approach allows classification of the proposed features using the following quality attributes: attractive (A), one-dimensional (O), must-be (M), indifferent (I), reverse (R), and questionable (Q).
The attractive quality attribute (A) describes the customer’s excitement about a feature. Although it is unexpected and manages to surprise the potential customer, the implementation of this functionality will certainly lead to greater customer satisfaction. A one-dimensional quality attribute (O) emphasizes the clearest perception expressed by a potential customer through a straightforward reaction that describes that the absence of it will bring dissatisfaction; the potential customer likes to have this feature and dislikes not having it. The must-be quality attribute (M) shows the cases where the presence of a feature is tolerated, expected, or the potential customer feels neutral with respect to its presence but expresses straightforward disagreement about its absence. An indifferent quality attribute (I) describes the case of a feature that has no positive or negative impact on customer satisfaction, a clear reaction of “like” or “dislike” not being expressed by the potential customer for the presence or absence of a feature. The reverse quality attribute (R) expresses a backward influence on customer satisfaction [53], the reactions of a potential customer showing that he or she likes not having the feature or dislikes having it. The questionable quality attribute (Q) shows contradictions in the reactions of a potential customer to the existence or absence of a feature. According to Wittel et al. [54], this situation is related to skeptical reactions, and the understanding of the feature by a potential customer is debatable.
The Kano evaluation process illustrated in Figure 2 shows that the first step is to formulate two questions for each feature proposed for a new product or service to capture the reaction of a potential customer to the existence (functional question) or absence (dysfunctional question) of that feature. Each potential customer should answer both questions by rating their reaction using the following options: “I like it that way”, “It must be that way”, “I am neutral”, “I can live with it that way”, and “I dislike it that way”. The evaluation of each potential customer reaction is analyzed on the basis of the pair of reactions (functional, dysfunctional).
A pair (“I like it that way”, x), where x = {“It must be that way”, “I am neutral”, “I can live with it that way”} leads to a categorization of that feature as attractive (A). The pair (“I like it that way”, “I dislike it that way”) categorizes the evaluated feature as one-dimensional (O). A pair (x, “I dislike it that way”), where x = {“It must be that way”, “I am neutral”, “I can live with it that way”} classifies the feature as must-be (M). Classification of a feature as reverse (M) results from the pairs (“I like it that way”, x), where x = {“It must be that way”, “I am neutral”, “I can live with it that way”, “I dislike it that way”}, or (x, “I dislike it that way”), where x = {“I like it that way”, “It must be that way”, “I am neutral”}. A feature is classified as questionable (Q) if reactions consist of pairs (“I like it that way”, “I like it that way”) or (“I dislike it that way”, “I dislike it that way”). The other remaining pairs of reactions lead to the classification of that feature as indifferent (I).
After counting the frequency of each quality attribute for each feature, the quality attribute with the maximum frequency will generally describe the reactions of potential customers for a certain feature. If two quality attributes are tied in the scoring of a given feature, Berger et al. [18] recommended the selection of the category that would have the greatest impact on customer satisfaction, determined by the following ordering: M > O > A > I. In addition, identification of market segmentation differences and follow-ups with potential customers can be carried out to identify additional information.
Berger et al. [18] proposed the usage of the satisfaction index (SI), also known as better, and the dissatisfaction index (DI), also known as worse, in the ranking of the features proposed for a new product or service. The results obtained from the calculation of SI using Equation (1) [18] indicate the level of satisfaction corresponding to each feature, the highest values being achieved by providing the features classified as A and O after performing the Kano evaluation.
SI = A + O A + O + M + I
Equation (2) [18] describes the calculation of DI, the lowest values obtained being strongly related to the missing features classified as O and M according to the Kano methodology.
DI = - 1   ×   O + M A + O + M + I
A general overview of the proposed product or service is determined by computing the total satisfaction index (TSI), defined in Equation (3) [18]. This will provide a more accurate overview of the features that deserve investment in the development stage and should be present in the final product or service. Negative results indicate that the lack of those features will bring dissatisfaction to the potential customer, while positive results are related to the features that will provide satisfaction through their presence. Following this approach, the features with higher TSI values yield a greater impact.
TSI = SI + DI
The interpretation of TSI values also provides a validation of the Kano evaluation results by also considering the values of the other categories in addition to the category that got the most reactions. Following this approach, a TSI < 0 describes a one-dimensional (O) feature, a TSI = 0 shows a reverse (R) feature, and if TSI > 0.1, the feature is attractive (A).
These metrics have been further researched and improved. Matzler et al. [56] introduced the notion of positive customer satisfaction (CS) for SI and negative CS for DI. Later, Jang et al. [57] introduced the average satisfaction coefficient (ASC) to describe the general impact of each feature. A drawback of this metric, defined by Equation (4), arises in the case of functionalities with low SI and high DI, where the ASC for that will be higher than for other features with higher SI and lower DI. For this reason, these metric results will not be analyzed in this research.
ASC = SI +   DI 2
The statistical significance of Kano evaluation is another important aspect that unfortunately is missing in several studies aiming to assess the quality attributes of various products or services. This metric has been proposed by Fong [44] and is defined as follows [24]:
a   -   b   <   z α   × a + b   ×   2   ×   n   -   a   -   b 2   ×   n
where a represents the total frequency of the Kano attribute given most often, b represents the total frequency of the Kano attribute with the second scoring, z α is the standard deviation corresponding to the confidence level α , and n represents the number of respondents.
Only the features identified as significant show that the study is relevant to that feature and that the quality attribute was identified with high precision.

2.4. Data Acquisition and Analysis

Data collection was carried out using the online questionnaire Table A1 designed according to the first step of the Kano methodology previously presented. The online questionnaire was shared with students via Zoom during laboratory activities and through Facebook groups. In the university year 2021–2022, the online questionnaire was shared with the students only during laboratory activities (Zoom video conferences or face-to-face in some cases, depending on university decisions related to the educational process [13,14]) to avoid receiving answers from the same students interviewed in the university year 2020–2021.
Taking into account the research questions defined previously for this study, it is mandatory to provide a statistical overview of the interviewed population. The determination of the sample size uses an optimal stratified survey similar to the methodology applied by Potra et al. [5]. Equation (6) defines the division of a population N into k sub-populations, further called layers.
N = i = 1 k N i = N 1 + N 2 + + N k
The stratified survey depends on the dimension of the n k   random levels of each layer according to Equation (7).
n 1 + n 2 + + n k = i = 1 k n i = n
The determination of the final value of the sample size n considers the linear characteristics as follows [58]:
n     N   ×   z 1 - α 2   ×   σ b 2 N   ×   δ 2 + z 1 - α 2   ×   σ b 2
where:
  • z 1 - α is the value of the Laplace variable for a probability 1   -   α , α representing the significance level;
  • σ b 2 is the variance of the binary characteristic, computed as σ b 2 = p   ×   1   -   p ;
  • δ 2 represents the probable error.
To ensure the reduction of statistical uncertainty, this research applies maximum variance, and the sample design considers the worst-case scenario, consisting of a binary characteristic’s variance σ b 2 = 0 . 5   ×   1   -   0 . 5 = 0 . 25 .
The optimal value results from dimensioning the n j   level at a maximum efficiency level using Equation (9).
n j = n   ×   N j   ×   σ ^ j i = 1 k N i   ×   σ ^ i , j = 1 ,   2 ,   ,   k

3. Results

This section provides an overview of the results obtained. It starts with a statistical characterization of the students who answered the questionnaire in Table A1 and continues with the presentation of the collected data and their classification based on the Kano methodology. Furthermore, this section also presents the calculated satisfaction metrics and emphasizes the relevance of this study based on the application of the Fong test [24,44].

3.1. Distribution of the Collected Data

Similarly to the research by Potra et al. [5], the analysis of the results begins with the calculation of the sample size, which will provide a better description of the distribution of the respondents. Table 2 shows the distribution of the students in four layers n 1 , n 2 , n 3 , and n 4 ; the calculated sample size is n = 394 students (number of respondents to the questionnaire from Table A1).
The total population is approximately N = 6000 students if the total number of AC and MPT students is summed. The value approximately considers the approach that at each time during this study, the number of students remained approximately constant (the students who graduated after the university year 2020–2021 were replaced by the same number of first-year students in the university year 2021–2022). This research considers a significance level α = 0.05, the confidence level being (1 − α = 95%), and therefore, z 0 . 95 = 1 . 96 . The worst-case scenario was considered in the sample design process; therefore, the variance of the binary characteristic is σ b   2 = 0 . 50   ×   1   -   0 . 50 = 0 . 25 . This approach leads to an average error of 4.77% for our study.

3.2. Kano Evaluation Results

The data collected were analyzed according to the Kano methodology described in Section 2.3. Kano-based analysis. Table 3 shows a parallel overview of the data collected in two successive university years for two engineering programs of studies, one more technical and the other oriented to the management area according to the description from Table 2. A detailed overview of the distribution for each layer is available in the Appendix A (Table A2 and Table A3).
The frequency of each Kano quality attribute for each feature was counted by analyzing the entire sample size of n = 394 students, considered as potential customers of the proposed online educational platform. Identification of the specific Kano category for each feature for the entire sample size is shown in Table 4. Additionally, this table contains the satisfaction metrics (SI, DI, and TSI) calculated according to (1), (2), and (3). The TSI will be considered as the main metric in the section allocated for discussions of the results of this research. Identification of the specific Kano category and the calculated metrics for each feature for the four layers of the sample size are available in the Appendix A (Table A4 and Table A5).
In addition to the statistical overview of the population interviewed using the Kano-based questionnaire (Table 2), this research also performs a significance test of the results obtained after the Kano evaluation by applying the Fong test according to Equation (5). This calculation considers a standard deviation z 0 . 95 = 1 . 96 corresponding to a confidence level of 95%. For each layer of the sample, the worst-case scenario was considered by considering the total population as N = 6000 students. Consequently, the statistical evaluation in Table 5 shows the relevance, also called “significance”, of the Kano categories identified for each feature. This validates current research by focusing on positive and negative results from a relevance perspective. In practice, this highlights the features that need further research (e.g., market research, costs analysis, etc.) before taking a final decision regarding their implementation.

4. Discussion

This study used as input the responses of 394 students compared to the 57 responses collected and analyzed in the previous study by Pop et al. [12]. The results of the Kano evaluation for the entire study are illustrated in Table 3 and Table 4. The increase in sample size led to greater precision in identifying the right quality attribute. Previous research had deficiencies in the classification of feature 1 (i.e., New educational resources suggestion) and feature 13 (i.e., Groups creation service) due to the reduced number of responses that led to the equality of two quality attributes, attractive (A) and indifferent (I), respectively, one-dimensional (O) and indifferent (I). In this case, Berger et al. [18] recommended the selection of the Kano category considering that the following order in terms of customer satisfaction M > O > A > I would lead to a wrong implementation decision. The current study shows that features 1 and 13 can be clearly categorized as indifferent (I), and implementation costs can be reduced by neglecting these features. Feature 9 (i.e., Automatic presence after full visualisation of live/recorded course) and feature 17 (i.e., Administrative responsible-based feedback) were both classified as attractive (A) by Pop et al. [12] but with small differences to the second most chosen quality attribute (i.e., indifferent (I) in both cases) became also indifferent (I) features after the current Kano evaluation. This last change can also be the outcome of the alternation of fully online with a hybrid education system, with students’ lived experiences in both cases making them no longer interested in these features.
Figure 3 shows the general comparative overview of the expectations for the online educational platforms of the layers n 1 , n 2 , n 3 , and n 4 of the chosen sample for the features that were significant after applying the Fong test (Table 5). In the Appendix A, Figure A1, Figure A2, Figure A3 and Figure A4 illustrate different comparisons between layers that will be discussed in Section 4. Discussion. Furthermore, the TSI was also considered for the entire sample size. These results demonstrate that the biggest needs of the students are related to the requirements describing their access to administrative resources, this category of requirements having most of the features rated as significant according to the Fong test. This is understandable because these needs are permanently independent of the chosen educational system (i.e., face-to-face, fully online, or hybrid) and the field of study.
The analysis of the TSI for feature 7 (i.e., Blocking unauthorized recording/screenshots) demonstrates a clear classification as reverse (R), for the students from AC (layers n 1 and n 3 ) and for the entire sample. Even if MPT students classified this feature as reverse (R) according to the Kano evaluation, the TSI analysis shows a deviation compared to the TSI values obtained for AC students. This shows a better overview and understandability of the possible consequences related to this feature by the background in computer and information technology of AC students. The change in TSI between the university year 2020–2021 and 2021–2022 of MPT students shows a strong relationship with their experience with the online educational system. In this way, the students represented in layer n 4 are completely dissatisfied with the existence of this feature compared with layer n 2 , two years of online education making them realize that taking screenshots or short recordings of some important explanations during the course can be helpful in their professional career or for exams.
According to the calculated TSI, feature 8 (i.e., Course recording availability) is generally considered one-dimensional (O) for the entire sample. The TSI shows a tendency of the students from layer n 4 to consider this feature as reverse (R) even if this was not directly expressed in their responses in the Kano questionnaire. The reason behind this is the longer periods of the hybrid education system compared to the students representing the other layers that made students consider this feature useless because the courses were better understood in the face-to-face laboratory or seminar activities. In the university year 2020–2021, both categories of students (layers n 1 and n 2 ) had negative values for TSI, these values showing the one-dimensional (O) characteristic. Respondents find their appearance uncomfortable in course recordings from online video conference sessions, and this explains why the values of DI are greater than SI. One year later, this feeling changed after putting in balance the advantage of having all-time access to these recordings that contain additional explanations in many cases compared to the received written material. The TSI obtained for the layer n 3 describes a tendency for this feature to become attractive (A) for this category of students even if they expressed it as one-dimensional (O) in the Kano questionnaire.
Features 14 (i.e., Access to the updated scholar situation), 15 (i.e., Online administrative documents/requests evidence and submission), and 16 (i.e., Online studies related payments possibility) are one-dimensional (O) also according to TSI analysis. These features show that there is no correlation between the need for access to administrative resources and previous experience with the online educational system or the field of study. Small deviations are observable for feature 16 in the case of layers n 1 and n 4 , explainable for the latter through face-to-face interactions with the faculty secretariat as a need to know and integrate with the university environment during the periods when students had face-to-face activities.

5. Conclusions

This work extended the research by Pop et al. [12] by providing a comparison between the data collected from two engineering study faculties, one related to computer science and information technology and the other oriented to the management field. Furthermore, this study used as input the answers of 394 students compared to 57 responses collected and analyzed in the previous study by Pop et al. [12]. From this arises the need for choosing the right sample size in order to have accurate results that better describe the population’s perceptions on a product or service.
The research context and related work were presented. Furthermore, the methodology applied in this research was presented in detail to facilitate the reproducibility of the experiment or the identification of possible improvements.
This research used online surveys as a method to collect data between November 2020 and April 2022 from the student community of the Politehnica University of Timisoara. The questionnaire design followed the Kano methodology and considered 17 possible features of a new online educational platform. The feature selection process used a literature review to identify features that have a direct impact on student engagement and motivation. The other features focused on basic technical or administrative needs, such as availability on multiple types of devices, the creation of a custom profile, the existence of messaging services, and access to personal documents handled by the university. The analysis results emphasize the need of students for online educational platforms even in the post-pandemic times.
This study performed a double statistical validation, one on the choice of sample size considering relevance compared to population size, and the other on the significance of the results obtained after applying the Kano evaluation. The second validation is lacking in most studies that use the Kano methodology to assess the expectations of potential customers about the proposed features of a new product or service. This article presented both positive and negative results obtained in this study and responded to the research questions formulated only based on the validated results. The implementation of online educational platforms following the proposed methodology leads to sustainable platforms designed according to the customer’s (i.e., students’) needs and allows efficient management of resources involved in the development of this type of software product.
Further work can address a discussion on the correlation between current results and the results of a self-stated importance questionnaire. Furthermore, a deeper analysis of the features marked as indifferent (I) and reverse (R) should be performed.

Author Contributions

Conceptualization, M.-D.P., S.A.P. and A.P.P.; methodology, M.-D.P., S.A.P. and A.P.P.; validation, A.P.P. and V.-I.C.; formal analysis, M.-D.P. and A.P.P.; investigation, M.-D.P. and V.-I.C.; data curation, M.-D.P.; writing—original draft preparation, M.-D.P.; writing—review and editing, S.A.P., A.P.P. and V.-I.C.; supervision, A.P.P. and V.-I.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Board of the Politehnica University of Timisoara with the approval number 13276/03.06.2022.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are openly available in Zenodo at doi:10.5281/zenodo.6555817, version 1.0.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Questionnaire designed according to Kano methodology to identify the student’s perceptions about the functionalities of a new online educational platform (dataset) * [59].
Table A1. Questionnaire designed according to Kano methodology to identify the student’s perceptions about the functionalities of a new online educational platform (dataset) * [59].
Faculty:
Year of Study:
□ I acknowledge that no personal data will be collected by completing this form.
□ I acknowledge that the options expressed in this form will only be analyzed and used for scientific purposes.
Students engagement requirements
1. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES personalized suggestions for other educational resources (e.g., other online courses, articles in scientific journals, practical applications, etc.) based on previous enrollments in various courses?
DOES NOT PROVIDE personalized suggestions for other educational resources (e.g., other online courses, articles in scientific journals, practical applications, etc.) based on previous enrollments in various courses?
2. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES quizzes during courses (live/recorded courses) with a chance to win badges?
DOES NOT PROVIDE quizzes during courses (live/recorded courses) with a chance to win badges?
3. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES the possibility of using VR (Virtual Reality) technology in laboratory experiments?
DOES NOT PROVIDE the possibility of using VR (Virtual Reality) technology in laboratory experiments?
4. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES the possibility to evaluate each course (course rating)?
DOES NOT PROVIDE the possibility to evaluate each course (course rating)?
Technical and security requirements
5. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES automatic real-time subtitle generation during the course?
DOES NOT PROVIDE automatic real-time subtitle generation during the course?
6. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES the same functionalities on any device (desktop, tablet, mobile phone)?
DOES NOT PROVIDE the same functionalities on any device (desktop, tablet, mobile phone)?
7. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
BLOCKS screenshot and video desktop recording services while the online platform is open?
DOES NOT BLOCK screenshot and video desktop recording services while the online platform is open?
8. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES the availability on the platform of the course recording after it took place?
DOES NOT PROVIDE the availability on the platform of the course recording after it took place?
9. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES automatic registration of student attendance after the full viewing of the course in live format or later based on its registration available on the platform?
DOES NOT PROVIDE automatic registration of student attendance after the full viewing of the course in live format or later based on its registration available on the platform?
Communication
10. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES options for creating and customizing a user profile (the profile will contain fields such as: name, surname, profile picture, cover photo, interests, appreciations, badges obtained)?
DOES NOT PROVIDE options for creating and customizing a user profile (the profile will contain fields such as: name, surname, profile picture, cover photo, interests, appreciations, badges obtained)?
11. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES messaging service with other users (students, teachers, secretariat)?
DOES NOT PROVIDE messaging service with other users (students, teachers, secretariat)?
12. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES a discussion forum to which both students and teachers have access?
DOES NOT PROVIDE a discussion forum to which both students and teachers have access?
13. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES the possibility to create discussion groups between any categories of users (students, teachers, administrative—secretariat)?
DOES NOT PROVIDE the possibility to create discussion groups between any categories of users (students, teachers, administrative—secretariat)?
Requirements for access to administrative resources
14. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES access to the updated school situation (e.g., grades, credits obtained)?
DOES NOT PROVIDE access to the updated school situation (e.g., grades, credits obtained)?
15. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES digitized administrative communication (description: allows the upload of various documents—study contracts, applications for the issue of certificates, the certificates issued by the secretariat, etc.)?
DOES NOT PROVIDE digitized administrative communication (description: allows the upload of various documents—study contracts, applications for the issue of certificates, the certificates issued by the secretariat, etc.)?
16. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES the possibility to pay online the study fees (e.g., study fee for those enrolled in the form with fee, failed exams fees, fee for the 3rd/special presentation)?
DOES NOT PROVIDE the possibility to pay online the study fees (e.g., study fee for those enrolled in the form with fee, failed exams fees, fee for the 3rd/special presentation)?
17. How do you feel if an online educational platform:
I like it that wayIt must be that wayI am neutralI can live with it that wayI dislike it that way
PROVIDES the possibility to report administrative problems directly to the responsible person (e.g., electrician, IT technician, carpenter, cleaning manager if the transmission is made from the faculty or the education is in a hybrid system)?
DOES NOT PROVIDE the possibility to report administrative problems directly to the responsible person (e.g., electrician, IT technician, carpenter, cleaning manager if the transmission is made from the faculty or the education is in a hybrid system)?
* Importance-related questions are not considered for current research and were not specified in this table.
Table A2. Distribution of the students’ reactions to the defined features for a new online educational platform for the university year 2020–2021 (dataset) [59].
Table A2. Distribution of the students’ reactions to the defined features for a new online educational platform for the university year 2020–2021 (dataset) [59].
Feature No. Layer   n 1 Layer   n 2
AMIORQAMIORQ
1.6514702842113101001
2.6268219131112101011
3.84753362115171200
4.452658476174121110
5.46599321084121100
6.37242795008731700
7.1150312710240290
8.242517117005542100
9.63137133217314920
10.426117171010415420
11.34255073018671400
12.391565640071111510
13.551069480151161300
14.133114125002312900
15.173127107103662000
16.282423107105462000
17.392076480033131411
Table A3. Distribution of the students’ reactions to the defined features for a new online educational platform for the university year 2021–2022 (dataset) [59].
Table A3. Distribution of the students’ reactions to the defined features for a new online educational platform for the university year 2021–2022 (dataset) [59].
Feature No. Layer   n 3 Layer   n 4
AMIORQAMIORQ
1.137211001113101001
2.151211410112101011
3.24211123015171200
4.8518201074121110
5.17215162084121100
6.710629008731700
7.111203800240290
8.105433005542100
9.1231819007314920
10.20218102010415420
11.1251124008671400
12.15314190171111510
13.1732570051161300
14.512232102312900
15.1012722103662000
16.512629005462000
17.13817140033131411
Table A4. Kano evaluation results and satisfaction metrics overview for the university year 2020–2021 (dataset) [59].
Table A4. Kano evaluation results and satisfaction metrics overview for the university year 2020–2021 (dataset) [59].
Feature No. Layer   n 1 Layer   n 2 n 1 + n 2
Kano CategorySIDITSIKano CategorySIDITSIKano CategorySIDITSI
1.I0.53−0.240.29A0.56−0.270.29A0.54−0.250.29
2.I0.48−0.150.33I0.52−0.160.36I0.49−0.150.34
3.A0.67−0.240.43A0.65−0.330.32A0.66−0.270.39
4.I0.52−0.410.11I0.54−0.390.15I0.53−0.410.12
5.I0.43−0.200.23I0.57−0.320.25I0.49−0.250.24
6.O0.72−0.650.07O0.57−0.68−0.11O0.66−0.660.00
7.R0.07−0.070.00R0.14−0.080.06R0.10−0.080.02
8.O0.77−0.78−0.01O0.59−0.71−0.12O0.70−0.75−0.05
9.I0.53−0.260.27O0.62−0.450.17I0.57−0.330.24
10.I0.32−0.130.19I0.47−0.290.18I0.38−0.190.19
11.O0.59−0.540.05O0.54−0.56−0.02O0.57−0.550.02
12.I0.56−0.430.13I0.52−0.430.09I0.55−0.430.12
13.I0.57−0.320.25I0.56−0.390.17I0.56−0.350.21
14.O0.75−0.85−0.10O0.66−0.75−0.09O0.72−0.81−0.09
15.O0.68−0.76−0.08O0.67−0.73−0.06O0.68−0.75−0.07
16.O0.74−0.720.02O0.65−0.74−0.09O0.71−0.73−0.02
17.I0.48−0.370.11I0.54−0.450.09I0.50−0.400.10
Table A5. Kano evaluation results and satisfaction metrics overview for the university year 2021–2022 (dataset) [59].
Table A5. Kano evaluation results and satisfaction metrics overview for the university year 2021–2022 (dataset) [59].
Feature No. Layer   n 3 Layer   n 4 n 3 +   n 4
Kano CategorySIDITSIKano CategorySIDITSIKano CategorySIDITSI
1.I0.45−0.330.12A0.62−0.380.24I0.52−0.350.17
2.I0.57−0.290.28A0.64−0.360.28I0.60−0.320.28
3.A0.73−0.290.44A0.77−0.370.40A0.75−0.320.43
4.O0.55−0.490.06I0.53−0.440.09O0.54−0.470.07
5.A0.66−0.360.30I0.54−0.430.11I0.61−0.390.22
6.O0.69−0.75−0.06O0.71−0.690.02O0.70−0.72−0.02
7.R0.07−0.070.00R0.00−0.33−0.33R0.05−0.15−0.10
8.O0.83−0.730.10O0.74−0.740.00O0.79−0.740.05
9.O0.60−0.420.18I0.48−0.360.12I0.55−0.400.15
10.A0.60−0.240.36I0.42−0.240.18I0.53−0.240.29
11.O0.69−0.560.13O0.63−0.570.06O0.67−0.560.11
12.O0.67−0.430.24O0.65−0.470.18O0.66−0.450.21
13.I0.46−0.190.27I0.51−0.400.11I0.48−0.280.20
14.O0.73−0.86−0.13O0.89−0.91−0.02O0.79−0.88−0.09
15.O0.63−0.67−0.04O0.66−0.74−0.08O0.64−0.70−0.06
16.O0.65−0.79−0.14O0.71−0.690.02O0.68−0.75−0.07
17.I0.52−0.420.10O0.52−0.520.00I0.52−0.460.06
Table A6. Fong test results for the university year 2020–2021 (dataset) [59].
Table A6. Fong test results for the university year 2020–2021 (dataset) [59].
Feature No. Layer   n 1 Layer   n 2 n 1 + n 2
a     b z α a + b 2 n     a     b 2 n a     b z α a + b 2 n     a     b 2 n a     b z α a + b 2 n     a     b 2 n
1.5.0018.096.0014.691.0023.31
2.20.0018.329.0015.0629.0023.72
3.31.0018.1521.0014.5752.0023.29
4.11.0016.967.0014.0719.0022.00
5.53.0018.346.0014.6159.0023.49
6.58.0018.0110.0014.6181.0022.85
7.77.0018.7455.0015.41132.0024.26
8.92.0018.2726.0014.77118.0023.51
9.8.0018.068.0014.8422.0022.82
10.75.0018.5921.0014.7396.0023.80
11.23.0017.7112.0013.8935.0022.55
12.1.0017.916.0014.237.0022.90
13.14.0017.757.0014.5728.0022.76
14.94.0018.5435.0014.94129.0023.85
15.76.0018.1733.0014.88109.0023.49
16.79.0018.0940.0014.91123.0023.36
17.28.0017.756.0013.7730.0022.64
Table A7. Fong test results for the university year 2021–2022 (dataset) [59].
Table A7. Fong test results for the university year 2021–2022 (dataset) [59].
Feature No. Layer   n 3 Layer   n 4 n 3   + n 4
a     b z α a + b 2 n     a     b 2 n a     b z α a + b 2 n     a     b 2 n a     b z α a + b 2 n     a     b 2 n
1.8.009.381.007.517.0012.02
2.6.009.511.007.515.0012.13
3.12.009.513.007.9815.0012.43
4.2.009.631.007.701.0012.34
5.1.009.301.007.702.0011.83
6.19.009.689.007.8629.0012.43
7.26.009.9925.008.1951.0012.91
8.23.009.8416.007.9239.0012.65
9.1.009.575.007.704.0012.29
10.2.009.635.007.863.0012.43
11.12.009.516.007.6118.0012.19
12.4.009.384.007.929.0012.24
13.8.009.813.008.0819.0012.43
14.20.009.8826.008.1746.0012.82
15.10.009.3814.007.9224.0012.29
16.17.009.7714.007.9233.0012.51
17.3.009.141.007.982.0012.19
Table A8. Fong test results for the period 2020–2022 (dataset) [59].
Table A8. Fong test results for the period 2020–2022 (dataset) [59].
Feature No. n 1 + n 3 n 2 +   n 4 n
a     b z α a + b 2 n     a     b 2 n a     b z α a + b 2 n     a     b 2 n a     b z α a + b 2 n     a     b 2 n
1.13.0020.397.0016.516.0026.25
2.26.0020.668.0016.8834.0026.68
3.44.0020.4727.0016.5173.0026.27
4.9.0019.559.0016.0018.0025.26
5.51.0020.5910.0016.4061.0026.35
6.80.0020.3620.0016.55110.0026.02
7.103.0021.2380.0017.45183.0027.48
8.116.0020.7442.0016.76162.0026.61
9.14.0020.253.0016.7334.0025.71
10.73.0020.9726.0016.6999.0026.86
11.36.0020.0719.0015.8055.0025.58
12.4.0020.192.0016.312.0025.97
13.22.0020.3110.0016.6947.0025.94
14.114.0021.0161.0017.07175.0027.08
15.86.0020.4747.0016.86133.0026.52
16.100.0020.4756.0016.82156.0026.50
17.31.0019.981.0016.1832.0025.71
Figure A1. Total satisfaction index (TSI)—expectations overview of the students from Automation and Computing (AC).
Figure A1. Total satisfaction index (TSI)—expectations overview of the students from Automation and Computing (AC).
Sustainability 14 14241 g0a1
Figure A2. Total satisfaction index (TSI)—expectations overview of the students from Management in Production and Transportation (MPT).
Figure A2. Total satisfaction index (TSI)—expectations overview of the students from Management in Production and Transportation (MPT).
Sustainability 14 14241 g0a2
Figure A3. Total satisfaction index (TSI)—general expectations overview for university year 2020–2021.
Figure A3. Total satisfaction index (TSI)—general expectations overview for university year 2020–2021.
Sustainability 14 14241 g0a3
Figure A4. Total satisfaction index (TSI)—general expectations overview for university year 2021–2022.
Figure A4. Total satisfaction index (TSI)—general expectations overview for university year 2021–2022.
Sustainability 14 14241 g0a4

References

  1. Liu, Z.; Han, Z. Exploring Trends of Potential User Experience of Online Classroom on Virtual Platform for Higher Education during COVID-19 Epidemic: A Case in China. In Proceedings of the 2020 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE), IEEE, Takamatsu, Japan, 8–11 December 2020; pp. 742–747. [Google Scholar]
  2. Barbu, A.; Popescu, M.A.M.; Moiceanu, G. Perspective of Teachers and Students towards the Education Process during COVID-19 in Romanian Universities. Int. J. Environ. Res. Public Health 2022, 19, 3409. [Google Scholar] [CrossRef] [PubMed]
  3. Edelhauser, E.; Lupu-Dima, L. Is Romania Prepared for ELearning during the COVID-19 Pandemic? Sustainability 2020, 12, 5438. [Google Scholar] [CrossRef]
  4. Iosif, L.; Ţâncu, A.M.C.; Didilescu, A.C.; Imre, M.; Gălbinașu, B.M.; Ilinca, R. Self-Perceived Impact of COVID-19 Pandemic by Dental Students in Bucharest. Int. J. Environ. Res. Public Health 2021, 18, 5249. [Google Scholar] [CrossRef] [PubMed]
  5. Potra, S.; Pugna, A.; Pop, M.-D.; Negrea, R.; Dungan, L. Facing COVID-19 Challenges: 1st-Year Students’ Experience with the Romanian Hybrid Higher Educational System. Int. J. Environ. Res. Public Health 2021, 18, 3058. [Google Scholar] [CrossRef]
  6. Guncaga, J.; Lopuchova, J.; Ferdianova, V.; Zacek, M.; Ashimov, Y. Survey on Online Learning at Universities of Slovakia, Czech Republic and Kazakhstan during the COVID-19 Pandemic. Educ. Sci. 2022, 12, 458. [Google Scholar] [CrossRef]
  7. Shevchenko, V.; Malysh, N.; Tkachuk-Miroshnychenko, O. Distance Learning in Ukraine in COVID-19 Emergency. Open Learn. J. Open Distance E-Learn. 2021, 1–16. [Google Scholar] [CrossRef]
  8. Baksa, T.; Luić, L. FROM FACE-TO-FACE TO REMOTE LEARNING IN TIMES OF COVID 19 CRISIS IN CROATIA. In Proceedings of the 13th annual International Conference of Education, Research and Innovation, Online Conference, 9–10 November 2020; pp. 9318–9326. [Google Scholar] [CrossRef]
  9. Molnar, G.; Namesztovszki, Z.; Glusac, D.; Karuovic, D.; Major, L. Solutions, Experiences in Online Education in Hungary and Serbia Related to the Situation Caused by COVID-19. In Proceedings of the 2020 11th IEEE International Conference on Cognitive Infocommunications (CogInfoCom), IEEE, Mariehamn, Finland, 23–25 September 2020; pp. 000601–000606. [Google Scholar] [CrossRef]
  10. Mitescu-Manea, M.; Safta-Zecheria, L.; Neumann, E.; Bodrug-Lungu, V.; Milenkova, V.; Lendzhova, V. Inequities in First Education Policy Responses to the COVID-19 Crisis: A Comparative Analysis in Four Central and East European Countries. Eur. Educ. Res. J. 2021, 20, 543–563. [Google Scholar] [CrossRef]
  11. Eurydice (European Education and Culture Executive Agency); Horváth, A.; Motiejūnaitė-Schulmeister, A.; Noorani, S.; Monseur, C. Teaching and Learning in Schools in Europe during the COVID-19 Pandemic: 2020/2021; Publications Office of the European Union: Luxembourg, France, 2022; ISBN 9789294881038. [Google Scholar] [CrossRef]
  12. Pop, M.-D.; Potra, S.A.; Pugna, A.P. ROMANIAN STUDENTS EXPECTATIONS FROM EDUCATIONAL ONLINE PLATFORMS IN THE AGE OF COVID-19. Proceedings of 15th International Technology, Education and Development Conference (INTED 2021), Online Conference, 8–10 March 2021; pp. 10372–10381. [Google Scholar]
  13. Hotărâre a Consiliului de Administrație al Universității Politehnica Timișoara Nr. 23/11.03.2020. 2020. Politehnica University of Timisoara. Available online: http://www.upt.ro/img/files/hca/2020/HCA_23_din_11.03.2020.pdf (accessed on 29 July 2022).
  14. Hotărâre a Senatului Universității Politehnica Timișoara Nr. 185/14.09.2020. Politehnica University of Timisoara. 2020. Available online: http://www.upt.ro/img/files/hs/2020/14.09.2020/HS_185_14.09.2020_aprobare_modalitati%20incepere_an_univ_2020-2021.pdf (accessed on 29 July 2022).
  15. Kano, N.; Seraku, N.; Takahashi, F.; Shin-ichi, T. Attractive Quality and Must-Be Quality. J. Jpn. Soc. Qual. Control 1984, 14, 147–156. [Google Scholar] [CrossRef]
  16. Wu, H.-Y.; Lin, H.-Y. A Hybrid Approach to Develop an Analytical Model for Enhancing the Service Quality of E-Learning. Comput. Educ. 2012, 58, 1318–1338. [Google Scholar] [CrossRef]
  17. Fujs, D.; Vrhovec, S.; Žvanut, B.; Vavpotič, D. Improving the Efficiency of Remote Conference Tool Use for Distance Learning in Higher Education: A Kano Based Approach. Comput. Educ. 2022, 181, 104448. [Google Scholar] [CrossRef]
  18. Berger, C.; Blauth, R.; Boger, D. Kano’s methods for understanding customer-defined quality. Cent. Qual. Manag. J. 1993, 2, 3–36. [Google Scholar]
  19. Bhardwaj, J.; Yadav, A.; Chauhan, M.S.; Chauhan, A.S. Kano Model Analysis for Enhancing Customer Satisfaction of an Automotive Product for Indian Market. Mater. Today Proceed. 2021, 46, 10996–11001. [Google Scholar] [CrossRef]
  20. Kermanshachi, S.; Nipa, T.J.; Nadiri, H. Service Quality Assessment and Enhancement Using Kano Model. PLoS ONE 2022, 17, e0264423. [Google Scholar] [CrossRef] [PubMed]
  21. Kohli, A.; Singh, R. An Assessment of Customers’ Satisfaction for Emerging Technologies in Passenger Cars Using Kano Model. XJM 2021, 18, 76–88. [Google Scholar] [CrossRef]
  22. Rampal, A.; Mehra, A.; Singh, R.; Yadav, A.; Nath, K.; Chauhan, A.S. Kano and QFD Analyses for Autonomous Electric Car: Design for Enhancing Customer Contentment. Mater. Today Proceed. 2022, 62, 1481–1488. [Google Scholar] [CrossRef]
  23. Hen-Yi, J.; Miriam, A.; Sibrian, B. Investigating the Service Quality Attributes of International Backpackers using the Kano Model, The 11th Asia Pacific Industrial Engineering and Management Systems Conference, December 2010. 2022. Available online: https://www.researchgate.net/publication/266496344 (accessed on 29 July 2022).
  24. Sireli, Y.; Kauffmann, P.; Ozan, E. Integration of Kano’s Model Into QFD for Multiple Product Design. IEEE Trans. Eng. Manag. 2007, 54, 380–390. [Google Scholar] [CrossRef]
  25. Ginting, R.; Hidayati, J.; Zulfin, M. Kano Questionnaire for the Assessment of Product Attributes of Alternative Power Plants in Kuala Sub-District. IOP Conf. Ser. Mater. Sci. Eng. 2019, 505, 012069. [Google Scholar] [CrossRef]
  26. Choudhury, S.; Pattnaik, S. Emerging Themes in E-Learning: A Review from the Stakeholders’ Perspective. Comput. Educ. 2020, 144, 103657. [Google Scholar] [CrossRef]
  27. Wang, A.I.; Tahir, R. The Effect of Using Kahoot! For Learning—A Literature Review. Comput. Educ. 2020, 149, 103818. [Google Scholar] [CrossRef]
  28. Ahn, S.J.; Nowak, K.L.; Bailenson, J.N. Unintended Consequences of Spatial Presence on Learning in Virtual Reality. Comput. Educ. 2022, 186, 104532. [Google Scholar] [CrossRef]
  29. Ibáñez, M.B.; Portillo, A.U.; Cabada, R.Z.; Barrón, M.L. Impact of Augmented Reality Technology on Academic Achievement and Motivation of Students from Public and Private Mexican Schools. A Case Study in a Middle-School Geometry Course. Comput. Educ. 2020, 145, 103734. [Google Scholar] [CrossRef]
  30. Khan, T.; Johnston, K.; Ophoff, J. The Impact of an Augmented Reality Application on Learning Motivation of Students. Adv. Hum. Comput. Interact. 2019, 2019, 1–14. [Google Scholar] [CrossRef] [Green Version]
  31. Borg, M.E.; Butterfield, K.M.; Wood, E.; Zhang, H.H.; Pinto, S. Investigating the Impacts of Personality on the Use and Perceptions of Online Collaborative Platforms in Higher Education. SN Soc. Sci. 2021, 1, 40. [Google Scholar] [CrossRef] [PubMed]
  32. Emanuel, F.; Ricchiardi, P.; Sanseverino, D.; Ghislieri, C. Make Soft Skills Stronger? An Online Enhancement Platform for Higher Education. Int. J. Educ. Res. Open 2021, 2, 100096. [Google Scholar] [CrossRef]
  33. Sridharan, S.; Bondy, M.; Nakaima, A.; Heller, R.F. The Potential of an Online Educational Platform to Contribute to Achieving Sustainable Development Goals: A Mixed-Methods Evaluation of the Peoples-Uni Online Platform. Health Res. Policy Syst. 2018, 16, 106. [Google Scholar] [CrossRef]
  34. Veeraiyan, D.N.; Varghese, S.S.; Rajasekar, A.; Karobari, M.I.; Thangavelu, L.; Marya, A.; Messina, P.; Scardina, G.A. Comparison of Interactive Teaching in Online and Offline Platforms among Dental Undergraduates. Int. J. Environ. Res. Public Health 2022, 19, 3170. [Google Scholar] [CrossRef]
  35. Wieser, D.; Seeler, J.M.; Sixl-Daniell, K.; Zehrer, A. Online Students’ Expectations Differ: The Advantage of Assessing Students’ Expectations in Online Education. In Proceedings of the 3rd International Conference on Higher Education Advances, Universitat Politècnica València, Valencia, Spain, 21–23 June 2017. [Google Scholar]
  36. Liu, B.; Chen, H.; Junmei, H. Research on the Platform of Online Education Platform Based on Cloud Computing. In Proceedings of the 2019 14th International Conference on Computer Science & Education (ICCSE), IEEE, Toronto, ON, Canada, 19–21 August 2019; pp. 22–25. [Google Scholar]
  37. Zhang, W.; Chen, H.; Huang, X.; Liu, H.; Shu, J. Online Teaching Platform—Hstar and Its Application in Higher Education. In Proceedings of the 2017 International Symposium on Educational Technology (ISET), Hong Kong, China, 27–29 June 2017; pp. 95–98. [Google Scholar]
  38. Wang, Y.; Fang, X.; Luo, Y. Design Research on Customized Online Education Platform Catering to Business Demands. In HCI International 2018—Posters’ Extended Abstracts; Stephanidis, C., Ed.; Springer International Publishing: Cham, Switzerland, 2018; Volume 852, pp. 124–130. ISBN 9783319922843. [Google Scholar]
  39. Yuan, J. Implementation of the Online Teaching Platform Design Based on the Web. In Proceedings of the 2021 2nd International Conference on Education, Knowledge and Information Management (ICEKIM), IEEE, Xiamen, China, 29–31 January 2021; pp. 243–247. [Google Scholar]
  40. Chen, X.; Lu, J.; Gong, M.; Guo, B.-J.; Xu, Y. Design And Implementation Of Decentralized Online Education Platform. In Proceedings of the 2020 5th International Conference on Mechanical, Control and Computer Engineering (ICMCCE), IEEE, Harbin, China, 25–27 December 2020; pp. 970–974. [Google Scholar]
  41. Abuhassna, H.; Al-Rahmi, W.M.; Yahya, N.; Zakaria, M.A.Z.M.; Kosnin, A.B.M.; Darwish, M. Development of a New Model on Utilizing Online Learning Platforms to Improve Students’ Academic Achievements and Satisfaction. Int. J. Educ. Technol. High Educ. 2020, 17, 38. [Google Scholar] [CrossRef]
  42. Zhou, L.; Xue, S.; Li, R. Extending the Technology Acceptance Model to Explore Students’ Intention to Use an Online Education Platform at a University in China. SAGE Open 2022, 12, 215824402210852. [Google Scholar] [CrossRef]
  43. Teo, T.; Khazaie, S.; Derakhshan, A. Exploring Teacher Immediacy-(Non)Dependency in the Tutored Augmented Reality Game-Assisted Flipped Classrooms of English for Medical Purposes Comprehension among the Asian Students. Comput. Educ. 2022, 179, 104406. [Google Scholar] [CrossRef]
  44. Fong, D. Using the self-stated importance questionnaire to interpret Kano questionnaire results. Cent. Qual. Manag. J. 1996, 5, 21–24. [Google Scholar]
  45. Martin, S.; Ruiz-Rube, I.; López-Martín, E.; Calvo, J.L.; Lopez, R. Design and Evaluation of a Collaborative Educational Game: BECO Games. Sustainability 2020, 12, 8471. [Google Scholar] [CrossRef]
  46. Sáiz-Manzanares, M.C.; Martin, C.F.; Alonso-Martínez, L.; Almeida, L.S. Usefulness of Digital Game-Based Learning in Nursing and Occupational Therapy Degrees: A Comparative Study at the University of Burgos. Int. J. Environ. Res. Public Health 2021, 18, 1757. [Google Scholar] [CrossRef] [PubMed]
  47. Göksün, D.O.; Gürsoy, G. Comparing Success and Engagement in Gamified Learning Experiences via Kahoot and Quizizz. Comput. Educ. 2019, 135, 15–29. [Google Scholar] [CrossRef]
  48. Ross, B.; Chase, A.-M.; Robbie, D.; Oates, G.; Absalom, Y. Adaptive Quizzes to Increase Motivation, Engagement and Learning Outcomes in a First Year Accounting Unit. Int. J. Educ. Technol. High Educ. 2018, 15, 30. [Google Scholar] [CrossRef] [Green Version]
  49. Lee, H. Shvetsova The Impact of VR Application on Student’s Competency Development: A Comparative Study of Regular and VR Engineering Classes with Similar Competency Scopes. Sustainability 2019, 11, 2221. [Google Scholar] [CrossRef] [Green Version]
  50. MacKay, J.R.D. Show and ‘Tool’: How Lecture Recording Transforms Staff and Student Perspectives on Lectures in Higher Education. Comput. Educ. 2019, 140, 103593. [Google Scholar] [CrossRef]
  51. Schmitz, S.M.; Schipper, S.; Lemos, M.; Alizai, P.H.; Kokott, E.; Brozat, J.F.; Neumann, U.P.; Ulmer, T.F. Development of a Tailor-made Surgical Online Learning Platform, Ensuring Surgical Education in Times of the COVID-19 Pandemic. BMC Surg. 2021, 21, 196. [Google Scholar] [CrossRef]
  52. Morris, N.P.; Swinnerton, B.; Coop, T. Lecture Recordings to Support Learning: A Contested Space between Students and Teachers. Comput. Educ. 2019, 140, 103604. [Google Scholar] [CrossRef]
  53. Potra, S.A.; Izvercian, M.; Pugna, A.P.; Dahlgaard, J.J. The HWWP, a Refined IVA-Kano Model for Designing New Delightful Products or Services. Total Qual. Manag. Bus. Excell. 2017, 28, 104–117. [Google Scholar] [CrossRef]
  54. Witell, L.; Löfgren, M.; Dahlgaard, J.J. Theory of Attractive Quality and the Kano Methodology—the Past, the Present, and the Future. Total Qual. Manag. Bus. Excell. 2013, 24, 1241–1252. [Google Scholar] [CrossRef] [Green Version]
  55. Potra, S.A.; Alptekin, H.D.; Pugna, A.; Kucun, N.T.; Ozkara, B.Y.; Pop, M.-D. Challenges in Testing the Kano Model’s Validity through Computer-Assisted Human Behaviour Analysis. In Proceedings of the 2022 IEEE Technology and Engineering Management Conference (TEMSCON EUROPE), IEEE, Izmir, Turkey, 25–29 April 2022; pp. 87–93. [Google Scholar]
  56. Matzler, K.; Hinterhuber, H.H.; Bailom, F.; Sauerwein, E. How to Delight Your Customers. J. Prod. Brand Manag. 1996, 5, 6–18. [Google Scholar] [CrossRef]
  57. Heung-Yeop, J.; Haegeun, S.; Young-Taek, P. Determining the Importance Values of Quality Attributes Using ASC. J. Korean Soc. Qual. Manag. 2012, 40, 589–598. [Google Scholar] [CrossRef]
  58. Isaic-Maniu, A.; Mitrut, C.; Voineagu, V. Statistics; Editura Economica: Bucuresti, Romania, 2004. [Google Scholar]
  59. Pop, M.-D.; Potra, S.A. Using a Hybrid Kano-importance Questionnaire in the Acquisition of Data Related to Students’ Expectations from Online Educational Platforms. Zenodo. 2022. Available online: https://doi.org/10.5281/zenodo.6555817 (accessed on 30 September 2022).
Figure 1. Research methodology.
Figure 1. Research methodology.
Sustainability 14 14241 g001
Figure 2. The Kano evaluation process (generalized after [55]).
Figure 2. The Kano evaluation process (generalized after [55]).
Sustainability 14 14241 g002
Figure 3. Total satisfaction index (TSI)—general expectations overview.
Figure 3. Total satisfaction index (TSI)—general expectations overview.
Sustainability 14 14241 g003
Table 1. Definition of the features for a new proposed online educational platform—entirely retrieved from Pop et al. [12] (pp. 10373–10374).
Table 1. Definition of the features for a new proposed online educational platform—entirely retrieved from Pop et al. [12] (pp. 10373–10374).
Feature No.FeaturesDefinitionCategory
1.New educational resources suggestionpersonalized suggestions for other educational resources (e.g., other online courses, articles in scientific journals, practical applications, etc.) based on previous enrollments in various coursesStudents engagement requirements
2.Interactive quizzes (intrinsic motivation)quizzes during courses (live/recorded courses) with a chance to win badges
3.VR enabling laboratory experimentsthe possibility of using VR technology in laboratory experiments
4.Course evaluationthe possibility of evaluating each course (course rating)
5.Automatic real-time subtitles generationautomatic real-time subtitle generation during the courseTechnical requirements
6.Multiple devices availability (desktop, tablet, mobile phone)the same functionalities on any device (desktop, tablet, mobile phone)
7.Blocking unauthorized recording/screenshotsblocks screenshot and video desktop recording services while the online platform is open
8.Course recording availabilitythe availability on the platform of the course recording after it took place
9.Automatic presence after full visualisation of live/recorded courseautomatic registration of student attendance after full viewing of the course in live format or later based on its registration available on the platform
10.Customized user profileprovides options for creating and customizing a user profile (the profile will contain fields such as name, surname, profile picture, cover photo, interests, appreciations, badges obtained)Communication tools
11.Messaging serviceprovides a messaging service with other users (students, teachers, secretariat)
12.Discussions forumdiscussion forum to which both students and teachers have access
13.Groups creation servicethe possibility to create discussion groups between any categories of users (students, teachers, administrative—secretariat)
14.Access to the updated scholar situationaccess to the updated school situation (e.g., grades, credits obtained)Requirements for access to administrative resources
15.Online administrative documents/requests evidence and submissiondigitized administrative communication (description: allows the upload of various documents—study contracts, applications for the issue of certificates, the certificates issued by the secretariat, etc.)
16.Online studies related payments possibilitythe possibility to pay online the study fees (e.g., study fee for those enrolled in the form with a fee, failed exams fees, the fee for the 3rd/special presentation)
17.Administrative responsible-based feedbackthe possibility to report administrative problems directly to the responsible person (e.g., electrician, IT technician, carpenter, cleaning manager if the transmission is made from the faculty or the education is in a hybrid system)
Table 2. Distribution of questionnaire respondents (dataset) [59].
Table 2. Distribution of questionnaire respondents (dataset) [59].
LayersDescriptionNumber of RespondentsPercentage (%)
n 1 Automation and Computing (AC)—university year 2020–202118346.45
n 2 Management in Production and Transportation (MPT)—university year 2020–202112431.47
n 3 Automation and Computing (AC)—university year 2021–20225213.20
n 4 Management in Production and Transportation (MPT)—university year 2021–2022358.88
Total394100
Table 3. Distribution of the student’s reactions to the defined features for a new online educational platform for the period 2020–2022 (dataset) [59].
Table 3. Distribution of the student’s reactions to the defined features for a new online educational platform for the period 2020–2022 (dataset) [59].
Feature No.FeaturesAMIORQ
1.New educational resources suggestion6514702842
2.Interactive quizzes (intrinsic motivation)6268219131
3.VR enabling laboratory experiments847533621
4.Course evaluation4526584761
5.Automatic real-time subtitles generation465993210
6.Multiple devices availability (desktop, tablet, mobile phone)3724279500
7.Blocking unauthorized recording/screenshots115031271
8.Course recording availability24251711700
9.Automatic presence after full visualisation of live/recorded course6313713321
10.Customized user profile4261171710
11.Messaging service3425507301
12.Discussions forum3915656400
13.Groups creation service5510694801
14.Access to the updated scholar situation13311412500
15.Online administrative documents/requests evidence and submission17312710710
16.Online studies related payments possibility28242310710
17.Administrative responsible-based feedback3920764800
Table 4. Kano evaluation results and satisfaction metrics overview for the period 2020–2022 (dataset) [59].
Table 4. Kano evaluation results and satisfaction metrics overview for the period 2020–2022 (dataset) [59].
Feature No. n 1 + n 3 n 2 + n 4 n
Kano CategorySIDITSIKano CategorySIDITSIKano CategorySIDITSI
1.I0.51−0.260.25A0.57−0.300.27I0.53−0.270.26
2.I0.50−0.180.32I0.54−0.210.33I0.52−0.190.33
3.A0.68−0.250.43A0.68−0.340.34A0.68−0.280.40
4.I0.53−0.430.10I0.54−0.400.14I0.53−0.420.11
5.I0.48−0.240.24I0.56−0.340.22I0.51−0.280.23
6.O0.71−0.670.04O0.60−0.68−0.08O0.67−0.68−0.01
7.R0.07−0.070.00R0.12−0.120.00R0.09−0.090.00
8.O0.78−0.770.01O0.62−0.72−0.10O0.72−0.75−0.03
9.I0.55−0.290.26O0.59−0.430.16I0.56−0.350.21
10.I0.38−0.150.23I0.46−0.280.18I0.41−0.200.21
11.O0.61−0.540.07O0.56−0.560.00O0.59−0.550.04
12.O0.59−0.430.16I0.55−0.440.11O0.57−0.430.14
13.I0.54−0.290.25I0.55−0.390.16I0.54−0.330.21
14.O0.75−0.85−0.10O0.71−0.79−0.08O0.73−0.83−0.10
15.O0.67−0.74−0.07O0.67−0.74−0.07O0.67−0.74−0.07
16.O0.72−0.74−0.02O0.67−0.73−0.06O0.70−0.73−0.03
17.I0.49−0.380.25I0.53−0.460.07I0.50−0.410.09
Table 5. Significance * of the evaluation of features based on the Fong test. Not significant is represented as “no” and significant as “yes”.
Table 5. Significance * of the evaluation of features based on the Fong test. Not significant is represented as “no” and significant as “yes”.
Feature No.FeaturesSignificance
Layer   n 1 Layer   n 2 Layer   n 3 Layer   n 4 Sample n
1.New educational resources suggestionnonononono
2.Interactive quizzes (intrinsic motivation)yesnononoyes
3.VR enabling laboratory experimentsyesyesyesnoyes
4.Course evaluationnonononono
5.Automatic real-time subtitles generationyesnononoyes
6.Multiple devices availability (desktop, tablet, mobile phone)yesnoyesyesyes
7.Blocking unauthorized recording/screenshotsyesyesyesyesyes
8.Course recording availabilityyesyesyesyesyes
9.Automatic presence after full visualisation of live/recorded coursenonononoyes
10.Customized user profileyesyesnonoyes
11.Messaging serviceyesnoyesnoyes
12.Discussions forumnonononono
13.Groups creation servicenonononoyes
14.Access to the updated scholar situationyesyesyesyesyes
15.Online administrative documents/requests evidence and submissionyesyesyesyesyes
16.Online studies related payments possibilityyesyesyesyesyes
17.Administrative responsible-based feedbackyesnononoyes
* The complete analysis results are available in Table A6, Table A7 and Table A8.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Pop, M.-D.; Pugna, A.P.; Crețu, V.-I.; Potra, S.A. Two Years of Hybrid Education in Romania: A Comparative Overview of the Students’ Expectations for the Online Educational Platforms. Sustainability 2022, 14, 14241. https://doi.org/10.3390/su142114241

AMA Style

Pop M-D, Pugna AP, Crețu V-I, Potra SA. Two Years of Hybrid Education in Romania: A Comparative Overview of the Students’ Expectations for the Online Educational Platforms. Sustainability. 2022; 14(21):14241. https://doi.org/10.3390/su142114241

Chicago/Turabian Style

Pop, Mădălin-Dorin, Adrian Pavel Pugna, Vladimir-Ioan Crețu, and Sabina Alina Potra. 2022. "Two Years of Hybrid Education in Romania: A Comparative Overview of the Students’ Expectations for the Online Educational Platforms" Sustainability 14, no. 21: 14241. https://doi.org/10.3390/su142114241

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop