Next Article in Journal
Design Methodology and Analysis of Five-Level LLC Resonant Converter for Battery Chargers
Next Article in Special Issue
Best Practice of Using Digital Business Simulation Games in Business Education
Previous Article in Journal
A Routine-Based Theory of Routine Replication
Previous Article in Special Issue
Combined Small- and Large-Scale Geo-Spatial Analysis of the Ruhr Area for an Environmental Justice Assessment
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Attractiveness of Collaborative Platforms for Sustainable E-Learning in Business Studies

by
Simona Sternad Zabukovšek
1,*,
Zdenko Deželak
1,
Silvia Parusheva
2 and
Samo Bobek
1
1
Faculty of Economics and Business, University of Maribor, 2000 Maribor, Slovenia
2
Digital and Distance Learning Center, University of Economics, 9002 Varna, Bulgaria
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(14), 8257; https://doi.org/10.3390/su14148257
Submission received: 17 June 2022 / Revised: 4 July 2022 / Accepted: 5 July 2022 / Published: 6 July 2022

Abstract

:
E-learning platforms have become more and more complex. Their functionality included in learning management systems is extended with collaborative platforms, which allow better communication, group collaboration, and face-to-face lectures. Universities are facing the challenge of advanced use of these platforms to fulfil sustainable learning goals. Better usability and attractiveness became essential in successful e-learning platforms, especially due to the more intensive interactivity expected from students. In the study, we researched the user experience of students who have used Moodle, Microsoft Teams, and Google Meet. User experience is, in most cases, connected with a person’s perception, person’s feelings, and satisfaction with the platform used. Data were collected using a standard UEQ questionnaire. With this research, we examined whether user experience factors: perceived efficiency, perceived perspicuity, perceived dependability, perceived stimulation, and perceived novelty affect perceived attractiveness, which is an important factor in the sustainability of e-learning tools. The collected data were processed using SmartPLS. The research study showed that all studied factors have a statistically significant impact on perceived attractiveness. Factor perceived stimulation has the strongest statistically significant impact on the perceived attractiveness of e-learning platforms, followed by perceived efficiency, perceived perspicuity, perceived novelty, and perceived dependability.

1. Introduction

Educational software aims to improve and support learning. Educational software should consider educational processes and especially students’ learning activities. Good usability of educational software is crucial [1,2]. Therefore, student collaborations with the educational software should be as spontaneous as possible. Educational software should be embedded in teaching and learning processes, so a synergy between students’ interaction with the software and learning processes is achieved. Usability should be related to efficient interaction of students and software and should also be considered in the sense of appropriateness for learning tasks [3]. When such synergy happens, we can say that the educational software is imbedded in education processes [4]. This means that a working relationship is developed between the actual use of special education software and the learning process. Many conducted research studies have not studied adequately the importance and effects of usability of educational software to reach learning purposes [3,5]. A so-called instructional interface is needed so the student can concentrate on learning content instead of concentrating on how to use software to retrieve the content. Research has repeatedly highlighted the need to focus on students’ goals instead of educational tasks [6]. Alternatively, the requirement for usability is also commonly known in the website design literature as a key measure of quality in determining website or platform user satisfaction [7,8,9,10,11]. Therefore, the usability of educational software can considerably influence learning [12].
Educational software has become more complex in the last decade regarding the educational content, technology resources, and interaction options [13] offered by the learning management system (LMS) platforms such as Moodle, Blackboard, etc. However, the user interface quality of these platforms and the ways in which tasks are performed are becoming increasingly important [14,15]. Learning management systems (LMS) differ from general online platforms because they have different purposes. This means that users learn new knowledge using such systems. Very often, such platforms demand that the student learn to use them so that he/she may finalise the learning tasks and fulfil the learning objectives. The problem which emerges very often is that there is greater involvement in the educational content combined with platform functionalities than involvement in user interfaces between students and the platform where such content will be presented [16,17].
With the rapid development of information technology, e-learning is becoming an increasingly important part of the innovative higher education system. E-learning has been implemented in the majority of universities today. As part of this progress, universities and colleges have progressively implemented a new generation of so-called Virtual Learning Environments (VLE) that allow teachers and students to connect effectively while focusing on learning purposes [18,19,20]. However, again, it has been confirmed in several research papers that weak usability can significantly affect the effectiveness of communication in the relationship between teachers and students and the learning of students [21].
The most important task for Human-Computer Interaction (HCI) researchers and designers is to create computer interaction to involve beginner users (students) and provide distance learning [22,23]. Educational software should consider the individual ways how students learn and, thus, enable as many as possible intuitive (natural) interactions with students. This calls for a review of traditional interaction with e-learning platforms to support new opportunities and flexibility that are appropriate for e-learning platforms as they allow students to interact during the learning process [24].
Usability and User Experience (UX) became important factors in developing successful e-learning platforms [25,26,27]. If the e-learning platform is not useful enough and is difficult to use, students need a lot of time to learn to use it. Thus, the e-learning platform hinders them from performing their assignments effectively [28]. The International Standards Organization (ISO) describes usability as “the level to which specified users can use a product or service to accomplish particular objectives with effectiveness, efficiency and satisfaction in a specified context” [29]. Today, researchers evaluate usability using various methods, such as analytical approach, specialist heuristic evaluation, surveys, and experimental approach [30,31], where specific characteristics of the e-learning platform usability can be found. For example, in his research, Giannakos [32] evaluated an online e-learning platform in terms of pedagogical applicability. Usability and user experience (UX) should not be equated as it is user experience (UX) described by the ISO as “observations and reactions of the person resulting from the use and/or expected use of the product, system or service” [33].
This definition is usually supplemented in research articles and books with phrases such as user’s feelings, user’s perception, or user’s satisfaction. Roto et al. [34] and Gordillo et al. [28] pointed out that in addition to these phrases, the words “dynamic” due to changing circumstances, ”subjectivity” due to individual performance, and “content-dependent” are also used to understand the concept of user experience (UX).
Usability and user experience (UX) are important for sustainable learning [25,35]. It must be obvious that students feel excellent and self-confident about the expertise, abilities, opinions, and beliefs they have learned. Sustainable education and learning are, thus, becoming crucial to the future lives of people on an increasingly-transforming planet [36]. The replacement of technologies and approaches from the past with newer sustainable ones, enables people to consume satisfying lifestyles without negatively impacting the environment [37]. Sustainability as a concept emerges in many industries i.e., sustainable business models in automotive industry [38] and many other areas [39,40,41]. New organisational sustainable identity frameworks emerge [42], which will foster also changes in higher education organisations, which will become sustainable organisations. The Sustainable Learning Environment (SLE) essentially includes approaches and skills that support teachers in effective substance recovery, reuse, inquiry, etc. [43,44]. In addition, it enables teachers to cope with difficult and complex requirements concerning learning and relearning [45]. Ben-Eliyahu [46] added that a self-regulated learning framework consisting of four characteristics could be used to understand SLE: (1) renovating and relearning; (2) individual, independent as well as collaborative learning; (3) dynamic learning; and (4) transferability. Due to a literature gap regarding examination of the user experience of e-learning platforms, which are an important part of SLE, the aim of our research is to evaluate VLE platforms used in a higher education setting from a user experience viewpoint to get an insight into their attractiveness perceived by students using them. Our research model is based on adapted factors from the User Experience (UX) Model from Hassenzahl [47], Laugwitz, et al. [48], Schrepp [49], and Schrepp et al. [50,51,52], which are perceived efficiency (EF), perceived perspicuity (PE), perceived dependability (DE), perceived stimulation (ST), perceived novelty (NO), and perceived attractiveness (AT). From a sustainability point of view, perceived attractiveness could be one of the features of an e-learning tool for long-term use. Therefore, the objective of the research is to test hypothesise that all other factors of user experience (UX), namely, perceived efficiency (EF), perceived perspicuity (PE), perceived dependability (DE), perceived stimulation (ST), and perceived novelty (NO), have a positive impact on perceived attractiveness (AT).
The article is structured in a way that in Section 2, the theoretical background is presented with a focus on e-learning collaboration platforms, the attractiveness of e-learning collaboration platforms as perceived by students is discussed, and the concepts of user experience are explained more in-depth. Section 2 concludes with a description of the research model. In Section 3, research methodology is described. In Section 4, research study is described, and research results are presented. In Section 5, research findings are discussed, implications are explained, and limitations and future work suggestions are presented.

2. Theoretical Background

2.1. E-Learning Collaboration Platforms

The usual practice of universities involves using the Learning Management System (LMS), which has its older traditions back in time. Researchers point out that in an open society, with increasing distribution as well as access to information dynamics, it is difficult to use it traditionally [53,54]. This determines the paramount role that e-learning platforms play in higher education. Through them, students gain access to the digital educational content of the courses, tests for self-preparation and assessment, etc. Researchers point out that using e-learning platforms as a tool for self-study contributes to a substantial difference in the use of resources and results in increased learning efficiency [55]. LMS significantly change the teaching experience of both teachers and students and is used very intensively [56]. COVID-19 pandemic affected the closure of all educational institutions, including universities, along with the shift to complete online learning and affected a series of changes in learning and teaching process which required technology [57]. Implementing effective learning in a remote, electronic-based environment has become a key issue in education [58,59,60,61]. As a solution, e-learning platforms, LMS platforms, video conferencing systems, and online collaboration tools (for example, Microsoft Teams, Zoom, Google Meet, Webex, etc.) have intensified. For the purposes of our study, the use of Moodle, Google Meet, and Microsoft Teams is considered and, therefore, special emphasis is placed on them.
E-learning platforms are web-based platforms for providing digital educational content and managing the learning process. As mentioned earlier, universities integrate into their e-learning platforms a range of different systems and tools. Among them are world-renowned LMS platforms such as Blackboard, Canvas, Moodle, Google Classroom, and more. Of these systems, Moodle and Canvas are free platforms. In addition, it should be noted that Moodle is an open-source system, while Blackboard is a paid system. Statistics on the use of e-learning platforms show that in more than 1600 institutions surveyed in Europe, 65% of them use Moodle, 12% Blackboard, followed by other platforms (ILIAS, APG Learning /Sakal/, etc.) with a share between 4% and 1% [62]. These figures prove the leading positions of Moodle among the used e-learning platforms. In addition to LMS platforms, a new generation of e-learning platforms emerged, oriented not only in content sharing, but also in facilitating e-learning collaboration. In most cases, universities’ use of Google Meet, Zoom, and Microsoft Teams as collaboration tools has intensified.
Somova and Gachkova point out that Moodle is one of the world’s most widely used platforms in over 200 countries, with more than 170,000 installations, where 250 million users use more than 30 million courses [63]. In addition, Moodle is coded in the PHP programming language and is issued under the General Public License (GNU). The platform is developed in compliance with the pedagogical principles and is used for blended learning, distance learning, flipped classrooms, and other areas of e-learning in schools, and universities, to maintain corporate qualifications and more. Some authors emphasise the important features that determine its widespread use. Moodle is argued to be the optimal Virtual Learning Environment (VLE) platform in terms of the tools at its disposal and its technical aspects [64].
Other authors [65] cite the leading features of Moodle, the excellent organisation of online sources, their accessibility and convenience, good structuring, and the effectiveness of communication tools. Evgenievich et al. [66] summarise the opinion of researchers and practitioners about the benefits of Moodle LMS in several directions: (1) many opportunities to increase the individuality and responsibility of students; (2) the possibility of using a variety of platform resources that enrich and at the same time develop skills for working with material; (3) the opportunity to progress individual educational paths; and (4) create an environment for acquiring new knowledge, experience exchange, and consulting. Some authors also emphasise the possibility of synchronous and asynchronous access and group work [67].
On the other hand, the authors highlight that communication in Moodle is not so widespread, with a greater interest in the use of social media and specifically social networks, as well as mobile applications [61,68]. For this reason, instant communication applications, such as Messenger, WhatsApp, Viber, and others, are especially popular and used. The Moodle forum module can be used for asynchronous group communication and collaboration, while for synchronous communication, Google’s Google Meet tools are preferred, as well as Microsoft Teams, Zoom, Webex, etc.
Google Meet is defined by the authors [69] as a “synchronous learning tool for distant online programs”. Google Meet and Microsoft Teams Rooms and their videoconferencing features provide a good basis for active interaction between lecturers and learners. Through them, effective synchronous work with students and their fuller engagement in the learning process is achieved. Research by many authors indicates the main possibilities used through these applications. Links to web-based classrooms are created by educators and are integrated into learning courses in Moodle. This combines the use of these tools, with Moodle LMS offering much more comprehensive and complete support to students and faculty through a variety of activities and resources, but primarily asynchronous, and Google Meet and Microsoft Teams offering synchronous communication for lectures and seminars. Due to video conferencing functionality, live lectures or seminar sessions are held with students [70].
Gartner recognised the Microsoft Teams application as a leader for unified communications as a service (UCaaS) and meeting solutions [71]. Microsoft Teams was placed highest of all solution providers for its ability to execute. Microsoft Teams is an application that supports internal and external team members in connecting and collaborating synchronously [72]. People can have one-on-one or many-to-many meetings or calls with fully integrated voice and video, informal chats, co-authoring a text, or participate in other applications and services. Microsoft Teams offers a shared workspace for people where we can quickly restore or repeat the project and allows you to work with team files and participate in shared results. Every new team created a new group, an online site using SharePoint complete with a document library, a OneNote notebook, Exchange online (shared calendar and mailbox), and is highly integrated with other Microsoft 365 and Office 365 applications (e.g., Power BI, Planner, Forms etc.) [72]. Microsoft is rapidly developing new capabilities into Microsoft Teams application—there is an update every month, and the dozens add new features. They also support the UserVoice initiative, with which everyone is able to suggest new features, and users vote on them—the most requested features get to be added in the coming updates. More recent added features incorporate [72]:
  • “Together mode”, which offers a simulation of actual meeting members being in the same room;
  • Customisable meetings that support setting up breakout rooms for meeting in smaller groups;
  • Ability to record the meetings on the go, accompanied with meeting notes and transcripts.
One of the latest investments by Microsoft is Microsoft Teams Rooms, which provides a mutual experience as in a standard office meeting room. This way, users can experience hybrid work scenarios that support simplified meeting starts and effortless sharing of content with complete audio and video collaboration [72].
Florjancic and Wiechetek [73] compare LMS platforms Moodle and Microsoft Teams and state that Moodle is a complex tool, but at the same time, it is a complicated platform. On the other hand, they define Microsoft Teams as a relatively new and simple tool with a modern design and easy to use. Particular emphasis is placed on its simplicity, real-time communication, and opportunities for integration with Microsoft Office 365.
It can be summarised that Moodle LMS, Microsoft Teams and Google Meet as e-learning tools support educators and students in different but complementary aspects. Moodle has significantly richer capabilities thanks to the many activities and resources built into the platform, but mainly for asynchronous support and collaboration. On the other hand, the potential of both Microsoft Teams and Google Meet tools for synchronous online communication and online lectures and seminars sessions is irreplaceable, especially in the difficult times of COVID’s restrictions and the closure of universities.

2.2. Attractiveness of E-Learning Collaboration Platforms for Students

Many researchers have researched e-learning as an approach to providing educational content within study programs, and many publications have been published. Many published research studies have been focused on technology issues of online teaching and online learning. Studies emphasise technological issues related to using online platforms and creating online content such as audio-visual and interactive content in higher education. Some studies are related to the hosting of knowledge bases and accessing such knowledge bases. Other studies are focused on online communication between the teacher and students and among the students within study programs.
However, teaching and learning are not all about the technology used in such a virtual environment. It also has its social and psychological dimensions. For sustainable learning, the teacher’s and student’s needs, desires, motivations, and interests, as well as perspectives, are also studied and taken into account. Only such a holistic approach can lead to sustainable higher education, improving the efficiency and effectiveness of online learning.
Research studies show that online courses can bring high-quality education to more students while online courses are easily accessible to all students, no matter where they are geographically located [74]. Czerniewicz et al. [75] point out that there are studies investigating the provision of academic content through e-learning in general and mass open online courses (MOOCs), their problems, and expected benefits. Mulder and Jansen [76] add that MOOCs can improve access to education but that questions related to Internet accessibility, digital literacy, and the medium of delivery need to be answered first.
Much effort in the e-learning environment relates to the assumption that when content is offered online, students can access it to meet the course requirements and achieve a certain level of knowledge of the topic. However, students participating in online courses also face several psychological and socio-cultural problems. These problems can lead to a high rate of non-completion of online courses. In the early 1990s, researchers pointed out that computer-mediated communication is impersonal and antisocial [77,78], which is still considered one of the main problems today. Wegerif [79] pointed out that the lack of social presence causes a low level of commitment. Moreover, it can also lead to withdrawal from the online environment. Penstein Rosé et al. [80] found that social factors make an important contribution to the use/non-use of online courses.
Numerous studies have shown that students acquire more knowledge and prefer to learn through direct communication (face-to-face), making it easier also for the teacher to gain a common understanding and conceptual knowledge. Therefore, we need to monitor and ensure the teacher’s cognitive presence if students are educated online [81,82]. The teacher’s cognitive and social presence is very important for more intensive student involvement in the online courses. This often leads to better meeting students’ needs and increases their motivation for online course participation [83]. Barnett [84] added that teacher cooperation with students in online courses has a positive effect on the intention of students to attend the online course.
Blended learning is the teaching mode where online access to course content is integrated with personal (face-to-face) teaching and personal communication [85]. The blended teaching model seems to precede online and personal teaching [86]. Therefore, blended learning in higher education is recognised as one of the biggest trends in training and education [87]. Most blended learning studies have shown that this positively affects student achievement and satisfaction, as the inclusion of online learning resources and activities for students improves learning outcomes in higher education [88,89].
Some other researchers got up with similar learning outcomes through blended and personal (face-to-face) ways of learning [90,91]. In their study, Rossiou and Sifaleras [92] studied the factors influencing student participation in using e-tools and e-content. They pointed out that the main reasons for not involving students in online courses are lack of internet access, other technical issues, and lack of time, awareness, and engagement. Anderson et al. [93] pointed out in their research that social patterns of student involvement are important, as not all students enrolled in online courses participate in the same way and are not equally engaged. Five categories of students (observer, spectator, collector, versatile, and rescuer) were identified, differing in the style of collaboration and time of interaction.
Research collaboration between teacher and student is a central element of the academic relationship. It should be emphasised that students expect expertise, support, and a balance between creativity and criticism of the teacher [94]. Modern e-learning platforms, especially their collaborative work functionalities, enable new dimensions of blended learning. At the same time, personal communication in the classroom can be replaced by face-to-face distant communication via a collaborative platform providing videoconferencing, to use collaborative platforms at an advanced level and ensure more active remote face-to-face communication by participants accepting these platforms by users/students should be on an advanced level.
In recent years, the phrase “digital collaboration” has been frequently used also in education. The extent and different ways of collaboration are usually organisation-specific, influencing the portfolio of tools used to be efficient and successful [95]. Collaboration is inevitable in education, and if we simplify, collaboration occurs every time two or more persons cooperate towards a common goal—in the case of education, the goal is to achieve the expected learning output. Organisations must consider the reasons for digital collaboration to maximise the collaboration results of tools used [95]. Collaboration can be on different levels, depending on the learning process requirements. The challenge there is in knowing the learning process well enough to support it to the right extent with collaboration tools. Many advantages come with the digital collaboration, which impacts sustainability issues in learning, including the following:
  • Saved time—Being able to finish a process or a task faster automatically means saving time and, as such, means fewer costs;
  • Strengthened team relationships—One can see students enrolled in a course as closely connected groups. So, it comes to be very valuable to be capable of maintaining sound and effective relationships within groups of participants. With modern collaboration tools, this can be supported and elevated so that every student has a better understanding of teamwork and mutual goals, of course, he attends;
  • Better organisation of teaching work—Collaboration tools are facilitators to improve teaching, especially active learning.
Digital collaboration within learning in higher education can be described as a combination of tools and processes that enable teachers, students, and other participants to communicate and interact on different levels. This is achieved using selected platforms and tools [96]. Digital collaboration tools for business are an effective way for organisations to be able to support different types of employees (working on location, working from home, working from anywhere) and still maintain effective business process flow. Similar is in higher education, where participants can use collaboration platforms and face-to-face communication tools. We see innovation lately in digital collaboration, which is driving efficiency and productivity with the combination of traditional tools, social media, and other modern tools that create unique, sustainable working environments [97]. In higher education, this is seen in the integration of collaboration platforms and tools with traditional e-learning platforms for access to digitalised content of courses.
Digital collaboration supports remote employees to connect to others seamlessly, completing their tasks and communicating. In higher education, digital collaboration enables students and teachers to be connected seamlessly. This positively impacts efficiency and enables better group dynamics and relationships within courses conducted in blended mode and using collaboration platforms and tools. Digital collaboration is in some way progressive but is getting increasingly common. For innovative organisations, the challenge is in developing even newer, more advanced platforms and tools to differentiate themselves and be leaders in their field.
For sustainable learning enabled by blended learning mode, e-learning platforms and collaboration platforms used to foster digitalised face-to-face collaboration must be accepted by students accordingly. Their user experience with e-learning and collaboration platforms must be as high as possible.

2.3. User Experience

Different aspects can be used to evaluate software applications from the user’s viewpoint. Some are quantitative, but it often depends on the user’s subjective opinion, whether he or she finds an application good or not. Recently for such evaluations concept of user experience (UX) has been used. User Experience (UX) is defined as the actual end-user experience with the application. The International Standards Organization (ISO) describes user experience as “a consequence of the presentation, system performance, functionality, interactive behaviour, and assistive competencies of an interactive system, both software, and hardware. It is also a consequence of the user’s prior experiences, skills, attitudes, personality, and habits [33]”. It is limited to software products, services, and systems with everything that suits the user’s journey and creates a user experience before using the software product, service, or system [98].
Great user experience provides better work motivation and performance and can also impact the welfare of users [99,100]. Therefore, we can assume that a good user experience will lead to a higher level of satisfaction, which can ensure better use of the product, service, or system. User experience is most frequently connected with software products and applications, concentrating on ensuring end users have a clear and useful experience with the solution interface. The concept of user experience is more complex than the user interface (UI) concept. After all, software cannot be viewed in just one way, as users can gain experience not only while using this software. Users’ perception is also influenced by the accompanying service and the entire system of a product [98]. Lallemand et al. [101] expose that usability is often seen as necessary for a good user experience. As we mentioned in the introduction part, one of the widely used definitions of usability defined by the International Standards Organization (ISO; standard ISO/IEC 9241) is: “the degree to which particular users can use a system, service or product to achieve particular goals with efficiency, effectiveness, and satisfaction in a particular context of use [33]”. There is some significant distinction between user experience and usability. Hassenzahl [99] revealed that the user experience has five unique features that distinguish it from usability. First, it is subjective, as it relies heavily on human perception. Second, it is holistic, including both hedonic and pragmatic use characteristics. Third, it is dynamic because it changes over the period. Fourth, it is context-sensitive, as it is always in some context, and fifth covers the positive and essential consequences of use. Therefore, user experience is involved with a user’s complete experience while using a system that is more inclined to their emotional views. In contrast, usability is evaluating the excellence of their system use based on efficiency, effectiveness, and satisfaction measures. To sum up, user experience can be evaluated regarding reaching different hedonic aims, for example, emotion, and human behaviour, as well as response generated based on interactive experience. While usability is assessed concerning reaching some performance objectives such as accessibility, safety, learnability, and similar [102].
Designers’ goal regarding positive user experience is to provide the intended software applications with interest, enjoyment, and gratification.
From a user point of view, abilities are apparent, estimated, and experienced in the perspective of use, preferably leading to interest, enjoyment, and gratification. Still, this can only be accomplished by a particular degree of hedonic and pragmatic qualities. A product’s substance and usefulness should be sufficient and beneficial with interactions that are easy to understand and effortless. Thus, the presentation should be attractive, pleasant, and compatible with the character of the brand [98].
Kashfi et al. [103] expose that agreed-upon lists of principles and practices which will identify good user experience are still unavailable. Scattered examples of user experience principles and practices can be found in user experience studies. Table 1 summarises user experience principles, practices, tools, and methods and also shows some examples [103].
User experience evaluation is the research topic of many studies, and many assessment techniques have been created and used within these research studies. Researchers claim that it is difficult to assess and measure user experience. People who use a particular software product can have implicit experience with it through anticipations based on current understanding of associated technologies, brands, demonstrations, advertisements, presentations, etc. Roto et al. [34] argue that the implicit experience expands after usage, such as the image of the earlier use or changes in the assessment of people’s use. Another important aspect to consider in user experience is also timeframe. We can argue that user experience changes over time (prior to use, during use, after use, and past use); we can focus only on what someone experienced while using the software product in a short time. On the other hand, we can concentrate on the aggregated understanding developed across a string of use encounters and phases of non-use that might reach over extended periods. User experience can consequently indicate a specific impression during use (momentary user experience), a judgment of a particular episode of usage (episodic user experience), or opinions about a complete system after using it for a longer time (cumulative user experience). Anticipated user experience could correlate to the time prior to first use. A summary of different types of user experience is presented in Table 2 [34].
Various methods are specifically created to assess and research UX-related concepts. For the research study, we used a standard freely available UEQ questionnaire [49,50] to measure interactive product and service UX, as explained in the following section.

2.4. Research Model

Our research model is based on adapted factors from the user experience model [47,48,49,50,51,52]. Adapted factors are perceived efficiency (EF), perceived perspicuity (PE), perceived dependability (DE), perceived stimulation (ST), perceived novelty (NO), and perceived attractiveness (AT). Descriptions of factors and their items are presented in Table 3. From a sustainability point of view, perceived attractiveness (AT) could be one of the features of an e-learning tool for long-term use. Therefore, we can hypothesise that all other adopted factors from the user experience model, namely perceived efficiency (EF), perceived perspicuity (PE), perceived dependability (DE), perceived stimulation (ST), and perceived novelty (NO), have a positive impact on perceived attractiveness (AT). Relationships between factors can be seen in the research model in Figure 1.
On the bases of the dependencies of relationships, we can highlight the following hypotheses:
Hypothesis 1 (H1).
Factor Perceived efficiency (EF) directly impacts the factor of Perceived attractiveness (AT).
Hypothesis 2 (H2).
Factor Perceived perspicuity (PE) directly impacts the factor of Perceived attractiveness (AT).
Hypothesis 3 (H3).
Factor Perceived dependability (DE) directly impacts the factor of Perceived attractiveness (AT).
Hypothesis 4 (H4).
Factor Perceived stimulation (SI) directly impacts the factor of Perceived attractiveness (AT).
Hypothesis 5 (H5).
Factor Perceived novelty (NO) directly impacts the factor of Perceived attractiveness (AT).
In Table 3 are adjusted measurement variables (indicators) to measure the degree of agreement with the statement.

3. Methodology

3.1. Participants and Procedure

We prepared two online versions of the questionnaire (in Slovenian and Bulgarian language) and sent them via e-mail together with an invitation to participate to all students of the Faculty of Economics and Business of the University of Maribor (FEB; Slovenian sample) and all students of the Faculty of Computer Science of the University of Economics—Varna (FCS; Bulgarian sample). All students were free to choose to participate in the survey. At the beginning of the online questionnaire, students were presented with the study itself, its purpose, as well as planned research publication, the data gathering process, and managing the respondents. We collected data without collecting data (i.e., personal data) with the help of which the identity of students could be revealed. The FEB uses two e-learning platforms, Microsoft Teams, and Moodle. The FCS also uses two e-learning platforms, Google Meets, and Moodle. Both groups of students were asked to complete a questionnaire for all e-learning platforms they use.

3.2. Measurement Instrument

The questionnaire is based on Laugwitz et al. [48] and Schrepp [49], where we changed the measurement scale of the factors to fit the statements (items) in the context of the e-learning platform. All items were measured on a 7-point Likert scale, where one (1) means “strongly disagree” to seven (7), which means “strongly agree”. The questionnaire was translated into Slovenian and Bulgarian languages. Before surveying students, four Slovenian e-business lecturers checked the clarity of the questionnaire. Based on the comments, we changed the word order in some items, but the meaning remained the same. A similar procedure was performed in Bulgaria, where three computer science lecturers checked the clarity of the items.

3.3. Data

We received 247 complete questionnaires for the e-learning platform Microsoft Teams for Slovenian students, 236 for the e-learning platform Moodle for Slovenian students, 130 for the e-learning platform Google Meets for Bulgarian students, and 99 for the e-learning platform Moodle for Bulgarian students. We further examined whether there were statistical differences between the individual groups. We used multi-group analyses (MGA) to test whether predefined data groups significantly differ in group-specific parameter assessments [104]. We used the PLS-MGA method [105], which is part of the SmartPLS tool [106] and is also used in the further data processing. There are no statistically significant differences between groups (p < 0.01); therefore, we can proceed with the analysis. The results of the processing of PLS-MGA can be obtained from the authors.
The recommended sample size is equal to or greater than 10 times the maximum number of structural paths targeting a particular construct in a structural model [107,108]. Our research model (see Figure 1) includes five structural paths (so, at least 50 responses). Our sample includes 712 responses and is representative. The average age of respondents is 18.48 years, most of whom were between 19 and 21 years old. In total, 210 (32%) males and 448 (63%) females answered the questionnaire. Most respondents, 83% (594), stated that they were studying in the first level of study, 9% (64) indicated that they were studying in the second level of study, and 8% (54) did not answer this question. Most students use e-learning tools for one to three years with an average of 2.27 years. At the end of the survey, we asked them about their overall impression of working with the e-learning tool. Respondents had a scale from 1 (strongly dislike) to 7 (strongly like), where the average value is 5.503. In Table 4, we can see the means and standard deviations of all factor items from our research model.

4. Results

Structural Equation Modelling (SEM) is one of the most advanced statistical analysis techniques used in the social sciences in the last decade. It involves multivariate techniques that combine the aspect of factor analysis and regression to allow researchers to simultaneously study the links between measurement variables and latent variables (measurement theory evaluation), as well as among latent variables (structural theory evaluation) [109,110]. Although there are many methods to implement SEM, the most commonly used are covariance-based SEM (CB-SEM), introduced by Karl Jöreskog in 1973, and partial least squares SEM (PLS-SEM), presented by Herman Wold in 1960 [111]. PLS-SEM has some benefits over CB-SEM, especially in social sciences research, where there are smaller samples and complex models with many indicators (items) and relationships [109,112]. Hair et al. [109] also added that CB-SEM is used to confirm (reject) concepts, while PLS-SEM is primarily used to develop concepts. Due to that, we used the PLS-SEM method, also named the PLS path model, consisting of two parts. The first part is an assessment of the measurement model, also named the outer model, which shows the relationships between constructs and indicators. The second part is an assessment of the structural model, also named the inner model, which shows the relationships between constructs of the research model.
For further analysis, we used SmartPLS [106] software. At first, we evaluated the measurement model (Section 4.1), secondly structural model (Section 4.2), and thirdly Importance—Performance Map Analysis—IPMA (Section 4.3), following the recommendations of Garson [110] and Hair [109,113]. The final version of the model is presented in the next sections, where items (DE1, NO3, and NO4 from Table 3) were removed from the initial model because they did not meet the criteria of the measurement model.

4.1. Reflective Measurement Model Assessment

The measurement model of reflective indicators is checked using three measures: (1) internal consistency reliability with measures Cronbach’s alpha and composite reliability, (2) convergent validity with indicator reliability and average variance extracted (AVE), and (3) discriminant validity [109,110].
Internal consistency reliability is usually checked with Cronbach alpha, which estimates the reliability created on the intercorrelation of the studied values of the items. Due to the limitations of Cronbach’s alpha, it is more suitable to use composite reliability, which takes a variety of external loads of items [109,113]. Both measures vary between 0 and 1, where greater values present a greater level of reliability with a threshold value of 0.60 [114]. From Table 5, we can see that all Cronbach’s alphas and composite reliability values exceed the value of 0.60 except Cronbach’s alpha of Perceived novelty (NO), which is 0.579, but its composite reliability value is 0.826, which, together with all other values were considered as good for exploratory research [115].
Convergent validity was checked through indicator reliability and average variance extracted (AVE). The square of a standardised item’s outer loadings represents indicator reliability with a threshold value of 0.5. All indicator reliability values from Table 4 exceed the value of 0.5. Measure AVE reflects the average communality for each latent construct in a reflective model with a threshold value of 0.5 [114] and can be used to test convergent and divergent validity. From Table 5, we can see that all AVE values exceed 0.5.
AVE’s square root can also be used to establish discriminant validity by Fornell–Larcker criteria [116]. From Table 6, we can see that all square root of AVE that appears in the diagonal cells of the table are higher than the values below it, which present no discriminant validity. The second measure of discriminant validity is cross-loadings criteria, where indicator loadings (see column indicators loadings in Table 4) should exceed a value of 0.70 [110]. The third measure of discriminant validity is the heterotrait–monotrait (HTMT) ratio, where each HTMT value of each indicator should not include 1 (see HTMT confidence interval from 2.5% to 97.5% columns in Table 4).
The first part, the evaluation of the measurement model, meets the requirements so that we can continue with the second part of the t.i. evaluation of the structural model.

4.2. Structural Model Assessment

Hair et al. [109] highlighted that the most important evaluation metrics for the structural model are: explained variance (R2 value), effect size (f2 value), predictive relevance (Q2 value), and statistical significance and size of the structural path coefficients.
Explained variance (R2 value) “measures difference, which is described in every one of the endogenous constructs and is, therefore, a measure of the model’s explanatory power” [113,117], where higher values were suggesting a larger explanatory power. Explained variance (R2) value of our research model is 0.805, which represents a substantial effect and can be explained that 80.5% of the variance in the construct Perceived attractiveness (AT) is explained by the research model (Table 7).
The effect size (f2) value represents a difference in R2 when the causal construct is deleted from the model. From Table 7, we can see that all constructs except construct Perceived stimulation (SI) have a small impact, while the omission of the construct Perceived stimulation (SI) would have a medium impact.
The predictive relevance (Q2) value merges characteristics of out-of-sample calculation along with within-sample explanatory power [118]. They added that it uses these values as inputs, and the blindfolding procedure calculates the deleted data points for all variables. Slight discrepancies in the middle of the expected and the initial values convert into a higher Q2 value and show better predictive accuracy. Q2 value of our research model is 0.516 for construct Perceived attractiveness (AT), which shows large predictive relevance of the structural model for that construct (Table 7).
The importance of structural path coefficients (β) has been calculated via a nonparametric bootstrapping technique, randomly taking subsamples from original data and then applying subsamples to evaluate the PLS path model. We used 5000 subsamples as suggested by Hair et al. [109]. Structural path coefficient (β) values vary from −1 to 1, where values nearest to absolute 1 indicate the strongest paths. “t-statistic value has to be greater than value 1.96 at 5% significance” [109,119]. The intensity of significance is measured by the p-values, which show the quality of significance. For all relationships in our research model, p-value is 0.000 (see Table 8, column p-values). This means that all relationships in our research model have a strong significance for structural path coefficients. The strongest impact on construct perceived attractiveness (AT) has perceived stimulation (SI; β = 0.346) followed by perceived efficiency (EF; β = 0.254), perceived perspicuity (PE; β = 0.226), perceived dependability (DE; β = 0.130), and perceived novelty (NO; β = 0.110) (Table 8). From Figure 2, we can also see the path coefficients (β) of items and their t-statistics (in parentheses) of our research model.

4.3. The Importance—Performance Map Analysis (IPMA)

IPMA expands the standard PLS-SEM calculations of path coefficients by adding an additional element. An IPMA depends on total effects on a specific target construct (in our case Perceived Attractiveness (AT)). “The total effect represents the precursor constructs’ importance, while their average latent variable scores represent their performance” [110]. Processing aims to find precursors with relatively high significance for the target construct and relatively small performance [120]. The aspects on which these constructions are based represent potential areas for improvement that could be given more attention. In our research model, the construct Perceived stimulation (SI) has the highest value of importance (0.346) on construct Attractiveness (AT). Still, its performance is in fourth place with a value of 62.164, followed by Perceived efficiency (EF) with a value of 0.254 and performance in third place with a value of 73.847 (Table 9, Figure 3).

5. Discussion

Traditional classroom education is understood to be structured and well methodologically developed, with little opportunity for collaborative, spontaneous or experiential learning (see [121,122]). The rigidity of traditional education often differs from online learning, which involves digital media, such as device use (computer, tablet, or mobile phone) and video [123,124,125]. Technology has traditionally been used by teachers to support limited and in most cases one-way interaction with students without full integration in courses [126]. Abbas et al. [57] pointed out that the centre point of the educational innovation projects today are game-based curriculum developments, e-learning platforms, distance and hybrid learning. They added that online synchronised and asynchronous teaching practices had become a widespread alternative to courses, which has been achieved through the modernisation and redesign of education systems that incorporate emerging technologies. E-learning enhances spontaneity, interactivity, and experimental learning. E-learning usually mentions the following skills: problem-solving, teamwork, interdisciplinary thinking, and holistic thinking. These skills, encompassed by problem-based pedagogy, offer students chances to learn to think, especially question how to think instead of question what to think [127], can also be incorporated into its content sustainable pedagogy [43,44,46]. In detecting measures and characteristics for assessing e-learning platforms peculiarity of e-learning has to be considered. The primary goal of the e-learning platform is to enable students to understand didactic content so that as little effort as possible is required to interact with the platform. Interaction between teacher and student is important for better outcomes from teaching [94].
Research of usability and user experience is often limited to the mean values of user experience (UX) factors and their items, but the strength of the influence of individual factors is not tested. With this research, we examined whether user experience (UX) factors (named perceived efficiency (EF), perceived perspicuity (PE), perceived dependability (DE), perceived stimulation (SI), and perceived novelty (NO)) affect perceived attractiveness (AT), which is an important factor in the sustainability of e-learning tools. We adapted factors from UX model proposed from Hassenzahl [47], Laugwitz et al. [48], Schrepp [49], and Schrepp et al. [50,51,52]. UX model through UEQ questionnaire (available [128]) measures means of items, where the scale of the items is between −3 (terrible) to +3 (excellent). Then averages of items of factors are calculated.
To conduct our research, we measured the impact of factors perceived efficiency (EF), perceived perspicuity (PE), perceived dependability (DE), perceived stimulation (SI), and perceived novelty (NO) on perceived attractiveness (AT). From research, we can conclude that they explain a huge part (80.5%) variance of factor perceived attractiveness (AT). The results of the research also show that all factors, namely perceived efficiency (EF), perceived perspicuity (PE), perceived dependability (DE), perceived stimulation (SI), and perceived novelty (NO), have a strong statistically significant influence on factor perceived attractiveness (AT) of e-learning platforms we researched (see Table 8).
The SUE methodology defines four dimensions [129]. (1) The presentation covers the external properties of the interface, with an emphasis on the tools and possibilities offered by the didactic module or the e-learning platform. (2) Hypermediality focuses on characteristics related to communication through different channels and allows for a non-consistent structure. At the same time, the personalisation of reading paths as well as analysis are emphasised. (3) The proactivity of the application enables the instruments and modalities with which the application encourages user training and other e-learning activities. (4) While user activity is aimed at the growing demands of the user that he would like to perform and how software or application handles them [12].
The factor perceived stimulation (SI; t = 11.358) has the strongest statistically significant impact on perceived attractiveness (AT) of e-learning platforms, followed by perceived efficiency (EF; t = 7.533), perceived perspicuity (PE; t = 6.279), perceived novelty (NO; t = 4.557), and perceived dependability (DE; t = 4.375). Therefore, we can confirm all hypotheses (H1–H5). Even more, our model shows a large part (51.6%) of predictive relevance (Q2 = 0.516; see Table 7).
The most important impact on factor perceived attractiveness (AT) was perceived stimulation (SI), as we mentioned above, and dropped it out of the model; it would have a medium impact (f2 = 0.259). In contrast, omitting other factors would have a low impact (see Table 7). IPMA also shows that the most important factor is perceived stimulation (SI; importance = 0.346), where their performance is in fourth place with a value of 62.164 (see Table 8 and Figure 3). This indicates that we have to put more effort into the excitement and motivation the students perceive when using the e-learning platform. The next factor we need to pay attention to is perceived efficiency (EF) with an importance value of 0.254 and performance in third place with a value of 73.847. Factor perceived efficiency shows how users can solve tasks without unnecessary efforts. For example, an e-learning platform works fast, efficiently, practically, and organised.
E-learning platform interfaces are especially valuable if they allow students to focus on learning content instead of accessing it. The need to focus on students’ goals over tasks has long been emphasised. On the other hand, the need for usability has been recognised in the website design literature as an important quality factor in assessing user satisfaction with the website [7,8,9,10,11]. Since both concepts are important for e-learning platforms, it can be argued that the usability of an e-learning application has a significant impact on learning.
Moreover, the e-learning platform interface should also be pedagogically appropriate, modern looking, and engaging. This means providing functional modules and a range of interactions that will allow students to support individual learning tasks and not just ensure the use of (state-of-the-art) technology. They need to be planned on the basis of processes and activities proposed by well-confirmed educational methodologies and results. An example is multimedia, which needs to be carefully prepared, as this is the only way to avoid the counterproductive surplus of sensor networks. The use of new technologies must not rule out successful traditional teaching strategies, such as problem-based learning and simulation systems. They must allow the integration of such strategies into their e-learning platform. E-learning platform interfaces should also have specific features, such as comprehensively supported content and system functionalities, easy and effective navigation, advanced personalisation of content, and clear completion of individual chapters/topics.
The new education models are emphasizing more teamwork, real work examples and critical thinking, for what face-to-face communication between student and teacher is needed [126]. This requires use of modern learning tools including collaborative platforms. Currently, the constructivist theory is widely accepted among teachers and recognises learning as an involved process. An approach where students learn through work allows students for cognitive manipulation of recent learning material and creates rational interactions between prior knowledge and new information.
For the constructivist theory to be effective, tasks have to be constantly incorporated in a collaborative and actual context. In this way, students will understand the motivation as well as the objective end goal of the learning task itself. This means that students need to be supported both in developing concepts and in redefining some of their individual ideas. From the point of view of constructivist theory, students should be inspired to take responsibility for their own learning and, at the same time, guide them to become more aware of their knowledge. This approach is evolving towards social perspectives on learning, especially with situational learning, which suggests that the learning effects of using a new generation of e-learning platforms with extended collaboration functionality will depend on the content in which it is used. In doing so, both people and other artefacts (all learning environment components) will interact and contribute to the learning process. Ardito et al. [129] pointed out that combining the constructivism principles with situational learning is often called “socioconstructivism”.

5.1. Implication of the Study

Ar and Abbas [126] systematic literature review shows that gamification strategies in education are highly relevant and effective, especially e-gamification teaching strategies. For support of modern education models face-to face communication based on technology is necessary. Such technology enabled face-to face communication between students and teachers is possible using collaborative platforms which have appeared recently and became used widely in COVID circumstances. Collaborative platforms became very efficient and allow communication between high number of participants. University education involves many students, therefore, collaborative platforms proved to be very useful. To leverage all benefits from collaborative platform use in higher education, such platforms must be used on advanced level which include all functionality they offer. To obtain use of collaborative platforms on higher and advanced level, such platforms need to be accepted by users which in this case are students on one hand and teachers on another hand. It is known that advanced use of certain technology is closely connected with perception of distinct technology by users. The more positive the perception, the more likely it is users will use advanced features of technology and they will use technology more often. Findings from our study show factors which impact perceived attractiveness and even more the study shows that all factors are important. Factor perceived stimulation has the highest impact, so it is important for higher education organisations to imbed stimulation in their education models.
The study also shows that integration of collaborative platforms in virtual learning environments is needed. Higher education organisations need to consider which tools, solutions and platforms they will use and integrate in their virtual learning environment while functionality of some tools, solutions and platforms overlaps. This opens the question of whether collaborative platforms will become prevailing technology in higher education. We have to stress that better perceived attractiveness leads to better acceptance of technology used and this leads to more advanced use of technology which is needed for better education which is expected from new sustainable education models.
Last, but not least, the research is also important for the developers of e-learning platforms to take into account the researched factors in their development, as well as for all other organisations, because if they want to make the most of e-learning platforms, then they can use the results of this research both when choosing an e-learning platform as well as in achieving a more advanced use of it.

5.2. Limitations and Suggestions for Future Work

Emerging technologies, such as Augment Reality (AR) and Artificial Intelligence (AI), with innovative teaching methods make available the gamification or/and personalisation of learning and creates a structure with encouraging prospects [57], which can lead to better user experience and attractiveness of e-learning platforms.
Research has several limitations. The first limitation is the sample. We wanted to develop a research model with this research, so we combined all student responses (Slovenian sample and Bulgaria sample), where we previously checked with PLS-MGA that there are no statistically significant differences between students and e-learning platforms. In this phase of the research, the goal was to examine which factors in the research model and to what extent influence perceived attractiveness (AT). With the confirmation of the research model, we can continue with in-depth research of individual e-learning platforms (Moodle, Google Meet, and Microsoft Teams and also other platforms such as Blackboard, Canvas, Zoom, Webex, etc.) (e.g., [62]), different groups of students (e.g., economics, computer science, law, etc.) (e.g., [130,131]), differences by using the same e-learning platforms in different countries (e.g., [132,133]), gender of students (e.g., [134]), age, etc. Another limitation of the research is that, over time, the importance of the studied factors of the research model changes [34], so it would be sensible to conduct longitudinal research in the future.
Future research will be needed to study critical success factors needed for implementation of collaborative platforms for educational purposes in higher education organisations. Research should be also extended in direction of acceptance of collaborative platforms by students and by teachers. We can speculate that perhaps acceptance of collaborative platforms by students has different characteristics than acceptance of collaboration platforms by students. Due to this we are planning to apply Technology Acceptance model (TAM) proposed by Davis [135,136] to study acceptance of students as users in learning processes. To study acceptance of teachers UTAUT model proposed by Venkatesh et al. [137] could be applied while several organisational characteristics should be also considered.
It will be also interesting to research how collaborative platforms should be integrated in VLE. Such research could also be extended and linked in the direction of organisational sustainability identity of higher education organisations.

Author Contributions

Conceptualisation, S.S.Z., Z.D. and S.B.; methodology, S.S.Z. and Z.D.; software, S.S.Z.; validation, S.P., Z.D. and S.B.; formal analysis, S.S.Z. and Z.D.; investigation, Z.D. and S.P.; resources, S.B.; data curation, Z.D. and S.P.; writing—original draft preparation, S.S.Z. and S.B.; writing—review and editing, S.S.Z., Z.D., S.P. and S.B.; visualisation, S.S.Z. and Z.D.; supervision, S.B.; project administration, S.B.; funding acquisition, S.S.Z. and S.B. All authors have read and agreed to the published version of the manuscript.

Funding

Authors acknowledge the financial support from the Slovenian Research Agency (research core funding No. P5–0023, ‘Entrepreneurship for Innovative Society’) and the Erasmus+ programmes (grant No. 2019-1-CZ01-KA203-061374 “Spationomy 2.0”.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Roselli, T.; Pragnelli, M.V.; Rossano, V. Studying usability of an educational software. In Proceedings of the IADIS International Conference WWW/Internet 2002, ICWI 2002, Lisbon, Portugal, 13–15 November 2002; pp. 810–814. Available online: https://www.researchgate.net/publication/220970004_Studying_Usability_of_an_Educational_Software (accessed on 30 June 2022).
  2. Cole, I.J. Usability of Online Virtual Learning Environments: Key Issues for Instructors and Learners. In Student Usability in Educational Software and Games: Improving Experiences; Gonzalez, C., Ed.; IGI Global: Hershey, PA, USA, 2013; pp. 41–58. [Google Scholar] [CrossRef]
  3. Gunesekera, A.I.; Bao, Y.; Kibelloh, M. The role of usability on e-learning user interactions and satisfaction: A literature review. J. Syst. Inf. Technol. 2019, 21, 368–394. [Google Scholar] [CrossRef]
  4. Studiyanti, A.; Saraswati, A. Usability Evaluation and Design of Student Information System Prototype to Increase Student’s Satisfaction (Case Study: X University). Ind. Eng. Manag. Syst. 2019, 18, 676–684. [Google Scholar] [CrossRef]
  5. Squires, D.; Preece, J. Usability and learning: Evaluating the potential of educational software. Comput. Educ. 1996, 27, 15–22. [Google Scholar] [CrossRef]
  6. Midwest Comprehensive Center. Student Goal Setting: An Evidence-Based Practice. 2018. Available online: http://files.eric.ed.gov/fulltext/ED589978.pdf (accessed on 30 June 2022).
  7. Ganiyu, A.A.; Mishra, A.; Elijah, J.; Gana, U.M.  The Importance of Usability of a Website. IUP J. Inf. Technol. 2017, 8, 27–35. Available online: https://www.researchgate.net/publication/331993738_The_Importance_of_Usability_of_a_Website (accessed on 30 June 2022).
  8. Nur Sukinah, A.; Noor Suhana, S.; Wan Nur Idayu Tun Mohd, H.; Nur Liyana, Z.; Azliza, Y. A Review of Website Measurement for Website Usability Evaluation. J. Phys. Conf. Ser. 2021, 1874, 2045. [Google Scholar] [CrossRef]
  9. Chauhan, S.; Akhtar, A.; Gupta, A. Customer experience in digital banking: A review and future research directions. Int. J. Qual. Serv. Sci. 2022, 14, 311–348. [Google Scholar] [CrossRef]
  10. Artyom, C.; Wunarsa, D.; Shanlong, H.; Zhupeng, L.; Balakrishnan, S. A Review of a E-Commerce Website Based on Its Usability Element. EasyChair Preprint No. 6665. 2021. Available online: https://easychair.org/publications/preprint/vNPc (accessed on 30 June 2022).
  11. Durdu, P.O.; Soydemir, Ö.N. A systematic review of web accessibility metrics. In App and Website Accessibility Developments and Compliance Strategies; Akgül, Y., Ed.; IGI Global: Hershey, PA, USA, 2022; pp. 77–108. [Google Scholar] [CrossRef]
  12. Costabile, M.F.; De Marsico, M.; Lanzilotti, R.; Plantamura, V.L.; Roselli, T. On the Usability Evaluation of E-Learning Applications. In Proceedings of the 38th Annual Hawaii International Conference on System Sciences, Hawaii, HL, USA, 3–6 January 2005; p. 6b. [Google Scholar] [CrossRef]
  13. Duran, D. Learning-by-teaching. Evidence and implications as a pedagogical mechanism. Innov. Educ. Teach. Int. 2017, 54, 476–484. [Google Scholar] [CrossRef]
  14. Albert, B.; Tullis, T.; Tedesco, D. Beyond the Usability Lab: Conducting Large-Scale Online User Experience Study; Morgan Kaufmann: Burlington, MA, USA, 2010. [Google Scholar] [CrossRef]
  15. Litto, F.M.; Formiga, M. Educação a Distância: O Estado da Arte; Pearson Education: São Paulo, Brasil, 2009. [Google Scholar]
  16. Freire, L.; Arezes, P.; Campos, J. A literature review about usability evaluation methods for e-learning platforms. Work 2012, 41, 1038–1044. [Google Scholar] [CrossRef] [Green Version]
  17. Borrás Gené, O.; Martínez Núñez, M.; Fidalgo Blanco, Á. Gamification in MOOC: Challenges, opportunities and proposals for advancing MOOC model. In Proceedings of the Second International Conference on Technological Ecosystems for Enhancing Multiculturality (TEEM ’14), Salamanca, Spain, 1–3 October 2014; Association for Computing Machinery: New York, NY, USA, 2014; pp. 215–220. [Google Scholar] [CrossRef]
  18. Mueller, D.; Strohmeier, S. Design characteristics of virtual learning environments: State of research. Comput. Educ. 2011, 57, 2505–2516. [Google Scholar] [CrossRef]
  19. Papanastasiou, G.; Drigas, A.; Skianis, C.; Lytras, M.; Papanastasiou, E. Virtual and augmented reality effects on K-12, higher and tertiary education students’ twenty-first century skills. Virtual Real. 2019, 23, 425–436. [Google Scholar] [CrossRef]
  20. Potkonjak, V.; Gardner, M.; Callaghan, V.; Mattila, P.; Guetl, C.; Petrović, V.M.; Jovanović, K. Virtual laboratories for education in science, technology, and engineering: A review. Comput. Educ. 2016, 95, 309–327. [Google Scholar] [CrossRef] [Green Version]
  21. Abuhlfaia, K.; de Quincey, E. Evaluating the Usability of an E-Learning Platform within Higher Education from a Student Perspective. In Proceedings of the 2019 3rd International Conference on Education and E-Learning (ICEEL 2019), Barcelona, Spain, 5–7 November 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–7. [Google Scholar] [CrossRef]
  22. Cengizhan Dirican, A.; Göktürk, M. Psychophysiological measures of human cognitive states applied in human computer interaction. Procedia Comput. Sci. 2011, 3, 1361–1367. [Google Scholar] [CrossRef] [Green Version]
  23. MacKenzie, I.S. Human-Computer Interaction: An Empirical Research Perspective; Morgan Kaufmann: Waltham, MA, USA, 2013. [Google Scholar]
  24. Ardito, C.; Costabile, F.; De Marsico, M.; Lanzilotti, R.; Levialdi, S.; Roselli, T.; Rossano, V. An approach to usability evaluation of e-learning applications. Univers. Access Inf. Soc. 2006, 4, 270–283. [Google Scholar] [CrossRef]
  25. Salas, J.; Chang, A.; Montalvo, L.; Núñez, A.; Vilcapoma, M.; Moquillaza, A.; Murillo, B.; Paz, F. Guidelines to evaluate the usability and user experience of learning support platforms: A systematic review. In Human-Computer Interaction. HCI-COLLAB 2019. Communications in Computer and Information Science; Ruiz, P., Agredo-Delgado, V., Eds.; Springer: Cham, Switzerland, 2019. [Google Scholar] [CrossRef]
  26. Abuhlfaia, K.; de Quincey, E. The Usability of E-learning Platforms in Higher Education: A Systematic Mapping Study. In Proceedings of the 32nd International BCS Human Computer Interaction Conference (HCI), Belfast, UK, 4–6 July 2018; BCS Learning and Development Ltd.: Belfast, UK, 2018. [Google Scholar] [CrossRef] [Green Version]
  27. Handayani, K.; Juningsih, E.; Riana, D.; Hadianti, S.; Rifai, A.; Serli, R. Measuring the Quality of Website Services covid19.kalbarprov.go.id Using the Webqual 4.0 Method. J. Phys. Conf. Ser. 2020, 1641, 2049. [Google Scholar] [CrossRef]
  28. Gordillo, A.; Barra, E.; Aguirre, S.; Quemada, J. The usefulness of usability and user experience evaluation methods on an e-Learning platform development from a developer’s perspective: A case study. In Proceedings of the 2014 IEEE Frontiers in Education Conference (FIE) Proceedings, Madrid, Spain, 22–25 October 2014; pp. 1–8. [Google Scholar] [CrossRef] [Green Version]
  29. ISO 9241-11; Ergonomic Requirements for Office Work with Visual Display Terminals (Vdts): Part 11: Guidance On Usability. International Organization for Standardization (ISO): Geneva, Switzerland, 1998. Available online: https://www.iso.org/standard/16883.html (accessed on 7 May 2022).
  30. Hartson, H.R.; Andre, T.S.; Williges, R.C.; Tech, V. Criteria for evaluating usability evaluation methods. Int. J. Hum.-Comput. Interact. 2001, 13, 373–410. [Google Scholar] [CrossRef]
  31. Quintana, C.; Carra, A.; Krajcik, J.; Soloway, E. Learner-centered design: Reflections and new directions. In Human-Computer Interaction in the New Millennium, 1st ed.; Carrol, J.M., Ed.; Addison Wesley: New York, NY, USA, 2001; pp. 605–626. [Google Scholar]
  32. Giannakos, M.N. The evaluation of an e-Learning web-based platform. In Proceedings of the 2nd International Conference on Computer Supported Education, Valencia, Spain, 7–10 April 2010; pp. 433–438. [Google Scholar] [CrossRef] [Green Version]
  33. ISO 9241-210; Ergonomics of Human-System Interaction Part 210: Human-Centered Design For Interactive Systems. International Organization for Standardization (ISO): Geneva, Switzerland, 2010. Available online: https://www.iso.org/standard/52075.html (accessed on 7 May 2022).
  34. Roto, V.; Law, E.; Vermeeren, A.; Hoonhout, J. (Eds.) User Experience White Paper. Outcome of the Dagstuhl Seminar on Demarcating User Experience, Germany, 2011. Available online: http://www.allaboutux.org/files/UX-WhitePaper.pdf (accessed on 1 June 2022).
  35. Liu, Y. A Scientometric Analysis of User Experience Research Related to Green and Digital Transformation. In Proceedings of the 2020 Management Science Informatization and Economic Innovation Development Conference (MSIEID), Guangzhou, China, 18–20 December 2020; pp. 377–380. [Google Scholar] [CrossRef]
  36. Graham, L.; Berman, J.; Bellert, A. Sustainable Learning: Inclusive Practices for 21st Century Classrooms; Cambridge University Press: Melbourne, Australia, 2015. [Google Scholar] [CrossRef]
  37. Medina-Molina, C.; Rey-Tienda, M.D.L.S. The transition towards the implementation of sustainable mobility. Looking for generalization of sustainable mobility in different territories by the application of QCA. Sustain. Technol. Entrep. 2022, 1, 100015. [Google Scholar] [CrossRef]
  38. Trapp, C.T.C.; Kanbach, D.K.; Kraus, S. Sector coupling and business models towards sustainability: The case of the hydrogen vehicle industry. Sustain. Technol. Entrep. 2022, 1, 100014. [Google Scholar] [CrossRef]
  39. Nedjah, N.; de Macedo Mourelle, L.; dos Santos, R.A.; dos Santos, L.T.B. Sustainable maintenance of power transformers using computational intelligence. Sustain. Technol. Entrep. 2022, 1, 100001. [Google Scholar] [CrossRef]
  40. Marinakis, Y.D.; White, R. Hyperinflation potential in commodity-currency trading systems: Implications for sustainable development. Sustain. Technol. Entrep. 2022, 1, 100003. [Google Scholar] [CrossRef]
  41. Franco, M.; Esteves, L. Inter-clustering as a network of knowledge and learning: Multiple case studies. J. Innov. Knowl. 2020, 5, 39–49. [Google Scholar] [CrossRef]
  42. Bouncken, R.B.; Lapidus, A.; Qui, Y. Organizational sustainability identity: ‘New Work’ of home offices and coworking spaces as facilitators. Sustain. Technol. Entrep. 2022, 1, 100011. [Google Scholar] [CrossRef]
  43. Geitz, G.; de Geus, J. Design-based education, sustainable teaching, and learning. Cogent Educ. 2019, 6, 7919. [Google Scholar] [CrossRef]
  44. Sinakou, E.; Donche, V.; Boeve-de Pauw, J.; Van Petegem, P. Designing Powerful Learning Environments in Education for Sustainable Development: A Conceptual Framework. Sustainability 2019, 11, 5994. [Google Scholar] [CrossRef] [Green Version]
  45. Sofiadin, A.B.M. Sustainable development, e-learning and Web 3.0: A descriptive literature review. J. Inf. Commun. Ethics Soc. 2014, 12, 157–176. [Google Scholar] [CrossRef]
  46. Ben-Eliyahu, A. Sustainable Learning in Education. Sustainability 2021, 13, 4250. [Google Scholar] [CrossRef]
  47. Hassenzahl, M. The Effect of Perceived Hedonic Quality on Product Appealingness. Int. J. Hum.-Comput. Interact. 2001, 13, 481–499. [Google Scholar] [CrossRef]
  48. Laugwitz, B.; Held, T.; Schrepp, M. Construction and Evaluation of a User Experience Questionnaire. In HCI and Usability for Education and Work; Holzinger, A., Ed.; Springer: Berlin/Heidelberg, Germany, 2008; pp. 63–76. [Google Scholar] [CrossRef]
  49. Schrepp, M. User Experience Questionnaire Handbook. 2019. Available online: https://www.ueq-online.org/Material/Handbook.pdf (accessed on 10 March 2022).
  50. Schrepp, M.; Hinderks, A.; Thomaschewski, J. Applying the User Experience Questionnaire (UEQ) in Different Evaluation Scenarios. In Design, User Experience, and Usability. Theories, Methods, and Tools for Designing the User Experience; Marcus, A., Ed.; Springer International Publishing: Geneva, Switzerland, 2014; pp. 383–392. [Google Scholar] [CrossRef]
  51. Schrepp, M.; Hinderks, A.; Thomaschewski, J. Construction of a benchmark for the User Experience Questionnaire (UEQ). Int. J. Interact. Multimed. Artif. Intell. 2017, 4, 40–44. [Google Scholar] [CrossRef] [Green Version]
  52. Schrepp, M.; Hinderks, A.; Thomaschewski, J. Design and Evaluation of a Short Version of the User Experience Questionnaire (UEQ-S). Int. J. Interact. Multimed. Artif. Intell. 2017, 4, 103–108. [Google Scholar] [CrossRef] [Green Version]
  53. Remali, A.M.; Ghazali, M.A.; Kamaruddin, M.K.; Kee, T.Y. Understanding Academic Performance Based on Demographic Factors, Motivation Factors and Learning Styles. Int. J. Asian Soc. Sci. 2013, 3, 1938–1951. [Google Scholar]
  54. Indreica, S.E. eLearning platform: Advantages and disadvantages on time management. In Proceedings of the 10th International Scientific Conference eLearning and Software for Education, Bucharest, Romania, 24–25 April 2014; pp. 236–243. [Google Scholar] [CrossRef]
  55. Virtič, M.P. The role of internet in education. In Proceedings of the DIVAI 2012, 9th International Scientific Conference on Distance Learning in Applied Informatics, Štúrovo, Slovakia, 2–4 May 2012; pp. 243–249. [Google Scholar]
  56. Alexe, C.M.; Alexe, C.G.; Dumitriu, D.; Mustață, I.C. The Analysis of the Users’ Perceptions Regarding the Learning Management Systems for Higher Education: Case Study Moodle. In Proceedings of the 17th International Scientific Conference on eLearning and Software for Education, eLSE 2021, Bucharest, Romania, 22–23 April 2021; pp. 284–293. [Google Scholar] [CrossRef]
  57. Abbas, A.; Hosseini, S.; Núñez, J.L.M.; Sastre-Merino, S. Emerging technologies in education for innovative pedagogies and competency development. Australas. J. Educ. Technol. 2021, 37, 1–5. [Google Scholar] [CrossRef]
  58. Alturki, U.; Aldraiweesh, A. Application of Learning Management System (LMS) during the COVID-19 Pandemic: A Sustainable Acceptance Model of the Expansion Technology Approach. Sustainability 2021, 13, 10991. [Google Scholar] [CrossRef]
  59. Sarnou, H.; Sarnou, D. Investigating the EFL Courses Shift into Moodle during the Pandemic of COVID-19: The Case of MA Language and Communication at Mostaganem University. Arab World Engl. J. 2021, 24, 354–363. [Google Scholar] [CrossRef]
  60. Bakhmat, L.; Babakina, O.; Belmaz, Y. Assessing online education during the COVID-19 pandemic: A survey of lecturers in Ukraine. J. Phys. Conf. Ser. 2021, 1840, 012050. [Google Scholar] [CrossRef]
  61. Polhun, K.; Kramarenko, T.; Maloivan, M.; Tomilina, A. Shift from blended learning to distance one during the lockdown period using Moodle: Test control of students’ academic achievement and analysis of its results. J. Phys. Conf. Ser. 2021, 1840, 012053. [Google Scholar] [CrossRef]
  62. Hill, P. New Release of European LMS Market Report. eLiterate 2016. Available online: https://eliterate.us/new-release-european-lms-market-report/ (accessed on 7 June 2022).
  63. Somova, E.; Gachkova, M. Strategy to Implement Gamification in LMS. In Next-Generation Applications and Implementations of Gamification Systems, 1st ed.; Portela, F., Queirós, R., Eds.; IGI Global: Hershey, PA, USA, 2022; pp. 51–72. [Google Scholar] [CrossRef]
  64. Al-Ajlan, A.; Zedan, H. Why Moodle. In Proceedings of the 12th IEEE International Workshop on Future Trends of Distributed Computing Systems, Kunming, China, 21–23 October 2008; pp. 58–64. [Google Scholar] [CrossRef]
  65. Kintu, M.J.; Zhu, C.; Kagambe, E. Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. Int. J. Educ. Technol. High. Educ. 2017, 14, 7. [Google Scholar] [CrossRef] [Green Version]
  66. Evgenievich, E.; Petrovna, M.; Evgenievna, T.; Aleksandrovna, O.; Yevgenyevna, S. Moodle LMS: Positive and Negative Aspects of Using Distance Education in Higher Education Institutions. Propós. Represent. 2021, 9, e1104. [Google Scholar] [CrossRef]
  67. Oproiu, G.C. Study about Using E-learning Platform (Moodle) in University Teaching Process. Procedia Soc. Behav. Sci. 2015, 180, 426–432. [Google Scholar] [CrossRef] [Green Version]
  68. Parusheva, S.; Aleksandrova, Y.; Hadzhikolev, A. Use of Social Media in Higher Education Institutions—An Empirical Study Based on Bulgarian Learning Experience. TEM J. Technol. Educ. Manag. Inform. 2018, 7, 171–181. [Google Scholar] [CrossRef]
  69. Ironsi, C.S. Google Meet as a synchronous language learning tool for emergency online distant learning during the COVID-19 pandemic: Perceptions of language instructors and preservice teachers. J. Appl. Res. High. Educ. 2022, 14, 640–659. [Google Scholar] [CrossRef]
  70. Lewandowski, M. Creating virtual classrooms (using Google Hangouts) for improving language competency. Lang. Issues: ESOL J. 2015, 26, 37–42. [Google Scholar]
  71. Herskowitz, N. Gartner Recognises Microsoft as Leader in Unified Communications as a Service and Meetings Solutions, Microsoft 365. 2021. Available online: https://www.microsoft.com/en-us/microsoft-365/blog/2021/10/25/gartner-recognizes-microsoft-as-leader-in-unified-communications-as-a-service-and-meetings-solutions/ (accessed on 7 June 2022).
  72. Microsoft. Welcome to Microsoft Teams. 2021. Available online: https://docs.microsoft.com/en-us/microsoftteams/teams-overview (accessed on 30 April 2021).
  73. Florjancic, V.; Wiechetek, L. Using Moodle and MS Teams in higher education—A comparative study. Int. J. Innov. Learn. 2022, 31, 264–286. [Google Scholar] [CrossRef]
  74. Friedman, T. Come the Revolution. New York Times. 15 May 2012. Available online: http://www.nytimes.com/2012/05/16/opinion/friedman-come-the-revolution.html/ (accessed on 7 May 2022).
  75. Czerniewicz, L.; Deacon, A.; Small, J.; Walji, S. Developing world MOOCs: A curriculum view of the MOOC landscape. J. Glob. Lit. Technol. Emerg. Pedagog. 2014, 2, 122–139. [Google Scholar]
  76. Mulder, F.; Jansen, D. MOOCs for opening up education and the openupEd initiative. In MOOCs and Open Education around the World; Bonk, C.J., Lee, M.M., Reeves, T.C., Reynolds, T.H., Eds.; Routledge: New York, NY, USA, 2015. [Google Scholar] [CrossRef]
  77. Walther, J.B. Computer-mediated communication. Commun. Res. 1996, 23, 3–43. [Google Scholar] [CrossRef]
  78. Walther, J.B.; Anderson, J.F.; Park, D.W. Interpersonal Effects in Computer-Mediated Interaction: A meta-analysis of social and antisocial communication. Commun. Res. 1994, 21, 460–487. Available online: www.matt-koehler.com/OtherPages/Courses/CEP_909_FA01/Readings/CmC/Walther_1994b.pdf (accessed on 7 June 2022). [CrossRef] [Green Version]
  79. Wegerif, R. The social dimension of asynchronous learning networks. J. Asynchronous Learn. Netw. 1998, 2, 34–49. Available online: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.103.7298&rep=rep1&type=pdf (accessed on 7 May 2021). [CrossRef]
  80. Penstein Rosé, C.; Carlson, R.; Yang, D.; Wen, M.; Resnick, L.; Goldman, P.; Sherer, J. Social factors that contribute to attrition in MOOCs. In Proceedings of the First ACM Conference on Learning @ Scale Conference (L@S’ 14), Atlanta, GA, USA, 4–5 March 2014; Association for Computing Machinery: New York, NY, USA, 2014; pp. 197–198. [Google Scholar] [CrossRef]
  81. Paechter, M.; Maier, B. Online or face-to-face? Students’ experiences and preferences in e-learning. Internet High. Educ. 2010, 13, 292–297. [Google Scholar] [CrossRef]
  82. Garrison, D.R.; Anderson, T.; Archer, W. Critical thinking, cognitive presence, and computer conferencing in distance education. Am. J. Distance Educ. 2001, 15, 7–23. [Google Scholar] [CrossRef] [Green Version]
  83. Chen, K.C.; Jang, S.J. Motivation in online learning: Testing a model of self-determination theory. Comput. Hum. Behav. 2010, 26, 741–752. [Google Scholar] [CrossRef]
  84. Barnett, E.A. Validation experiences and persistence among community college students. Rev. High. Educ. 2011, 34, 193–230. [Google Scholar] [CrossRef]
  85. Graham, C.R. Emerging practice and research in blended learning. In Handbook of Distance Education, 3rd ed.; Moore, M.G., Ed.; Routledge: New York, NY, USA, 2013; pp. 333–350. [Google Scholar]
  86. Ross, B.; Gage, K. Global perspectives on blending learning: Insight from WebCT and our customers in higher education. In Handbook of Blended Learning: Global Perspectives, Local Designs; Bonk, C.J., Graham, C.R., Eds.; Pfeiffer Publishing: San Francisco, CA, USA, 2006; pp. 155–168. [Google Scholar]
  87. Drossos, L.; Vassiliadis, B.; Stefani, A.; Xenos, M.; Sakkopoulos, E.; Tsakalidis, A. Introducing ICT in traditional higher education environment: Background, design and evaluation of a blended approach. Int. J. Inf. Commun. Technol. Educ. 2006, 2, 65–78. [Google Scholar] [CrossRef]
  88. Dziuban, C.; Moskal, P. A course is a course is a course: Factor invariance in student evaluation of online, blended and face-to-face learning environments. Internet High. Educ. 2011, 14, 236–241. [Google Scholar] [CrossRef]
  89. Means, B.; Toyama, Y.; Murphy, R.; Baki, M. The effectiveness of online and blended learning: A meta-analysis of the empirical literature. Teach. Coll. Rec. 2013, 115, 1–47. [Google Scholar] [CrossRef]
  90. Griffiths, R.; Chingos, M.; Mulhern, C.; Spies, R. Interactive Online Learning on Campus: Testing MOOCs and Other Platforms in Hybrid Formats in the University System of Maryland; Ithaka S+R: New York, NY, USA, 2014; Volume 10, pp. 1–81. [Google Scholar]
  91. Joseph, A.M.; Nath, B.A. Integration of Massive Open Online Education (MOOC) System with in-Classroom Interaction and Assessment and Accreditation: An Extensive Report from a Pilot Study. WORLDCOMP ‘13. Available online: http://worldcomp-proceedings.com/proc/p2013/EEE3547.pdf (accessed on 11 April 2022).
  92. Rossiou, E.; Sifalaras, A. Blended Methods to Enhance Learning: An Empirical Study of Factors Affecting Student Participation in the Use of E-Tools to Complement F2F Teaching of Algorithms. In Proceedings of the 6th European Conference on e-Learning (ECEL 2007), Dublin, Ireland, 4–5 October 2007; pp. 519–528. Available online: https://www.researchgate.net/publication/248392307_Blended_Methods_to_Enhance_Learning_An_Empirical_Study_of_Factors_Affecting_Student_Participation_in_the_use_of_e-Tools_to_Complement_F2F_Teaching_of_Algorithms (accessed on 10 April 2022).
  93. Anderson, A.; Huttenlocher, D.; Kleinberg, J.; Leskovec, J. Engaging with massive online courses. In Proceedings of the 23rd International Conference on World Wide Web, Seoul, Korea, 7–11 April 2014; pp. 687–698. [Google Scholar] [CrossRef] [Green Version]
  94. Abbas, A.; Arrona-Palacios, A.; Haruna, H.; Alvarez-Sosa, D. Elements of students’ expectation towards teacher-student research collaboration in higher education. In Proceedings of the 2020 IEEE Frontiers in Education Conference (FIE), Uppsala, Sweden, 21–24 October 2020; pp. 1–5. [Google Scholar] [CrossRef]
  95. Eisenhauer, T. Grow Your Business with Collaboration Tools. Axero Solutions, 2021. Available online: https://info.axerosolutions.com/grow-your-business-with-collaboration-tools (accessed on 5 May 2021).
  96. Deloitte. Remote Collaboration Facing the Challenges of COVID-19. 2021. Available online: https://www2.deloitte.com/content/dam/Deloitte/de/Documents/human-capital/Remote-Collaboration-COVID-19.pdf (accessed on 30 April 2021).
  97. Madlberger, M.; Raztocki, N. Digital Cross-Organizational Collaboration: Towards a Preliminary Framework. In Proceedings of the Fifteenth Americas Conference on Information Systems, San Francisco, CA, USA, 6–9 August 2009; Available online: https://ssrn.com/abstract=1477527 (accessed on 30 April 2021).
  98. Van de Sand, F.; Frison, A.K.; Zotz, P.; Riener, A.; Holl, K. The intersection of User Experience (UX), Customer Experience (CX), and Brand Experience (BX). In User Experience Is Brand Experience. Management for Professionals; Springer Nature: Cham, Switzerland, 2020. [Google Scholar] [CrossRef]
  99. Hassenzahl, M. Experience Design: Technology for All the Right Reasons. Synth. Lect. Hum.-Cent. Inform. 2010, 3, 1–95. [Google Scholar] [CrossRef]
  100. Nass, C.; Adam, S.; Doerr, J.; Trapp, M. Balancing user and business goals in software development to generate positive user experience. In Human-Computer Interaction: The Agency Perspective; Springer: Berlin/Heidelberg, Germany, 2012. [Google Scholar] [CrossRef]
  101. Lallemand, C.; Gronier, G.; Koenig, V. User experience: A concept without consensus? Exploring practitioners’ perspectives through an international survey. Comput. Hum. Behav. 2015, 43, 35–48. [Google Scholar] [CrossRef]
  102. Zaki, T.; Nazrul Islam, M. Neurological and physiological measures to evaluate the usability and user-experience (UX) of information systems: A systematic literature review. Comput. Sci. Rev. 2021, 40, 375. [Google Scholar] [CrossRef]
  103. Kashfi, P.; Feldt, R.; Nilsson, A. Integrating UX principles and practices into software development organisations: A case study of influencing events. J. Syst. Softw. 2019, 154, 37–58. [Google Scholar] [CrossRef]
  104. Sarstedt, M.; Henseler, J.; Ringle, C.M. Multi-Group Analysis in Partial Least Squares (PLS) Path Modeling: Alternative Methods and Empirical Results. Adv. Int. Mark. 2011, 22, 195–218. [Google Scholar] [CrossRef]
  105. Henseler, J.; Ringle, C.M.; Sinkovics, R.R. The Use of Partial Least Squares Path Modeling in International Marketing. Adv. Int. Mark. 2009, 20, 277–320. [Google Scholar] [CrossRef] [Green Version]
  106. Ringle, C.M.; Wende, S.; Becker, J.-M. SmartPLS 3. Boenningstedt: SmartPLS. 2015. Available online: https://www.smartpls.com (accessed on 10 June 2021).
  107. Chin, W.W.; Newsted, P.R. Structural equation modeling analysis with small samples using partial least squares. In Statistical Strategies for Small Sample Research; Hoyle, R.H., Ed.; Sage Publications: Thousand Oaks, CA, USA, 1999; pp. 307–341. [Google Scholar]
  108. Chin, W.W. Commentary: Issues and Opinion on Structural Equation Modeling. MIS Q. 1998, 22, vii–xvi. Available online: http://www.jstor.org/stable/249674 (accessed on 30 June 2022).
  109. Hair, F.J.; Hult, G.T.M.; Ringle, C.M.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM), 2nd ed.; Sage: Los Angeles, CA, USA, 2017. [Google Scholar]
  110. Garson, G.D. Partial Least Squares: Regression and Structural Equation Models; Statistical Associates Publishers: Asheboro, NC, USA, 2016. [Google Scholar]
  111. Jöreskog, K.G.; Wold, H.O.A. The ML and PLS Techniques for modeling with latent varibles: Historical and comparative aspects. In Systems under Indirect Obervation, Part I, 1st ed.; Wold, H., Jöreskog, K.G., Eds.; Elsevier: Amsterdam, The Netherlands, 1982; pp. 263–270. [Google Scholar]
  112. Cassel, C.; Hackl, P.; Westlund, A.H. Robustness of partial least squares method for estimating latent variable quality structures. J. Appl. Stat. 1999, 26, 435–446. [Google Scholar] [CrossRef]
  113. Hair, J.F.; Risher, J.J.; Sarstedt, M.; Ringle, C.M. When to use and how to report the results of PLS-SEM. Eur. Bus. Rev. 2019, 31, 2–24. [Google Scholar] [CrossRef]
  114. Chin, W.W. The partial least squares approach for structural equation modeling. In Modern Methods for Business Research; Macoulides, G.A., Ed.; Lawrence Erlbaum Associates: Mahwah, NJ, USA, 1998; pp. 295–336. [Google Scholar]
  115. Daskalakis, S.; Mantas, J. Evaluatiing the impact of a service-oriented framework for healthcare interoperability. In eHealth beyond the Horizont—Get IT There: Proceedings of MIE2008; Anderson, S.K., Klein, G.O., Schulz, S., Aarts, J., Mazzoleni, M.C., Eds.; IOS Press: Amsterdam, The Netherlands, 2008; pp. 285–290. [Google Scholar]
  116. Fornell, C.G.; Larcker, D.F. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  117. Shmueli, G.; Koppius, O.R. Predictive analytics in information systems research. MIS Q. 2011, 35, 553–572. [Google Scholar] [CrossRef] [Green Version]
  118. Sarstedt, M.; Ringle, C.M.; Hair, J.F. Partial least squares structural equation modeling. In Handbook of Market Research; Homburg, C., Klarmann, M., Vomberg, A., Eds.; Springer: Cham, Switzerland, 2017. [Google Scholar] [CrossRef]
  119. Hair, J.F.; Black, W.C.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis, 7th ed.; Pearson Prentice Hall: Upper Saddle River, NJ, USA, 2009. [Google Scholar]
  120. Höck, C.; Ringle, C.M.; Sarstedt, M. Management of multi-purpose stadiums: Importance and performance mesurement of service interfaces. Int. J. Serv. Technol. Manag. 2010, 14, 188–207. [Google Scholar] [CrossRef]
  121. UN. Resolutions Adopted by the Conference. A/CONF.151/26/Rev.l. In Report of the United Nations Conference on Environment and Development, Rio de Janeiro, Brazil, 3–14 June 1992; United Nations: New York, NY, USA, 1993; Volume I, Available online: https://www.un.org/esa/dsd/agenda21/Agenda%2021.pdf (accessed on 25 March 2021).
  122. Gibson, R.B. Sustainability assessment: Basic components of a practical approach. Impact Assess. Proj. Apprais. 2006, 24, 170–182. [Google Scholar] [CrossRef]
  123. Hannay, M.; Newvine, T. Perceptions of distance learning: A comparison of online and traditional learning. Merlot J. Online Learn. Teach. 2006, 2, 1–11. [Google Scholar]
  124. Nazarenko, A.L. Blended Learning vs. Traditional Learning: What Works? (A Case Study Research). Procedia-Soc. Behav. Sci. 2015, 200, 77–82. [Google Scholar] [CrossRef] [Green Version]
  125. Engum, S.A.; Jeffries, P.; Fisher, L. Intravenous catheter training system: Computer-based education versus traditional learning methods. Am. J. Surg. 2003, 186, 67–74. [Google Scholar] [CrossRef]
  126. Ar, A.Y.; Abbas, A. Role of gamification in Engineering Education: A systematic literature review. In Proceedings of the 2021 IEEE Global Engineering Education Conference (EDUCON), Vienna, Austria, 21–23 April 2021; pp. 210–213. [Google Scholar] [CrossRef]
  127. Thomas, I. Critical Thinking, Transformative Learning, Sustainable Education, and Problem-Based Learning in Universities. J. Transform. Educ. 2009, 7, 245–264. [Google Scholar] [CrossRef]
  128. UEQ. User Experience Questionnaire. Available online: https://www.ueq-online.org/ (accessed on 10 March 2022).
  129. Ardito, C.; De Marsico, M.; Lanzilotti, R.; Levialdi, S.; Roselli, T.; Rossano, V.; Tersigni, M. Usability of E-learning tools. In Proceedings of the Working Conference on Advanced Visual Interfaces (AVI ‘04), Gallipoli, Italy, 25–28 May 2004; Association for Computing Machinery: New York, NY, USA, 2004; pp. 80–84. [Google Scholar] [CrossRef] [Green Version]
  130. Beranič, T.; Heričko, M. The Impact of Serious Games in Economic and Business Education: A Case of ERP Business Simulation. Sustainability 2022, 14, 683. [Google Scholar] [CrossRef]
  131. Beranič, T.; Heričko, M. Introducing ERP Concepts to IT Students Using an Experiential Learning Approach with an Emphasis on Reflection. Sustainability 2019, 11, 4992. [Google Scholar] [CrossRef] [Green Version]
  132. Sternad Zabukovšek, S.; Shah Bharadwaj, S.; Bobek, S.; Štrukelj, T. Technology acceptance model-based research on differences of enterprise resources planning systems use in India and the European Union. Inž. Ekon. 2019, 30, 326–338. [Google Scholar] [CrossRef] [Green Version]
  133. Sternad Zabukovšek, S.; Bobek, S.; Štrukelj, T. Employees’ attitude toward ERP system’s use in Europe and India: Comparing two TAM-based studies. In Cross-Cultural Exposure and Connections: Intercultural Learning for Global Citizenship; Birdie, A.K., Ed.; Apple Academic Press: New York, NY, USA, 2020; pp. 29–69. [Google Scholar] [CrossRef]
  134. Rožman, M.; Sternad Zabukovšek, S.; Bobek, S.; Tominc, P. Gender differences in work satisfaction, work engagement and work efficiency of employees during the COVID-19 pandemic: The case in Slovenia. Sustainability 2021, 13, 8791. [Google Scholar] [CrossRef]
  135. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef] [Green Version]
  136. Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User acceptance of computer technology: A comparison of two theoretical models. Manag. Sci. 1989, 35, 982–1003. [Google Scholar] [CrossRef] [Green Version]
  137. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User Acceptance of Information Technology: Toward a Unified View. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Research model.
Figure 1. Research model.
Sustainability 14 08257 g001
Figure 2. Structural model results.
Figure 2. Structural model results.
Sustainability 14 08257 g002
Figure 3. Importance–performance map.
Figure 3. Importance–performance map.
Sustainability 14 08257 g003
Table 1. The definitions of user experience concepts.
Table 1. The definitions of user experience concepts.
TermDescription with Examples
UX principlesCritical factors and basic concepts indicate the understanding of the user experience, which professionals must consider in their work. Example: The hedonic and pragmatic aspects of software application development play an essential role in user experience design as user experience is temporary.
UX practicesIncludes actions that practitioners need to perform to comply with the user experience principles. Examples of practices are: recognizing users’ personal goals and desires, preparing prototypes, including users in the design process, and assessing software from a hedonistic and pragmatic perspective.
UX softwareComputer-aided software that developers or designers use to perform a variety of UX practices is usually designed to support specific methods to allow for more systematic software development. Examples: eye-tracking software, persona preparation, visual design, and prototyping software.
UX techniquesThey allow practitioners to select and load a structure based on best practices, thus allowing them to be more systematic, and therefore, they are also more likely to succeed. Examples are questionnaires and surveys, mind plans, cognitive mapping, field research, and design studio.
Table 2. User experience dimensions.
Table 2. User experience dimensions.
WhenWhatHow
Prior to useAnticipated user experienceExpectations about the experience.
During useMomentary user experienceFacing with the experience.
After usageEpisodic user experienceThinking about the experience.
Past useCumulative user experienceCall to mind several periods of use.
Table 3. Factors and items definition.
Table 3. Factors and items definition.
FactorsDescriptionItems (Measured Variable Codes)
Perceived efficiency (EF)The degree to which users can solve tasks without unnecessary effort.EF1: The e-learning platform works fast.
EF2: The e-learning platform is efficient to use.
EF3: The e-learning platform is practical to use.
EF4: The e-learning platform is organised.
Perceived perspicuity (PE)The degree of ease in getting acquainted with the e-learning platform and learning to use it.PE1: The e-learning platform is understandable to use.
PE2: The e-learning platform is easy to learn.
PE3: The e-learning platform is easy to use.
PE4: The e-learning platform is transparent.
Perceived dependability (DE)The degree to which the user feels in control of the interaction with the e-learning platform.DE1: The e-learning platform is predictive.
DE2: The e-learning platform is supportive.
DE3: The e-learning platform is secure.
DE4: The e-learning platform meets expectations.
Perceived stimulation (SI)The degree of excitement and motivation the user perceives when using the e-learning platform.SI1: The e-learning platform is valuable.
SI2: An e-learning platform is exciting.
SI3: An e-learning platform is interesting.
SI4: An e-learning platform is motivating.
Perceived novelty (NO)The degree to which the e-learning platform is innovative and creative attracts users’ interest.NO1: The e-learning platform encourages creativity.
NO2: An e-learning platform is inventive.
NO3: An e-learning platform is leading edge.
NO4: An e-learning platform is innovative.
Perceived attractiveness (AT)The degree of the user’s general impression of the e-learning platform.AT1: The e-learning platform is enjoyable to use.
AT2: The e-learning platform is good to use.
AT3: The e-learning platform is pleasing to use.
AT4: The e-learning platform is pleasant to use.
AT5: The e-learning platform is attractive to use.
AT6: The e-learning platform is friendly to use.
Source: Adapted and supplemented according to Laugwitz et al. [48] and Schrepp [49].
Table 4. Descriptive statistics, convergent validity, and discriminant validity.
Table 4. Descriptive statistics, convergent validity, and discriminant validity.
ConstructsItemMean
Value
Standard
Deviation
Indicators LoadingsIndicator
Reliability
HTMT Confidence
Interval
2.5%97.5%
Perceived efficiency (EF)EF15.2581.6550.7230.5230.2640.315
EF25.5341.4890.7930.6290.3270.380
EF35.3551.5950.7630.5820.2890.338
EF45.5241.6040.7660.5870.3250.388
Perceived perspicuity (PE)PE15.8171.4880.8170.6670.3150.363
PE25.3991.7870.7530.5670.2600.319
PE35.4121.6550.7640.5840.2550.306
PE45.3861.7440.7950.6320.3370.396
Perceived dependability (DE)DE25.4831.6130.7510.5640.4350.534
DE35.5511.6280.7470.5580.3250.400
DE45.3951.5730.820.6720.4130.483
Perceived stimulation (SI)SI14.8511.5920.7780.6050.2960.342
SI24.4481.5530.8180.6690.2680.304
SI34.9591.6210.8740.7640.3330.370
SI44.5961.7380.7520.5660.2610.306
Perceived novelty (NO)NO14.6421.7860.8570.7340.5790.676
NO24.6471.7940.820.6720.5170.614
Perceived attractiveness (AT)AT15.5461.5160.7580.5750.1850.208
AT25.6151.5830.7890.6230.1990.226
AT35.1431.5220.7990.6380.1840.210
AT45.4521.4480.8210.6740.2000.221
AT55.0061.5940.8250.6810.2030.227
AT65.2231.5460.830.6890.1990.226
Table 5. Internal consistency reliability and convergent validity.
Table 5. Internal consistency reliability and convergent validity.
ConstructInternal Consistency
Reliability
Convergent Validity
Cronbach’s
Alpha
Composite
Reliability
AVE
0.60–0.950.60–0.95>0.50
Perceived efficiency (EF)0.7590.8470.58
Perceived perspicuity (PE)0.7900.8630.613
Perceived dependability (DE)0.6670.8170.598
Perceived stimulation (SI)0.8200.8810.651
Perceived novelty (NO)0.5790.8260.703
Perceived attractiveness (AT)0.8910.9160.647
Table 6. Fornell–Larcker values.
Table 6. Fornell–Larcker values.
EFPEDESINOAT
Perceived
efficiency (EF)
0.762
Perceived
perspicuity (PE)
0.7520.783
Perceived
dependability (DE)
0.7390.6760.774
Perceived
stimulation (SI)
0.6330.5790.5840.807
Perceived
novelty (NO)
0.4910.4310.4470.6590.839
Perceived
attractiveness (AT)
0.7440.7530.7220.7860.6190.804
Table 7. Explained variance, effect size, and predictive relevance values.
Table 7. Explained variance, effect size, and predictive relevance values.
ConstructsExplained Variance
R2 Value
Effect Size
f2 Value
Predictive Relevance
Q2 Value
Weak (≥0.25)
Moderate (≥0.50)
Substantial (≥0.75)
Small (≥0.02)
Medium (≥0.15)
High (≥0.35)
Small (>0.00)
Medium (≥0.25)
Large (≥0.50)
Perceived
efficiency (EF)
0.104
Perceived
perspicuity (PE)
0.104
Perceived
dependability (DE)
0.036
Perceived
stimulation (SI)
0.259
Perceived
novelty (NO)
0.035
Perceived
attractiveness (AT)
0.805 0.516
Table 8. Path coefficient analysis.
Table 8. Path coefficient analysis.
Constructs Impact on Perceived Attractiveness (AT)Path Coefficients
(β)
t-Statisticsp-Values95% Confidence IntervalsSignificant
(p < 0.00)
≥0.10≥1.96<0.001-Strong
<0.01-Moderate
<0.05-Weak
≥-No Effect
[2.5%, 97.5%]Yes/No
Perceived efficiency (EF) 0.2547.5330.000[0.187, 0.321]Yes
Perceived perspicuity (PE)0.2266.2790.000[0.156, 0.299]Yes
Perceived dependability (DE)0.1304.3750.000[0.071, 0.191]Yes
Perceived stimulation (SI)0.34611.3580.000[0.285, 0.405]Yes
Perceived novelty (NO)0.1104.5570.000[0.065, 0.158]Yes
Table 9. Importance–performance analysis for construct Perceived attractiveness (AT).
Table 9. Importance–performance analysis for construct Perceived attractiveness (AT).
ImportancePerformances
Perceived efficiency (EF) 0.25473.847
Perceived perspicuity (PE)0.22675.372
Perceived dependability (DE)0.1374.509
Perceived stimulation (SI)0.34662.164
Perceived novelty (NO)0.1160.742
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Sternad Zabukovšek, S.; Deželak, Z.; Parusheva, S.; Bobek, S. Attractiveness of Collaborative Platforms for Sustainable E-Learning in Business Studies. Sustainability 2022, 14, 8257. https://doi.org/10.3390/su14148257

AMA Style

Sternad Zabukovšek S, Deželak Z, Parusheva S, Bobek S. Attractiveness of Collaborative Platforms for Sustainable E-Learning in Business Studies. Sustainability. 2022; 14(14):8257. https://doi.org/10.3390/su14148257

Chicago/Turabian Style

Sternad Zabukovšek, Simona, Zdenko Deželak, Silvia Parusheva, and Samo Bobek. 2022. "Attractiveness of Collaborative Platforms for Sustainable E-Learning in Business Studies" Sustainability 14, no. 14: 8257. https://doi.org/10.3390/su14148257

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop