Next Article in Journal
Analysis of Factors Affecting Electric Vehicle Range Estimation: A Case Study of the Eskisehir Osmangazi University Campus
Previous Article in Journal
Environmental Policy Implementation and Communication in the Association of Southeast Asian Nations Manufacturing: A Comparative Case Study of Three Key Manufacturing Firms in Indonesia, Malaysia, and Thailand (2020–2023)
Previous Article in Special Issue
When Artificial Intelligence Tools Meet “Non-Violent” Learning Environments (SDG 4.3): Crossroads with Smart Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhancing Sustainability of E-Learning with Adoption of M-Learning in Business Studies

by
Silvia Parusheva
1,*,
Irena Sisovska Klancnik
2,
Samo Bobek
2 and
Simona Sternad Zabukovsek
2
1
Department of Computer Science, University of Economics—Varna, 9002 Varna, Bulgaria
2
Department of E-Business, Faculty of Economics and Business, University of Maribor, 2000 Maribor, Slovenia
*
Author to whom correspondence should be addressed.
Sustainability 2025, 17(8), 3487; https://doi.org/10.3390/su17083487
Submission received: 5 March 2025 / Revised: 27 March 2025 / Accepted: 3 April 2025 / Published: 14 April 2025
(This article belongs to the Special Issue Sustainable Higher Education: From E-learning to Smart Education)

Abstract

:
The rapid development of information technologies has significantly transformed education, making digital tools and e-learning platforms essential for modern learning processes. This research study examines mobile learning (m-learning) adoption among higher education students, emphasising the importance of user acceptance for implementing mobile technologies in education. Using a research model based on the UTAUT framework and analysed via PLS–SEM, the research study investigates business students’ behavioural intention to adopt m-learning platforms (in our case, Microsoft Teams), offering valuable insights into their efficacy and long-term viability, explicitly focusing on the use of mobile devices and the mobile version of the Microsoft Teams application for educational purposes. The study’s findings indicate that all four examined factors significantly influence students’ behavioural intention to adopt m-learning applications, with “performance expectancy” having the most substantial impact. The survey highlights the sustained importance of m-learning even after the pandemic. These results reinforce that students’ expectations of improved performance play a crucial role in adopting mobile apps for m-learning. The findings also suggest that increasing adoption rates requires improving system usability, minimising reliance on external support through intuitive design and training, and leveraging peer influence to enhance engagement in m-learning environments.

1. Introduction

The changes in business and education caused by information technologies in recent years have been particularly intense, impressive, and markedly rapid and irreversible. Education is undoubtedly one of the areas most affected by technological change. Learning and training processes are currently impossible without the full support of digital methods and tools. Pupils and students from the generations with a strong propensity to use digital devices across their entire spectrum (laptops, mobile phones, tablets, wearables, etc.) are susceptible to engagement with the increased use of technology in learning. While traditional learning from paper textbooks remains fundamental for many learners, digital materials have increased dramatically in recent years and often complement traditional methods [1,2,3,4].
One permanently established trend in recent years is the support of students’ learning and training with e-learning platforms. Their importance and use are increasing as the pandemic, in which the normal present learning process was suspended and went entirely to a remote electronic environment, became an additional serious stimulator of these processes. Information technologies and e-learning platforms irreversibly transform education, offering several further opportunities in multiple aspects—both from a pedagogical nature and from the point of view of improving access to quality education [5].
E-learning platforms provide the two participating parties, learners and trainers, with a way to provide and share knowledge in an online environment quickly and efficiently, as the parties in electronic learning can be geographically distant. Platforms are already established, universal and effective environments for managing educational activities and resources.
The success and functionality of e-learning platforms depend on the technology used and adoption. Mobile technologies’ functionalities look promising when applied to e-learning platforms. Their success within e-learning depends, to a great extent, on their adoption. A significant issue in adoption is the acceptance of m-learning by the stakeholders involved—learners and teachers. Because of the many people involved in m-learning and mobile technologies, acceptance of m-learning could be a more significant issue. An even bigger issues might be the question as to whether m-learning platforms can aim to improve learning sustainability, because of the additional goals achievable by m-learning. This study aims to explore the adoption of m-learning among higher education students based on Acceptance and Use of Technology (UTAUT) constructs.
In this study, mobile learning (m-learning) explicitly refers to the fact that mobile devices such as smartphones or tablets are used to access educational content and collaborate through mobile applications. This includes using the Microsoft Teams mobile application for tasks such as communication, resource sharing, group work, and attending online lectures. This mobile form of e-learning enables students to engage in learning activities anytime and anywhere, enhancing flexibility and accessibility. To examine students’ acceptance of m-learning applications, this study adopts the Unified Theory of Acceptance and Use of Technology (UTAUT) developed by Venkatesh et al. (2003) [6]. The UTAUT model synthesises elements from eight earlier technology acceptance models and provides a comprehensive framework to explain user intentions and subsequent technology use. It has been widely applied in studies on educational technologies, including mobile learning systems, due to its strong predictive power and conceptual clarity [6,7,8]. Given the focus of our research on adopting the Microsoft Teams mobile application in higher education, the UTAUT model is particularly suitable for identifying key factors influencing students’ behavioural intention to use m-learning platforms.
Although the advantages of m-learning are widely acknowledged, effective adoption depends largely on users’ acceptance and sustained engagement. Previous research has primarily focused on the technical features and benefits of m-learning platforms, while relatively few studies have investigated the behavioural drivers behind students’ adoption of mobile learning, especially within the context of business education [7,8]. There remains a gap in the literature concerning how mobile learning platforms, such as Microsoft Teams, are accepted and used in higher education beyond the pandemic, and what factors most strongly influence students’ intention to adopt them over time. To address this gap, the present study applies the UTAUT model to explore the key determinants of m-learning adoption among business students using the Microsoft Teams mobile application.
The rest of the article presents a research study conducted based on the UTAUT framework in which Partial Least Squares–Structural Equation Modelling (PLS–SEM) was used to explain the conducted field study and study results. The article ends with an explanation of the findings and a discussion.

2. Theoretical Background

2.1. E-Learning Platforms and Sustainability

New information and communication technologies (ICTs) are transforming educational practices and enabling new ways of teaching and learning. However, research shows that technology alone does not improve learning outcomes unless integrated with appropriate pedagogical strategies [9,10]. Researchers point out that the possibility of remote online learning can be defined as a cornerstone of modern educational processes [11]. Implementing online learning with support through e-learning platforms ensures the sustainability of educational methods in an unstable social environment, including under the negative impact of a pandemic, political or economic instability, and even a military situation.
The global COVID-19 pandemic can be seen as a seminal, universal test of the role of e-learning platforms in education. Specifically, in higher education schools, after the announcement of the termination of face-to-face classes in universities, the learning process continued in an online environment within a week.
Some authors have examined the contribution of e-learning platforms to learning and its sustainability, including in terms of using specific platforms. On the subject of one of the most widely used worldwide platforms, Moodle, Bentaa et al. point out that it provides trainers, learners and administrators with a single robust, secure and integrated system [12]. Researchers have directed their studies towards analysing free e-learning platforms to evaluate their quality [13].

2.1.1. E-Learning Platforms

In recent years, Higher Education Institutions (HEIs) have increasingly integrated e-learning platforms into their learning processes to meet the dynamic needs of students and adapt to changing trends in the educational field. Authors define these tools as vital for higher education [14]. Regarding traditional education, e-learning platforms are complementary and, at the same time, an indispensable and essential part of educational processes. Distance learning can already be indicated as the primary vital tool in teaching and learning, since this training emphasises mainly and primarily, and often only, electronic methods. In traditional learning, especially in distance learning, e-learning platforms are the primary route via which educators deliver electronic learning materials, activities, and resources to students. Learners can use them for self-study, allowing them to learn at their own convenience and pace [14,15].
The changes brought about when using e-learning and related platforms can be examined regarding answers to the following questions: where, for who, when, and how e-learning occurs. In addition to the research done by Wassermann and Fisher [16], questions and answers regarding the impact of e-learning on educational environments are presented in Figure 1.
Where—independence from location: through online access to the e-learning platforms, limitation related to the territorial area of the learners is removed. They can take part in e-learning whether at home, on the go, at university, in co-working spaces, while doing other activities, etc.
Who—accessibility for a broader range of learners: e-learning enables extended inclusion of learners, who would be limited in the conditions of the traditional classroom. This concerns learners with hearing, visual or motor limitations. Built-in accessibility features (large text, text-to-voice, etc.) on mobile phones and tablets are also helpful for learners with similar problems.
When—independence in time: training through the e-learning platform can be conducted according to the learner’s capabilities regarding their tasks and available schedule, continuously, twenty-four-hour-a-day and regardless of the time zone, an essential effect of the asynchronous learning available on the platform. Authors also highlight the accessibility of microlearning [16,17], as the e-learning platform can be usefully used for short periods.
How—enhanced technological and design capabilities: the technologies applied on the e-learning platforms, as well as the capabilities of the devices used by the learners (smartphones, tablets, laptops, etc.), enrich the learning process by supplementing it with new methodological approaches and new delivery of educational content using advanced digital tools. Online courses on e-platforms include rich digital content with various multimedia formats, interactive activities, and resources, including interactive games, quizzes, videos, audio clips, virtual reality and augmented reality content, etc. Learning through e-learning platforms is based on the intensive application of IT, which refers to technology-enhanced learning or technology-enabled and enhanced training [18,19].
The answers to the above questions point to the reasons for turning e-learning platforms into a mandatory and unchangeable element of the educational environment of every HEI worldwide.

2.1.2. Sustainability Issues and E-Learning Platforms

A crucial question about e-learning platforms is how much they help sustain the learning process over time. The availability of e-learning platforms in 24/7 mode from any place with an Internet connection makes it easier for students to learn, not only anytime but anywhere—at any time when it is convenient for them and from anywhere they are located with access to learning resources [20].
Flexibility is another essential quality of e-learning platforms that significantly contributes to learning sustainability [21,22,23]. This quality is related, on the one hand, to their accessibility in terms of place and time, as students can access electronic learning materials and complete assignments, anytime, anywhere. On the other hand, e-learning platforms can provide adaptability to students’ learning pace—in their environment, they can learn at their own pace [24]. Flexibility is also shown in terms of the types of learning styles supported in the students’ environment—different learning options are offered, such as reading, watching videos and listening to audio, links which the trainers have uploaded to the platform, and practical experience gained by solving practical tasks provided by the trainers. Another manifestation of flexibility is the possibility of using different devices to access e-learning platforms. Students can access the platforms from other devices, such as computers, tablets, and smartphones, while using the platforms via mobile devices, i.e., m-learning [25]. This important feature will be considered in a broader context in the present study.
Learning sustainability has been associated with the forms and interactivity tools offered by e-learning platforms in recent years. The sustainability of learning relies on interactivity, as this enables effective e-learning that engages learners and encourages their active participation. On the one hand, e-learning platforms offer various synchronous and asynchronous communication tools, including discussion forums (providing asynchronous text-based interaction between learners and teachers or between peers); real-time chats (allowing synchronous and text-based interaction for instant communication); links to different video conferencing software, such as Zoom, Microsoft Teams, Google Meet, etc.; obtaining feedback from learners by creating a survey for participants, etc. [26].
On the other hand, e-learning platforms have built-in collaboration tools, which refer to the sharing and exchange of electronic learning materials, including files in different formats, presentations, assignments and other resources; the possibility of assigning group tasks (with their implementation, students work together on projects and tasks, thus promoting teamwork and coordination), etc. [26,27].
In addition, the assessment and self-assessment tools available on e-learning platforms should be noted. These tools include online tests, for which lecturers can provide feedback and return online grades with comments on student work. Through the options for polls and surveys included on e-learning platforms, students can give feedback and evaluate the educational process in order to improve it. In addition, students can be given access to perform self-training and self-assessment tests, using this approach to allow them to receive information about their performance and, based on their results, to set goals for improvement.
Integrating game elements into learning can make education more interactive and sustainable. Students experience a more engaging and enjoyable learning environment by the incorporation of games and simulations. This helps boost students’ motivation [28]. Gamification tools and components play a key role in this approach, encouraging active participation and long-term interest. Another approach to gamification involves awarding points and badges [29], recognising learners’ accomplishments, and promoting a sense of competition while upholding fairness and integrity. In the same context, various types of leaderboards are also implemented, through which learners’ performance is compared to their peers, and their efforts to perform better are stimulated.
Another contribution of e-learning platforms relates to their personalisation capabilities. These can manifest in several directions. On the one hand, it is possible to achieve adaptive learning, as the platform uses data on the learners’ performance and, based on these data, personalises the learning materials and tasks [30]. Another feature is the platform’s flexibility regarding students’ learning styles—the platform can adjust to students’ learning style preferences, e.g., visual, auditory, or kinaesthetic [30,31]. Personalisation can be provided by e-learning platforms regarding the learners‘ interests. According to their interests, established based on their searches or views, courses can be offered that match their profile and would interest them. It is possible through the platform to create student-centred activities (also referred to as active learning), via which the learners’ attention is kept, and they obtain an entirely new perception of the educational content offered to them [32].
Lastly, e-learning platforms strongly support learning sustainability by maintaining effectiveness [33]. They contribute to more efficient learning of materials by allowing students to review the materials repeatedly, practice, perform tests to verify and self-check acquired knowledge, and receive teacher feedback [34].

2.2. Sustainable Mobility in E-Learning Platforms

When considering sustainability in teaching and learning via e-learning, a direct connection can be sought with mobile access and the mobile features of e-learning platforms. On the one hand, the importance of participating in mobile technologies via the available channels should be emphasised. On the other hand, mobile features of e-learning platforms are a subject of research interest.
Researchers emphasise the importance of mobile learning for the digital transformation of learning and teaching, noting that it has become particularly popular in recent years due to the affinity for and propensity of young people towards mobile technologies and the ubiquity of mobile devices [35].
Using a mobile channel to access e-learning platforms and mobile learning generally reinforces and highlights the strengths of implementing e-learning using online platforms. This is true for accessibility—it allows learners to access the platform and learn from any location relying on their mobile device [36]. In addition, an increase in learner motivation and engagement is reported due to the interactive and personalised elements of mobile e-learning applications, gamification elements and possible immediate feedback mechanisms [37]. A significant emphasis should be placed on the convenience of learning on the go and using free time intervals [38].

2.2.1. Mobile Technologies in Channels

The development of ICT in recent years has been marked by the dramatically rapid deployment of mobile technologies and their substantial impact on e-learning.
Available e-learning channels include a wide range of options, presented in Figure 2. They can consist of the standard broadband Internet connection, Wi-Fi and e-mail communications (including sending study materials and notifications, receiving assignments and returning completed assignments and feedback, and correspondence with teachers and classmates), the mobile channel, the various varieties of social media, including social networking sites, and especially chat and messaging platforms, as well as forums and discussion platforms and other channels.
Mobile applications (Apps) are currently an established and very popular channel for accessing and using educational services and content through e-learning platforms. Researchers point out that mobile technologies in learning and teaching can foster a positive atmosphere in higher educational institutions [39]. E-learning contributes by using mobile technologies to increase student engagement, allowing students to learn conveniently and combine learning with other duties [40]. Mobile channel-based e-learning offers personalised features through which learning materials can be adapted according to learners’ needs and preferences [41].
Mobile devices facilitate collaboration and communication between learners through various social and communication platforms. A study by Gikas and Grant found that mobile technology supports interaction between students and their teachers by facilitating the sharing of ideas and resources [42]. Other authors also emphasise that one of the main benefits of the m-learning concept is its support for collaborative learning [43].
At the same time, scholars point out challenges regarding mobile access to e-learning platforms and applications. First, it should be mentioned that there are some technical limitations, mainly related to the smaller screens and lower performance that mobile devices have compared to computers, which can affect the effectiveness of learning [44], as well as the heterogeneity of devices [45]. Research shows that the small screens and possible limited resolution of mobile devices can make it difficult to read long texts and view complex graphs and charts. Impacts on the clarity of information organisation, the time it takes to read, and the user’s ability to remember the information are also a factor [46]. The potential for distraction from other activities instead of learning when using mobile devices is also an aspect seen as a challenge by researchers [45]. There are also potential security and safety concerns and privacy issues when using mobile devices for learning [47].
Additionally, there are the up-to-date and actively used modern channels for young people, such as social networking sites (e.g., Facebook groups for discussion and resource sharing, LinkedIn for creating professional networking and training, and Twitter for short messages and news) [48]. Attention should also be paid to other social media representatives distributing educational content for learners—blogs, content communities (such as YouTube), multimedia platforms, etc. [38,49,50].
A special emphasis is placed on using chat and messaging platforms and applications, which are particularly popular among learners. Authors have explored how social media, including. e.g., Telegram, influences students’ academic engagement and performance and highlight that social media use is associated with increased engagement and improved student performance in the learning process [51]. The use of Telegram as a learning resource channel and the Slack app as a collaboration and communication tool in the educational process for coordination of team tasks has become a preferred strategy for learning and communication [52,53,54].
The creation of virtual classrooms and the realisation of video conferencing meetings with the application of video conferencing software is also a channel of marked importance in the context of synchronous communication between trainers and learners, especially in the context of distance learning. Applications such as Zoom, Microsoft Teams, and Google Meet are indispensable communication and collaboration tools [55,56]. In the pandemic, learning in the virtual classroom environment was the only possible solution for implementing teaching and learning processes while respecting individuals’ isolation requirements.
Forums and discussion platforms are also identified as leading asynchronous channels for facilitating interaction between students and teachers, stimulating critical thinking and forming in learners a sense of belonging to a single community. On the one hand, these channels create an environment for active interaction between the two parties in the process, trainees and trainers, and, on the other hand, between trainees. Through these platforms, students can ask questions, receive answers, discuss and give opinions on different issues. This leads them to a deeper understanding and insight into the learning material’s essence and avoids a formal attitude in the learning process. Conversely, forums can provide the necessary time for students to think, ask, and formulate the right questions or answers. Teachers can obtain impressions from learners and intervene to support learning activities [57]. In addition, examining the impact of factors on learners’ success in e-learning, researchers have found that it is influenced by interactions with their instructors and fellow students [58,59].
Examining the presented channels, it can be summarised that the use of some channels lags in time, and they are currently used more as an exception than as a regular practice—e.g., the e-mail channel. On the other hand, it can be pointed out that the mobile channel takes precedence and has a leading role in users’ use of electronic educational services and the systems, platforms, and applications that support them.
Based on the mobile technologies learners use through their various mobile devices (tablets, smartphones, phablets, PDAs, e-book readers), the mobile channel can mediate the other channels for access to learning resources and activities. The channels above for access to e-learning, such as social media, and specifically social networking sites, chat and messaging platforms, forums and discussion platforms, are often used by learners based on mobile technologies and the mobile channel. Therefore, we emphasise its leading role and popularity.
Despite the numerous advantages of mobile access to e-learning platforms, several studies have identified essential limitations and challenges. These include the small screen size, which can complicate reading long texts or interpreting complex graphs and charts, [44], and limited processing power and resolution, which can reduce the effectiveness of interaction with learning materials [46]. Moreover, the potential for distraction is a significant concern, as learners may switch to unrelated content or activities while using mobile devices [47]. These negative aspects should be considered when designing mobile learning environments to ensure effectiveness and user engagement.

2.2.2. Mobile Features of E-Learning Platforms

Mobile communications influence the psychology of learners and create expectations that the educational process is as accessible and flexible as all other routine activities of everyday life. Young people have a desire for quick “things to happen,” “here and now”, and the possibilities of mobile technologies and mobile devices seriously fuel their typical impatience. The mastery of fast communications makes it necessary to offer educational functions and activities in a quick, accessible, and flexible way, which justifies the need to establish and expand the mobile functions of e-platforms in order to find wider acceptance and use.
By optimising the learning experience for smartphones, tablets, phablets and other mobile devices, e-learning platforms can enable users to learn anytime, anywhere. Specific mobile features that are transforming the e-learning environment include the following.
Accessibility and offline functionality—a fundamental principle supported by m-learning is accessibility. Through this, learners can seamlessly access course materials, lectures, and assessments on their mobile devices. This opportunity is valuable for students who are working at the same time or those who have limited internet connectivity. Mobile apps allow learners to download learning materials and access them later offline, thus ensuring continuous learning, including while commuting or travelling [60].
Interactive Functionalities and Gamification—based on the capabilities of mobile devices, interactive features can be included in m-learning applications so that students can turn from passive to active learners. This can be realised through various interactive elements, including tests, quizzes, and gamification tools, like badges, points, leaderboards, etc. [15,61].
Students’ collaboration and social learning—thanks to the capabilities of mobile devices, features like discussion forums, chat rooms, and group projects can be integrated [62,63]. These features enable learners to stay connected with their colleagues, share knowledge and experience, discuss problematic assignments, and enrich their experience.
Learners’ engagement by Push Notifications—e-learning based on the mobile channel can be significantly more effective and implemented according to plan thanks to push notifications, with the help of which learners are kept engaged [64]. Learners are informed in time via their mobile devices by receiving reminders of approaching deadlines for assignments or tests, new course materials or essential announcements. They can react relevantly to important tasks, events, or meetings they must not miss. It is possible to send personalised messages based on learner preferences and behaviour. Receiving customised advice to improve performance based on previous results and support through suggestions for additional resources or exercises to help students better understand the material positively affects their final performance on the course.
Microlearning—m-learning excels in implementing microlearning, which provides information in succinct and easily assimilable segments [65,66]. Bite-sized content caters to shorter attention spans and allows learners to take advantage of learning opportunities during breaks or commutes. In addition, mobile applications can augment this process by incorporating features such as gamification and spaced repetition, thereby enhancing user engagement and facilitating knowledge retention.
Integrating these mobile features allows e-learning platforms to make learning convenient, engaging, and tailored to individual needs. This enables learners to take control of their education and maximise their potential, no matter where or how busy their schedules may be. The following is a more detailed description of the MS Teams app as an example of an m-learning platform.

2.3. Microsoft Teams

As mentioned earlier, the use of different mobile devices and applications is increasing across all sectors of today’s digital society. This trend is also evident in educational institutions and the educational process due to the prevalence of online teaching and learning, which has become widespread in almost all universities worldwide due to the COVID-19 pandemic. Online education is a form of distance learning without attending a physical institution, where students and teachers communicate via the Internet [67]. In the literature, various terms are used interchangeably with online education, such as computer-assisted training, web-based training, internet-based training, distance learning, e-learning (electronic learning), m-learning (mobile learning), computer-supported distance education, etc. [68]. E-learning uses information and communication technology (ICT) to enhance and support the teaching and learning process [69].
At the University of Maribor, Faculty of Economics and Business, e-learning took place remotely with the onset of the COVID-19 pandemic, specifically from March 2020 to March 2022. During this period, the education process transitioned entirely to remote (online) learning and later adopted a hybrid approach, combining remote and in-person teaching at the faculty. The tool used by professors and students at that time for learning was MS Teams. For this reason, we have included this app in the research.
MS Teams is an essential and transformative tool for remote work and virtual team management. It offers a wide range of functionalities that enhance collaboration and communication. Its integration with Microsoft 365 ensures seamless functionality. The platform adapts to the evolving needs of modern workplaces [70]. The following is a brief description of the tool and its key functionalities.

2.3.1. Definition

MS Teams is a tool for collaboration that, in one place, brings together different functionalities, such as chat, meetings, calls, content, files, other Office 365 apps, third-party apps, and custom apps. According to Microsoft, the definition of a team, which is the basis of Teams, is “a collection of people, content, and tools that bring together different projects and results within an organisation” [71].
Teams is a tool or utility that offers extensive capabilities: a communication platform that lets team members communicate through voice and video calls, content sharing capabilities, and the integration of applications that teams and team members need to succeed in their work [72]. The Teams application appears as a desktop version that can be installed on a computer, an online version that can be retrieved from anywhere through an Internet connection, and a mobile version adapted for smartphones. The mobile version of Teams is adapted for iOS and Android. The app was previously also available for Windows phones but was discontinued in July 2018 [73].

2.3.2. Features

Teams is a cloud-based digital application hub that combines conversations, meetings, files, and applications in a single Learning Management System [74]. Although the term app (short for application) can refer to applications for any hardware platform, it is most commonly used to describe its usage by mobile devices, such as smartphones and tablets [75]. The key functionalities or components of the Teams mobile app are as follows [76]:
  • Teams and channels. Teams comprise channels that form a place for communication between the users.
  • Conversations within channels and teams. All team members can see and add different posts in the general channel as a form of conversation. They can also use the @ (Mention) function to invite other members to discussions.
  • Chat. The primary function of chat in most applications is usually collaboration or conversation, which can occur between teams, groups, and individuals.
  • SharePoint file storage. Each team using Teams has a site created in SharePoint Online that contains a default document library folder. All files shared across all conversations are automatically saved in this folder. Security with permissions options can also be customised for confidential data.
  • Online video calls and screen sharing. Setting up video calls with users inside or outside the organisation is possible. Good video call performance is the basis for effective collaboration. In addition, fast sharing or desktop sharing is possible when multiple users collaborate in real time.
  • Online meetings. This feature helps to improve communication, extend meetings, and even train and teach through online meetings, which can involve up to 10,000 users. Online meetings can involve users from outside or inside the organisation. Additional functionalities of online meetings include assistance with scheduling and timing, a note-taking application, uploading files, and chatting with participants during the meeting.
  • Audio conference. With audio conferencing, all members can join an online meeting via phone. Using a dial-in number, even users on the move can participate without needing the Internet. It should be noted that this functionality is only available with a specific type of license.
  • Microsoft 365 Business Voice. Microsoft 365 Business Voice can completely replace a company’s or organisation’s telephone system. Again, it should be noted that the functionality is only available with a specific type of license.
From all the above, we can conclude that the key functionalities of Teams are as follows:
  • Seamless communication—the MS Teams platform offers an extensive array of features that support not only audio and video calls but also live chat functionalities and real-time conversational engagements, thereby facilitating highly effective and efficient communication among team members, irrespective of their physical locations or geographical distances [77];
  • Document collaboration—within this robust platform, users are allowed to share various documents and engage in collaborative efforts in real-time while simultaneously leveraging the all-in-one integrations with a wide range of Microsoft Office applications, such as Word and PowerPoint, which significantly enhances the collaborative experience [78];
  • meeting flexibility—the MS Teams application supports various meeting types. This includes spontaneous gatherings and scheduled appointments. It enables both formal and informal discussions. The platform allows diverse collaboration styles to fit team needs [79].
Remote participation in various educational and organisational processes has become a defining trend in today’s digital environment. While it offers notable benefits, such as increased flexibility and productivity, it also presents challenges related to communication, team cohesion, and participant well-being. Addressing these challenges requires institutions to adopt new strategies for managing virtual collaboration effectively.
Microsoft Teams plays a pivotal role in facilitating remote work and learning in organisations, including universities. Its integrated features—such as chat, video conferencing, shared workspaces, and collaborative tools—enable students and faculty to maintain consistent interaction and manage group work efficiently, regardless of physical location. These capabilities help overcome many of the common barriers associated with remote learning environments and contribute to more inclusive, accessible, and effective digital education practices. The benefits of using Teams for study and work include the following:
  • Enhanced productivity—the various features integrated into the platform are specifically designed to streamline and optimise workflows, thereby enabling teams to operate with heightened efficiency while simultaneously maintaining a high level of organisational structure that is crucial for successful collaboration in a remote work environment [77];
  • Accessibility—MS Teams enables seamless connectivity for users. It allows individuals to collaborate from any location. This is especially useful for teams spread across different regions or countries. The platform helps overcome traditional barriers to communication and teamwork [79];
  • Educational applications—within the context of academic environments, MS Teams plays a pivotal role in supporting structured learning experiences by providing organised materials and assignments, which ultimately contributes to the enhancement of effectiveness and the overall quality of online education, particularly in an era where digital learning has become increasingly prevalent [78].
Despite the many benefits offered by Microsoft Teams, several challenges have been identified in both professional and educational settings. One of the most commonly reported issues is related to information overload—students and staff can become overwhelmed by the constant flow of messages, notifications, and tasks distributed across multiple channels [80]. Another challenge relates to the cognitive load and fatigue associated with prolonged use of video conferencing tools, sometimes referred to as “Zoom fatigue” or “video conferencing fatigue” [81]. These issues can affect student attention, motivation, and well-being during extended remote sessions. Additionally, there can be technical and usability barriers, especially for less digitally literate users. In some cases, students report confusion due to the complex interface, overlapping features, and inconsistent organisation of Teams’ functionalities [7,46]. These limitations suggest the need for clear training, simplified user interfaces, and balanced workloads to support the effective and sustainable use of MS Teams in educational environments.

2.3.3. Usage

According to data from Statista [82], the highest number of Microsoft Teams mobile app downloads was recorded in Q2 2020, with around 70.83 million. This augmentation can be attributed to the worldwide emergence of the coronavirus pandemic, which disrupted traditional face-to-face interactions and compelled organisations to implement communication solutions to acclimate to the newly established remote working and studying environment. In comparison, downloads were much lower before that, with about 27 million in Q1 2020 and only 6 million in Q4 2019. After Q2 2020, the number of downloads steadily decreased and later stabilised at between 25 and 35 million per quarter.
During the last quarter of 2023, Microsoft Teams was the most popular mobile app published under the Microsoft 365 platform, with approximately 27.31 million downloads globally. The Microsoft Authenticator app followed with 23.6 million downloads, and the Microsoft Word app recorded 17 million downloads. Other frequently downloaded Microsoft applications included Outlook, Microsoft 365 (Office), Microsoft Edge: AI browser, Excel, PowerPoint, and OneDrive [83].

2.3.4. Copilot in Microsoft Teams

As indicated, Microsoft Teams is essential for collaboration and communication in many organisations. Trends in using Microsoft Teams include artificial intelligence (AI) within this app. With Copilot, Teams has improved productivity and efficiency in meetings. Copilot is an AI assistant that helps capture key points, action items, and outcomes. This makes it easier for users to stay organised and informed.
One of the primary features of Copilot in Teams is its ability to summarise key discussion points during meetings. This feature is handy for participants who join a meeting late or need to catch up on what was discussed. Copilot can summarise the meeting, including who said what and where people are aligned or disagree. This real-time summarisation helps ensure everyone is on the same page and can subsidise the discussion effectively. In addition to summarising meetings, Copilot can suggest action items and follow-up questions. This feature helps participants stay focused and ensures that essential tasks are not missed [84].
Copilot is available in both meetings and Teams chat. It summarises key points, action items, and decisions without the need for scrolling through long threads. This is useful for busy professionals managing multiple conversations. Copilot keeps users informed by highlighting updates from the past day, week, or month [85].
Administrators can easily manage Copilot settings in the Teams admin centre. They can turn it on or off for meetings and events to match organisational policies. This flexibility allows organisations to customise Copilot’s features as needed [86].

3. Research Model and Research Methodology

3.1. Research Model

Numerous models exist to explain technology adoption and, for our investigation into the acceptance and usage of the mobile application MS Teams, we utilised one of the newer frameworks: the Unified Theory of Acceptance and Use of Technology. Developed by Venkatesh et al. in 2003 [6], this model synthesises elements from eight foundational IT/IS adoption models: the Theory of Reasoned Action (TRA) [87], the Technology Acceptance Model (TAM) [88], the Motivational Model (MM), the Theory of Planned Behaviour (TPB) [89], a hybrid model that combines aspects of TAM and TPB [90], the Personal Computer Usage Model (MPCU) [91], Innovation Diffusion Theory (IDT) [92], and Social Cognitive Theory (SCT) [93]. A review of the literature discovered that authors have previously used the UTAUT model to examine IT adoption among students [94,95,96], the acceptance of mobile banking [97,98,99], mobile learning and teaching [7,8,100], and e-business adoption [101,102].
Based on this theoretical framework, the study is guided by the following research question:
What are the key factors influencing students’ behavioural intention to adopt mobile learning through the Microsoft Teams mobile application in higher education business studies?
Venkatesh et al. [6], based on a study comparing the above models and theories, concluded that four factors have a statistically significant direct impact on user adoption and behaviour. These factors are performance expectancy, effort expectancy, social influence and facilitating conditions.
Performance expectancy (PE) is the degree to which an individual believes that using IT/IS will help them grow in work performance. Five constructs from different models comprise expected performance: perceived usefulness (TAM/TAM 2 model), extrinsic motivation (MM model), job-fit expectations (MPCU model), relative advantage (ITD model) and outcome expectations (SCT model).
Effort expectancy (EE) is the skill level associated with using IT/IS. Three constructs from different models are consistent with this concept: perceived ease of use (TAM/TAM2 model), complexity (MPCU model) and ease of use (IDT model)—the variables of gender, age, and experience moderate the expected effort.
Social influence (SI) is defined as the degree to which an individual’s perception of the importance of using or not using IT/IS is influenced by the beliefs of others. It appears as a subjective norms (SNs) variable in the TRA, TAM 2, TPB/DTPB models, as a social factor in the MPCU model and as an image factor in the IDT model. If the use of IT/IS is voluntary, then this factor is not significant, but it becomes essential if the use of IT/IS is mandatory.
Facilitating Conditions (FCs) are defined by the degree to which an individual believes that the organisational and technological infrastructure exists to support the use of IT/IS. It includes three factors: perceived behavioural control (TPB/DTPB model), supportive circumstances (MPCU model) and compatibility (IDT model).
These external factors influence behavioural intentions (BI), which refer to an individual’s subjective likelihood of engaging in a particular behaviour [6]. This concept shows how motivated an individual is to perform a specific action. It is a key factor in determining technology use, as it reveals the user’s willingness and tendency to accept and use the system.
The UTAUT model has been widely applied in educational contexts to examine students’ acceptance of learning technologies, including e-learning systems, mobile applications, and digital platforms. Prior research has confirmed its robustness in predicting behavioural intention among students and educators [7,8,94]. These studies support the model’s relevance for exploring technology adoption in learning environments, particularly in mobile learning scenarios. Consequently, we selected UTAUT as a suitable theoretical framework to investigate students’ intention to use the Microsoft Teams mobile application in higher education.
Based on the dependencies inherent in the relationships in the research model shown in Figure 3 below, the following hypotheses were postulated:
H1: 
Performance Expectancy has a significant statistical impact on Behavioural Intention.
H2: 
Effort Expectancy has a significant statistical impact on Behavioural Intention.
H3: 
Social Influence has a significant statistical impact on Behavioural Intention.
H4: 
Facilitating Conditions have a significant statistical impact on Behavioural Intention.
The measurement items used in our study are based on the original UTAUT model developed by Venkatesh et al. [6], with adaptations to reflect the context of mobile learning through Microsoft Teams. The constructs were operationalised in line with prior research (e.g., refs. [7,8]), and their item wording was tailored to the specific learning environment to ensure conceptual consistency and contextual relevance.
Each construct was measured using multiple indicators on a five-point Likert scale (1 = strongly disagree; 5 = strongly agree), as follows:
  • Performance Expectancy (PE):
    • PE1: The Microsoft Teams mobile application enables me to complete tasks quickly.
    • PE2: The Microsoft Teams mobile application restricts my task execution.
    • PE3: The Microsoft Teams mobile application enhances my productivity.
    • PE4: The Microsoft Teams mobile application reduces my efficiency in the classroom (group).
    • PE5: Using the Microsoft Teams mobile application facilitates the completion of my academic obligations (assignments, duties, projects, seminar papers, etc.).
    • PE6: Using the Microsoft Teams mobile application decreases the quality of my academic work (assignments, duties, projects, seminar papers, etc.).
    • PE7: Using the Microsoft Teams mobile application contributes to my classmates and colleagues perceiving me as competent.
    • PE8: Using the Teams mobile app increases teachers’/professors’ respect for me.
    • PE9: Using the Teams mobile app reduces my chances of being promoted to a higher year.
    • PE10: The Teams mobile app is helpful for teaching and learning.
  • Effort Expectancy (EE):
    • EE1: Learning to use Teams is easy for me.
    • EE2: The Microsoft Teams application is straightforward for completing my academic obligations.
    • EE3: My interaction with the Microsoft Teams application is clear and understandable.
    • EE4: The Microsoft Teams mobile application is flexible for interaction.
    • EE5: I can quickly learn to use the Microsoft Teams application.
    • EE6: The Microsoft Teams mobile application is easy to use.
    • EE7: Using the Microsoft Teams application takes too much time when completing my regular academic tasks.
    • EE8: Working with the Microsoft Teams application is complex and challenging.
  • Social Influence (SI):
    • SI1: People who influence me believe I should use the Microsoft Teams application.
    • SI2: My parents and friends think I should use the Microsoft Teams application.
    • SI3: Teachers/professors at my university provide support in using the Microsoft Teams application.
    • SI4: Teachers/professors at my university strongly support using the Microsoft Teams application in their courses.
    • SI5: Overall, the university supports using the Microsoft Teams application.
    • SI6: Installing the Microsoft Teams application is considered a status symbol at my university.
  • Facilitating Conditions (FC):
    • FC1: I have the necessary resources (e.g., phone, computer, internet) to use the Microsoft Teams application.
    • FC2: I have the knowledge to use the Microsoft Teams application.
    • FC3: The Microsoft Teams application is incompatible with other mobile applications and my operating system.
    • FC4: A helpdesk is available at the university to assist with issues related to the Microsoft Teams application.
    • FC5: Using the Microsoft Teams application aligns with my student lifestyle.
  • Behavioural Intention (BI):
    • BI1: I complete my academic obligations through the Microsoft Teams application whenever possible.
    • BI2: I perceive using the Microsoft Teams application as involuntary.
    • BI3: I intend to use the Microsoft Teams application.
    • BI4: I would like to use the Microsoft Teams application as much as possible for various tasks.
    • BI5: I would like to use the Microsoft Teams application as much as possible to meet my academic obligations.
Some items were negatively worded (e.g., PE2, PE4, EE7) to reduce response bias; these items were reverse-coded before statistical analysis. The full questionnaire is not included as an appendix for reasons of language adaptation and to avoid redundancy. However, all constructs and items are explicitly described in this section, allowing full transparency of the instrument design.

3.2. Research Methodology

The questionnaire was developed based on the UTAUT model proposed by Venkatesh et al. [6], adapting the item wording to reflect the specific context of mobile learning through the Microsoft Teams application. The adaptation followed practices commonly found in prior research using UTAUT in educational contexts (e.g., [7,8]) without formal permission, as the model is widely published and openly used in academic studies. The questionnaire was translated into Slovenian. Four Slovenian e-business lecturers reviewed the questionnaire for clarity before administering the survey to students. Based on their feedback, we adjusted the word order in some items while ensuring that their meaning remained unchanged.
Upon enrolment, all students at the University of Maribor are provided with a Microsoft 365 account, which includes Microsoft Teams. This access is free to students as part of the University’s institutional agreement with Microsoft. As such, there were no financial barriers to using the MS Teams mobile application, which ensures equal access among participants and eliminates fee-related bias in the study. As mentioned earlier, this platform can be accessed in three ways: the desktop version, the web version via a web browser, and the mobile app. The app can be accessed via the web version or the mobile app installed on a user’s mobile phone.
A web questionnaire was developed and distributed anonymously to all undergraduate and master’s students at the University of Maribor, Faculty of Economics and Business (FEB). All students were allowed to contribute to the survey voluntarily. The Faculty of Economics and Business at the University of Maribor was selected for this study because the MS Teams platform was officially adopted and actively used as the primary learning and collaboration tool during and after the COVID-19 pandemic. This institutional support ensured a consistent digital learning environment for students, making it an appropriate setting for longitudinal research on mobile learning adoption. Before the start of the survey, students were informed of the study summary, its purpose, planned research publication, data collection procedure, and respondent management. Data were collected without any personally identifiable information that could reveal the students’ identity.
As our focus is on the use of mobile apps, respondents were asked whether they used the Microsoft Teams mobile application. Only those who responded positively were included in the final analysis, regardless of whether they used other versions (desktop or web). The survey was conducted during the 2021/2022 academic year, which coincided with the global pandemic. A total of 149 students (33.48%) responded. The survey was repeated two years after the pandemic during the 2023/2024 academic year (296 responses; 66.52%).
While these are independent samples, we merged them into a single dataset for model estimation due to the conceptual similarity of the target group (undergraduate and master’s students at the same institution) and the consistent research instrument used across both periods.
The merged dataset provides a broader and more representative view of students’ mobile learning behaviour in both pandemic and post-pandemic conditions while also enabling subgroup comparisons via multigroup analysis (MGA), which is conducted later in the paper. The final sample consists of 445 students (157 male and 288 female), with the majority (97.08%) aged between 19 and 25. Descriptive statistics are presented in Table 1.
A question was posed to the respondents regarding the frequency of their use of the MS Teams app. The respondents were given the following options: three or more times per day (83 responses; 27 in 2022 and 56 in 2024), once or twice per day (226 responses; 63 in 2022 and 163 in 2024), less than one per day (68 responses; 25 in 2022, 43 in 2024) and once a week or less (68 responses; 34 in 2022 and 34 in 2023).
The number of students occasionally using Microsoft Teams has decreased, indicating that more users are integrating the app into their regular academic routines. There is a notable shift towards more frequent use, with a significant increase in students who report using Microsoft Teams once or more times per day. Although Microsoft Teams was the primary platform for distance education during the pandemic, the lower frequency of reported use in the 2021/2022 sample may be attributed to several contextual factors. Class attendance during the pandemic was not mandatory, and some students chose not to attend online sessions regularly. Additionally, infrastructure challenges—such as limited internet access or lack of appropriate mobile devices—may have discouraged consistent use of the mobile version of Microsoft Teams. In contrast, in the post-pandemic period, the platform became more embedded in students’ daily academic workflows, being used not only for lectures but also for communication, group work, and consultations, which likely contributed to the increase in usage frequency.

4. Results

Structural Equation Modelling (SEM) is a statistical technique for analysing complex relationships between observed and latent variables. Among the various methodologies associated with SEM, Partial Least Squares Structural Equation Modeling has attained significant recognition, especially within the context of exploratory research and in the analysis of intricate models characterised by minor to moderate sample sizes. Unlike Covariance-Based SEM (CB-SEM), which focuses on confirming theoretical models, PLS–SEM is variance-based and aims at maximising the explained variance of dependent constructs, making it suitable for predictive modelling [103,104].
Since the aim of our research is the development of theory, PLS–SEM was chosen for quantitative analysis. One of the key advantages of PLS–SEM is its robustness and ability to handle data, which goes against the assumptions of traditional SEM methods. It is well-suited for handling non-normal data distributions, small sample sizes, and variables with non-linear relationships, making it more effective than other techniques in such contexts [105,106,107]. Furthermore, it facilitates the incorporation of both formative and reflective measurement models. This feature is particularly advantageous in the social sciences and analogous domains, where latent constructs may possess multiple indicators with divergent levels of influence. The PLS–SEM methodology is characterised by an iterative process, which enables researchers to refine their models based on the results obtained [103,104,108]. As a result, PLS–SEM has become an accepted technique for model analysis, beginning with estimating the measurement model and then evaluating the structural model, where relationships between variables are examined [103,104,108].
This study selected PLS–SEM due to its ability to handle non-normal data distributions. The analysis was conducted using SmartPLS version 4.1.0.4 [109], following the methodological guidelines proposed by Sarstedt et al. [104,110], Garson [103], Henseler et al. [108], and Hair et al. [111]. The empirical data analysis was carried out in two stages using the PLS–SEM technique.
  • The first phase of the research evaluated the psychometric properties of all measurement scales to ensure their reliability and discriminant validity.
  • The subsequent phase of the study focused on hypothesis testing and model analysis to examine structural relationships, explained variance (R2), effect sizes (f2), and predictive relevance (Q2).
To further enhance the analysis, we extended the study by incorporating two advanced PLS–SEM techniques: Importance–Performance Map Analysis (IPMA) and Multigroup Analysis (MGA).
  • IPMA provides deeper insights by assessing the relative importance and performance of predictor constructs, allowing for the identification of key areas for strategic improvement [111].
  • MGA was applied to assess whether significant differences exist between subgroups, enabling the exploration of potential moderating effects and examining how the relationships in the model vary across different user groups [110].

4.1. Measurement Model Assessment

The PLS–SEM measurement model estimates the reliability and validity of latent constructs before analysing structural relationships. It ensures that the observed indicators accurately represent the underlying theoretical constructs. The final version is presented below, where indicators that did not meet the conditions of the measurement model are excluded.
The evaluation process consists of four key steps. The first step involves assessing indicator reliability by examining factor loadings, where values above 0.70 indicate strong reliability. Table 2. presents the descriptive statistics and factor loadings of the final factor items in our research model, all of which meet the PLS–SEM criteria. However, one-factor loading (FC1 = 0.685) falls slightly below the recommended threshold. We retained this indicator because, in exploratory research or complex models, factor loadings above 0.60 may still be maintained if they contribute to the overall validity and reliability of the construct [111].
The second step involves assessing internal consistency reliability, which evaluates how well a set of indicators (items) consistently measure a latent construct. It ensures that the indicators within a construct are highly correlated and provide stable measurements. In PLS–SEM, the evaluation of internal consistency reliability is typically conducted through three main measures: Cronbach’s Alpha (α), Composite Reliability (CR), and Dijkstra–Henseler’s Rho (ρ_A). It is generally accepted that values exceeding 0.70 indicate an acceptable level of internal consistency [112,113].
Henseler et al. [108] emphasise the need to report all three reliability measures, as together they provide a comprehensive perspective. Although Cronbach’s alpha (α) has been widely used historically, it tends to underestimate reliability due to its strict assumptions. In PLS–SEM, the preferred measure is composite reliability (CR), which considers the indicators’ different loadings. Meanwhile, Dijkstra–Henseler’s Rho (ρ_A) is regarded as a more accurate reliability estimate because it does not rely solely on loadings but calculates construct reliability from the exact correlations in the model [112].
As shown in Table 3, the values for Cronbach’s Alpha and Composite Reliability (CR) exceed the commonly accepted threshold of 0.70 for most constructs, indicating acceptable to good internal consistency [108,111]. One exception is the construct Facilitating Conditions, which has a Cronbach’s Alpha of 0.621. While this value may be considered borderline or “questionable” according to some interpretations [113], it is still acceptable in exploratory PLS–SEM research when supported by other reliability indicators [103,104]. To strengthen our assessment, we additionally calculated Dijkstra–Henseler’s Rho (ρ_A), where all values exceeded 0.70, further confirming the reliability of the constructs.
For most constructs, ρ_A aligns well with CR and Cronbach’s Alpha. However, in the case of Social Influence, ρ_A is 0.923, slightly above the commonly recommended upper limit of 0.90. Although such high values may suggest potential redundancy among indicators, in this case, both Cronbach’s Alpha and Composite Reliability remain within the acceptable range (α = 0.825), indicating internal consistency without excessive overlap. Therefore, we retained all indicators, as they contribute meaningfully to the construct.
Reporting all three measures ensures robustness and transparency in construct validation [114].
The third step involves assessing convergent validity, which evaluates whether the indicators of a construct share a high proportion of variance. In PLS–SEM, this is assessed using Average Variance Extracted (AVE), where values ≥ 0.50 indicate that the construct explains at least 50% of the variance of its indicators [115]. A higher AVE suggests that the indicators are strongly related to their respective latent construct, confirming that the construct adequately captures the variance in the observed variables.
Table 3 shows that all AVE values exceed the 0.50 threshold, confirming satisfactory convergent validity for all constructs. The highest AVE value is observed for Social Influence at 0.851, indicating a strong ability of its indicators to explain the construct. The lowest AVE value is for Facilitating Conditions at 0.542, which is still above the recommended threshold, meaning that the construct retains acceptable convergent validity. As all AVE values are greater than or equal to 0.50, the subsequent analysis stage may be initiated [104,111].
The final step in evaluating a measurement model is to assess discriminant validity. It is necessary to check that each construct is distinct from the others in the model. This demonstrates that the latent variables measure unique models and are not highly correlated with each other [108,111,115]. In PLS–SEM, discriminant validity is commonly assessed using two key methods: Fornell–Larcker Criterion and Heterotrait–Monotrait (HTMT) Ratio. Fornell–Larcker Criterion compares the square root of AVE for each construct with the correlations amongst constructs. A construct should have a higher AVE square root than its highest correlation with any other construct, confirming discriminant validity. Table 4 shows that each construct’s diagonal values (square root of AVE) are more significant than the off-diagonal correlation values, confirming discriminant validity [115].
HTMT Ratio is a more modern and stricter criterion that assesses construct discriminant validity by measuring the ratio of heterotrait correlations to mono-trait correlations [108]. HTMT values should ideally be below 0.85 (strict criterion) or 0.90 (more lenient threshold). Table 5. All HTMT values are below 0.90, confirming that the constructs are sufficiently distinct. Since both Fornell–Larcker and HTMT criteria confirm satisfactory discriminant validity, we can proceed with the structural model assessment [108,111].

4.2. Structural Model Assessment

Following the affirmation of the measurement model’s reliability and validity, the subsequent phase entails the examination of the structural model, which evaluates the interrelations among latent constructs and serves to test research hypotheses empirically. The structural model assessment examines the path coefficients, their statistical significance, and the model’s explanatory power. To evaluate the quality of the structural model, the following key metrics are analysed: path coefficients (β) and hypothesis testing, coefficient of determination (R2), effect sizes (f2), predictive relevance (Q2) and model fit indices (SRMR, NFI) [104,111]. We assess how accurately the proposed model predicts the relationships between latent variables by examining structural relationships.
The first step in the structural model valuation involves evaluating path coefficients (β) and hypothesis testing, which sets the strength and significance of the relationships between constructs [116]. Path coefficients represent the direct effects between independent and dependent variables, providing insight into how strongly one construct influences another. To measure the statistical significance of these relationships, bootstrapping procedures were applied using 10,000 resamples in SmartPLS [104,114]. The importance of path coefficients is evaluated using t-statistics and p-values, where t-values ≥ 1.96 (for a significance level of 0.05) indicate a significant relationship, and p-values < 0.05 confirm that the relationship is statistically significant.
The results of the structural model assessment presented in Table 6 show that all proposed relationships are statistically significant.
Performance Expectancy has the most potent positive effect on Behavioural Intention (β = 0.380, t = 8.117, p < 0.001), confirming H1. This suggests that individuals who perceive the system as applicable are more likely to intend to use it. Effort Expectancy significantly influences Behavioural Intention (β = 0.118, t = 2.448, p = 0.014), supporting H2. This finding suggests that perceived ease of use exerts a positive yet comparatively less pronounced effect on behavioural intention than other factors. Social Influence has been demonstrated to impact behavioural intention significantly (β = 0.175, t = 4.241, p < 0.001), thereby confirming Hypothesis 3. This finding suggests that encouragement from peers and teachers plays a substantial role in influencing students’ decisions to adopt the system. Lastly, the positive impact of Facilitating Conditions on Behavioural Intention (β = 0.186, t = 3.462, p = 0.001) is consistent with H4, suggesting that essential resources and support play a substantial role in fostering users’ intention to adopt the system.
Figure 4 presents the structural model, which includes the path coefficients (β) and their corresponding t-values (in parentheses). The model also displays the relationships between the latent variables and their respective measurement items, demonstrating the proposed framework’s overall structure and statistical significance.
The coefficient of determination (R2) measures the explained variance of the endogenous constructs, indicating how well the independent variables account for the variance in the dependent variable [111]. In PLS–SEM, R2 is a key measure of predictive accuracy, where higher values suggest a more potent model explanatory power. For our model, the R2 value for Behavioural Intention is 0.451, meaning that 45.1% of the variance in BI is explained by the independent variables: Effort Expectancy, Performance Expectancy, Social Influence, and Facilitating Conditions. This indicates that the model has moderate explanatory power in predicting Behavioural Intention. Whilst 45.1% of the observed variance in BI can be accounted for, the residual 54.9% is influenced by additional factors not incorporated within the model. Additional contributory factors may include personal attitudes, experiential knowledge, levels of trust, and external factors [117]. The results confirm that the selected predictors—particularly Performance Expectancy, which has the most substantial effect—play a significant role in explaining Behavioural Intention, supporting the validity of the theoretical framework.
Effect size (f2) measures the relative impact of each predictor variable on the dependent variable beyond what is already explained by other constructs in the model [108]. R2 measures the overall explained variance, while f2 measures the specific contribution of each independent variable, thereby helping to assess its importance in the model. Table 3 shows the f2 values for each predictor of behavioural intention:
  • Performance Expectancy (f2 = 0.152) establishes a moderate effect size, subsequently supporting its substantial contribution to supporting behavioural intention.
  • Social Influence (f2 = 0.045) and Facilitating Conditions (f2 = 0.040) demonstrate small but significant effects, indicating that while peer influence and resource availability influence Behavioural Intention, their impact is more constrained than Performance Expectancy.
  • Effort Expectancy (f2 = 0.016) exhibits the smallest effect size, suggesting that although the ease of use is significant, it exerts a relatively minor influence on Behavioural Intention compared to other factors.
These results imply that, while all four factors significantly influence Behavioural Intention, Performance Expectancy has the most substantial relative impact, reinforcing its importance in predicting technology adoption.
The next step in assessing the model is to evaluate its predictive relevance (Q2), which measures out-of-sample predictive power and indicates how well the exogenous variables predict the values of the endogenous variable beyond the variance explained in the model [104]. This is done using the blindfolding procedure, where Q2 values above 0 confirm the model’s predictive relevance [111]. According to Henseler et al. [108], predictive power can be further assessed by comparing the Root Mean Squared Error (RMSE) values obtained from PLS–SEM and a Linear Model (LM). If the PLS–SEM RMSE values are lower than the LM RMSE values, it can be deduced that the model demonstrates higher predictive accuracy than a basic linear regression approach. As illustrated in Table 7, all the Q2predict values for the behavioural intention indicators are positive, thus confirming that the model has predictive relevance. The Q2predict values extend from 0.259 to 0.349, showing that the model has a moderate level of predictive power [111]. The results indicate that, while the model is statistically significant and explanatory, its extrapolative capability is moderate.
The final step in the structural model assessment involves evaluating model fit indices. In PLS–SEM, model fit is typically assessed using the Standardized Root Mean Square Residual (SRMR) and the Normed Fit Index (NFI). SRMR measures the average difference between observed and model-implied correlations, indicating the overall model fit. A value below 0.08 indicates a good model fit [108]. In our model, the SRMR value is 0.076, which falls below the commonly accepted threshold of 0.08 for acceptable model fit, as recommended by Hu and Bentler [118]. This indicates that the model has a satisfactory fit with the observed data. NFI compares the proposed model to a null (baseline) model, where values closer to 1 indicate better model fit. In PLS–SEM, NFI values above 0.90 are generally considered good, while values above 0.70 still indicate an acceptable, though less optimal, fit [111]. Our model, NFI = 0.772, suggests a moderate model fit, which, while not ideal, remains acceptable in exploratory research. Given that PLS–SEM prioritises predictive accuracy over absolute model fit, the current results are considered satisfactory for exploratory research [108,111].

4.3. Importance–Performance Map Analysis (IPMA)

The Importance–Performance Map Analysis (IPMA) is an advanced PLS–SEM technique that extends traditional path modelling by considering both the importance (total effect) and performance (average latent variable scores) of predictor variables [111]. The present approach facilitates the identification of constructs that exert the most significant influence on the designated target variable, thereby determining the most advantageous locations for implementing performance improvements [119].
IPMA is a framework that converts statistical findings into actionable understandings by recognising high-impact but low-performance areas. This allows organisations and decision-makers to prioritise interventions accordingly. In this study, IPMA was applied to behavioural intention to determine the relative importance of its key predictors and performance levels: performance expectancy, effort expectancy, social influence, and facilitating conditions.
Table 8 presents each construct’s importance and performance values in predicting Behavioural Intention. Performance Expectancy is the most critical predictor (importance = 0.380) but has the lowest performance level (75.671). The findings of this study indicate that, while PE exerts a significant influence on behavioural intention, its performance could be enhanced to promote overall adoption. Effort Expectancy and Facilitating Conditions demonstrate relatively high-performance levels (83.890 and 81.907, respectively), yet their importance is somewhat weakened in PE. Social Influence has the lowest performance score (40.962), with substantial significance (0.175). This suggested that efforts to improve social support and encouragement from peers and instructors could enhance Behavioural Intention. The Importance–Performance Map (Figure 5) visually represents these relationships, highlighting areas where strategic improvements could maximise adoption outcomes.

4.4. Multigroup Analysis

MGA is an advanced PLS–SEM technique to determine whether statistically significant variances occur among subgroups within a dataset [110]. This methodology enables researchers to ascertain whether the structural relationships between latent constructs vary across groups, such as demographic segments, user categories, or contextual conditions.
MGA has been demonstrated to be of particular value in comparative research. Indeed, it has been shown to facilitate the examination of moderation effects that extend beyond the scope of traditional interaction analysis [111]. By checking whether path coefficients significantly vary between groups, MGA helps identify group-specific differences, offering more reflective insights into heterogeneous user behaviours [120,121].
In PLS–SEM, MGA is conducted using nonparametric techniques, including Henseler’s MGA, the parametric approach, and permutation tests, which are well-suited for non-normal data distributions and small sample sizes [104]. By applying MGA, we can assess whether key adoption factors differ between subgroups, informing tailored strategies to improve technology acceptance in diverse populations.
We conducted MGA on the validated research model to measure whether significant differences occurred between the two student groups (2022 and 2024). The analysis followed the MICOM procedure [121], including three steps:
  • Configurational Invariance checks that both groups share the same conceptual model structure.
  • Compositional Invariance proves that construct scores are calculated equivalently across groups.
  • Equality of Means and Variances assesses whether significant differences exist in means and variances across groups.
The first step tests configurational invariance, ensuring that both groups follow the same conceptual structure. Since SmartPLS does not proceed without configurational invariance being established, we confirmed that this requirement was met, allowing us to continue with the next steps.
The second step tests compositional invariance, meaning construct scores are computed equivalently across groups. To ensure stability, we set the number of permutations to 10,000. The results of Step 2 include original correlations, permutation mean correlations, confidence intervals, and p-values for each construct. Since all p-values exceed 0.05, compositional invariance is confirmed, meaning constructs are measured consistently in both groups (Table 9).
Since configurational and compositional invariance were confirmed, we proceeded to Step 3, which examines whether the means and variances of constructs significantly differ between groups. Table 10 presents the results of this analysis, showing permutation p-values for means and variances. A p-value below 0.05 points at a statistically significant difference between the two groups.
Since Step 3 is not fully satisfied, full measurement invariance is not established. Instead, we confirm partial measurement invariance, allowing for further MGA comparisons (Table 11). The results of structural path comparisons across groups indicate whether relationships differ significantly.
To gain a more granular understanding of the differences between the two student groups (2022 and 2024), we conducted a more detailed analysis of Step 3 of the MICOM procedure. While the previous analysis established partial measurement invariance, this additional examination compares path coefficients, their standard deviations, t-values, and statistical significance across both groups. This allows for a better interpretation of how the relationships between predictors and Behavioural Intention evolved. Table 12 presents the original path coefficients for both groups and their means, standard deviations, t-values, and p-values. Including t-values and p-values helps determine whether the strength of relationships between constructs differs significantly between the two groups. A p-value below 0.05 implies a statistically significant difference, meaning that the relationship between a predictor and BI changed over time.
The results confirm that Performance Expectancy remains the most influential predictor of BI across both years, with high path coefficients (0.367 in 2022 and 0.385 in 2024) and strong statistical significance (p = 0.000 in both years). This shows that students consistently perceive the system’s usefulness as the fundamental driver of adoption, and its role in affecting BI remains stable over time.
A significant difference was observed in Effort Expectancy between the 2022 and 2024 cohorts. In 2022, the EE → BI pathway was found to be weak and statistically non-significant (β = 0.071, p = 0.424), indicating that ease of use was not a primary concern for students then. Conversely, in 2024, the path coefficient increased to β = 0.145, with a significant p-value (p = 0.013), indicating that students in 2024 place significantly more importance on ease of use. This implies a change in user expectations, which may be attributable to changes in technology experience, improvements in interface design, or improved digital literacy among students in 2024.
The research also reveals a statistically significant shift in the perceived importance of facilitating conditions in both years (p = 0.005 in 2022 and p = 0.014 in 2024), suggesting that, while infrastructure and technical support are still key, their impact on BI is decreasing slightly over time (β = 0.251 in 2022 to β = 0.157 in 2024). We can speculate that this could indicate students’ increasing expertise in technology, indicating a decrease in their dependence on external support structures.
Lastly, social influence remains a constant predictor throughout both years, showing small but statistically significant effects on BI. The slightly higher path coefficient in 2024 (β = 0.183) compared to 2022 (β = 0.155) suggests that peer and instructor encouragement may have gained more relevance in influencing system adoption over time.

5. Discussion

E-learning development in higher education has led to the use of many innovative technologies in teaching and learning, such as m-learning [122]. Several researchers have studied different aspects of this new field. In most cases, key findings are similar and show great potential for applying m-learning in higher education and its impact on student-centred e-learning, emphasising the self-regulatory learning process [123]. It is also argued that before designing and implementing a mobile learning system, it is essential to consider future users’ perception of m-learning since their perception will influence their behavioural intention to use m-learning [124]. Several studies reveal the importance of students’ perception of new technologies used in e-learning and mobile technologies [125].
The findings of this study align with existing technology acceptance research and extend prior studies by demonstrating the relative influence of Performance Expectancy, Effort Expectancy, Social Influence, and Facilitating Conditions in shaping user adoption behaviour. The results of this study provide empirical support for the proposed structural model, confirming the significance of all four hypothesised relationships in predicting behavioural intention when using the system.

5.1. Key Determinants of Behavioural Intention

Performance Expectancy emerged as the strongest predictor of Behavioural Intention (β = 0.380, p < 0.001), confirming H1. This result is consistent with prior research (e.g., [6,109]), emphasising that individuals are more likely to adopt a system if they perceive it as beneficial and performance-enhancing. This reinforces the notion that perceived usefulness remains the key determinant of adoption in technology acceptance models, highlighting the need for institutions to effectively communicate the practical benefits of the system to potential users.
Effort Expectancy (β = 0.118, p = 0.014, H2 confirmed) was found to have a significant but relatively weaker influence on BI. While ease of use contributes to adoption, its effect is less pronounced than perceived usefulness. This finding suggests that users may tolerate some complexity if they perceive high functional benefits. This finding aligns with the results documented in previous studies (e.g., [7,126]), which indicate that usability factors play a significant role in adoption, although they may not be the only drivers.
Social influence (β = 0.175, p < 0.001, H3 confirmed) significantly affects BI, indicating that peer and teacher encouragement positively influences an individual’s propensity to use the system. This finding is consistent with the research (e.g., [108]) that emphasises the role of subjective norms and external pressures in technology adoption. Therefore, educational institutions and other organisations are advised to leverage social influence mechanisms such as peer mentoring, teacher/tutor participation, and institutional incentives to encourage wider use.
The results of this study support the hypothesised role of facilitators in adopting the process (H4), highlighting the importance of technical and organisational support in the adoption process (beta = 0.186, p = 0.001, confirming).
This finding is consistent with the conclusions of several studies (e.g., [96,127]), which suggest that users are more likely to adopt a system when they have access to adequate resources, infrastructure, and training. A study conducted in New Zealand also reports similar findings, despite it being undertaken using TAM and not the UTAUT framework [128]. Six of the seven factors influencing behavioural intention, i.e., self-efficacy, perceived usefulness, subjective norm, attitude, perceived ease of use, and perceived financial resources, were reported to be accepted [128]. The results of another study were also in line with our findings. In contrast, the survey by Fabito [129] showed that the different dimensions distinctively predict the learners’ behavioural intention to use m-learning.

5.2. Explanatory Power and Effect Sizes

The model explains 45.1% of the variance (R2 = 0.451) in behavioural intention, indicating moderate explanatory power [111]. While this accounts for a considerable portion of user adoption behaviour, additional external factors, such as personal attitudes, prior experience, or trust in technology, may also influence Behavioural Intention (e.g., [105,106,117]). Future research could be directed at exploring other factors to increase the explanatory power of the model.
Effect size analysis (f2) provides further insight into the relative influence of each predictor. Expected performance (PE) (f2 = 0.152) shows a moderate effect size, confirming its significant impact on adoption decisions. In contrast, Social Influence (f2 = 0.045) and Facilitating Conditions (f2 = 0.040) demonstrate more minor but significant effects, suggesting that, although external encouragement and re-resource availability play a part in adoption, their joint impact is less substantial. The effect of effort expectancy (f2 = 0.016) was the weakest, confirming that ease of use is still a relevant factor but does not significantly dominate the prediction of Behavioural Intention in this context [7,8].

5.3. Predictive Relevance and Model Fit

The predictive relevance (Q2) and model fit indices provide additional insight into the robustness of the proposed research model. The Q2predict values for the Behavioural Intention indicators range from 0.259 to 0.349, indicating a moderate level of predictive power [111]. In addition, the PLS–SEM RMSE values are lower than the LM RMSE values for most indicators, supporting the model’s predictive accuracy [104]. The model’s goodness of fit (GoF) indices confirm an acceptable fit. The SRMR value of 0.076 is within the recommended threshold (≤0.08). This indicates a satisfactory model fit [121]. However, the NFI value of 0.772, while acceptable for exploratory research, suggests that further model refinements may improve overall model fit [111]. It is noteworthy that PLS–SEM places a higher value on predictive accuracy than on absolute model fit; per se, it can be concluded that the model is appropriate for exploratory analysis, with the possibility of enhanced performance and robustness through additional refinements.

5.4. IPMA and Multigroup Analysis Insights

The Importance–Performance Map Analysis (IPMA) methodically identifies areas where enhancements can be made to facilitate enhanced user adoption. The results confirm that expected performance (PE) is the most critical predictor (significance = 0.380) but has the lowest performance level (75.671). The result suggests that improving the system’s practical benefits, functionality, and perceived value could significantly increase adoption rates. In contrast, expected effort duration and facilitating conditions are successful (83.890 and 81.907, respectively), indicating that these factors are well supported. However, Social Influence has the lowest performance score (40.962), suggesting that increasing peer, instructor, and organisational encouragement could enhance user adoption behaviour.
The Multigroup Analysis provides insights into how predictor relationships with BI evolved. The results confirm that Performance Expectancy remains the most potent predictor across both years, aligning with previous technology adoption research [6,111]. However, the expected duration of effort became significantly more influential in the 2024 study than in the 2022 study, with students in 2024 finding that usability was more valued. We can speculate that this could be due to increased digital literacy or changing user expectations. These findings suggest that educational institutions would do well to prioritise developing system functionality and maintaining usability and accessibility for new users.
Similarly, the Facilitating Conditions variable showed a modest decrease in perceived importance throughout the study (β = 0.251 in 2022 to β = 0.157 in 2024). This observation suggests that reliance on external support systems decreases as technology proficiency increases. This finding is coherent with earlier research demonstrating increased technological self-efficacy over time, reducing dependency on external support structures [104].
Social influence remained stable but increased slightly in 2024 (β = 0.183 vs. β = 0.155 in 2022). This indicates that peer and instructor encouragement became more critical. This suggests that institutions and organisations could use social and collaborative learning strategies, like peer mentoring and instructor encouragement, to increase system adoption. A study conducted by Diemer et al. [130] reported that students who characterised themselves as comfortable with e-learning modes reported significantly greater levels of perception of learning and engagement. Personal attitude can affect the student’s perception of different aspects of e-learning, including the perceived utility of the mobile tools [131]. Expectations of e-learning users are related to previous experiences, expertise level, and the context of e-learning [132]. Recent studies also report that teachers should embrace new technologies and include them in their strategies [133].
While our study focused on the positive factors influencing mobile learning adoption, it is essential to acknowledge the potential drawbacks associated with m-learning. Prior research highlights technical limitations, such as small screen dimensions and reduced information clarity [44], lower device capabilities compared to desktops [46], and user distraction risks due to multitasking or social media usage [47]. Although these aspects were not measured empirically in our current study, they represent relevant barriers that may impact long-term adoption and effectiveness. We recommend that future studies incorporate such factors into extended models of mobile learning acceptance.
It is worth noting that, although Microsoft Teams was a key platform for distance learning during the 2021/2022 academic year, the reported frequency of its use was lower compared to the 2023/2024 cohort. This counterintuitive result may be explained by contextual differences: during the pandemic, online class attendance was not compulsory, and some students did not participate regularly in virtual sessions. Additionally, technical limitations, such as insufficient internet access or the lack of adequate mobile devices, may have restricted app usage. In contrast, in the post-pandemic period, MS Teams became embedded in the regular academic routine and was increasingly used for asynchronous communication, collaborative group work, and hybrid teaching, contributing to higher engagement levels—particularly through the mobile application.
In summary, the findings provide a comprehensive answer to the research question by identifying performance expectancy, effort expectancy, social influence, and facilitating conditions as the key factors influencing students’ behavioural intention to adopt m-learning through the Microsoft Teams mobile application.

5.5. Contribution to Sustainability of E-Learning

The findings of this study provide several important implications for enhancing the sustainability of e-learning in higher education. First, the strong influence of performance expectancy on behavioural intention suggests that students are more likely to adopt and continue using mobile learning applications when they perceive them as effective and beneficial. This reinforces prior research showing that clearly perceived benefits are essential for the long-term adoption of e-learning platforms [126].
Second, the significance of effort expectancy and facilitating conditions indicates that usability and accessibility are critical for sustainable adoption. Prior studies have emphasised that intuitive system design, technical support, and equitable access to infrastructure are prerequisites for sustaining digital learning environments over time [10,126].
Third, the role of social influence highlights the importance of supportive learning communities and peer encouragement in promoting sustained use of m-learning platforms. These social and institutional drivers align with findings from Bervell and Arkorful [134], who demonstrated that subjective norms and institutional support foster long-term system engagement.
Overall, the model suggests that addressing these key factors can significantly improve the long-term viability of mobile learning solutions and contribute to more resilient, inclusive, and sustainable digital education systems.

6. Conclusions

This study examined the adoption of the MS Teams mobile application among university students, comparing responses from two cohorts: 2021/2022 (during the COVID-19 pandemic) and 2023/2024 (two years post-pandemic). The study explains how key adoption factors evolved by analysing Behavioural Intention to use MS Teams. This study contributes to the theoretical advancement of technology adoption research by reinforcing the validity of the proposed model [6,88,89]. The findings indicate that Performance Expectancy remains the strongest determinant, whilst Social Influence and Facilitating Conditions also play significant roles. Furthermore, while Effort Expectancy is statistically significant, it has a comparatively lower impact, suggesting that students prioritise functionality over ease of use when adopting the MS Teams mobile application [33,72].
From a practical perspective, the findings yield actionable insights for educational institutions, technology designers and organisations requesting to enhance mobile application adoption. To raise adoption rates, approaches should focus on the following:
  • Enhancing system usability to align with changing user expectations and improving user experience [35].
  • Reducing reliance on external technical support by implementing intuitive system designs, self-service resources, and user training programs [41].
  • Leveraging social and peer influence by fostering collaborative learning environments, peer mentoring, and instructor engagement strategies [135].
The longitudinal comparison between 2021 and 2022, and 2023 and 2024, highlights the evolution of adoption factors over time. While Performance Expectancy remained the dominant predictor, the increasing influence of Effort Expectancy in 2024 suggests that usability concerns are becoming more relevant as students adapt to evolving digital environments [19,104,111]. Conversely, the decreasing impact of Facilitating Conditions indicates that as digital literacy and familiarity with technology increase, reliance on institutional support structures may decline [42,43].
Notwithstanding the contributions of this study, several limitations must be acknowledged. Primarily, the sample was drawn from a specific user group (university students using MS Teams Mobile), which may limit the generalisability of the findings. It is recommended that future research extend the study to diverse populations, industries, or different institutional contexts to validate the model further [110,135].
Secondly, the present study employed a cross-sectional design, the advantage of which is that it captures user perceptions simultaneously. A longitudinal study, however, could provide more in-depth insights into how behavioural intention evolves, particularly as students become more familiar with the technology or as system upgrades are introduced [79,108].
Thirdly, while the present study examined key factors influencing behavioural intention, the role of other moderating variables, such as prior digital experience, institutional policies, cultural differences, and technology resistance, in explaining variations in adoption behaviour remains to be explored. Future research should examine these additional variables to gain a more complete understanding of the factors influencing mobile learning technology adoption [33,111].
Our study did not incorporate potential negative aspects of mobile learning, such as cognitive overload, distraction, or screen-related fatigue. Prior studies have discussed these challenges (e.g., [44,46,47]) and may influence students’ acceptance and sustained use of mobile learning platforms. Since our research model, based on UTAUT, was not designed to capture such effects, we could not address them empirically. We acknowledge this limitation and suggest that future studies extend the existing model to include variables representing these barriers to provide a more balanced understanding of mobile learning adoption.
Additionally, as the study was limited to students in business-related programs at a single institution, the generalisability of findings to other academic disciplines is limited. Future research should include students from various fields, such as IT or environmental studies, to explore potential disciplinary differences in m-learning adoption.
It is recommended that future research employ experimental or qualitative approaches to complement quantitative findings. This would allow for a more in-depth exploration of user motivations and barriers to mobile technology adoption. Furthermore, investigating the disparities in mobile application adoption across academic disciplines or professional environments can yield sector-specific insights of considerable value for developing targeted implementation strategies [35,41].
The present study further supports the notion that Performance Expectancy, Social Influence, Effort Expectancy, and Facilitating Conditions significantly influence Behavioural Intention to adopt the MS Teams mobile application [6]. Considering the evolution of adoption factors, future research should explore how shifts in digital literacy, changing educational technologies, and institutional policies influence long-term mobile application acceptance patterns [33]. Building upon these findings by incorporating additional variables and exploring mobile adoption behaviours across diverse user groups, industries, and learning environments will further enhance our understanding of technology acceptance and digital transformation [19,35]. The findings give researchers and practitioners valuable insights, emphasising the importance of perceived usefulness, social influence, and available resources in driving technology adoption.
This study answered the guiding research question by confirming that performance expectancy plays the most significant role in students’ intention to adopt mobile learning in business studies, followed by facilitating conditions, social influence, and effort expectancy. These insights contribute to a clearer understanding of the behavioural dynamics behind m-learning adoption in higher education.

Author Contributions

Conceptualization, S.P., I.S.K. and S.B.; methodology, S.S.Z.; software, I.S.K. and S.S.Z.; validation, S.P. and S.B.; formal analysis, S.S.Z. and S.B.; investigation, I.S.K.; resources, S.P., I.S.K., S.S.Z. and S.B.; data curation, I.S.K. and S.S.Z.; writing—original draft preparation, S.P., I.S.K. and S.S.Z.; writing—review and editing, S.P., I.S.K., S.S.Z. and S.B.; visualisation, I.S.K. and S.S.Z.; supervision, S.P., S.S.Z. and S.B.; project administration, I.S.K.; funding acquisition, S.P., S.S.Z. and S.B. All authors have read and agreed to the published version of the manuscript.

Funding

Authors acknowledge the financial support from the Slovenian Research Agency (research core funding No. P5–0023, Entrepreneurship for Innovative Society).

Institutional Review Board Statement

The Senate of the Faculty of Economics and Business at University of Maribor approves DECISION 2025/9 of the Commission for Research Ethics, namely The Commission for Research Ethics at Faculty of Economics and Business at University of Maribor establishes that the research with no risk to participants and issues a positive opinion regarding the ethical aspects of the research.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Dataset available on request from the authors.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Baron, N.S. Words Onscreen: The Fate of Reading in a Digital World; Oxford University Press: New York, NY, USA, 2015. [Google Scholar]
  2. Ross, B.; Pechenkina, E.; Aeschliman, C.; Chase, A.-M. Print Versus Digital Texts: Understanding the Experimental Research and Challenging the Dichotomies. Res. Learn. Technol. 2017, 25, 1976. [Google Scholar] [CrossRef]
  3. Delgado, P.; Vargas, C.; Ackerman, R.; Salmerón, L. Don’t Throw Away Your Printed Books: A Meta-Analysis on the Effects of Reading Media on Reading Comprehension. Educ. Res. Rev. 2018, 25, 23–38. [Google Scholar] [CrossRef]
  4. Singer, L.M.; Alexander, P.A. Reading on Paper and Digitally: What the Past Decades of Empirical Research Reveal. Rev. Educ. Res. 2017, 87, 1007–1041. [Google Scholar] [CrossRef]
  5. Zabasta, A.; Kazymyr, V.; Drozd, O.; Verslype, S.; Espeel, L.; Bruzgiene, R. Development of Shared Modeling and Simulation Environment for Sustainable E-Learning in the STEM Field. Sustainability 2024, 16, 2197. [Google Scholar] [CrossRef]
  6. Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User Acceptance of Information Technology: Toward a Unified View. MIS Q. 2003, 27, 425–478. [Google Scholar] [CrossRef]
  7. Chao, C.M. Factors Determining the Behavioral Intention to Use Mobile Learning: An Application and Extension of the UTAUT Model. Front. Psychol. 2019, 10, 1652. [Google Scholar] [CrossRef] [PubMed]
  8. Almaiah, M.A.; Alamri, M.M.; Al-Rahmi, W. Applying the UTAUT Model to Explain the Students’ Acceptance of Mobile Learning System in Higher Education. IEEE Access 2019, 7, 174673–174686. [Google Scholar] [CrossRef]
  9. Hew, K.F.; Brush, T. Integrating Technology into K-12 Teaching and Learning: Current Knowledge Gaps and Recommendations for Future Research. Educ. Technol. Res. Dev. 2007, 55, 223–252. [Google Scholar] [CrossRef]
  10. Zawacki-Richter, O. The Current State and Impact of COVID-19 on Digital Higher Education in Germany. Hum. Behav. Emerg. Technol. 2021, 3, 218–226. [Google Scholar] [CrossRef]
  11. Laskova, K. 21st Century Teaching and Learning with Technology: A Critical Commentary. Acad. Lett. 2021, 2, 2090. [Google Scholar] [CrossRef]
  12. Bentaa, D.; Bologaa, A.; Dzitaca, I. E-Learning Platforms in Higher Education. Case Study. Procedia Comput. Sci. 2014, 31, 1170–1176. [Google Scholar] [CrossRef]
  13. Ouadoud, M.; Rida, N.; Chafiq, T. Overview of E-Learning Platforms for Teaching and Learning. Int. J. Recent Contrib. Eng. Sci. IT 2021, 9, 50–70. [Google Scholar] [CrossRef]
  14. Al-Adwan, A.S.; Nofal, M.; Akram, H.; Albelbisi, N.A.; Al-Okaily, M. Towards a Sustainable Adoption of E-Learning Systems: The Role of Self-Directed Learning. J. Inf. Technol. Educ. Res. 2022, 21, 245–267. [Google Scholar] [CrossRef]
  15. Sizova, D.A.; Sizova, T.V.; Adulova, E.S. Advances in Social Science, Education and Humanities Research. In Proceedings of the International Scientific Conference “Digitalization of Education: History, Trends and Prospects” (DETP 2020), Yekaterinburg, Russia, 23–24 April 2020; Volume 437, pp. 328–334. [Google Scholar]
  16. Wasserman, M.E.; Fisher, S.L. E-Learning. In Encyclopedia of Electronic HRM; Bondarouk, T., Fisher, S., Eds.; De Gruyter Oldenbourg: Boston, MA, USA, 2020; pp. 188–194. [Google Scholar] [CrossRef]
  17. Javorcik, T.; Polasek, R. The Basis for Choosing MicroLearning within the Terms of E-Learning in the Context of Student Preferences. In Proceedings of the 16th International Conference on Emerging eLearning Technologies and Applications (ICETA), Stary Smokovec, Slovakia, 15–16 November 2018; pp. 237–244. [Google Scholar] [CrossRef]
  18. Mohana, M.; Valliammal, N.; Suvetha, V.; Krishnaveni, M. A Study on Technology-Enhanced Multimedia Learning for Enhancing Learner’s Experience in E-Learning. In Proceedings of the 2023 International Conference on Network, Multimedia and Information Technology (NMITCON), Bengaluru, India, 1–2 September 2023. [Google Scholar] [CrossRef]
  19. Kimura, R.; Matsunaga, M.; Barroga, E.; Hayashi, N. Asynchronous E-Learning with Technology-enabled and Enhanced Training for Continuing Education of Nurses: A Scoping Review. BMC Med. Educ. 2023, 23, 505. [Google Scholar] [CrossRef]
  20. Timbi-Sisalima, C.; Sánchez-Gordón, M.; Hilera-Gonzalez, J.R.; Otón-Tortosa, S. Quality Assurance in E-Learning: A Proposal from Accessibility to Sustainability. Sustainability 2022, 14, 3052. [Google Scholar] [CrossRef]
  21. Cassidy, A.; Fu, G.; Valley, W.; Lomas, C.; Jovel, E.; Riseman, A. Flexible Learning Strategies in the First through Fourth-Year Courses. Coll. Essays Learn. Teach. 2016, 9, 83–94. [Google Scholar] [CrossRef]
  22. Santiago, C.S., Jr.; Ulanday, M.L.P.; Centeno, Z.J.R.; Bayla, M.C.D.; Callanta, J.S. Flexible Learning Adaptabilities in the New Normal: E-Learning Resources, Digital Meeting Platforms, Online Learning Systems and Learning Engagement. Asian J. Distance Educ. 2021, 6, 38–56. [Google Scholar] [CrossRef]
  23. Furqon, M.; Sinaga, P.; Liliasari, L.; Riza, L.S. The Impact of Learning Management System (LMS) Usage on Students. TEM J. 2023, 12, 1082–1089. [Google Scholar] [CrossRef]
  24. Kirange, S.; Sawai, D. A Comparative Study of E-Learning Platforms and Associated Online Activities. Online J. Distance Educ. E Learn. 2021, 9, 192–1999. [Google Scholar]
  25. Nedungadi, P.; Raman, R. A New Approach to Personalization: Integrating E-Learning and M-Learning. Educ. Technol. Res. Dev. 2012, 60, 659–678. [Google Scholar] [CrossRef]
  26. Kasabova, G.; Parusheva, S.; Bankov, B. Learning Management Systems as a Tool for Learning in Higher Education. Izvestia J. Union Sci. Varna Econ. Sci. Ser. 2023, 12, 224–233. [Google Scholar] [CrossRef]
  27. Al-Abri, A.; Jamoussi, Y.; Kraiem, N.; Al-Khanjari, Z. Comprehensive Classification of Collaboration Approaches in E-Learning. Telemat. Inform. 2017, 34, 878–893. [Google Scholar] [CrossRef]
  28. Chou, Y. Actionable Gamification: Beyond Points, Badges, and Leaderboards; Createspace Independent Publishing Platform: Scotts Valley, CA, USA, 2015. [Google Scholar]
  29. Rajšp, A.; Beranič, T.; Heričko, M.; Horng-Jyh, P.W. Students’ Perception of Gamification in Higher Education Courses. In Proceedings of the Central European Conference on Information and Intelligent Systems, Varazdin, Croatia, 27–29 September 2017; pp. 69–75. [Google Scholar]
  30. El-Sabagh, H.A. Adaptive E-Learning Environment Based on Learning Styles and Its Impact on Development Students’ Engagement. Int. J. Educ. Technol. High. Educ. 2021, 18, 53. [Google Scholar] [CrossRef]
  31. Nuankaew, P.; Nuankaew, W.; Phanniphong, K.; Imwut, S.; Bussaman, S. Students Model in Different Learning Styles of Academic Achievement at the University of Phayao, Thailand. Int. J. Emerg. Technol. Learn. 2019, 14, 133. [Google Scholar] [CrossRef]
  32. Nouri, J. The Flipped Classroom: For Active, Effective and Increased Learning—Especially for Low Achievers. Int. J. Educ. Technol. High. Educ. 2016, 13, 33. [Google Scholar] [CrossRef]
  33. Al-Maroof, R.S.; Alhumaid, K.; Akour, I.; Salloum, S. Factors That Affect E-Learning Platforms after the Spread of COVID-19: Post Acceptance Study. Data 2021, 6, 49. [Google Scholar] [CrossRef]
  34. Encarnacion, R.F.E.; Galang, A.A.D.; Hallar, B.J.A. The Impact and Effectiveness of E-Learning on Teaching and Learning. Int. J. Comput. Sci. Res. 2020, 5, 383–397. [Google Scholar] [CrossRef]
  35. Naveed, Q.N.; Choudhary, H.; Ahmad, N.; Alqahtani, J.; Qahmash, A.I. Mobile Learning in Higher Education: A Systematic Literature Review. Sustainability 2023, 15, 13566. [Google Scholar] [CrossRef]
  36. Martono Kurniawan, T.; Nurhayati Oky, D. Implementation of Android-Based Mobile Learning Application as a Flexible Learning Media. Int. J. Comput. Sci. Issues 2014, 11, 168–174. [Google Scholar]
  37. Snezhko, Z.; Babaskin, D.; Vanina, E.; Rogulin, R.; Egorova, Z. Motivation for Mobile Learning: Teacher Engagement and Built-In Mechanisms. Int. J. Interact. Mob. Technol. 2022, 16, 78–93. [Google Scholar] [CrossRef]
  38. Ayuningtyas, P. Whatsapp: Learning on the Go. Metathesis J. Engl. Lang. Lit. Teach. 2018, 2, 159–170. [Google Scholar] [CrossRef]
  39. Salhab, R.; Daher, W. University Students’ Engagement in Mobile Learning. Eur. J. Investig. Health Psychol. Educ. 2023, 13, 202–216. [Google Scholar] [CrossRef]
  40. Anuyahong, B.; Pucharoen, N. Exploring the Effectiveness of Mobile Learning Technologies in Enhancing Student Engagement and Learning Outcomes. Int. J. Emerg. Technol. Learn. 2023, 18, 50–63. [Google Scholar] [CrossRef]
  41. Al-Razgan, M.; Alotaibi, H. Personalized Mobile Learning System to Enhance Language Learning Outcomes. Indian J. Sci. Technol. 2019, 12, 1–9. [Google Scholar] [CrossRef]
  42. Gikas, J.; Grant, M.M. Mobile Computing Devices in Higher Education: Student Perspectives on Learning with Cellphones, Smartphones & Social Media. Internet High. Educ. 2013, 19, 18–26. [Google Scholar] [CrossRef]
  43. Alrasheedi, M.; Capretz, L.F. Determination of Critical Success Factors Affecting Mobile Learning: A Meta-Analysis Approach. Turk. Online J. Educ. Technol. 2015, 14, 41–51. [Google Scholar]
  44. Alasmari, T. The Effect of Screen Size on Students’ Cognitive Load in Mobile Learning. J. Educ. Teach. Learn. 2020, 5, 280–295. [Google Scholar] [CrossRef]
  45. Lakshminarayanan, R.; Ramalingam, R.; Shaik, S.K. Challenges in Transforming, Engaging and Improving M-Learning in Higher Educational Institutions: Oman Perspective. In Proceedings of the Third International Conference of Educational Technology, Seeb, Oman, 24–26 March 2015. [Google Scholar]
  46. Ghamdi, E.; Yunus, F.; Da’ar, O.; El-Metwally, A.; Khalifa, M.; Aldossari, B.; Househ, M. The Effect of Screen Size on Mobile Phone User Comprehension of Health Information and Application Structure: An Experimental Approach. J. Med. Syst. 2016, 40, 11. [Google Scholar] [CrossRef]
  47. Shonola, S.A.; Joy, M. Mobile Learning Security Issues from Lecturers’ Perspectives (Nigerian Universities Case Study). In Proceedings of the EDULEARN14 Conference, Barcelona, Spain, 7–9 July 2014; pp. 7081–7088. [Google Scholar]
  48. Parusheva, S.; Aleksandrova, Y.; Hadzhikolev, H. Use of Social Media in Higher Education Institutions—An Empirical Study Based on Bulgarian Learning Experience. TEM J. 2018, 7, 171–181. [Google Scholar] [CrossRef]
  49. Merelo, J.J.; Castillo, P.A.; Mora, A.M. Chatbots and Messaging Platforms in the Classroom: An Analysis from the Teacher’s Perspective. Educ. Inf. Technol. 2024, 29, 1903–1938. [Google Scholar] [CrossRef]
  50. Ghafar, Z.N. Media Platforms in Any Fields, Academic, Non-Academic, All Over the World in Digital Era: A Critical Review. J. Digit. Learn. Distance Educ. 2024, 2, 707–721. [Google Scholar] [CrossRef]
  51. Mahdiuon, R.; Salimi, G.; Raeisy, L. Effect of Social Media on Academic Engagement and Performance: Perspective of Graduate Students. Educ. Inf. Technol. 2020, 25, 2427–2446. [Google Scholar] [CrossRef]
  52. Sevnarayan, K. Telegram as a Teaching Tool in Distance Education. In Proceedings of the Writing Research Across Borders, Trondheim, Norway, 18–22 February 2023. [Google Scholar] [CrossRef]
  53. Dollah, M.H.M.; Nair, S.M.; Wider, W. The Effects of Utilizing Telegram App to Enhance Students’ ESL Writing Skills. Int. J. Educ. Stud. 2021, 4, 10–16. [Google Scholar] [CrossRef]
  54. Deng, L.; Shen, Y.W.; Chan, J.W.W. Supporting Cross-Cultural Pedagogy with Online Tools: Pedagogical Design and Student Perceptions. TechTrends 2021, 65, 760–770. [Google Scholar] [CrossRef]
  55. Sternad Zabukovšek, S.; Deželak, Z.; Parusheva, S.; Bobek, S. Attractiveness of Collaborative Platforms for Sustainable E-Learning in Business Studies. Sustainability 2022, 14, 8257. [Google Scholar] [CrossRef]
  56. Ironsi, C.S. Google Meet as a Synchronous Language Learning Tool for Emergency Online Distant Learning During the COVID-19 Pandemic: Perceptions of Language Instructors and Preservice Teachers. J. Appl. Res. High. Educ. 2022, 14, 640–659. [Google Scholar] [CrossRef]
  57. de Lima, D.P.R.; Gerosa, M.A.; Conte, T.U. What to Expect, and How to Improve Online Discussion Forums: The Instructors’ Perspective. J. Internet Serv. Appl. 2019, 10, 22. [Google Scholar] [CrossRef]
  58. Martin, F.; Wang, C.; Sadaf, A. Student Perception of Helpfulness of Facilitation Strategies That Enhance Instructor Presence, Connectedness, Engagement and Learning in Online Courses. Internet High. Educ. 2018, 37, 52–65. [Google Scholar] [CrossRef]
  59. Guo, C.; Shea, P.; Chen, X.D. Investigation on Graduate Students’ Social Presence and Social Knowledge Construction in Two Online Discussion Settings. Educ. Inf. Technol. 2022, 27, 2751–2769. [Google Scholar] [CrossRef]
  60. Munoto, M.; Sumbawati, M.S.; Sari, S.F. The Use of Mobile Technology in Learning with Online and Offline Systems. Int. J. Inf. Commun. Technol. Educ. 2021, 17, 54–67. [Google Scholar] [CrossRef]
  61. Gure, S.G. “M-Learning”: Implications and Challenges. Int. J. Sci. Res. 2016, 5, 2087–2093. [Google Scholar]
  62. Razzaque, A. M-Learning Improves Knowledge Sharing over e-Learning Platforms to Build Higher Education Students’ Social Capital. SAGE Open 2020, 10, 1–9. [Google Scholar] [CrossRef]
  63. Filippo, D.; Barreto, C.G.; Fuks, H.; Lucena, C.J.P. Collaboration in Learning with Mobile Devices: Tools for Forum Coordination. In Proceedings of the 22nd ICDE—World Conference on Distance Education, Promoting Quality in Online, Flexible and Distance Education, Ed. ABED, Rio de Janeiro, Brazil, 3–6 September 2006; pp. 1–10. [Google Scholar]
  64. Kljun, M.; Krulec, R.; Pucihar, K.C.; Solina, F. Persuasive Technologies in m-Learning for Training Professionals: How to Keep Learners Engaged with Adaptive Triggering. IEEE Trans. Learn. Technol. 2019, 12, 370–383. [Google Scholar] [CrossRef]
  65. Kartika, D.A.I.; Nurkhamidah, N.; Santosa, I. Construction of EFL Learning Object Materials for Senior High School. J. Ilmu Sos. Dan Pendidik. 2022, 6, 2332–2343. Available online: http://ejournal.mandalanursa.org/index.php/JISIP/index (accessed on 20 January 2025).
  66. Balasundaram, S.; Mathew, J.; Nair, S. Microlearning and Learning Performance in Higher Education: A Post-Test Control. J. Learn. Dev. 2024, 11, 1–14. [Google Scholar]
  67. Farmer, L.S. Collective Intelligence in Online Education. Handb. Res. Pedagog. Models Next Gen. Teach. Learn. 2018, 285–305. [Google Scholar] [CrossRef]
  68. Saranya, A.K. A Critical Study on the Efficiency of Microsoft Teams in Online Education. In Efficacy of Microsoft Teams During COVID-19—A Survey; Bonfring Publication: Tamilnadu, India, 2020; pp. 310–323. [Google Scholar]
  69. Yusuf, M.O. Information and Communication Technology and Education: Analyzing the Nigerian National Policy for Information Technology. Int. Educ. J. 2005, 6, 316–321. [Google Scholar]
  70. Yaroslav, Y. Using Microsoft Teams in Online Learning of Students: Methodical Aspect. Inf. Technol. Learn. Tools 2024, 100, 72–91. [Google Scholar] [CrossRef]
  71. Ilag, B.N. Microsoft Teams Overview. In Understanding Microsoft Teams Administration; Apress: Berkeley, CA, USA, 2020. [Google Scholar] [CrossRef]
  72. Pal, D.; Vanijja, V. Perceived Usability Evaluation of Microsoft Teams as an Online Learning Platform During COVID-19 Using System Usability Scale and Technology Acceptance Model in India. Child. Youth Serv. Rev. 2020, 119, 105535. [Google Scholar] [CrossRef]
  73. Amaxra. Microsoft Teams Mobile App Overview; Amaxra: Redmond, WA, USA, 2020; Available online: https://www.amaxra.com/microsoft-teams-mobile-app-overview#:~:text=Overview%3A%20Microsoft%20Teams%20Mobile%20App,-Originally%20only%20released&text=(The%20app%20was%20previously%20also,mobile%20device%2C%20anywhere%20and%20anytime (accessed on 17 February 2025).
  74. Microsoft. Welcome to Microsoft Teams; Microsoft: Redmond, DC, USA, 2018; Available online: https://docs.microsoft.com/en-us/microsoftteams/teams-overview (accessed on 17 February 2025).
  75. Techterms. App Definition; Techterms: 2019. Available online: https://techterms.com/definition/app (accessed on 13 February 2025).
  76. Compete 366. What Is Microsoft Teams, and Who Should Be Using It? Compete 366: London, UK, 2019; Available online: https://www.compete366.com/blog-posts/microsoft-teams-what-is-it-and-should-we-be-using-it/ (accessed on 15 February 2025).
  77. Ilag, B.N.; Tripathy, D.; Ireddy, V. Organization Readiness for Microsoft Teams. In Understanding Microsoft Teams Administration: Configure, Customize and Manage the Teams Experience; Apress: Berkeley, CA, USA, 2023; pp. 285–334. [Google Scholar] [CrossRef]
  78. Narayn, H. Teams and Power Virtual Agents. In Building the Modern Workplace with SharePoint Online: Solutions with SPFx, JSON Formatting, Power Automate, Power Apps, Teams, and PVA; Apress: Berkeley, CA, USA, 2023; pp. 351–386. [Google Scholar]
  79. Ilag, B.N.; Sabale, A.M. Microsoft Teams Overview. In Troubleshooting Microsoft Teams: Enlisting the Right Approach and Tools in Teams for Mapping and Troubleshooting Issues; Apress: Berkeley, CA, USA, 2022; pp. 17–74. [Google Scholar] [CrossRef]
  80. Waizenegger, L.; McKenna, B.; Cai, W.; Bendz, T. An Affordance Perspective of Team Collaboration and Enforced Working from Home during COVID-19. Eur. J. Inf. Syst. 2020, 29, 429–442. [Google Scholar] [CrossRef]
  81. Bailenson, J.N. Nonverbal Overload: A Theoretical Argument for the Causes of Zoom Fatigue. Technol. Mind Behav. 2021, 2, 1–6. [Google Scholar] [CrossRef]
  82. Statista. Downloads of Microsoft Teams Mobile App Worldwide from 3rd Quarter 2019 to 4th Quarter 2023, by Region; Statista: Hamburg, Germany, 2024; Available online: https://www.statista.com/statistics/1240026/microsoft-teams-global-downloads-app-by-region/ (accessed on 17 January 2025).
  83. Statista. Most Popular Microsoft Apps Worldwide in 4th Quarter 2023, by Downloads (In Millions); Statista: Hamburg, Germany, 2024; Available online: https://www.statista.com/statistics/1268166/most-downloaded-microsoft-apps-worldwide/ (accessed on 17 January 2025).
  84. Microsoft. Get Started with Copilot in Microsoft Teams Meetings; Microsoft Support: Redmond, DC, USA, 2024; Available online: https://support.microsoft.com/en-us/office/get-started-with-copilot-in-microsoft-teams-meetings-0bf9dd3c-96f7-44e2-8bb8-790bedf066b1 (accessed on 11 February 2025).
  85. Microsoft. Use Copilot in Microsoft Teams Chat and Channels; Microsoft Support: Redmond, DC, USA, 2024; Available online: https://support.microsoft.com/en-us/office/use-copilot-in-microsoft-teams-chat-and-channels-cccccca2-9dc8-49a9-ab76-b1a8ee21486c (accessed on 11 February 2025).
  86. Microsoft. Manage Copilot for Microsoft Teams Meetings and Events; Microsoft Learn: Redmond, DC, USA, 2024; Available online: https://learn.microsoft.com/en-us/microsoftteams/copilot-teams-transcription (accessed on 11 February 2025).
  87. Fishbein, M.; Ajzen, I. Belief, Attitude, Intention, and Behavior: An Introduction to Theory and Research. Philos. Rhetor. 1977, 10, 130–132. [Google Scholar]
  88. Davis, F.D. Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Q. 1989, 13, 319–340. [Google Scholar] [CrossRef]
  89. Ajzen, I. The Theory of Planned Behavior. Organ. Behav. Hum. Decis. Process. 1991, 50, 179–211. [Google Scholar] [CrossRef]
  90. Taylor, S.; Todd, P.A. Understanding Information Technology Usage: A Test of Competing Models. Inf. Syst. Res. 1995, 6, 144–176. [Google Scholar] [CrossRef]
  91. Thompson, R.L.; Higgins, C.A.; Howell, J.M. Personal Computing: Toward a Conceptual Model of Utilization. MIS Q. 1991, 15, 125–143. [Google Scholar] [CrossRef]
  92. Rogers, E.M. Diffusion of Innovations, 4th ed.; Free Press: New York, NY, USA, 1995. [Google Scholar] [CrossRef]
  93. Bandura, A. Social Foundations of Thought and Action: A Social Cognitive Theory; Prentice Hall: Englewood Cliffs, NJ, USA, 1986. [Google Scholar]
  94. Attuquayefio, S.N.; Addo, H. Using the UTAUT Model to Analyze Students’ ICT Adoption. Int. J. Educ. Dev. Using Inf. Commun. Technol. 2014, 10, 75–86. [Google Scholar] [CrossRef]
  95. Abbad, M.M. Using the UTAUT Model to Understand Students’ Usage of E-Learning Systems in Developing Countries. Educ. Inf. Technol. 2021, 26, 7205–7224. [Google Scholar] [CrossRef]
  96. Garavand, A.; Samadbeik, M.; Nadri, H.; Rahimi, B.; Asadi, H. Effective Factors in Adoption of Mobile Health Applications Between Medical Sciences Students Using the UTAUT Model. Methods Inf. Med. 2019, 58, 131–139. [Google Scholar] [CrossRef]
  97. Yu, C.S. Factors Affecting Individuals to Adopt Mobile Banking: Empirical Evidence from the UTAUT Model. J. Electron. Commer. Res. 2012, 13, 104–121. [Google Scholar]
  98. Bhatiasevi, V. An Extended UTAUT Model to Explain the Adoption of Mobile Banking. Inf. Dev. 2016, 32, 799–814. [Google Scholar] [CrossRef]
  99. Oliveira, T.; Faria, M.; Thomas, M.A.; Popovič, A. Extending the Understanding of Mobile Banking Adoption: When UTAUT Meets TTF and ITM. Int. J. Inf. Manag. 2014, 34, 689–703. [Google Scholar] [CrossRef]
  100. Mosunmola, A.; Mayowa, A.; Okuboyejo, S.; Adeniji, C. Adoption and Use of Mobile Learning in Higher Education: The UTAUT Model. In Proceedings of the 9th International Conference on E-Education, E-Business, E-Management and E-Learning, San Diego, CA, USA, 11–13 January 2018; ACM: New York, NY, USA, 2018; pp. 20–25. [Google Scholar] [CrossRef]
  101. Uzoka, F.M.E. Organisational Influences on E-Commerce Adoption in a Developing Country Context Using UTAUT. Int. J. Bus. Inf. Syst. 2008, 3, 300–316. [Google Scholar] [CrossRef]
  102. Cody-Allen, E.; Kishore, R. An Extension of the UTAUT Model with E-Quality, Trust, and Satisfaction Constructs. In Proceedings of the 2006 ACM SIGMIS CPR Conference on Computer Personnel Research: Forty-Four Years of Computer Personnel Research: Achievements, Challenges & the Future, Claremont, CA, USA, 13–15 April 2006; ACM: New York, NY, USA, 2006; pp. 82–89. [Google Scholar] [CrossRef]
  103. Garson, G.D. Partial Least Squares: Regression and Structural Equation Models; Statistical Associates Publishers: Asheboro, NC, USA, 2016. [Google Scholar] [CrossRef]
  104. Sarstedt, M.; Ringle, C.M.; Hair, J.F. Partial Least Squares Structural Equation Modeling. In Handbook of Market Research; Homburg, C., Klarmann, M., Vomberg, A.E., Eds.; Springer: Cham, Switzerland, 2021; pp. 587–632. [Google Scholar] [CrossRef]
  105. Sternad Zabukovšek, S.; Kalinic, Z.; Bobek, S.; Tominc, P. SEM–ANN Based Research of Factors’ Impact on Extended Use of ERP Systems. Cent. Eur. J. Oper. Res. 2019, 27, 703–735. [Google Scholar] [CrossRef]
  106. Sternad Zabukovšek, S.; Bobek, S.; Zabukovšek, U.; Kalinić, Z.; Tominc, P. Enhancing PLS-SEM-Enabled Research with ANN and IPMA: Research Study of Enterprise Resource Planning (ERP) Systems’ Acceptance Based on the Technology Acceptance Model (TAM). Mathematics 2022, 10, 1379. [Google Scholar] [CrossRef]
  107. Zabukovšek, U.; Tominc, P.; Bobek, S. Business IT Alignment Impact on Corporate Sustainability. Sustainability 2023, 15, 12519. [Google Scholar] [CrossRef]
  108. Henseler, J.; Hubona, G.; Ray, P.A. Using PLS Path Modeling in New Technology Research: Updated Guidelines. Ind. Manag. Data Syst. 2016, 116, 2–20. [Google Scholar] [CrossRef]
  109. Ringle, C.M.; Wende, S.; Becker, J.-M. SmartPLS 4. Boenningstedt: SmartPLS GmbH. Available online: http://www.smartpls.com (accessed on 12 September 2024).
  110. Sarstedt, M.; Henseler, J.; Ringle, C.M. Multigroup Analysis in Partial Least Squares (PLS) Path Modeling: Alternative Methods and Empirical Results. In Measurement and Research Methods in International Marketing; Sarstedt, M., Schwaiger, M., Taylor, C.R., Eds.; Emerald Group Publishing Limited: Bingley, UK, 2011; pp. 195–218. [Google Scholar] [CrossRef]
  111. Hair, J.F.; Risher, J.J.; Sarstedt, M.; Ringle, C.M. When to Use and How to Report the Results of PLS-SEM. Eur. Bus. Rev. 2019, 31, 2–24. [Google Scholar] [CrossRef]
  112. Dijkstra, T.K.; Henseler, J. Consistent Partial Least Squares Path Modeling. MIS Q. 2015, 39, 297–316. [Google Scholar] [CrossRef]
  113. George, D.; Mallery, P. SPSS for Windows Step by Step: A Simple Guide and Reference, 11.0 Update, 4th ed.; Allyn & Bacon: Boston, MA, USA, 2003; ISBN 978-0205375523. [Google Scholar]
  114. Hair, J.F., Jr.; Hult, G.T.M.; Ringle, C.M.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM), 3rd ed.; SAGE Publications, Inc.: Thousand Oaks, CA, USA, 2022. [Google Scholar] [CrossRef]
  115. Fornell, C.; Larcker, D.F. Evaluating Structural Equation Models with Unobservable Variables and Measurement Error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  116. Chin, W.W. The Partial Least Squares Approach to Structural Equation Modeling. In Modern Methods for Business Research; Marcoulides, G.A., Ed.; Lawrence Erlbaum Associates: Mahwah, NJ, USA, 1998; pp. 295–336. [Google Scholar] [CrossRef]
  117. Sternad Zabukovšek, S.; Bobek, S. Using the Technology Acceptance Model for Factors Influencing Acceptance of Enterprise Resource Planning Solutions. In Encyclopedia of Information Science and Technology, 6th ed.; IGI Global: Hershey, PA, USA, 2025; pp. 1–28. [Google Scholar] [CrossRef]
  118. Hu, L.T.; Bentler, P.M. Cutoff Criteria for Fit Indexes in Covariance Structure Analysis: Conventional Criteria Versus New Alternatives. Struct. Equ. Model. 1999, 6, 1–55. [Google Scholar] [CrossRef]
  119. Ringle, C.M.; Sarstedt, M. Gain More Insight from Your PLS-SEM Results: The Importance-Performance Map Analysis. Ind. Manag. Data Syst. 2016, 116, 1865–1886. [Google Scholar] [CrossRef]
  120. Matthews, L. Applying Multigroup Analysis in PLS-SEM: A Step-by-Step Process. In Partial Least Squares Path Modeling: Basic Concepts, Methodological Issues and Applications; Latan, H., Noonan, R., Eds.; Springer: Cham, Switzerland, 2017; pp. 219–243. [Google Scholar] [CrossRef]
  121. Henseler, J.; Ringle, C.M.; Sarstedt, M. Testing Measurement Invariance of Composites Using Partial Least Squares. Int. Mark. Rev. 2016, 33, 405–431. [Google Scholar] [CrossRef]
  122. Vyas, N.; Nirban, V. Students’ Perception on the Effectiveness of Mobile Learning in an Institutional Context. ELT Res. J. 2014, 3, 26–36. [Google Scholar]
  123. Eom, S. The Effects of the Use of Mobile Devices on the E-Learning Process and Perceived Learning Outcomes in University Online Education. E Learn. Digit. Media 2023, 20, 80–101. [Google Scholar] [CrossRef]
  124. Yusri, I.K.; Goodwin, R.; Mooney, C. Teachers and Mobile Learning Perception: Towards a Conceptual Model of Mobile Learning for Training. Procedia Soc. Behav. Sci. 2015, 176, 425–430. [Google Scholar] [CrossRef]
  125. Popovici, A.; Mironov, C. Students’ Perception on Using eLearning Technologies. Procedia Soc. Behav. Sci. 2015, 180, 1514–1519. [Google Scholar] [CrossRef]
  126. Almaiah, M.A.; Al-Khasawneh, A.; Althunibat, A. Exploring the Critical Challenges and Factors Influencing the E-Learning System Usage during COVID-19 Pandemic. Educ. Inf. Technol. 2020, 25, 5261–5280. [Google Scholar] [CrossRef] [PubMed]
  127. Venkatesh, V.; Thong, J.Y.; Xu, X. Consumer Acceptance and Use of Information Technology: Extending the Unified Theory of Acceptance and Use of Technology. MIS Q. 2012, 36, 157–178. [Google Scholar] [CrossRef]
  128. Lu, X.; Viehland, D. Factors Influencing the Adoption of Mobile Learning. In ACIS 2008 Proceedings, Proceedings of the Australasian Conference on Information Systems, Christchurch, New Zealand, 3–5 December 2008; p. 56. Available online: https://aisel.aisnet.org/acis2008/56/ (accessed on 5 February 2025).
  129. Fabito, B.S. Exploring Critical Success Factors of Mobile Learning as Perceived by Students of the College of Computer Studies–National University. In Proceedings of the 2017 International Conference on Soft Computing, Intelligent System and Information Technology (ICSIIT), Bali, Indonesia, 26–29 September 2017; IEEE: Yogyakarta, Indonesia, 2017; pp. 220–226. [Google Scholar] [CrossRef]
  130. Diemer, T.T.; Fernandez, E.; Streepey, J.W. Student Perceptions of Classroom Engagement and Learning Using iPads. J. Teach. Learn. Technol. 2013, 1, 13–25. [Google Scholar]
  131. Herrador-Alcaide, T.C.; Hernández-Solís, M.; Hontoria, J.F. Online Learning Tools in the Era of m-Learning: Utility and Attitudes in Accounting College Students. Sustainability 2020, 12, 5171. [Google Scholar] [CrossRef]
  132. Dirin, A.; Nieminen, M. User Experience Evolution of M-Learning Applications. In Proceedings of the 9th International Conference on Computer Supported Education (CSEDU 2017), Porto, Portugal, 25–27 April 2017; SCITEPRESS: Porto, Portugal, 2017; pp. 154–161. [Google Scholar] [CrossRef]
  133. Fombona, J.; Pascual, M.A.; Pérez Ferra, M. Analysis of the Educational Impact of M-Learning and Related Scientific Research. J. New Approaches Educ. Res. 2020, 9, 167–180. [Google Scholar] [CrossRef]
  134. Bervell, B.; Arkorful, V. LMS-Enabled Blended Learning Use Intention Among Distance Education Students in Ghana: Initial Application of UTAUT Model. Educ. Inf. Technol. 2020, 25, 1117–1135. [Google Scholar] [CrossRef]
  135. Al-Azawei, A.; Parslow, P.; Lundqvist, K. Investigating the Effect of Learning Styles in a Blended E-Learning System: An Extension of the Technology Acceptance Model (TAM). Australas. J. Educ. Technol. 2017, 33, 1–23. [Google Scholar] [CrossRef]
Figure 1. Impact of E-learning on educational environments.
Figure 1. Impact of E-learning on educational environments.
Sustainability 17 03487 g001
Figure 2. E-learning channels.
Figure 2. E-learning channels.
Sustainability 17 03487 g002
Figure 3. Research model.
Figure 3. Research model.
Sustainability 17 03487 g003
Figure 4. Structural Model with Path Coefficients (β) and t-values.
Figure 4. Structural Model with Path Coefficients (β) and t-values.
Sustainability 17 03487 g004
Figure 5. Importance–Performance Map (IPMA) for Behavioural Intention.
Figure 5. Importance–Performance Map (IPMA) for Behavioural Intention.
Sustainability 17 03487 g005
Table 1. Descriptive statistics.
Table 1. Descriptive statistics.
Variables2021/20222023/2024Total
Frequency%Frequency%Frequency%
Gender
Male5939.60%9833.11%15735.28%
Female9060.40%19866.89%28864.72%
Age
<19 years00.00%20.68%20.45%
19–25 years13993.29%29398.99%43297.08%
>25 years106.71%10.34%112.47%
Study Program
Higher Education9664.43%8328.04%17940.22%
University Program3221.48%18763.18%21949.21%
Master’s Program2114.09%268.78%4710.56%
Year of Study
First Year10570.47%16355.07%26860.22%
Second Year2214.77%3812.84%6013.48%
Third Year2013.42%8829.73%10824.27%
Graduate Year21.34%72.36%92.02%
Frequency of Using MS Teams
Three or more times per day2718.12%5618.92%8318.65%
Once or twice per day6342.28%16355.07%22650.79%
Less than once per day2516.78%4314.53%6815.28%
Once a week or less3422.81%3411.49%6815.28%
Table 2. Descriptive Statistics for Indicators and Factor Loadings.
Table 2. Descriptive Statistics for Indicators and Factor Loadings.
FactorsIndicatorsMeanMedianObserved MinObserved MaxStandard
Deviation
Factor
Loadings
Behavioural
Intention
BI13.9484.0001.0005.0001.0080.727
BI34.0024.0001.0005.0000.8440.844
BI43.6654.0001.0005.0000.9520.899
BI53.8634.0001.0005.0000.9440.908
Effort
Expectancy
EE14.4475.0001.0005.0000.6430.768
EE24.3214.0001.0005.0000.6850.816
EE34.3644.0001.0005.0000.6590.768
EE44.1624.0001.0005.0000.7070.765
EE54.4384.0001.0005.0000.6030.815
EE64.4134.0001.0005.0000.6390.828
Facilitating
Conditions
FC14.6205.0001.0005.0000.6480.685
FC24.5335.0002.0005.0000.5540.701
FC53.9534.0001.0005.0000.8690.816
Performance
Expectancy
PE13.9214.0001.0005.0000.8820.732
PE104.2454.0001.0005.0000.7410.731
PE33.6224.0001.0005.0000.8950.764
PE54.1784.0001.0005.0000.7570.795
Social
Influence
SI12.7823.0001.0005.0001.0870.923
SI22.5013.0001.0005.0001.0290.922
Table 3. Construct reliability and validity.
Table 3. Construct reliability and validity.
FactorsCronbach’s
Alpha
Composite
Reliability (CR)
Dijkstra–Henseler’s Rho (ρ_A) Average Variance Extracted (AVE) f2
Behavioural Intention0.8660.8720.8450.718
Effort Expectancy0.8830.8880.7930.6300.016
Facilitating Conditions0.6210.6690.7340.5420.040
Performance Expectancy0.7500.7530.7560.5710.152
Social Influence0.8250.8250.9230.8510.045
Table 4. Discriminant Validity—Fornell–Larcker Criterion.
Table 4. Discriminant Validity—Fornell–Larcker Criterion.
FactorsBIEEFCPESI
Behavioural Intention0.847
Effort Expectancy0.4570.794
Facilitating Conditions0.4950.5440.736
Performance Expectancy0.6110.5320.5040.756
Social Influence0.4150.1990.2980.4220.923
Table 5. Discriminant Validity—Heterotrait–Monotrait Ratio Matrix.
Table 5. Discriminant Validity—Heterotrait–Monotrait Ratio Matrix.
FactorsBIEEFCPESI
Behavioural Intention
Effort Expectancy0.518
Facilitating Conditions0.6110.746
Performance Expectancy0.7520.6440.657
Social Influence0.4910.2270.3540.537
Table 6. Path Coefficients and Hypothesis Testing Results.
Table 6. Path Coefficients and Hypothesis Testing Results.
RelationshipsOriginal Sample (β)Sample Mean
(M)
Standard Deviation (STDEV)t-Statisticsp-ValuesHypothesis
PE → BI0.3800.3800.0478.1170.000H1 confirmed
EE → BI0.1180.1180.0482.4480.014H2 confirmed
SI → BI0.1750.1740.0414.2410.000H3 confirmed
FC → BI0.1860.1890.0543.4620.001H4 confirmed
Table 7. Predictive Relevance (Q2) and RMSE Comparison.
Table 7. Predictive Relevance (Q2) and RMSE Comparison.
IndicatorQ2predictPLS–SEM_RMSELM_RMSEDifference
BI10.2590.8700.877−0.007
BI30.3170.6990.700−0.001
BI40.3120.7910.796−0.005
BI50.3490.7630.7630.000
Table 8. Importance–Performance Map Analysis (IPMA) Results for Behavioural Intention.
Table 8. Importance–Performance Map Analysis (IPMA) Results for Behavioural Intention.
ImportancePerformance
EE0.11883.890
FC0.18681.907
PE0.38075.671
SI0.17540.962
Table 9. Step 2 of the MICOM Procedure—Compositional Invariance Results.
Table 9. Step 2 of the MICOM Procedure—Compositional Invariance Results.
ConstructOriginal
Correlation
Correlation Permutation Mean5.0%Permutation
p-Value
BI0.9990.9990.9980.120
EE0.9980.9980.9950.487
FC0.9870.9860.9530.357
PE0.9990.9970.9930.686
SI1.0000.9990.9970.971
Table 10. Step 3 of the MICOM Procedure—Mean and Variance Differences.
Table 10. Step 3 of the MICOM Procedure—Mean and Variance Differences.
Mean Values
ConstructOriginal DifferencePermutation Mean Difference5.0%95.0%Permutation
p-Value
BI−0.160−0.002−0.1700.1640.060
EE−0.282−0.001−0.1680.1620.003
FC−0.306−0.001−0.1690.1630.002
PE−0.348−0.001−0.1680.1630.000
SI0.067−0.003−0.1710.1640.243
Variance Differences
ConstructOriginal DifferencePermutation Mean Difference5.0%95.0%Permutation
p-Value
BI−0.001−0.006−0.2760.2540.510
EE−0.405−0.009−0.3030.3230.007
FC−0.119−0.007−0.2900.2770.261
PE0.090−0.006−0.2550.2430.268
SI−0.120−0.005−0.2120.2000.180
Table 11. Partial Measurement Invariance—Path Comparisons Across Groups.
Table 11. Partial Measurement Invariance—Path Comparisons Across Groups.
PathOriginal (2022)p-Value (2022)Original (2024)p-Value (2024)Invariant
EE → BI0.0710.4240.1450.013No
FC → BI0.2510.0050.1570.014Yes
PE → BI0.3670.0000.3850.000Yes
SI → BI0.1550.0330.1830.000Yes
Table 12. Multigroup Analysis—Path Differences, Significance, and Effect Sizes Across Student Groups (2022 vs. 2024).
Table 12. Multigroup Analysis—Path Differences, Significance, and Effect Sizes Across Student Groups (2022 vs. 2024).
Original (2022)Original (2024)Mean (2022)Mean (2024)STDEV (2022)STDEV (2024)t Value (2022)t Value (2024)p-Value (2022)p-Value (2024)
EE → BI0.0710.1450.0750.1440.0890.0580.7992.4870.4240.013
FC → BI0.2510.1570.2500.1600.0900.0642.8072.4710.0050.014
PE → BI0.3670.3850.3710.3860.0960.0553.8057.0510.0000.000
SI → BI0.1550.1830.1530.1820.0730.0512.1283.6010.0330.000
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Parusheva, S.; Klancnik, I.S.; Bobek, S.; Sternad Zabukovsek, S. Enhancing Sustainability of E-Learning with Adoption of M-Learning in Business Studies. Sustainability 2025, 17, 3487. https://doi.org/10.3390/su17083487

AMA Style

Parusheva S, Klancnik IS, Bobek S, Sternad Zabukovsek S. Enhancing Sustainability of E-Learning with Adoption of M-Learning in Business Studies. Sustainability. 2025; 17(8):3487. https://doi.org/10.3390/su17083487

Chicago/Turabian Style

Parusheva, Silvia, Irena Sisovska Klancnik, Samo Bobek, and Simona Sternad Zabukovsek. 2025. "Enhancing Sustainability of E-Learning with Adoption of M-Learning in Business Studies" Sustainability 17, no. 8: 3487. https://doi.org/10.3390/su17083487

APA Style

Parusheva, S., Klancnik, I. S., Bobek, S., & Sternad Zabukovsek, S. (2025). Enhancing Sustainability of E-Learning with Adoption of M-Learning in Business Studies. Sustainability, 17(8), 3487. https://doi.org/10.3390/su17083487

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop