Next Article in Journal
Impact of COVID-19 School Closures on German High-School Graduates’ Perceived Stress: A Structural Equation Modeling Study of Personal and Contextual Resources
Previous Article in Journal
Urban Middle Schoolers’ Experiences of an Outdoor Adventure Education Program to Facilitate Social and Emotional Development
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Measuring Students’ Use of Digital Technology to Support Their Studies

Faculty of Economics and Business, University of Maribor, Razlagova 14, 2000 Maribor, Slovenia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(7), 842; https://doi.org/10.3390/educsci15070842
Submission received: 15 May 2025 / Revised: 14 June 2025 / Accepted: 26 June 2025 / Published: 2 July 2025

Abstract

To provide a more holistic understanding of how digital tools shape the educational environment, this paper includes a comprehensive analysis that explores several dimensions of technology use in higher education: use of artificial intelligence in education, online collaboration, use of an E-Board for learning, and excessive use of technology. With the aim of measuring students’ use of digital technology to support their studies, this research meets the goals of developing the measurement process, building a multi-criteria model, and applying it to a real-life example of determining the degree of students’ use of digital technology in relation to the demonstrated quality of academic performance. The analysis is based on a survey conducted among students at the University of Maribor’s Faculty of Economics and Business. Using factor analysis and multi-criteria evaluation, the findings reveal that students who demonstrate very-high-quality achievements also report the highest level of technology use to support their studies. They are followed by students with outstanding achievements, who excel in using an E-Board for learning and in demonstrating responsibility regarding excessive technology use. Students who achieve acceptable-quality results with certain defects stand out in online collaboration and the use of AI in the study process. The lowest level of technology use was reported by students demonstrating moderate-quality achievements. Theoretically, this research contributes to a better understanding of the multidimensional use of digital technology in higher education, while, practically, it provides useful guidelines for optimizing digital learning tools and enhancing the overall quality of the academic process.

1. Introduction

Over the past decade, online learning has become an integral part of academic life and a crucial element in the long-term strategy of educational institutions. The rapid advancement of digital technology, combined with globalization and the increasing demand for accessible education, has transformed traditional learning environments (Chen et al., 2020). As universities seek to adapt to the evolving needs of students, digital education has gained prominence, offering new opportunities for learning beyond geographical and temporal constraints. The shift from conventional classroom-based instruction to blended and fully online learning models has not only expanded access to education but also redefined pedagogical strategies, assessment methods, and student engagement (Correia & Good, 2023).
One of the driving forces behind this transformation has been the trend toward openness and information sharing, facilitated by digital platforms and cloud-based learning management systems (McGrath et al., 2023). These innovations have enabled the widespread adoption of e-learning tools, virtual classrooms, and remote collaboration software, fostering a more dynamic and interconnected academic landscape (Correia & Good, 2023). The proliferation of digital resources, such as open educational materials, online courses, and interactive simulations, has empowered students to take a more self-directed approach to their learning (Chen et al., 2020). Moreover, technological advancements have led to the development of adaptive learning systems that personalize educational content based on individual progress and learning styles, making education more inclusive and tailored to diverse student needs (Ateş & Köroğlu, 2024).
With the continuous evolution of digital tools, students increasingly engage with diverse technological resources to enhance their learning experiences. From artificial intelligence (AI)-powered tutoring systems and automated feedback mechanisms to immersive virtual reality environments, digital technology is revolutionizing the way students acquire knowledge, collaborate, and interact with course materials (Tumaini et al., 2021). Integrating digital platforms, artificial intelligence, and collaborative tools has reshaped the traditional academic landscape, enabling more flexible and interactive forms of education (Kamalov et al., 2023). These developments have made learning more engaging and introduced new teaching methodologies, such as flipped classrooms and gamified learning experiences, which have been shown to improve student motivation and knowledge retention (Kovari, 2025).
In recent years, artificial intelligence has emerged as a key driver of educational transformation, enabling more adaptive and personalized learning experiences. AI-powered tools, such as intelligent tutoring systems, automated grading software, and predictive analytics, have the potential to enhance academic performance, optimize instructional methods, and improve student engagement (Farhan et al., 2024). These innovations allow educators to gain deeper insights into student learning patterns, identify struggling students early, and provide targeted interventions to support their academic progress (Chu et al., 2022). At the same time, the increasing reliance on digital technology raises concerns about its excessive use and its potential negative impact on students’ well-being and learning outcomes. The convenience of online learning and the abundance of digital resources have contributed to a shift in study habits, with students spending more time on screens and engaging in multitasking behaviors that may hinder deep learning and cognitive processing (Aivaz & Teodorescu, 2022). Thus, the balance between leveraging technology for academic success and mitigating its potential drawbacks remains a critical issue in higher education research (Chu et al., 2022; Aivaz & Teodorescu, 2022).
Despite numerous studies on digital learning, there is still a need for a comprehensive analysis that explores multiple dimensions of technology use in higher education. This study seeks to bridge this gap by analyzing four key constructs: excessive use of technology, the use of E-Boards for learning, online collaboration, and AI integration in education. By investigating these dimensions collectively, this research aims to provide a more holistic understanding of how digital tools shape the educational environment. To achieve this, a survey was conducted among undergraduate and postgraduate students at the University of Maribor, Faculty of Economics and Business, with the aim of examining their engagement with various digital learning tools and assessing their perceptions of technology’s impact on their studies. The survey sought to explore also students’ attitudes toward online learning environments, AI-assisted educational technologies, and collaborative platforms.
This research, therefore, aims to measure the students’ use of digital technology to support their studies. The objectives include developing the measurement process, building the multi-criteria model, and applying it to a real-life example of determining the degree of students’ use of digital technology in relation to the demonstrated quality of academic performance.
In the remainder of the paper, a literature review provides a comprehensive examination of the four examined constructs of digital technology use in studying. However, there is a gap in the comparative analysis of digital technology usage levels considering individual dimensions and concerning multiple dimensions, particularly in relation to the demonstrated quality of students’ academic performance. To fill this gap, the research questions are as follows:
  • What are the levels of students’ use of digital technology in studying concerning the demonstrated quality of academic performance?
  • In which dimensions of digital technology use in studying do students excel based on the demonstrated quality of their academic performance?
We then introduce the methodology used in the study, covering data collection, the sample, the measurement instrument, statistical analysis, and multicriteria measurement. This is followed by a detailed presentation of the results, whose significance and theoretical and practical implications are discussed in the conclusion, along with identified limitations and possibilities for future research.

2. Literature Review

2.1. The Integration of Artificial Intelligence in Higher Education

With the rapid advancement of technology, AI has become an essential component of modern life, particularly in higher education. AI-driven technologies are transforming various aspects of academia, including instruction, student engagement, assessment, curriculum management, and institutional data analysis (McGrath et al., 2023). By integrating AI into educational processes, universities aim to enhance learning experiences, optimize instructional methods, and improve administrative efficiency (Kohnke et al., 2023). One of AI’s most significant contributions to education is enhancing learning experiences by adapting content and improving engagement. Machine learning algorithms analyze students’ progress and tailor curricula to individual needs, leading to greater student motivation, deeper understanding of material, and improved knowledge retention. Personalization of learning content enhances student engagement and reduces dropout rates by allowing students to learn at their own pace (Chen et al., 2020). Moreover, AI enables the analysis of student behavioral patterns, which can assist in the early detection of learning difficulties and provide timely support. Advanced data analytics allow educational institutions to develop effective support strategies, ultimately improving the overall quality of education (Chu et al., 2022). AI-driven virtual assistants and chatbots provide 24/7 support for students by answering queries related to coursework, deadlines, and institutional policies. These systems enhance student engagement by offering personalized study recommendations and resources. Some AI assistants function as learning companions, helping students with assignments, language learning, and problem-solving (Crompton & Burke, 2023). The integration of AI technologies into educational environments has significantly enhanced the quality of learning experiences, instructional methods, and administrative decision-making (McGrath et al., 2023; Farhan et al., 2024). The study by Farhan et al. (2024), conducted at the University of Baghdad with 379 pharmacy students, explored the impact of AI on educational service quality. The findings showed a strong positive correlation between AI use and perceived improvements. AI-assisted curricula enhanced quality by 52.4% through personalized learning and automated assessments. Decision-making supported by AI improved outcomes by 60.4%, enabling better resource allocation and performance tracking. AI-powered e-learning platforms contributed 53.9% by increasing engagement and accessibility, while AI-based training programs improved learning effectiveness by 50.1% through real-time feedback and virtual simulations.

2.2. Online Collaboration Among Students: Benefits and Challenges

Modern science education is increasingly adapting to the digital era by developing essential competencies and knowledge that enable students to understand and address complex scientific and technological challenges (Ateş & Köroğlu, 2024). One of the key trends in higher education is the growing integration of online collaboration tools, which facilitate more efficient knowledge sharing and connectivity among students (Huang et al., 2020). The use of collaborative platforms, such as shared documents, virtual classrooms, and communication tools, not only enhances active engagement in the learning process but also strengthens teamwork, critical thinking, and problem-solving skills in a digital environment (Kalogiannakis et al., 2021). This approach to collaboration improves access to educational content, provides greater flexibility in studying, and contributes to a deeper understanding of the subject matter through interactive discussions and group projects (Rannastu-Avalos & Siiman, 2020; Salta et al., 2020). As web-based applications, such tools facilitate various collaborative activities including messaging, file sharing, and assessment, fostering dynamic interaction. Not only do they enhance communication and cooperation among students, but they also boost student engagement and motivation in the learning process (Ateş & Köroğlu, 2024; Abdillah et al., 2021). Additionally, these tools provide students with opportunities to apply their acquired knowledge to real-world scenarios, offering numerous benefits from promoting active participation to facilitating the transfer of theoretical knowledge into practice (Alismaiel et al., 2022). Real-time interactive applications enable students to connect and collaborate both with their peers and educators, creating a wide range of learning opportunities. The integration of these tools into science education requires an approach based on reciprocal interaction between the learner and the learning environment (Alismaiel et al., 2022; Feyzi Behnagh & Yasrebi, 2020). By providing students with a digital environment that fosters authentic and meaningful learning experiences, online collaboration tools bridge the gap between education and real-world problem-solving (Salta et al., 2020). Online collaboration tools play a crucial role in enhancing students’ learning outcomes, motivation, and engagement. As digital platforms, they facilitate effective communication, resource sharing, and group task execution, contributing to a dynamic and interactive learning experience (Abdillah et al., 2021). The use of these technologies fosters active student participation, strengthens critical thinking, and enables the connection between theoretical knowledge and practical application. Students who engage in online collaboration tend to achieve better academic results and exhibit higher motivation as they have the opportunity to interact, collaborate, and solve real-world problems in a digital environment (Ateş & Köroğlu, 2024). Additionally, these platforms offer flexibility in learning and promote the development of teamwork skills, which are essential for modern education and career advancement. Despite their numerous advantages, studies highlight certain challenges, such as students’ digital literacy, access to stable internet connections, and the need to adapt teaching methodologies for the effective integration of these tools into educational processes (Ateş & Köroğlu, 2024; Rannastu-Avalos & Siiman, 2020).

2.3. Digital Learning Environments: The Use of E-Boards in Education

The increasing demand for digital tools in education has led to the development of electronic blackboard systems (E-Boards), which provide an alternative to traditional chalkboards and whiteboards. These digital learning tools aim to enhance teaching efficiency, reduce manual workload, and create a more interactive learning experience. Digital boards contribute to a more interactive learning environment, supporting multimedia content such as images and equations, with potential future developments for video integration. Unlike traditional blackboards, which require continuous writing and erasing, E-Boards provide a seamless transition between content display and lecture flow, improving teaching effectiveness (Mishra et al., 2019). The integration of E-Boards into educational environments has significantly transformed the way students engage in learning, particularly in the context of cooperative learning and digital collaboration (Mhouti et al., 2017). According to Bodnenko et al. (2020), the use of E-Boards facilitates student interaction, enhances teamwork, and improves the overall organization of educational activities, especially in distance learning settings. Bodnenko et al. (2020) suggest that the use of E-Boards significantly enhances student collaboration and engagement, making them an indispensable tool for modern education. As universities continue adopting technology-enhanced learning environments, E-Boards will play an increasing role in supporting interactive, cooperative, and flexible education models. Mhouti et al. (2017) emphasized that E-Boards combined with cloud-based solutions facilitate real-time student collaboration, enabling shared content access and interactive discussions, even in remote learning environments. The findings suggest that the adoption of E-Boards and cloud-based solutions can significantly enhance digital education, providing flexibility, efficiency, and improved learning outcomes. However, challenges such as internet connectivity issues, security concerns, and the need for digital literacy among educators must be addressed to fully leverage the potential of these technologies. As educational institutions continue to transition toward technology-enhanced learning, the role of E-Boards and cloud computing is expected to expand, offering new opportunities for personalized and interactive teaching methodologies.

2.4. Excessive Use of Technology and Its Impact on Academic Performance

The use of digital technology to enable learning anytime and anywhere, deliver content, and facilitate student engagement has seen significant growth in recent years. Several studies highlight how students leverage digital technologies to optimize their learning experiences, including information retrieval, collaborative engagement, and interaction with faculty and peers (Cohen et al., 2022). These technologies have contributed to more flexible learning environments, improved student-centered education, and broader access to knowledge. Furthermore, digital tools support a more inclusive and equitable higher education landscape by bridging geographical barriers and providing learners with diverse resources tailored to their needs (Pechenkina & Aeschliman, 2017). In modern higher education, increasing attention is being given to the role of educational technology and its impact on teaching methods and students’ academic achievements (Mishra et al., 2019). The study by Correia and Good (2023), conducted at an American university, examined students’ perceptions regarding the use of technology in the learning process. They analyzed 34,480 survey responses to determine how technology influences teaching, learning progress, and overall course satisfaction. The results showed a strong correlation between the use of technology and the effectiveness of teaching methods. Instructors who utilized digital tools received higher ratings for clarity of explanation, connecting theory with practice, and encouraging students to engage in independent research. Technology-enhanced instruction had a positive impact on achieving learning objectives, with students highlighting the following key benefits: (1) improvement in analytical thinking and problem-solving, (2) enhancement of competencies relevant to professional careers, and (3) development of critical thinking and independent research skills (Correia & Good, 2023). The study by Pechenkina and Aeschliman (Pechenkina & Aeschliman, 2017), conducted at Swinburne University of Technology in Australia, examined students’ preferences in the use of educational technology. The research analyzed 66 survey responses from randomly selected students. The findings revealed that 78% of students believe that a combination of online and face-to-face learning helps them better understand the content. Most students preferred a blended learning approach over exclusively online or traditional face-to-face education. Additionally, 49% of students used social media platforms (Facebook and Twitter) to communicate with their peers about their studies. The results also indicated that social media plays an important role in student communication (Pechenkina & Aeschliman, 2017).

3. Materials and Methods

3.1. Data and Sample

From 3 January 2024 to 20 February 2024, a total of 287 undergraduate and postgraduate students from the Faculty of Economics and Business at the University of Maribor in Slovenia participated in an online survey. Of the respondents, 54.4% were undergraduate students, while 45.6% were enrolled in postgraduate programs. The sample comprised 43% male and 57% female students. Regarding their field of study, the distribution was as follows: 3% were from strategic and project management, 3% from international business economics, 7% from economics, 6% from accounting, auditing, and taxation, 9% from entrepreneurship and innovation, 8% from management informatics and e-business, 16% from finance and banking, 26% from marketing, and 22% from management, organization, and human resources. Among students who recorded their average grades, 25% demonstrated acceptable-quality achievements with defects, 50% moderate-quality achievements, 23% very-high-quality achievements, and 2% demonstrated outstanding achievements.

3.2. Measurement Instrument

A closed-type online questionnaire was employed as the research instrument. The students were asked to rate their agreement with the provided statements on a 5-point Likert-type scale ranging from 1, which corresponds to ‘strongly disagree’, to 5, which corresponds to ‘completely agree’. The research was conducted using an online survey, which guarantees complete anonymity and the absence of any collection of personal data.
The questionnaire was reviewed prior to distribution by experts specializing in educational technology and quantitative research methods to ensure content validity and appropriateness of the items in the context of higher education. Reliability of the constructs was tested using Cronbach’s alpha, with all constructs exceeding the recommended value of 0.8. The items were adopted from previously validated studies conducted in university settings; although these studies originate from various countries, they share a comparable research context in terms of population (university students) and focus (use of digital tools in learning). Participation in the survey was entirely voluntary. An introductory note informed students that the data would be used for research purposes in an anonymized form.
The questionnaire focused on students’ self-reported use of digital tools across their study experiences. It did not evaluate tool use in a specific course or setting but rather examined general learning behaviors. Examples of tools referenced include AI-based educational tools (such as ChatGPT or other AI writing assistants), collaborative platforms (e.g., Microsoft Teams and Google Docs), and digital whiteboards or E-Boards used during lectures or study sessions. E-Board use was included as separate construct because digital whiteboards are widely and consistently used across lectures in the participating institution, particularly in subjects involving quantitative and analytical content. Their role is pedagogically significant as they support structured content delivery, visualization, and active problem-solving during teaching sessions. Students also reported on their use of apps or functions for managing screen time and minimizing distractions. As this study is based on perceived usage rather than direct observation or experimental implementation, it reflects students’ organic engagement with digital tools in their academic routines.
Students were asked to indicate their average academic grade based on their most recent semester. This grade typically reflects a weighted average of exam performance, written assignments, and continuous assessment, depending on the course structure.
Items for the construct excessive use of technology were adopted from Abid et al. (2022) and Navarro-Martínez and Peña-Acuña (2022), items for the construct using an E-Board for learning were adopted from Mishra et al. (2019), items for the construct online collaboration were adopted from Venter (2024) and Forman and Miller (2023), and items for the construct use of artificial intelligence in education were adopted from Chen et al. (2020) and Wang et al. (2024).

3.3. Data Analysis

The frame procedure for multi-criteria decision-making (Čančer & Mulej, 2010) based on the analytic hierarchy process (AHP) (Saaty, 1986) was used in measuring students’ use of digital technology to support their studies. Groups of students formed according to their grade point averages, based on demonstration of achievements, are defined as alternatives: outstanding, very high quality, moderate quality, and acceptable quality with defects. Based on the above-mentioned survey, factor analysis (Costello & Osborne, 2005) was used to form the criteria structure.
In this paper, the expression ‘achievement in the data’ refers to the students’ self-reported academic performance, measured by their average study grades. Based on these grades, students were categorized into four achievement groups: (1) students with acceptable-quality achievements with defects, (2) students with moderate-quality achievements, (3) students with very-high-quality achievements, and (4) students with outstanding achievements. These categories served as alternatives in the multi-criteria evaluation process, enabling us to compare digital technology use across different levels of academic performance.
The results of the descriptive statistical analysis, namely, the means of agreement with the statements that represent items in factor analysis, by the considered groups of students, designed with regard to their academic performance, were inputs for measuring the alternatives’ values with value functions. Pairwise comparisons (Saaty, 1986; Saaty & Sodenkamp, 2010) were used to collect the individual judgments of teachers dealing with the use of digital technology by students at the 1st and 2nd levels of study. Individual weights were aggregated by geometric means and normalized to obtain group weights expressing the importance of each criterion. The aggregate values of alternatives obtained with the distributive mode of synthesis (Saaty & Sodenkamp, 2010; Saaty, 2008) indicate the level of the use of digital technology in the study for a particular alternative. Based on the obtained aggregated values according to different levels of criteria and according to all criteria, the alternatives were ranked. Gradient and dynamic sensitivity analyses were performed to determine the stability of the results obtained, and performance sensitivity analysis was performed to identify the key strengths and weaknesses of each student group when using the technology in the study process.

4. Results

The Keiser–Meyer–Olkin (KMO) values of sampling adequacy (KMO > 0.5) and the results of Bartlett’s test of sphericity (p < 0.001) indicated that the use of factor analysis for each construct is justified. Due to the low proportions of the total variance of certain variables explained by the selected factors (communality < 0.4), we excluded the items ‘I find AI-based tools like language translators or grammar checkers beneficial in my studies’, ‘I believe AI can help me manage my study schedule effectively by providing reminders and updates’, and ‘I trust AI’s ability to provide accurate and unbiased information’ from the construct ‘use of AI in education’. Similarly, the items ‘I find online collaboration tools easy to use’, ‘I would prefer in-person collaborations over online collaborations’, and ‘I feel comfortable expressing disagreement or offering constructive criticism in an online collaborative setting’ were excluded from the construct ‘online collaboration’. Furthermore, the items ‘I am more likely to skim through the material on an E-Board rather than reading it thoroughly’, ‘navigating through digital content on an E-Board is easier than flipping through pages of a book’, ‘I experience more eye strain when using E-Boards, which affects my study sessions’, and ‘I feel that lack of physically highlighting or annotating affects my ability to memorize material on E-Boards’ were excluded from the construct ‘using an E-Board for learning’. Finally, the item ‘I often use technology (like study apps or online resources) positively to assist in my studying’ was excluded from the construct ‘excessive use of technology’.
The resulting factor structures based on the factor loadings values as presented in Table 1, Table 2, Table 3 and Table 4 indicate the hierarchy of criteria (Figure 1, Table 5): in the multi-criteria model, constructs represent first-level criteria, and factors serve as second-level criteria, while the items from the questionnaire, renamed as criteria, form the third level of criteria hierarchy. The only exception is the second-level criterion ‘AI and human interactions concerns’, which originates from the item ‘I am concerned that AI use in education could reduce the need for human interaction’ (Table 1). To avoid unnecessary expansion of the model, this item is incorporated into the second-level criterion. Structuring the hierarchy of criteria using factor analysis is also conceptually meaningful. The factor loadings presented in Table 1, Table 2, Table 3 and Table 4 represent the strength of the relationship between each questionnaire item and the underlying latent constructs (factors) identified through exploratory factor analysis. A factor can be thought of as a hidden dimension that explains the common variance among several related items. Loadings range between −1 and 1, with values closer to ±1 indicating a stronger association between the item and the factor. In these tables, two factors were extracted for each digital tool usage dimension, labeled Factor 1 and Factor 2. These factors group together items that reflect similar patterns of responses. For example, in the dimension of AI usage in education, Factor 1 may capture items related to the benefits of AI in education, while Factor 2 may reflect perceived human interaction concerns due to AI. The loadings help interpret how each item contributes to the overall structure of that dimension.
The scheme of the criteria hierarchy for measuring students’ use of digital technology in their studies is shown in Figure 1, with descriptions of the meaning of symbols in Table 5.
Table 5 shows the group weights of all criteria included in the model, relative to the higher-level, i.e., the upper criterion. The group weights shown in Table 5 indicate how important each criterion (i.e., the dimension of digital technology use) is in evaluating the overall performance of student groups. In the context of multi-criteria decision-making, each criterion contributes differently to the final outcome, depending on its weight. A higher group weight means that the corresponding digital tool usage dimension had a greater influence in distinguishing between student groups. These weights were calculated using the aggregation by geometric mean from individual weights of teachers dealing with the use of digital technology by students (Table A2).
The obtained group weights in Table 5 show that, concerning the students’ use of digital technology supportive of their studies, the most important first-level criterion is ‘use of AI in education’, followed by ‘using an E-Board for learning’, ‘excessive use of technology’, and ‘online communication’. Regarding the second-level criteria, ‘benefits of AI in education’ are more important than ‘AI and human education concerns’ in the use of AI in education. Similarly, ‘positive engagement in online collaboration’ outweighs ‘collaboration challenges’. Concerning using an E-Board for learning, however, ‘E-Board learning challenges’ are more important than ‘traditional learning advantages’, and concerning excessive use of technology, ‘technology-induced study distractions’ outweigh ‘technology-use regulation effects’. On the third level, the most important benefit of AI in education is its capacity to make learning more engaging and interactive, while its capacity to personalize learning to suit students’ needs and pace was judged to be the least important. Developing skills that are relevant for students’ future careers through online collaboration is the most important, and enrichment of students’ learning by the diverse perspectives shared during online collaborations is the least important for positive engagement in online collaboration. Managing students’ time effectively during online collaborative tasks is a more important collaboration challenge than establishing personal connections with peers in an online environment. The most important E-Board learning challenge is the convenience of switching between different materials or resources on an E-Board to disrupt students’ focused study time, while the effects of the lack of tactile experience in E-Boards as compared to physical books on the students’ ability to learn are judged the least important E-Board learning challenge. Better memorization of the material because of studying from physical books, less promotion of in-depth reading and understanding by E-Boards compared to physical books, and greater contribution of traditional book-based learning to retain and recall information, despite the benefits of technology, are judged to be the most important traditional learning advantages. Diverting students’ attention from studying by online content (like videos, articles, or social media posts) is the most important, and frequent use of smartphones or computers for non-study purposes during study is the least important study distraction induced by technology. Regular use of apps or features to limit students’ screen time during the study is judged to be the most important technology-use regulation effect (Table 5).
Alternatives’ values concerning each lowest-level criterion obtained by value functions are presented in Table 6. For each criterion at the lowest level, the lower bound is equal to the minimum mean and the upper bound is equal to the maximum mean (Table A1 in Appendix A) to ensure greater differentiation between alternatives. To measure the supporting power of the use of digital technology in the study, we used the increasing linear value functions concerning all the lowest-level criteria of ‘use of AI in education’, and also ‘positive engagement in online collaboration’, which is the second-level criterion of ‘online collaboration’; on the contrary, the decreasing linear value functions were used concerning both lowest-level criteria of another second-level criterion ‘collaboration challenges’. Regarding ‘using an E-Board for learning’, an increasing linear function was used in the case of ‘greater difficulty in remembering information learned from physical books compared to an E-Board’, while concerning all other the lowest-level criteria, the decreasing linear value functions were employed. The use of decreasing functions also prevailed in criteria at the lowest-level of the criterion ‘excessive use of technology’, except for ‘regular use of apps or features to limit students’ screen time during study hours’. Although the outstanding quality (OQ) group includes only five students, it was retained as a separate cluster due to its distinct digital tool usage profile and maximum academic achievement level (grades 9.5–10). The statistically significant differences observed in comparison with other groups, especially the very-high-quality (VHQ) group, suggest that OQ students may engage with digital learning tools in a qualitatively different and more strategic way (Table 6 and Table 7). While the small sample size is noted as a limitation, the preservation of this group allows for a more nuanced understanding of digital behavior patterns among the most academically successful students.
Table 7 includes aggregate values of alternatives concerning the global goal, as well as to first- and second-level criteria. The ranks in Table 7 show the final position of each student group after aggregating their values across all evaluated dimensions of digital technology use and across first- and second-level criteria. These ranks reflect the relative standing of each group in terms of how effectively and responsibly they reported using digital tools to support their learning. The ranking is based on the aggregated alternatives’ values derived through a multi-criteria evaluation approach, where each digital tool dimension was weighted (as shown in Table 5), and student group performance according to the lowest-level criteria was assessed with local values (Table 6). A lower numerical rank (e.g., 1st) indicates a stronger overall digital usage profile in relation to academic success, while a higher rank (e.g., 4th) suggests less effective or less regulated use of digital technologies.
The results in Table 7 report that students who demonstrate very-high-quality achievements demonstrate the highest level of the use of digital technology to support their studies, although they do not excel considering any of the first-level criteria. Considering all criteria for measuring the students’ use of digital technology to support their studies, they are followed by students who demonstrate outstanding achievements—these excel in using an E-Board for learning and in responsibility regarding excessive technology use, and by the ones who demonstrate acceptable-quality achievements with defects—these excel in the use of AI in education and in online collaboration. The lowest level of the use of digital technology to support their studies was achieved by students demonstrating moderate-quality achievements. Gradient and dynamic sensitivity analysis let us report that the first rank is sensitive to changes in the first-level criteria as follows: if the group weight of ‘use of AI in education’ decreases by 0.066 or of ‘excessive use of technology’ increases by 0.060, i.e., less than 0.1, students that demonstrate outstanding achievements replace students that demonstrate very-high-quality achievements in the first place. Changes in the second- and third-level criteria weights by less than or equal to 0.1 do not change the order of the first and second-ranked alternatives.

5. Discussion

The results obtained let us answer the first research question. The highest level of digital technology use in studying (0.279) is achieved by the students with very-high-quality academic performance, followed by the ones with outstanding-quality academic performance (0.263), those that demonstrate acceptable-quality achievements with defects (0.241), and the students with moderate-quality achievements (0.218). Together with the assigned weights of criteria, the obtained levels and ranks are further explained by answering the second research question. We would like to point out that students that demonstrate very-high-quality achievements, first-ranked concerning the global goal, do not achieve the highest aggregate value according to any of the first-level criteria, but they demonstrate the highest level of responsible use of technology in their studies according to the following second-level criteria: ‘AI and human interaction concerns’, ‘collaboration challenges’, ‘E-Board learning challenges’, and ‘technology-use regulation effects’; these students achieve the lowest aggregate level concerning the second-level criterion ‘traditional learning advantages’. Students that demonstrate outstanding achievements, second-ranked concerning the global goal, achieved the highest aggregate value concerning two first-level criteria, i.e., ‘using an E-Board for learning’ and ‘excessive use of technology’, and the second-level criteria ‘traditional learning advantages’ and ‘technology-induced study distractions’. On the other hand, these students are last-ranked concerning the first-level criteria ‘use of AI in education’ and ‘online collaboration’ and to all their second-level criteria, as well as to the second-level criterion ‘technology-use regulation effects’. On the contrary, students that demonstrate acceptable achievements with defects, third-ranked concerning the global goal, achieved the highest values concerning the first-level criteria ‘use of AI in education’ and ‘online collaboration’ and the second-level criterion ‘positive engagement in online collaboration’, but they are last-ranked concerning the first level criterion ‘excessive use of technology’, its second-level criterion ‘technology-induced distractions’, and also concerning the second-level criterion ‘E-Board learning challenges’. The last-ranked group of students concerning the global goal, i.e., students that demonstrate moderate-quality achievements, has the highest aggregate value concerning only one second-level criterion, i.e., ‘benefits of AI in education’, and the lowest aggregate value concerning the first-level criterion ‘using an E-Board for learning’. The findings of this study align with prior research emphasizing that digital technology integration enhances students’ academic performance, particularly when used within structured learning environments. Our results indicate that students who utilize digital tools in a strategic and goal-oriented manner achieve better academic outcomes, which is consistent with the conclusions of Al-Abdullatif and Gameil (2021). Their research found that perceived ease of use and perceived usefulness of digital technology positively influence students’ attitudes toward learning and their engagement, ultimately leading to better academic performance. The findings of this study highlight the diverse ways students engage with digital technology in their academic environment and how these interactions influence their learning outcomes. The data reveal that students with lower academic performance tend to rely more on AI tools and online collaboration platforms, while those with higher academic achievements demonstrate a more selective and responsible approach to digital technology use. These results align with previous research suggesting that unstructured and excessive technology use may negatively impact academic performance, whereas strategic and goal-oriented usage can enhance learning outcomes (Navarro-Martínez & Peña-Acuña, 2022). Furthermore, our study confirms that the manner in which students engage with digital tools significantly affects their learning experience. While increased use of digital platforms does not necessarily translate to better academic performance, structured and goal-oriented use—such as critical engagement with e-learning tools and responsible technology consumption—seems to foster higher achievement levels. This observation is in line with the systematic review findings, which emphasize that the effectiveness of ICT in education depends on the quality of its integration into pedagogical strategies rather than its mere availability (Valverde-Berrocoso et al., 2022). Romi (2024) found that a combination of common digital communication skills, educational problem-solving abilities, and advanced digital competencies is necessary to maximize students’ academic success. While general digital literacy, such as online communication and transaction skills, enhances academic efficiency and satisfaction, more advanced digital skills, including database management, application development, and data processing, show a stronger correlation with academic effectiveness. This suggests that merely providing students with digital tools is insufficient. Institutions must also foster strategic digital engagement that encourages problem-solving, information literacy, and critical evaluation of digital content.
A recent systematic review by Sun and Zhou (2024) highlights the dual role of AI tools in education, emphasizing their potential to enhance student engagement and provide personalized support when used in guided and structured settings. However, they also caution that without adequate digital literacy and pedagogical framing, students may misuse such tools as shortcut generators, leading to superficial learning and a reduction in metacognitive awareness. This dual perspective aligns with our findings, where students with lower academic performance reported more frequent use of AI tools, potentially indicating a compensatory, yet ultimately counterproductive, pattern of engagement. Recent research into digital technology use among university students has highlighted the importance of behavioral and motivational factors in shaping academic outcomes. A study by Saleem et al. (2024) showed that frequent use of digital tools does not automatically lead to improved academic performance. What matters more is how students engage with these tools. If digital resources are used with clear educational goals in mind, they can support learning. However, when they are used primarily as a means of distraction, the outcomes are often negative. The study also pointed out that multitasking and unregulated technology use tend to reduce concentration and mental stamina, which in turn impairs learning effectiveness. Moreover, students with lower levels of intrinsic motivation were more inclined to use technology as a way to avoid academic challenges, which led to more superficial learning. These findings underline that digital literacy involves more than just technical skills. It also requires critical thinking, the ability to self-regulate, and a deliberate approach to using digital tools. Our study builds on these insights by applying a structured multi-criteria framework to explore how various forms of technology use relate to different levels of academic success. Findings from a longitudinal study by Al-Abdullatif and Gameil (2021) further support our results by confirming that academic benefits from digital tool use depend on consistency, structure, and clear learning intentions. Their research showed that students who engaged with learning technologies regularly and strategically performed significantly better than their peers who used digital tools irregularly or without a defined learning goal. These patterns align closely with our analysis, which revealed that students with the highest academic achievement made purposeful use of AI tools and digital platforms such as e-tables. Conversely, students with lower academic performance tended to use digital tools in an unstructured and compensatory way, reflecting a similar lack of intentionality as described in the longitudinal profiles identified in their study. Additional insights are offered by a recent study on university students’ perceptions of AI-powered learning (Rosaura et al., 2024), which revealed that while most students believe AI tools assist their learning, many use them passively, often without critically assessing the output or understanding how the tools function. This superficial engagement aligns with our findings among students with lower academic performance who use AI tools frequently but without demonstrable academic benefit. Moreover, the study reported that students often rely on AI to save time or generate quick summaries, rather than as a complement to deeper learning processes. This reflects a broader pattern of compensatory behavior, as also observed in our data. These results further confirm the need to distinguish between the frequency of AI tool usage and meaningful, critical engagement with digital technology in an educational context.
The findings from this study will provide valuable insights into both the advantages and limitations of digital education, shedding light on key factors that contribute to students’ academic success in a technology-driven learning environment. Understanding students’ experiences and perspectives is essential for identifying best practices in digital education and for addressing areas that require improvement. The paper fills the gap in comparative analysis of technology usage levels considering individual dimensions and concerning multiple dimensions, particularly in relation to the demonstrated quality of students’ academic performance. Furthermore, the study’s findings will serve as a valuable resource for educators, policymakers, and institutional leaders, helping them refine curriculum design, implement supportive digital policies, and invest in technological innovations that align with students’ educational needs and learning preferences. As higher education continues to evolve in response to technological advancements, it is imperative to strike a balance between harnessing the benefits of digital learning and addressing its inherent challenges. By offering a comprehensive analysis of students’ engagement with digital tools, this study aims to inform the development of more effective strategies for integrating technology into academic settings. Ultimately, the goal is to ensure that digital advancements act as enablers of student success, fostering deeper engagement, personalized learning experiences, and improved academic performance, rather than creating barriers to meaningful education.
These results have several practical implications for higher education institutions. For example, students who achieved the highest academic performance showed more regulated use of digital tools, especially in managing distractions and selecting learning technologies strategically. This suggests that fostering digital self-regulation and critical digital literacy may be more effective than merely increasing access to technology. Educators could integrate digital awareness modules into their courses, helping students to reflect on their technology habits and align tool use with academic goals. Moreover, the strong reliance on AI and online collaboration tools among students with lower achievement levels may indicate a need for structured guidance on how to use these technologies meaningfully, beyond convenience or surface engagement. This aligns with prior research emphasizing that the quality of technology use is more critical than the quantity.
While this study provides valuable insights into students’ use of digital technology in higher education, it is important to acknowledge certain limitations that may influence the interpretation and generalizability of the findings. These limitations primarily concern the scope of the sample, the methodology, and the contextual factors that may shape students’ digital learning behaviors. Addressing these constraints in future research will help refine our understanding of how technology supports academic performance and identify opportunities for optimizing digital learning environments. The research was conducted exclusively among students at the Faculty of Economics and Business, University of Maribor. Future studies should consider expanding the sample to include students from diverse fields, such as natural sciences, engineering, and social sciences, to provide a more comprehensive understanding of digital technology use in higher education. This research represents a cross-sectional analysis, capturing students’ use of digital tools at a specific moment in time. Consequently, it does not provide insights into longitudinal trends in technology adoption throughout different stages of academic life. A longitudinal study would allow researchers to track changes in digital learning habits across semesters or years, helping to understand how students adapt their use of technology over time. Also, it should be noted that the research design does not allow for the establishment of a causal relationship between the use of digital tools and academic performance. Rather, the study identifies associations and patterns among student groups with different achievement levels. Future studies using longitudinal or experimental designs would be needed to explore causality more directly. One important limitation of this study is the absence of a control group. The research was designed to compare different levels of digital tool usage among student groups with varying academic achievement levels, rather than to test the effects of a specific intervention. Future research could benefit from experimental or longitudinal designs incorporating control groups to better assess the impact of targeted digital learning strategies. Moreover, the study does not account for external factors that may influence students’ digital learning behavior, such as learning styles, prior digital literacy, socioeconomic background, or the role of instructors in guiding technology use. Future studies could incorporate these variables to explore their impact on students’ engagement with digital tools and academic performance.

6. Conclusions

This study offers valuable insights into how university students use digital technologies to support their academic work. By applying a multi-criteria evaluation framework, we were able to differentiate between student groups based on their academic performance and analyze how each group engages with various aspects of digital learning—ranging from artificial intelligence and online collaboration to E-Board usage and the regulation of technology overuse.
The findings suggest that students who demonstrate very-high-quality academic performance tend to use digital tools in a more structured and responsible manner, while students with lower performance levels show stronger reliance on specific technologies, such as AI and online collaboration. These insights contribute to a more nuanced understanding of digital technology use in higher education and highlight the importance of encouraging critical and goal-oriented engagement with digital tools.
The study also provides a foundation for educational institutions to reflect on how digital technologies are integrated into teaching practices and how students are guided in their use. Overall, the study reinforces the importance of aligning digital learning strategies with students’ needs and competencies, ensuring that technology serves as an enabler of academic success rather than a source of distraction or disengagement.

Author Contributions

Conceptualization, V.Č., M.R. and P.T.; methodology, V.Č., M.R. and P.T.; validation, V.Č., M.R. and P.T.; formal analysis, V.Č. and M.R.; investigation, V.Č., M.R. and P.T.; resources, V.Č., M.R. and P.T.; data curation, V.Č., M.R. and P.T.; writing—original draft preparation, V.Č., M.R. and P.T.; writing—review and editing, V.Č., M.R. and P.T.; visualization, V.Č., M.R. and P.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Slovenian Research Agency (research core funding No. P5–0023, ‘Entrepreneurship for Innovative Society’).

Institutional Review Board Statement

Ethical review and approval were waived for this study, due to this research falls under no risk to participants.

Informed Consent Statement

Informed consent for participation was obtained from all subjects involved in the study.

Data Availability Statement

Data are contained within the article.

Acknowledgments

This work was supported by the Slovenian Research Agency (Entrepreneurship for Innovative Society) under Project P5-0023.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A1. Means of agreement with the items describing the use of technology in the study.
Table A1. Means of agreement with the items describing the use of technology in the study.
ItemAlternative
AOQMQVHQOQ
I regularly use AI-based educational tools to assist in my learning (for example, ChatGPT, etc.).2.972.932.723.33
AI-driven tutoring systems have been effective in supplementing my learning.3.123.053.023.17
I believe AI technologies can personalize learning to suit my needs and pace.3.353.313.203.33
I often rely on AI-based tools for grading or feedback on my assignments.3.012.892.592.67
AI tools that provide instant feedback help me to learn and rectify mistakes quickly.3.133.212.982.83
I feel comfortable interacting with AI-based educational systems. 3.133.133.002.83
I believe that AI in education can make learning more engaging and interactive.3.123.293.323.00
AI has the potential to make complex topics easier to understand through visualizations and simulations.3.633.593.523.67
The use of AI in education will be beneficial for my future career considering the increasing digitization in all sectors.3.343.533.573.17
I am concerned that AI use in education could reduce the need for human interaction.3.633.413.703.17
I feel comfortable collaborating with my peers online.3.743.533.743.50
I believe that online collaboration enhances my learning experience.3.663.653.563.17
I find it easy to communicate and share ideas with peers in an online environment.3.593.783.803.83
My learning is enriched by the diverse perspectives shared during online collaborations.3.603.533.453.67
I feel my contributions are valued in online collaborative tasks.3.463.443.433.67
The feedback I receive from my peers during online collaboration improves my understanding of the subject.3.463.583.202.67
Online collaboration motivates me to stay engaged with the course material.3.493.243.112.83
Online collaborations help me develop skills that are relevant for my future career.3.613.533.522.67
I find it difficult to manage my time effectively during online collaborative tasks.3.033.062.893.17
I find it challenging to establish a personal connection with my peers in an online environment.2.903.052.592.83
I often get distracted when studying from an E-Board, impacting my attention to the material.2.872.802.643.17
I struggle to focus on the material when using an E-Board for a prolonged period.3.123.163.083.33
The lack of tactile experience in E-Boards as compared to physical books affects my ability to learn.3.043.092.892.50
I feel the interactive elements in E-Boards can sometimes distract from the main learning material.3.243.393.133.00
The convenience of switching between different materials or resources on an E-Board can disrupt my focused study time.3.283.183.083.17
I find it more difficult to remember information learned from physical books compared to an E-Board.2.372.141.781.67
I feel that studying from physical books results in better memorization of the material.3.653.883.983.17
My comprehension of the subject matter is better when studying from physical books than E-Boards.3.433.713.673.17
I feel that E-Boards do not promote as deep a level of reading and understanding as physical books do.3.383.423.803.00
Despite the benefits of technology, I believe traditional book-based learning is more conducive to retaining and recalling information.3.473.723.953.00
I often find myself checking my smartphone during study sessions.3.843.593.663.00
Social media notifications frequently distract me when I am studying.3.693.353.122.83
I believe my academic performance could improve if I reduced my screen time.4.023.763.463.00
I frequently use my smartphone or computer for non-study purposes during study time.3.643.453.112.50
My use of technology often extends beyond the time I had allocated for it, cutting into my study time.3.503.533.102.67
I sometimes lose track of time when using social media, leading to shorter or rushed study sessions.3.663.443.183.00
Online content (like videos, articles, or social media posts) often diverts my attention from studying.3.563.483.382.83
I find it challenging to concentrate on studying after using my smartphone or other devices.3.593.353.102.83
The use of digital devices for studying often leads me to multitask, reducing my study effectiveness.3.353.353.393.00
I regularly use apps or features to limit my screen time during study hours.2.882.502.461.83
My sleep pattern is disturbed due to excessive use of technology, affecting my study schedule.2.762.482.212.50
I feel anxious if I am unable to check my phone or social media for a period of time, including study time.2.512.362.181.83
AI—artificial intelligence, AOQ—students that demonstrate acceptable-quality achievements with defects, MQ—students that demonstrate moderate-quality achievements, VHQ—students that demonstrate very-high-quality achievements, OQ—students that demonstrate outstanding achievements.
Table A2. Individual criteria weights.
Table A2. Individual criteria weights.
Global Goal. Measuring Students’ Use of Technology to Support Their Studies
LevelCriterionY1Y2X
1AI. Use of AI in education0.1467660.572450.154061
2AI1. Benefits of AI in education0.1250.8333330.833333
3AI1.1. Regular use of AI-based educational tools to assist students’ learning0.0323360.3218540.093006
3AI1.2. Effectiveness of AI-driven tutoring systems in supplementing students’ learning0.0955980.1094990.021313
3AI1.3. The capacity of AI technologies to personalize learning to suit students’ needs and pace0.0184560.0598670.093006
3AI1.4. Reliance on AI-based tools for grading or feedback on students’ assignments0.0323360.1094990.093006
3AI1.5. Help with AI tools that provide instant feedback to learn and rectify mistakes quickly0.0689810.1094990.264897
3AI1.6. Feeling comfortable interacting with AI-based educational systems0.1576490.1094990.033387
3AI1.7. Capacity of AI in education to make learning more engaging and interactive0.1576490.1094990.264897
3AI1.8. The potential of AI to make complex topics easier to understand through visualizations and simulations0.1576490.0353930.043483
3AI1.9. Benefits of using AI in education for students’ future careers considering the increasing digitization in all sectors0.2793470.0353930.093006
2AI2. AI and human interaction concerns0.8750.1666670.166667
1OC. Online collaboration0.056090.109270.403121
2OC1. Positive engagement in online collaboration0.8333330.8333330.833333
3OC1.1. Feeling comfortable collaborating with peers online0.0736950.3916240.044924
3OC1.2. Capacity of online collaboration to enhance students’ learning experience0.0271450.0764550.131625
3OC1.3. Ease communication and sharing ideas with peers in an online environment0.2027330.0764550.044924
3OC1.4. Enrichment of students’ learning by the diverse perspectives shared during online collaborations0.019470.0764550.131625
3OC1.5. Appreciation of students’ contributions in online collaborative tasks0.0736950.1496470.131625
3OC1.6. Improving understanding of the subject by the feedback received from peers during online collaboration0.3268340.0764550.131625
3OC1.7. Staying engaged with the course material motivated by online collaboration0.0736950.0764550.131625
3OC1.8. Developing skills that are relevant for students’ future careers through online collaboration0.2027330.0764550.252026
2OC2. Collaboration challenges0.1666670.1666670.166667
3OC2.1. The difficulty of managing students’ time effectively during online collaborative tasks0.50.750.75
3OC2.2. The challenge of establishing personal connections with peers in an online environment0.50.250.25
1EB. Using an E-Board for learning0.1467660.109270.403121
2EB1. E-Board learning challenges0.8333330.8333330.166667
3EB1.1. Getting distracted when studying from an E-Board, impacting students’ attention on the material0.1328210.4514410.111111
3EB1.2. Struggling to focus on the material when using an E-Board for a prolonged period0.3596330.1558720.111111
3EB1.3. Effects of the lack of tactile experience in E-Boards as compared to physical books on the students’ ability to learn0.032650.0809430.333333
3EB1.4. The capacity of the interactive elements in E-Boards to sometimes distract from the main learning material0.3311370.1558720.111111
3EB1.5. The challenge of the convenience of switching between different materials or resources on an E-Board to disrupt students’ focused study time0.143760.1558720.333333
2EB2. Traditional learning advantages0.1666670.1666670.833333
3EB2.1. Greater difficulty in remembering information learned from physical books compared to an E-Board0.0909090.5555560.047619
3EB2.2. Better memorization of the material because of studying from physical books0.2727270.1111110.238095
3EB2.3. Better comprehension of the subject matter when studying from physical books than E-Boards0.0909090.1111110.238095
3EB2.4. Less promotion of in-depth reading and understanding by E-Boards compared to physical books0.2727270.1111110.238095
3EB2.5. Greater contribution of traditional book-based learning to retain and recall information, despite the benefits of technology0.2727270.1111110.238095
1T. Excessive use of technology0.6503790.2090090.039698
2T1. Technology-induced study distractions0.8333330.8333330.25
3T1.1. Checking smartphones during study sessions0.047180.0500540.039564
3T1.2. Frequent distractions caused by social media notifications when studying0.047180.0189130.116207
3T1.3. The capacity to reduce students’ screen time to improve their academic performance0.1360790.1451490.223628
3T1.4. Frequent use of smartphones or computers for non-study purposes during study time0.047180.0189130.039564
3T1.5. Often exceeding the time that students allocate for the use of technology, thereby cutting into their study time0.1360790.0189130.116207
3T1.6. Occasional loss of track of time when using social media, leading to shorter or rushed study sessions0.2459710.1451490.116207
3T1.7. Diverting students’ attention from studying by online content (like videos, articles, or social media posts)0.2459710.228880.116207
3T1.8. Challenge to concentrate on studying after using smartphones or other devices0.047180.1451490.116207
3T1.9. Multitask, reducing study effectiveness, caused by using digital devices for studying0.047180.228880.116207
2T2. Technology-use regulation effects0.1666670.1666670.75
3T2.1. Regular use of apps or features to limit students’ screen time during study hours0.1781780.6483290.777778
3T2.2. Disturbed sleep pattern due to excessive use of technology, affecting students’ study schedule0.7514050.2296510.111111
3T2.3. Feeling anxious if being unable to check phones or social media for a while, including study time0.0704180.122020.111111
AI—artificial intelligence, Y1, Y2, and X—evaluators of criteria’s importance.

References

  1. Abdillah, L., Handayani, T., Rosalyn, E. R., & Mukti, Y. I. (2021). Collaborating digital social media for teaching science and Arabic in higher education during COVID-19 pandemic. Ijaz Arabi: Journal of Arabic Learning, 4, 12–25. [Google Scholar] [CrossRef]
  2. Abid, H., Mohd, J., Mohd, A. Q., & Rajiv, S. (2022). Understanding the role of digital technologies in education: A review. Sustainable Operations and Computers, 3, 275–285. [Google Scholar]
  3. Aivaz, K. A., & Teodorescu, D. (2022). College students’ distractions from learning caused by multitasking in online vs. face-to-face classes: A case study at a public university in Romania. International Journal of Environmental Research and Public Health, 19, 11188. [Google Scholar] [CrossRef] [PubMed]
  4. Al-Abdullatif, A., & Gameil, A. (2021). The effect of digital technology integration on students’ academic performance through project-based learning in an e-learning environment. International Journal of Emerging Technologies in Learning, 16, 189–210. [Google Scholar] [CrossRef]
  5. Alismaiel, O. A., Cifuentes-Faura, J., & Al-Rahmi, W. M. (2022). Online learning, mobile learning, and social media technologies: An empirical study on constructivism theory during the COVID-19 pandemic. Sustainability, 14, 11134. [Google Scholar] [CrossRef]
  6. Ateş, H., & Köroğlu, M. (2024). Online collaborative tools for science education: Boosting learning outcomes, motivation, and engagement. Journal of Computer Assisted Learning, 40, 1052–1067. [Google Scholar] [CrossRef]
  7. Bodnenko, D., Kuchakovska, H., Proshkin, V., & Lytvyn, O. (2020). Using a virtual digital board to organize student’s cooperative learning. Available online: https://www.researchgate.net/profile/Halyna-Kuchakovska/publication/341932306_Using_a_virtual_digital_board_to_organize_student’s_cooperative_learning/links/5eda2a9192851c9c5e819168/Using-a-virtual-digital-board-to-organize-students-cooperative-learning.pdf (accessed on 11 March 2025).
  8. Chen, L., Chen, P., & Lin, Z. (2020). Artificial intelligence in education: A review. IEEE Access, 8, 75264–75278. [Google Scholar] [CrossRef]
  9. Chu, H., Tu, Y., & Yang, K. (2022). Roles and research trends of artificial intelligence in higher education: A systematic review of the top 50 most-cited articles. Australasian Journal of Educational Technology, 38, 22–42. [Google Scholar]
  10. Cohen, A., Soffer, T., & Henderson, M. (2022). Students’ use of technology and their perceptions of its usefulness in higher education: International comparison. Journal of Computer Assisted Learning, 38, 1321–1331. [Google Scholar] [CrossRef]
  11. Correia, A. P., & Good, K. (2023). Exploring the relationships between student perceptions and educational technology utilization in higher education. Journal of Educational Technology Development and Exchange, 16, 92–107. [Google Scholar] [CrossRef]
  12. Costello, A. M., & Osborne, J. O. (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Research & Evaluation, 10, 1–9. [Google Scholar]
  13. Crompton, H., & Burke, D. (2023). Artificial intelligence in higher education: The state of the field. International Journal of Educational Technology in Higher Education, 20, 22. [Google Scholar] [CrossRef]
  14. Čančer, V., & Mulej, M. (2010). The dialectical systems theory’s capacity for multi-criteria decision-making. Systems Research and Behavioral Science, 27, 285–300. [Google Scholar] [CrossRef]
  15. Farhan, N. D., Sadiq, B. H., Zwayyer, M. H., & Arnout, B. A. (2024). The impact of using artificial intelligence techniques in improving the quality of educational services/case study at the University of Baghdad. Frontiers in Education, 9, 1474370. [Google Scholar] [CrossRef]
  16. Feyzi Behnagh, R., & Yasrebi, S. (2020). An examination of constructivist educational technologies: Key affordances and conditions. British Journal of Educational Technology, 51, 1907–1919. [Google Scholar] [CrossRef]
  17. Forman, T. M., & Miller, A. S. (2023). Student perceptions about online collaborative coursework. Journal of Curriculum and Teaching, 12, 224–232. [Google Scholar] [CrossRef]
  18. Huang, S. Y., Kuo, Y. H., & Chen, H. C. (2020). Applying digital escape rooms infused with science teaching in elementary school: Learning performance, learning motivation, and problem-solving ability. Thinking Skills and Creativity, 37, 100681. [Google Scholar] [CrossRef]
  19. Kalogiannakis, M., Papadakis, S., & Zourmpakis, A. I. (2021). Gamification in science education. A systematic review of the literature. Education in Science, 11, 22. [Google Scholar] [CrossRef]
  20. Kamalov, F., Santandreu Calonge, D., & Gurrib, I. (2023). New Era of artificial intelligence in education: Towards a sustainable multifaceted revolution. Sustainability, 15, 12451. [Google Scholar] [CrossRef]
  21. Kohnke, L., Moorhouse, B. L., & Zou, D. (2023). Exploring generative AI preparedness among university language instructors: A case study. Computers and Education: Artificial Intelligence, 5, 100156. [Google Scholar]
  22. Kovari, A. (2025). A systematic review of AI-powered collaborative learning in higher education: Trends and outcomes from the last decade. Social Sciences & Humanities Open, 11, 101335. [Google Scholar]
  23. McGrath, C., Pargman, T. C., Juth, N., & Palmgren, P. J. (2023). University teachers’ perceptions of responsibility and AI in higher education—An experimental philosophical study. Computers and Education: Artificial Intelligence, 4, 100139. [Google Scholar]
  24. Mhouti, A., Erradi, M., & Nasseh, A. (2017). Using cloud computing services in e-learning process: Benefits and challenges. Education and Information Technologies, 23, 893–909. [Google Scholar] [CrossRef]
  25. Mishra, P., Meshram, Y., Dange, P., Wankhede, A., Bawankule, P., & Mane, S. (2019). E-black board system—A step towards digital education. International Journal of Scientific Research in Science, Engineering and Technology, 6, 48–51. [Google Scholar] [CrossRef]
  26. Navarro-Martínez, Ó., & Peña-Acuña, B. (2022). Technology usage and academic performance in the Pisa 2018 report. Journal of New Approaches in Educational Research, 11, 130–145. [Google Scholar] [CrossRef]
  27. Pechenkina, E., & Aeschliman, C. (2017). What do students want? Making sense of student preferences in technology-enhanced learning. Contemporary Educational Technology, 8, 26–39. [Google Scholar] [CrossRef] [PubMed]
  28. Rannastu-Avalos, M., & Siiman, L. A. (2020, September 8–11). Challenges for distance learning and online collaboration in the time of COVID-19: Interviews with science teachers. Collaboration Technologies and Social Computing: 26th International Conference, CollabTech 2020, Proceedings 26 (pp. 128–142), Tartu, Estonia. [Google Scholar]
  29. Romi, I. (2024). Digital skills impact on university students’ academic performance: An empirical investigation. Edelweiss Applied Science and Technology, 8, 2126–2141. [Google Scholar] [CrossRef]
  30. Rosaura, D., Héctor, C., Cristina, A., Juan, B., Montenegro, L., & Antonio, M. (2024). Emerging technological tools and post-pandemic academic performance in college students. Journal of Ecohumanism, 3, 2897–2916. [Google Scholar] [CrossRef]
  31. Saaty, T. L. (1986). Axiomatic foundation of the analytic hierarchy process. Management Sciences, 32, 841–855. [Google Scholar] [CrossRef]
  32. Saaty, T. L. (2008). Decision making with the analytic hierarchy process. International Journal of Services Sciences, 1, 83–98. [Google Scholar] [CrossRef]
  33. Saaty, T. L., & Sodenkamp, M. (2010). The analytic hierarchy and analytic network measurement processes: The measurement of intangibles. In C. Zopounidis, & P. M. Pardalos (Eds.), Handbook of multicriteria analysis (pp. 91–166). Springer. [Google Scholar]
  34. Saleem, F., Chikhaoui, E., & Malik, M. (2024). Technostress in students and quality of online learning: Role of instructor and university support. Frontiers in Education, 9, 1309642. [Google Scholar] [CrossRef]
  35. Salta, K., Paschalidou, K., Tsetseri, M., & Koulougliotis, D. (2020). Shift from a traditional to a distance learning environment during the COVID-19 pandemic: University students’ engagement and interactions. Science & Education, 31, 93–122. [Google Scholar]
  36. Sun, L., & Zhou, Z. (2024). Does generative artificial intelligence improve the academic achievement of college students? A meta-analysis. Journal of Educational Computing Research, 62, 1896–1933. [Google Scholar] [CrossRef]
  37. Tumaini, K., Ilias, P., & Dag, H. O. (2021). AI-enabled adaptive learning systems: A systematic mapping of the literature. Computers and Education: Artificial Intelligence, 2, 100017. [Google Scholar]
  38. Valverde-Berrocoso, J., Acevedo-Borrega, J., & Cerezo-Pizarro, M. (2022). Educational technology and student performance: A systematic review. Frontiers in Education, 7, 916502. [Google Scholar] [CrossRef]
  39. Venter, A. (2024). Exploring the downside to student online collaborations. The Independent Journal of Teaching and Learning, 19, 64–78. [Google Scholar] [CrossRef]
  40. Wang, S., Wang, F., Zhu, Z., Wang, J., Tran, T., & Du, Z. (2024). Artificial intelligence in education: A systematic literature review. Expert Systems with Applications, 252, 124167. [Google Scholar] [CrossRef]
Figure 1. Scheme of criteria hierarchy.
Figure 1. Scheme of criteria hierarchy.
Education 15 00842 g001
Table 1. Factor analysis results for ‘use of AI in education’.
Table 1. Factor analysis results for ‘use of AI in education’.
ItemsCommunalitiesFactor LoadingsCronbach’s Alpha
12
I regularly use AI-based educational tools to assist in my learning (For example, ChatGPT, etc.).0.7440.8620.0150.902
AI-driven tutoring systems have been effective in supplementing my learning.0.7660.875−0.006
I believe AI technologies can personalize learning to suit my needs and pace.0.7610.8690.075
I often rely on AI-based tools for grading or feedback on my assignments.0.7320.852−0.077
AI tools that provide instant feedback help me to learn and rectify mistakes quickly.0.7650.873−0.058
I feel comfortable interacting with AI-based educational systems.0.7140.840−0.094
I believe that AI in education can make learning more engaging and interactive.0.6920.8270.092
I am concerned that AI use in education could reduce the need for human interaction.0.795−0.1370.881
AI has the potential to make complex topics easier to understand through visualizations and simulations.0.5990.5910.470
The use of AI in education will be beneficial for my future career considering the increasing digitization in all sectors.0.5820.7190.256
Kaiser–Meyer–Olkin Measure of Sampling Adequacy: 0.932. Bartlett’s Test of Sphericity: Approx. Chi-Square: 1946.881; df: 45; Sig. < 0.001. Total Variance Explained. Factor 1: 60.458%, Factor 2: 11.046%, Cumulative: 71.504%.
Table 2. Factor analysis results for ‘online collaboration’.
Table 2. Factor analysis results for ‘online collaboration’.
ItemsCommunalitiesFactor LoadingsCronbach’s Alpha
12
I feel comfortable collaborating with my peers online.0.5630.7510.0120.835
I believe that online collaboration enhances my learning experience.0.6470.8030.044
I find it easy to communicate and share ideas with peers in an online environment.0.6630.806−0.113
My learning is enriched by the diverse perspectives shared during online collaborations.0.6660.813−0.078
I feel my contributions are valued in online collaborative tasks.0.5110.6930.175
I find it difficult to manage my time effectively during online collaborative tasks.0.6750.0620.819
I find it challenging to establish a personal connection with my peers in an online environment.0.6890.0040.830
The feedback I receive from my peers during online collaboration improves my understanding of the subject.0.5430.7270.121
Online collaboration motivates me to stay engaged with the course material.0.5170.6840.223
Online collaborations help me develop skills that are relevant for my future career.0.4660.682−0.039
Kaiser–Meyer–Olkin Measure of Sampling Adequacy: 0.874. Bartlett’s Test of Sphericity: Approx. Chi-Square: 1083.609; df: 45; Sig. < 0.001. Total Variance Explained. Factor 1: 45.932%, Factor 2: 15.827%, Cumulative: 61.759%.
Table 3. Factor analysis results for ‘using an E-Board for learning’.
Table 3. Factor analysis results for ‘using an E-Board for learning’.
ItemsCommunalitiesFactor LoadingsCronbach’s Alpha
12
I find it more difficult to remember information learned from physical books compared to an E-Board.0.6380.328−0.7280.843
I often get distracted when studying from an E-Board, impacting my attention to the material.0.5930.7700.002
I feel that studying from physical books results in better memorization of the material.0.7090.4410.717
I struggle to focus on the material when using an E-Board for a prolonged period.0.6210.7440.258
The lack of tactile experience in E-Boards as compared to physical books affects my ability to learn.0.4950.6870.154
I feel the interactive elements in E-Boards can sometimes distract from the main learning material.0.4770.6160.312
My comprehension of the subject matter is better when studying from physical books than E-Boards.0.7200.4400.726
I feel that E-Boards do not promote as deep a level of reading and understanding as physical books do.0.6280.5030.613
The convenience of switching between different materials or resources on an E-Board can disrupt my focused study time.0.5040.6400.307
Despite the benefits of technology, I believe traditional book-based learning is more conducive to retaining and recalling information.0.7100.4210.730
Kaiser–Meyer–Olkin Measure of Sampling Adequacy: 0.892. Bartlett’s Test of Sphericity: Approx. Chi-Square: 1202.372; df: 45; Sig. < 0.001. Total Variance Explained. Factor 1: 47.713%, Factor 2: 13.231%, Cumulative: 60.944%.
Table 4. Factor analysis results for ‘excessive use of technology’.
Table 4. Factor analysis results for ‘excessive use of technology’.
ItemsCommunalitiesFactor LoadingsCronbach’s Alpha
12
I often find myself checking my smartphone during study sessions.0.6530.8030.0880.867
Social media notifications frequently distract me when I am studying.0.6710.7790.253
I believe my academic performance could improve if I reduced my screen time.0.6110.7490.223
I frequently use my smartphone or computer for non-study purposes during study time.0.5850.7470.164
My use of technology often extends beyond the time I had allocated for it, cutting into my study time.0.6770.7960.208
I sometimes lose track of time when using social media, leading to shorter or rushed study sessions.0.6310.7910.070
Online content (like videos, articles, or social media posts) often diverts my attention from studying.0.6180.7850.042
I find it challenging to concentrate on studying after using my smartphone or other devices.0.5170.6350.338
I regularly use apps or features to limit my screen time during study hours.0.527−0.0420.725
My sleep pattern is disturbed due to excessive use of technology, affecting my study schedule.0.4970.2340.666
I feel anxious if I am unable to check my phone or social media for a period of time, including study time.0.6070.2730.730
The use of digital devices for studying often leads me to multitask, reducing my study effectiveness.0.4790.5760.383
Kaiser–Meyer–Olkin Measure of Sampling Adequacy: 0.908. Bartlett’s Test of Sphericity: Approx. Chi-Square: 1552.294; df: 66; Sig. < 0.001. Total Variance Explained. Factor 1: 49.103%, Factor 2: 12.726%, Cumulative: 61.829%.
Table 5. Weights of criteria.
Table 5. Weights of criteria.
LevelCriterionGeometric MeanGroup Weight
1AI. Use of AI in education0.2350.321
2AI1. Benefits of AI in education0.4430.605
3AI1.1. Regular use of AI-based educational tools to assist students’ learning0.0990.122
3AI1.2. Effectiveness of AI-driven tutoring systems in supplementing students’ learning0.0610.075
3AI1.3. The capacity of AI technologies to personalize learning to suit students’ needs and pace0.0470.058
3AI1.4. Reliance on AI-based tools for grading or feedback on students’ assignments0.0690.085
3AI1.5. Help with AI tools that provide instant feedback to learn and rectify mistakes quickly0.1260.156
3AI1.6. Feeling comfortable interacting with AI-based educational systems0.0830.103
3AI1.7. Capacity of AI in education to make learning more engaging and interactive0.1660.205
3AI1.8. The potential of AI to make complex topics easier to understand through visualizations and simulations0.0620.077
3AI1.9. Benefits of using AI in education for students’ future careers considering the increasing digitization in all sectors0.0970.120
2AI2. AI and human interaction concerns0.2900.395
1OC. Online collaboration0.1350.185
2OC1. Positive engagement in online collaboration0.8330.833
3OC1.1. Feeling comfortable collaborating with peers online0.1090.131
3OC1.2. Capacity of online collaboration to enhance students’ learning experience0.0650.078
3OC1.3. Ease communication and sharing ideas with peers in an online environment0.0890.107
3OC1.4. Enrichment of students’ learning by the diverse perspectives shared during online collaborations0.0580.070
3OC1.5. Appreciation of students’ contributions in online collaborative tasks0.1130.136
3OC1.6. Improving understanding of the subject by the feedback received from peers during online collaboration0.1490.179
3OC1.7. Staying engaged with the course material motivated by online collaboration0.0910.109
3OC1.8. Developing skills that are relevant for students’ future careers through online collaboration0.1570.190
2OC2. Collaboration challenges0.1670.167
3OC2.1. The difficulty of managing students’ time effectively during online collaborative tasks0.6550.675
3OC2.2. The challenge of establishing personal connections with peers in an online environment0.3150.325
1EB. Using an E-Board for learning0.1860.255
2EB1. E-Board learning challenges0.4870.631
3EB1.1. Getting distracted when studying from an E-Board, impacting students’ attention on the material0.1880.223
3EB1.2. Struggling to focus on the material when using an E-Board for a prolonged period0.1840.218
3EB1.3. Effects of the lack of tactile experience in E-Boards as compared to physical books on the students’ ability to learn0.0960.114
3EB1.4. The capacity of the interactive elements in E-Boards to sometimes distract from the main learning material0.1790.212
3EB1.5. The challenge of the convenience of switching between different materials or resources on an E-Board to disrupt students’ focused study time0.1950.232
2EB2. Traditional learning advantages0.2850.369
3EB2.1. Greater difficulty in remembering information learned from physical books compared to an E-Board0.1340.158
3EB2.2. Better memorization of the material because of studying from physical books0.1930.228
3EB2.3. Better comprehension of the subject matter when studying from physical books than E-Boards0.1340.158
3EB2.4. Less promotion of in-depth reading and understanding by E-Boards compared to physical books0.1930.228
3EB2.5. Greater contribution of traditional book-based learning to retain and recall information, despite the benefits of technology0.1930.228
1T. Excessive use of technology0.1750.240
2T1. Technology-induced study distractions0.5580.670
3T1.1. Checking smartphones during study sessions0.0450.050
3T1.2. Frequent distractions caused by social media notifications when studying0.0470.052
3T1.3. The capacity to reduce students’ screen time to improve their academic performance0.1640.181
3T1.4. Frequent use of smartphones or computers for non-study purposes during study time0.0330.036
3T1.5. Often exceeding the time that students allocate for the use of technology, thereby cutting into their study time0.0670.074
3T1.6. Occasional loss of track of time when using social media, leading to shorter or rushed study sessions0.1610.178
3T1.7. Diverting students’ attention from studying by online content (like videos, articles, or social media posts)0.1870.207
3T1.8. Challenge to concentrate on studying after using smartphones or other devices0.0930.102
3T1.9. Multitask, reducing study effectiveness, caused by using digital devices for studying0.1080.119
2T2. Technology-use regulation effects0.2750.330
3T2.1. Regular use of apps or features to limit students’ screen time during study hours0.4480.550
3T2.2. Disturbed sleep pattern due to excessive use of technology, affecting students’ study schedule0.2680.329
3T2.3. Feeling anxious if being unable to check phones or social media for a while, including study time0.0980.121
Table 6. Values of alternatives with respect to lowest-level criteria.
Table 6. Values of alternatives with respect to lowest-level criteria.
Third-Level CriterionAlternative
AOQMQVHQOQ
AI1.1. Regular use of AI-based educational tools to assist students’ learning0.2340.19600.570
AI1.2. Effectiveness of AI-driven tutoring systems in supplementing students’ learning0.3570.10700.536
AI1.3. The capacity of AI technologies to personalize learning to suit students’ needs and pace0.3850.28200.333
AI1.4. Reliance on AI-based tools for grading or feedback on students’ assignments0.5250.37500.100
AI1.5. Help with AI tools that provide instant feedback to learn and rectify mistakes quickly0.3610.4580.1810
AI1.6. Feeling comfortable interacting with AI-based educational systems0.3900.3900.2210
AI1.7. Capacity of AI in education to make learning more engaging and interactive0.1640.3970.4380
AI1.8. The potential of AI to make complex topics easier to understand through visualizations and simulations0.3330.21200.455
AI1.9. Benefits of using AI in education for students’ future careers considering the increasing digitization in all sectors0.1830.3870.4300
AI2. AI and human interaction concerns0.3740.1950.4310
OC1.1. Feeling comfortable collaborating with peers online0.4710.0590.4710
OC1.2. Capacity of online collaboration to enhance students’ learning experience0.3600.3530.2870
OC1.3. Ease communication and sharing ideas with peers in an online environment00.2970.3280.375
OC1.4. Enrichment of students’ learning by the diverse perspectives shared during online collaborations0.3330.17800.489
OC1.5. Appreciation of students’ contributions in online collaborative tasks0.1070.03600.857
OC1.6. Improving understanding of the subject by the feedback received from peers during online collaboration0.3540.4080.2380
OC1.7. Staying engaged with the course material motivated by online collaboration0.4890.3040.2070
OC1.8. Developing skills that are relevant for students’ future careers through online collaboration0.3550.3250.3210
OC2.1. The difficulty of managing students’ time effectively during online collaborative tasks0.2640.2080.5280
OC2.2. The challenge of establishing personal connections with peers in an online environment0.18100.5540.265
EB1.1. Getting distracted when studying from an E-Board, impacting students’ attention on the material0.2500.3080.4420
EB1.2. Struggling to focus on the material when using an E-Board for a prolonged period0.3330.2700.3970
EB1.3. Effects of the lack of tactile experience in E-Boards as compared to physical books on the students’ ability to learn0.06000.2380.702
EB1.4. The capacity of the interactive elements in E-Boards to sometimes distract from the main learning material0.18800.3250.488
EB1.5. The challenge of the convenience of switching between different materials or resources on an E-Board to disrupt students’ focused study time00.2440.4880.268
EB2.1. Greater difficulty in remembering information learned from physical books compared to an E-Board0.5470.3670.0860
EB2.2. Better memorization of the material because of studying from physical books0.2670.08100.653
EB2.3. Better comprehension of the subject matter when studying from physical books than E-Boards0.32600.0470.628
EB2.4. Less promotion of in-depth reading and understanding by E-Boards compared to physical books0.2630.23800.500
EB2.5. Greater contribution of traditional book-based learning to retain and recall information, despite the benefits of technology0.2890.13900.572
T1.1. Checking smartphones during study sessions00.1970.1420.661
T1.2. Frequent distractions caused by social media notifications when studying00.1920.3220.486
T1.3. The capacity to reduce students’ screen time to improve their academic performance00.1410.3040.554
T1.4. Frequent use of smartphones or computers for non-study purposes during study time00.1020.2850.613
T1.5. Often exceeding the time that students allocate for the use of technology, thereby cutting into their study time0.02300.3260.652
T1.6. Occasional loss of track of time when using social media, leading to shorter or rushed study sessions00.1620.3530.485
T1.7. Diverting students’ attention from studying by online content (like videos, articles, or social media posts)00.0810.1820.737
T1.8. Challenge to concentrate on studying after using smartphones or other devices00.1610.3290.510
T1.9. Multitask, reducing study effectiveness, caused by using digital devices for studying0.0850.08500.830
T2.1. Regular use of apps or features to limit students’ screen time during study hours0.4470.2850.2680
T2.2. Disturbed sleep pattern due to excessive use of technology, affecting students’ study schedule00.2570.5050.239
T2.3. Feeling anxious if being unable to check phones or social media for a while, including study time00.1290.2840.586
AI—artificial intelligence, AOQ—students that demonstrate acceptable-quality achievements with defects, MQ—students that demonstrate moderate-quality achievements, VHQ—students that demonstrate very-high-quality achievements, OQ—students that demonstrate outstanding achievements.
Table 7. Aggregate alternatives’ values.
Table 7. Aggregate alternatives’ values.
CriterionAlternative
AQDMQVHQOQ
LevelNameValueRankValueRankValueRankValueRank
0Students’ use of digital technology to support their studies0.2413.0.2184.0.2791.0.2632.
1Use of AI in education0.3291.0.2803.0.2872.0.1044.
2Benefits of AI in education0.3002.0.3361.0.1923.0.1724.
2AI and human interaction concerns0.3742.0.1953.0.4311.04.
1Online collaboration0.2991.0.2333.0.2942.0.1744.
2Positive engagement in online collaboration0.3121.0.2522.0.2453.0.1914.
2Collaboration challenges0.2372.0.1403.0.5371.0.0864.
1Using an E-Board for learning0.2303.0.1764.0.2572.0.3371.
2E-Board learning challenges0.1754.0.1843.0.3951.0.2462.
2Traditional learning advantages0.3242.0.1623.0.0214.0.4931.
1Excessive use of technology0.0894.0.1663.0.2812.0.4641.
2Technology-induced study distractions0.0124.0.1213.0.2472.0.6191.
2Technology-use regulation effects0.2463.0.2572.0.3481.0.1494.
AI—artificial intelligence, AOQ—students that demonstrate acceptable-quality achievements with defects, MQ—students that demonstrate moderate-quality achievements, VHQ—students that demonstrate very-high-quality achievements, OQ—students that demonstrate outstanding achievements.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Čančer, V.; Tominc, P.; Rožman, M. Measuring Students’ Use of Digital Technology to Support Their Studies. Educ. Sci. 2025, 15, 842. https://doi.org/10.3390/educsci15070842

AMA Style

Čančer V, Tominc P, Rožman M. Measuring Students’ Use of Digital Technology to Support Their Studies. Education Sciences. 2025; 15(7):842. https://doi.org/10.3390/educsci15070842

Chicago/Turabian Style

Čančer, Vesna, Polona Tominc, and Maja Rožman. 2025. "Measuring Students’ Use of Digital Technology to Support Their Studies" Education Sciences 15, no. 7: 842. https://doi.org/10.3390/educsci15070842

APA Style

Čančer, V., Tominc, P., & Rožman, M. (2025). Measuring Students’ Use of Digital Technology to Support Their Studies. Education Sciences, 15(7), 842. https://doi.org/10.3390/educsci15070842

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop