Next Article in Journal
Psycho-Emotional Factors Associated with Internet Gaming Disorder Among Japanese and Israeli University Students and Other Young Adults
Previous Article in Journal
Examining Longitudinal Risk and Strengths-Based Factors Associated with Depression Symptoms Among Sexual Minority Men in Canada
Previous Article in Special Issue
A Systematic Review of Responses, Attitudes, and Utilization Behaviors on Generative AI for Teaching and Learning in Higher Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Impact of Digital Learning Competence on the Academic Achievement of Undergraduate Students

School of Education, Tianjin University, Tianjin 300350, China
*
Authors to whom correspondence should be addressed.
Behav. Sci. 2025, 15(7), 840; https://doi.org/10.3390/bs15070840
Submission received: 15 May 2025 / Revised: 4 June 2025 / Accepted: 20 June 2025 / Published: 22 June 2025
(This article belongs to the Special Issue Artificial Intelligence and Educational Psychology)

Abstract

Digital learning competence has gradually become one of the core qualities essential for undergraduate students. To effectively enhance undergraduates’ digital learning abilities and their positive impact on academic performance, this study developed a validated survey on digital learning competence and academic achievement. A total of 312 valid questionnaires were collected from undergraduate students. Descriptive statistical analysis revealed that the overall academic achievement of the sample students was at an upper-middle level, with course achievements and practical achievements being higher than scholarly achievements. Differential analysis showed that male students scored higher than female students in scholarly achievements, practical achievements, and overall academic performance. Additionally, senior students generally outperformed junior students in course achievement, academic research, and overall academic performance, while undergraduates from key universities generally achieved higher academic results than those from ordinary undergraduate institutions. Correlation and regression analyses indicated that digital learning evaluation competence as a sub-competence under digital learning competence has significant positive predictive effects on undergraduates’ academic achievement. When other factors remained constant, for each unit increase in digital learning evaluation ability, academic achievement increased by 0.480 units. Therefore, universities can improve existing student development processes through measures such as enriching carriers, optimizing methods, and creating supportive environments to foster undergraduates’ digital learning competence, thereby enhancing their academic achievement.

1. Introduction

Computer information technology has seen widespread application in education, profoundly transforming educational models, teaching methods, and learning approaches. Digital technology addresses time and space limitations, giving rise to remote education (Oliveira et al., 2021), online education (Turnbull et al., 2021), and blended learning (Galindo-Dominguez, 2021; Wu et al., 2023). Digital tools such as ChatGPT4.0 and Bing Chat (Liu et al., 2024), virtual reality (Holly et al., 2021; Lampropoulos & Kinshuk, 2024), and augmented reality (Hidayat & Wardat, 2024) have expanded students’ learning channels. Traditional teacher-centered classroom instruction is gradually being replaced by student-centered interactive, participatory teaching (Mosqueira-Rey et al., 2023; Yan, 2022). Artificial intelligence applications such as intelligent tutoring systems, educational robots, and personalized learning recommendation systems greatly enhance student autonomy and initiative (X. Chen et al., 2022). AI tools, including online intelligent systems, collaborative robots, and chatbots, also improve teaching efficiency and quality (L. Chen et al., 2020).
The COVID-19 pandemic forced universities to shift away from traditional face-to-face teaching models (Rof et al., 2022), accelerating the popularization of online education. Educational digitalization is gradually becoming normalized. The European Union officially issued the Digital Education Action Plan (2021–2027) in 2020, proposing that high-quality, accessible digital education be widely recognized throughout Europe and supporting member states’ education and training systems to adapt to the digital age (European Commission, 2020). In 2021, the Organisation for Economic Co-operation and Development (OECD), using Hungary’s higher education digital transformation as a pilot, published Supporting the Digital Transformation of Higher Education in Hungary, elevating higher education digitalization to a national strategy and constructing policy frameworks and action pathways (Organisation for Economic Co-operation and Development, 2021a). The United Nations Education Transformation Summit in September 2022 released the Action Initiative to Ensure and Improve the Quality of Universal Public Digital Learning, emphasizing that the international community needs to recognize current trends in higher education digital transformation, and calling for countries to establish new international rules, leverage digital technology advantages to empower teaching and learning, and create digital learning platforms (United Nations Educational, Scientific and Cultural Organization, 2022).
As the primary subjects of educational activities, students’ digital competence has gradually gained widespread attention from governments, society, and academia. The EU’s DigComp framework is currently recognized as a digital competence framework in academia, and is widely used in relevant research (Nguyen et al., 2024; Chaw & Tang, 2024). The latest DigComp 2.2 defines digital competence as “the confident, critical, and responsible use of and engagement with digital technologies for learning, working, and participating in society”. It includes five dimensions—information and data literacy, communication and collaboration, digital content creation, safety, and problem-solving. Digital competence is one of the core competencies for 21st-century citizens and a key interdisciplinary skill for lifelong learning, employment, and active social participation (Vuorikari et al., 2022). Numerous studies demonstrate that digital competence positively impacts student development, for example, by increasing the acceptance of digital learning (Scheel et al., 2022), promoting learning behaviors (Pan et al., 2024), enhancing digital informal learning and academic performance (Mehrvarz et al., 2021), improving English learning outcomes (Niu et al., 2022), increasing online learning satisfaction and thereby improving academic achievement (Younas et al., 2022), enhancing physical education performance (Li et al., 2022), and improving undergraduates’ research capabilities (Mieg et al., 2024).
However, existing research primarily focuses on evaluating students’ overall digital competence (Guitert et al., 2021; X. Wang et al., 2021; Tzafilkou et al., 2022; Martzoukou et al., 2021). Limited attention has been paid to students’ Digital Learning Competence (DLC) in specific learning contexts. Thus, the mechanism by which digital learning competence influences academic achievement remains insufficiently explored. Undergraduates are often involved in digital competence research (Zhao et al., 2021). Therefore, this paper focused on undergraduate students and developed a valid and reliable digital learning competence assessment scale based on a comprehensive literature review. Through questionnaire surveys and statistical analysis, this study aims to address the following questions:
  • What is the structural connotation of undergraduates’ digital learning competence?
  • Which dimensions of digital learning competence have a significant impact on undergraduates’ academic achievement?
  • How can we improve undergraduates’ digital learning competence to enhance their learning outcomes and academic performance, thereby improving talent cultivation quality?

2. Literature Review

2.1. Digital Learning Competence

The term “E-learning” gained popularity in the late 1990s. In 2000, the CEO Forum on Educational Technology (ET-CEO Forum) defined digital learning as an approach that integrates digital technology with curriculum content, emphasizing the construction of digital learning environments, resources, and methodologies that match the talent requirements of the 21st century (CEO Forum on Education and Technology, 2000). The OECD’s PISA 2025 framework highlights that “learning in the digital world” not only requires students have the ability to use digital tools for knowledge construction and problem-solving, but it also emphasizes their self-regulation capabilities in metacognition, behavioral regulation, and emotional management (White et al., 2023). In research on digital learning competence, we need to clearly distinguish related concepts such as “digital literacy”, “digital competence”, and “digital maturity”. Digital literacy originated from the needs of the information society. It refers to the ability to access, evaluate, generate, and communicate information using digital technologies in effective and efficient ways (Hargittai, 2008). It encompasses various abilities, including computer literacy, information and communication technology literacy, information literacy, and media literacy (Liang et al., 2021). Digital literacy represents a fundamental cognitive ability. Digital maturity emphasizes the overall development level of organizations or individuals in digital technology application processes. It involves deep-level aspects such as strategic planning, cultural adaptation, and risk mitigation (Pinto et al., 2023; Koch et al., 2024). In contrast, digital competence is a more comprehensive concept that covers individuals’ abilities to complete various tasks in digital environments, and includes five dimensions—information and data literacy, communication and collaboration, digital content creation, safety, and problem-solving (Vuorikari et al., 2022).
Digital learning competence focuses on specific learning contexts. Research has defined digital learning competence as learners’ ability to conduct effective learning using digital tools in digital environments. It represents a combination of knowledge, skills, and attitudes (Yang et al., 2021; Kallas & Pedaste, 2022). Since digital learning competence is often developed by integrating virtual learning environments, digital learning tools, and methods (Bojórquez-Roque et al., 2024), this study defined digital learning competence as the ability to conduct learning activities in digital learning environments, such us utilizing digital learning resources and digital learning tools for knowledge acquisition, construction, management, and evaluation, thereby achieving improvements in cognition, knowledge, skills, attitudes, and other aspects.

2.2. Academic Achievement

Academic achievement is a crucial indicator for measuring students’ learning performance and educational outcomes. In a narrow sense, academic achievement often equates with students’ learning grades (Cutumisu et al., 2020; Zhang & Zeng, 2024). In a broader sense, the definition encompasses students’ comprehensive performance throughout the educational process, including interpersonal relationships (Luo et al., 2023) and career planning abilities (X. W. Wang et al., 2022) among the measurement indicators. With societal development and the urgent demand for high-quality talent, higher education goals are becoming increasingly diverse and comprehensive. The OECD emphasizes that future education should focus on cultivating individuals with key competencies such as collaboration, creativity, self-regulation, and value judgment (Organisation for Economic Co-operation and Development, 2021b). UNESCO points out that higher education should be committed to promoting students’ comprehensive development, covering multiple dimensions including academic ability, social responsibility, sustainable development awareness, and lifelong learning ability (United Nations Educational, Scientific and Cultural Organization, 2021).
Therefore, this paper defined academic achievement as the comprehensive outcome students obtain through learning activities during a certain period. It encompasses learning results, learning behaviors, and learning attitudes, and covers three aspects—course achievement, scholarly achievement, and practical achievement.

2.3. Relationship Between Digital Learning Competence and Academic Achievement

Academic achievement is often a key element in research. School climate, family background, teacher support, learning motivation, and self-regulation abilities (Demirtas-Zorbaz et al., 2021; Hardaway et al., 2020; Tao et al., 2022; Lavrijsen et al., 2022; C. Chen et al., 2024; J. C. Hong et al., 2022) are common variables in current research. With the development of educational informatization, scholars have gradually focused on the impact of emerging media and technologies on academic achievement. Research indicates that online communication in online learning systems (Suad et al., 2024), environmental compatibility and personal innovation (W. T. Wang & Lin, 2021) significantly influence students’ academic performance. The more frequent use of digital textbooks in classrooms can improve students’ academic performance, academic interest, and learning skills (Lee et al., 2023). Compared to students who use personal computers individually at home, students without personal computers can significantly enhance both their current and their long-term academic performance by frequently using information and communication technology (ICT) with teachers in classrooms (J. Hong et al., 2024). In higher physical education, integrating gamified digital game-based learning (DGBL) with the ARCS model can effectively improve student motivation, academic performance, and practical knowledge (Camacho-Sánchez et al., 2023). Ibrahim and Aldawsari (2023) found through empirical research that digital capabilities not only directly affect students’ academic performance, but also indirectly influence it by enhancing students’ learning self-efficacy. Tran-Duong (2023) discovered that undergraduate students’ media literacy positively impacts perceived learning outcomes, with critical production having the most significant effect on learning outcomes. Pramila-Savukoski et al. (2023) found that health science students’ digital learning competence not only enhances their comprehensive abilities, but also broadens their understanding of related skills. Örnek et al. (2024) analyzed PISA 2018 data, and found that students’ interaction behaviors with ICT significantly correlate with science academic achievement. Positive ICT attitudes and frequent educational ICT use have positive effects on improving science performance, while excessive recreational ICT use may produce negative impacts.
However, the current research on the relationship between digital learning competence and academic achievement still has limitations. Existing studies have not yet established universally accepted evaluation standards for digital learning competence and, lack further exploration of its specific impact on academic achievement. Therefore, this study focused on undergraduate students by investigating the influence of digital learning competence on academic achievement, aiming to provide new perspectives on the research in the relevant field.

3. Methodology

3.1. Research Design

We can collect data through questionnaires designed to study specific populations’ behaviors, attitudes, opinions, knowledge, or characteristics. This enables the rapid, systematic collection of large amounts of data for quantitative analysis. This study employed the questionnaire survey method, designing qualified questionnaires to collect data from Chinese undergraduate students, aiming to analyze the relationship between digital learning competence and academic achievement. Specifically, this study reviewed relevant research, summarized existing questionnaire measurement indicators, and independently developed valid and reliable questionnaires. It investigates undergraduates’ current digital learning competence and academic achievement status, explores the impact of undergraduates’ digital learning competence on academic achievement, draws research conclusions, and proposes corresponding countermeasures. The questionnaire targets undergraduate students and primarily consists of three parts—demographic information, a digital learning competence scale, and an academic achievement survey. The questionnaire survey is divided into the preliminary survey and formal survey phases. The preliminary survey distributes questionnaires on a small scale, conducts item analysis and reliability–validity analysis of the questionnaire data, and forms the final formal questionnaire after eliminating invalid items. The formal questionnaire survey is then conducted on a large scale. SPSS18 is used for the descriptive statistical analysis, differential analysis, correlation analysis, and regression analysis of questionnaire data. This helps us investigate undergraduates’ current digital learning competence and academic achievement status and explore the correlation between undergraduates’ digital learning competence and academic achievement, and provides valid references for the research’s conclusions.

3.2. Development of Measurement Tools

3.2.1. Development of Digital Learning Competence Scale

Currently, in the relevant research, there is no unified standard for the dimensional classification of digital learning competence. The UK’s Jisc Digital Capability Framework, based on the EU digital competence framework, particularly emphasizes “learning-type” capability and “lifelong learning” concepts. It proposes six elements of digital capability for higher education and vocational education teachers and students, as follows: ICT/digital proficiency, information data and media literacy (critical use), digital creation, problem-solving and innovation (creative production), digital communication, collaboration and participation (participation), digital learning and development (development), and digital identity and wellbeing (self-realization) (Jisc, 2021). The International Society for Technology in Education (ISTE) digital competence strategy emphasizes students’ identity recognition in digital learning. It uses various standards to measure students’ digital competence, depending on whether they can do the following: recognize responsibilities and opportunities to contribute to digital communities; use digital tools to rigorously manage various resources to build knowledge, produce creative works, and create meaningful learning experiences for themselves and others; use various technologies in the design process to identify and solve problems through creating new, useful, or imaginative solutions; leverage technological approaches to develop and test solutions, thereby formulating and applying strategies for understanding and solving problems; use platforms, tools, styles, formats, and digital media appropriate to their goals to communicate clearly and express themselves creatively for various purposes; use digital tools to broaden perspectives and enrich their learning by collaborating with others and working effectively in local and global teams (International Society for Technology in Education, 2016). Yang et al. (2021) developed a digital learning competence framework that was effective in assessing secondary school students’ digital learning competence, including six dimensions—technology use, cognitive processing, digital reading skills, time management, peer management, and will management. Pedaste et al. (2023) developed and validated Digitest, a digital learning competence assessment scale specifically for primary and junior high school students, comprising nine dimensions—perceptual control, behavioral attitudes, behavioral intentions, creating digital materials or content, programming digital content, communication in the digital world, operating with digital tools, protecting oneself and others in the digital world, and legal behavior in the digital world. Tan et al. (2024) assessed the digital learning competence of secondary vocational school students via five dimensions—cognitive processing and reading, technology use, thinking ability, activity management, and will management.
Although the aforementioned frameworks provide multi-dimensional perspectives on the composition of digital learning competence, several limitations remain. First, some current models focus primarily on basic education stages, which are insufficient for undergraduate stages wherein learning environments and tasks are relatively complex. There is still a lack of digital learning competence frameworks suitable for undergraduate stages that encompass the higher-order abilities of university students in academic research, knowledge construction, and digital expression. Second, the dimensional divisions in some frameworks are unclear. For example, Yang et al. separately treated three closely related management dimensions—time management, peer management, and volition management. Pedaste et al. introduced non-core learning dimensions such as “protecting oneself and others in the digital world” and “legal behavior in the digital world” into their framework, which may reduce the targeting of digital learning competence assessment. Finally, the theoretical foundations and dynamic nature of current models require strengthening. Some models, such as the Jisc digital competence framework and ISTE Digital Competence strategies, are built on policy advocacy or technical standards, while lacking the integration of core learning science theories, such as Self-Regulated Learning Theory and Cognitive Load Theory, making it difficult to reveal the mechanisms between different competence dimensions in digital environments. Meanwhile, the emergence of advanced technologies, such as generative AI and large language models, also poses challenges to existing digital learning competence models.
Self-Regulated Learning (SRL) is a core conceptual framework for understanding cognitive, motivational, and emotional aspects of learning. It has become one of the most important research areas in educational psychology (Panadero, 2017). Therefore, this study critically integrated the above digital learning competence frameworks by combining them with Zimmerman’s (2000) Self-Regulated Learning Theory (SRL). We proposed a digital learning competence framework containing five dimensions—awareness, technical skills, behaviors, management, and evaluation (Table 1). Zimmerman’s (2000) SRL model divides the learning process into three phases: Forethought, Performance, and Self-reflection. In the Forethought phase, students analyze learning tasks, clarify learning goals, and formulate corresponding learning plans. Students’ motivational beliefs (such as confidence and task value) play key roles in this phase. They not only provide the psychological driving force for learning activities, but they also influence the selection and activation of subsequent learning strategies. In the Performance phase, students begin specific task execution while monitoring learning progress and adopting various self-control strategies to maintain cognitive engagement and task motivation. In the Self-reflection phase, students evaluate their task completion and attribute causes of success or failure. Positive attribution can have positive effects on future learning attitudes and behaviors, while negative attribution can have negative effects on future learning attitudes and behaviors.
In the five-dimensional framework proposed by this study, digital learning awareness corresponds to the Forethought phase in SRL. It reflects individuals’ cognition, understanding, and attention to digital learning, as well as attitudes toward digital learning tools and resources. It also directly determines whether individuals are willing and able to actively participate in digital learning, serving as the prerequisite for stimulating subsequent behaviors. Digital learning technical skills, digital learning engagement behaviors, and digital learning self-management collectively correspond to the Performance phase. Technical skills represent individuals’ skills and abilities in using digital technologies and tools, serving as the foundation for achieving digital learning. Engagement behaviors represent the series of behaviors and habits individuals display during digital learning processes, including collaboration, communication, and connection abilities, which can reflect individuals’ digital learning states. Self-management represents individuals’ abilities to manage learning resources, learning tools, learning time, and learning emotions during digital learning processes, helping individuals control the quality of digital learning and promote efficient learning. Digital learning evaluation competence highly aligns with the Self-reflection phase. It refers to individuals’ abilities to reasonably and accurately evaluate digital learning resources, tools, environments, processes, and outcomes, helping individuals reflect on and optimize their learning methods and processes to achieve continuous improvement. These five dimensions are interrelated and completely cover the “Forethought–Performance–Self-reflection” cycle of SRL theory. They represent a systematic expression of learners’ self-regulation abilities in digital learning contexts, suitable for measuring and enhancing university students’ digital learning competence in higher education.
The undergraduate digital learning competence scale is shown in Table 1. The scale contains 31 items using a 7-point Likert scale. Strongly agree scores 7 points, somewhat agree scores 6 points, slightly agree scores 5 points, uncertain scores 4 points, slightly disagree scores 3 points, somewhat disagree scores 2 points, and strongly disagree scores 1 point. Based on the initial questionnaire development, first, the critical ratio method was adopted. The total scores of scale items were calculated and ranked. The top 27% of scores were designated as the high-score group, and the bottom 27% as the low-score group. An independent samples t-test was then used to compare differences between high- and low-scoring groups. If p < 0.05, the high- and low-scoring groups showed significant differences. If differences were not significant, the item could not discriminate respondents’ response levels, indicating the item should be modified or deleted. The results indicate that all items except B4 showed significance (p < 0.05). Therefore, item B4 was deleted while other items were retained. Second, item–total correlation analysis was conducted using Pearson correlation coefficients to measure correlations between each item and total scores. High correlations indicate high consistency between items and the overall scale. Items with Pearson correlation coefficients below 0.3 can be adjusted or deleted. All items in this scale’s item–total correlation analysis had correlation coefficients exceeding 0.3. Therefore, all items were retained. Third, validity analysis was performed. The Kaiser–Meyer–Olkin (KMO) value of the digital learning competence scale was 0.711, indicating scale items were suitable for factor analysis. Principal component analysis and varimax rotation were used for exploratory factor analysis. After removing six items (A7, C1, C5, C6, D5, E6) that did not meet dimensional expectations, the scale was divided into five factors. Factor one includes A1, A2, A3, A4, A5 and A6; factor two includes B1, B2, B3, B5, B6 and B7; factor three includes C2, C3 and C4; factor four includes D1, D2, D3 and D4; factor five includes E1, E2, E3, E4 and E5, totaling 24 items. Finally, reliability analysis was conducted. The overall reliability and dimensional reliability test results of the final digital learning competence scale show the following: digital learning competence Cronbach’s Alpha = 0.890, factor one digital learning awareness Cronbach’s Alpha = 0.864, factor two digital learning technical skills Cronbach’s Alpha = 0.817, factor three digital learning engagement behaviors Cronbach’s Alpha = 0.730, factor four digital learning self-management Cronbach’s Alpha = 0.788, and factor five digital learning evaluation competence Cronbach’s Alpha = 0.778. Overall, the scale demonstrates good reliability and can be used to measure undergraduate students’ digital learning competence.

3.2.2. Development of the Academic Achievement Questionnaire

This study defines academic achievement via comprehensive outcomes encompassing learning results, learning behaviors, and learning attitudes that students obtain through learning activities within a certain period. It covers three aspects: course achievement, scholarly achievement, and practical achievement. Course achievement refers to outcomes students achieve in professional course learning, reflecting the mastery of disciplinary foundational knowledge and skills. It emphasizes systematicity and the standardization of the learning process. The measurement indicators include professional ranking and College English Test Band 4 and Band 6 scores. Scholarly achievement refers to outcomes from students’ participation in research activities, academic exchanges, and disciplinary competitions. It focuses on higher-order cognitive abilities and innovative practice, reflecting academic literacy and problem-solving capabilities. Measurement indicators include participation in research projects, academic conferences, and disciplinary competitions with awards. Practical achievement refers to outcomes formed through students’ social practice, volunteer service, and professional experience activities. It emphasizes knowledge application and social responsibility, reflecting extensions of learning behaviors and the externalization of attitudes. Measurement indicators include participation in social practice, volunteer service, holding positions, participating in off-campus internships (not organized by schools), participating in sports competitions with awards, and participating in arts competitions with awards. The questionnaire adopts the single-choice format with 12 items in total.

3.3. Data Collection

The formal questionnaire consists of three parts: personal basic information, the undergraduate digital learning competence scale, and the undergraduate academic achievement questionnaire (Appendix A). The personal basic information section contains 4 demographic items: gender, grade, disciplinary category of major, and school level. The undergraduate digital learning competence scale is divided into 5 dimensions: digital learning awareness, digital learning technical skills, digital learning engagement behaviors, digital learning self-management, and digital learning evaluation competence, totaling 24 items. The undergraduate academic achievement questionnaire is divided into three parts: course achievement, scholarly achievement, and practical achievement, totaling 12 items.
The formal questionnaire distribution primarily employed online student self-assessment. The digital learning competence scale comprised a 7-point Likert scale; strongly agree scores 7 points, somewhat agree scores 6 points, slightly agree scores 5 points, uncertain scores 4 points, slightly disagree scores 3 points, somewhat disagree scores 2 points, and strongly disagree scores 1 point. Students’ final digital learning competence scores are the sum of scores from 24 items. Scores for digital learning awareness, digital learning technical skills, digital learning engagement behaviors, digital learning self-management, and digital learning evaluation competence are respectively the sum of scores from items in each dimension. The academic achievement questionnaire adopts the single-choice format. Each item is scored from low to high according to the selected answers (for example, professional ranking below 80% scores 1 point, 60–80% scores 2 points, 40–60% scores 3 points, 20–40% scores 4 points, top 20% scores 5 points). Course achievement scores are the sum of 3 items in that section. Scholarly achievement scores are the sum of 3 items in that section. Practical achievement scores are the sum of 3 items in that section.
Undergraduate education strives for comprehensive development in moral, intellectual, physical, aesthetic, and labor aspects. Using the case institution’s undergraduate comprehensive quality assessment plan as an example, moral education accounts for 8%, intellectual education accounts for 80%, and physical education, aesthetic education, and labor education each account for 4%. Academic achievement is operationally defined as: Total academic achievement score = Total course achievement score × 50% + Total Scholarly Achievement score × 30% + Total practical achievement score × 20%. This calculation method highlights the core position of course foundations while emphasizing the coordinated development of academic innovation and practical application. It aligns with the OECD’s educational goal of “cultivating collaboration, innovation, and self-regulation abilities” (Organisation for Economic Co-operation and Development, 2021b), and is consistent with UNESCO’s concept of “promoting students’ comprehensive development” (United Nations Educational, Scientific and Cultural Organization, 2021).
The formal distribution of the questionnaire was primarily conducted online, with a total of 350 responses collected. Of these, 312 were valid. Regarding the gender composition of the sample, there were 133 males, accounting for 43% of the total sample, and 179 females, accounting for 57%, with a difference of 46 individuals between the two groups. In terms of academic year, the largest group was from the fourth year, comprising 59% of the total sample. The second and third-year students accounted for 14% and 15%, respectively, while fifth-year students comprised the smallest group, at 3%. In terms of academic discipline, the largest group was from the field of management, comprising 22% of the sample, followed by education students at 17%. Students from the fields of science and engineering were represented in similar numbers, while those from the military and the agricultural sciences were the smallest groups. Finally, regarding the institutional level of the schools represented, the distribution was relatively even, with students from Double First-Class universities and other universities comprising nearly equal proportions of the sample.

4. Results

Statistical analysis is a process of collecting, processing, analyzing, and interpreting data using statistical principles and methods. It can scientifically process large amounts of data and extract valuable information from them. This article mainly uses SPSS to conduct descriptive statistical analysis, difference analysis, correlation analysis, and regression analysis on the questionnaire survey data to investigate the current status of undergraduate students’ digital learning ability and academic achievement, and explore the correlation between undergraduate students’ digital learning ability and academic achievement.

4.1. Descriptive Statistical Analysis

Table 2 presents a descriptive statistical analysis of undergraduate digital learning competence and academic achievement across various dimensions. The overall mean score for digital learning competence was 5.54 (SD = 0.47), indicating that undergraduates demonstrate relatively high levels of digital learning competence with minimal internal sample variation and an overall concentration of scores. Specifically, digital learning engagement behaviors (M = 5.85, SD = 0.66) and digital learning awareness (M = 5.80, SD = 0.66) received the highest scores with smaller standard deviations, indicating students performed best and most consistently in goal setting, process monitoring, and technology acceptance. Digital learning technical competence had the lowest mean and highest standard deviation (M = 5.06, SD = 0.76), suggesting students performed worst in technology utilization with significant individual differences. Additionally, digital learning self-management (M = 5.57, SD = 0.65) and evaluation competence (M = 5.59, SD = 0.71) both scored above the overall average, indicating that students performed at upper-middle levels in digital learning resource integration and outcome reflection.
The overall mean for academic achievement was 8.22 (SD = 2.06), indicating that undergraduate academic achievement was at an upper-middle level, though development imbalances existed across different dimensions. Practical achievement had the highest mean and greatest dispersion (M = 15.78, SD = 4.44), demonstrating significant student accomplishments in practical activities but with notable individual differences. Course achievement had a mean of 9.32 (SD = 2.62), indicating good student performance in coursework. Academic achievement had the lowest mean of only 5.49 (SD = 2.39), revealing substantial room for improvement in research output and academic competitions.

4.2. Differential Analysis

To explore the mechanism by which digital learning competence affects academic achievement, this study conducted differential analyses across multiple dimensions of the survey data. The specific results are as follows (Table 3).
First, regarding gender differences, independent sample t-tests revealed no significant difference in course achievement between genders (p > 0.05). However, significant differences were found in academic achievement, practical achievement, and overall academic achievement (p < 0.05). These differences may stem from gender-based variations in intelligence types, learning motivation, social expectations, and educational approaches.
Second, regarding grade-level differences, one-way ANOVA (Analysis of Variance) revealed no significant difference in practical achievement across grade levels (p > 0.05). However, significant differences were found in course achievement, academic achievement, and overall academic achievement (p < 0.05). This may be attributed to senior students accumulating more scientific knowledge, learning methods, learning techniques, and learning resources during their extended digital learning experiences, which contributes to improved academic achievement levels.
Third, regarding disciplinary differences, one-way ANOVA revealed no significant differences in course achievement, academic achievement, practical achievement, or overall academic achievement across different disciplines (p > 0.05). This indicates that the influence of digital learning competence on academic achievement has cross-disciplinary universality.
Finally, regarding university tier differences, one-way ANOVA revealed significant differences in course achievement, academic achievement, practical achievement, and overall academic achievement across different university tiers (p < 0.05). This may be due to key universities having advantages in terms of digital teaching resources, faculty strength, and learning environments. Additionally, differences in student quality and learning potential exist among different universities.

4.3. Correlation and Regression Analysis

4.3.1. Correlation Analysis

Correlation analysis can quantify the strength and direction of linear relationships between variables. Through correlation analysis, this study found the following results (Table 4).
Course achievement showed a significant positive correlation with digital learning evaluation competence, demonstrating significance at the 0.05 level. However, it showed no correlation with digital learning awareness, technical competence, behavioral competence, management competence, or overall digital learning competence.
Academic achievement demonstrated significant positive correlations with digital learning technical competence, management competence, evaluation competence, and overall digital learning competence. The correlations with digital learning technical competence, management competence, and evaluation competence all showed significance at the 0.01 level, while the correlation with overall digital learning competence showed significance at the 0.05 level. Academic achievement showed no correlation with digital learning awareness or behavioral competence.
Practical achievement exhibited significant positive correlations with digital learning technical competence, evaluation competence, and overall digital learning competence, all with significance at the 0.01 level. No correlations were found between practical achievement and digital learning awareness, behavioral competence, or management competence.
Overall academic achievement demonstrated significant positive correlations with digital learning technical competence, management competence, evaluation competence, and overall digital learning competence, all with significance at the 0.01 level. No correlations were found between overall academic achievement and digital learning awareness or behavioral competence.

4.3.2. Regression Analysis

As shown in Table 4, overall academic achievement demonstrated significant positive correlations with three dimensions of digital learning competence: technical competence, management competence, and evaluation competence. To investigate the quantitative relationships between academic achievement and these three dimensions, this study used academic achievement as the dependent variable and digital learning technical competence, management competence, and evaluation competence as independent variables. Gender, grade level, and university tier—demographic variables that showed significant differences in academic achievement—were used as control variables. Stepwise multiple linear regression analysis was employed, with the results shown in Table 5.
Table 5 indicates that the effects of digital learning technical competence and management competence on academic achievement were not significant; therefore, they were removed and did not enter the regression equation. The model passed the F-test (F = 17.462, p = 0.000). After controlling for the confounding effects of gender, grade level, and university tier, based on the multiple regression model Y = β0 + β1X1 + β3X3 + … + βkXk + βm+1Z1 + βm+2Z2 + … + βk+mZm + ε, substituting the results from the table yields
Y = 4.673 + 0.480X1 + 1.458Z1 − 0.638Z2 + 0.510Z3
where Y represents academic achievement, X1 represents digital learning evaluation competence, Z1 represents “Double First-Class” universities, Z2 represents lower grade levels, and Z3 represents male gender. The regression equation indicates that digital learning evaluation competence has a positive impact on undergraduate academic achievement. When other factors remain constant, for each standard unit increase in digital learning evaluation competence, academic achievement increases by 0.480 standard units.

5. Discussion

This study focused on undergraduate students, examining the multidimensional construction of digital learning competence and its impact on academic achievement. By systematically integrating core elements, including information technology skills, learning strategies, and self-regulation, it innovatively proposes a five-dimensional digital learning competence framework encompassing awareness, technology, behavior, management, and evaluation. This framework not only enriches structural model research on digital learning competence, but also provides a scientific and feasible tool for assessing digital learning competence in higher education. It fills a research gap in this field at the level of higher education, offering significant theoretical contributions and practical application value.
Regarding theoretical contributions, firstly, this research has expanded the theoretical boundaries of digital learning competence. Traditional digital learning competence research primarily focuses on primary and secondary school students. This study extended the research subjects to undergraduates in higher education for the first time. This shift not only demonstrates the continuity and developmental nature of educational research, but also reveals potential differences in digital learning competence across educational stages. By constructing a five-dimensional framework, this study has deepened the understanding of the essential characteristics of digital learning competence. It has clarified the interaction mechanisms among awareness, technology, behavior, management, and evaluation dimensions, providing a solid theoretical foundation for subsequent research.
Second, the difference analysis results for gender and institutional level provide new research perspectives. This study found that male students scored higher than female students in scholarly achievement, practical achievement, and academic achievement. Laaber et al. (2023) proposed that differences in agreeableness, conscientiousness, and negative emotions affect individual digital maturity development. Compared to male students, female students experience more negative emotional arousal (Lin & Tsai, 2018) and consider themselves to have lower general computer self-efficacy (Sobieraj & Krämer, 2020). They show lower enthusiasm for using digital technologies (VR) in teaching and assessment (Mahling et al., 2023). Traditional social stereotypes that boys are more suited to learning in science and technology courses also adversely affect female participation enthusiasm in STEM (Moss-Racusin et al., 2018). In the future, we need more research exploring how to maintain and enhance female students’ interest in digital learning, promote female students’ academic and practical participation in artificial intelligence projects, and improve their digital learning competence and academic achievement.
Meanwhile, in the institutional difference dimension, students from “Double First-Class” universities generally reached higher academic achievement than students from non-Double First-Class institutions. This indicates that digital learning competence development not only depends on individual literacy, but is also deeply influenced by external conditions such as institutional resource support, technological environments, and organizational culture. This reflects the important role of institutional-level digital maturity in student development. Abdullah (2023) proposed that university digital maturity includes seven dimensions: leadership, planning and management, quality assurance, scientific research, digital teaching and learning, community service, equipment and technological infrastructure, and technological culture. Compared to non-Double First-Class institutions, “Double First-Class” universities typically possess more advanced digital learning platforms, richer online resource libraries, policies and cultures that encourage teachers to apply digital technologies, and more comprehensive digital skills training support. These factors collectively create more favorable institutional environments for students’ digital learning competence development and academic achievement improvement. This perspective provides important insights for understanding the roles that universities at different levels play in enhancing students’ digital learning competence. It also provides theoretical foundations for future research to further explore associations between institutional support and learning effectiveness.
Additionally, it validates the positive correlation between digital learning competence and academic achievement. This study finds a significant positive correlation between digital learning competence and student academic achievement. Particularly, digital learning evaluation competence has a significant positive predictive effect on academic achievement. This finding is consistent with that rom Tran-Duong’s (2023) research that critical production in undergraduate media literacy has the most significant impact on learning outcomes. This finding not only validates core concepts of Self-Regulated Learning theory—the central role of critical resource assessment and metacognitive reflection in academic success—but also provides important theoretical evidence for educational practitioners. Research indicates that enhancing students’ digital learning competence, especially evaluation capability, is an effective approach to improving academic achievement. Finally, it reveals the importance of critical resource assessment and metacognitive reflection. This study emphasizes the central position of critical resource assessment and metacognitive reflection in digital learning competence. In the era of the information explosion, how students effectively screen, evaluate, and utilize digital resources, and how they optimize learning processes through metacognitive reflection, become key factors determining learning effectiveness. This finding enriches the understanding of digital learning competence and provides theoretical support for cultivating new-era talents with information literacy and autonomous learning abilities. Particularly, it verifies that digital learning evaluation competence has a significant positive predictive effect on academic achievement, aligning with core concepts of Self-Regulated Learning theory and highlighting the central role of critical resource assessment and metacognitive reflection in academic success (Ateş, 2024; S. L. Wang et al., 2024; Huang & Lin, 2024), helping students escape traps of technology dependence.
Notably, although digital learning awareness and digital learning engagement behaviors are important components of digital learning competence, these two dimensions did not show significant correlations with academic achievement. In this survey, undergraduates scored relatively high in digital learning awareness (M = 5.80), indicating that undergraduates generally recognize the value of digital learning. This may be related to the normalization of online learning after the COVID-19 pandemic. When group awareness levels are generally high, their differential impact on academic achievement may be diluted, resulting in non-significant correlations. According to Self-Regulated Learning Theory (Zimmerman, 2000), while “motivation” at the awareness level may stimulate learners’ initial participation willingness, without subsequent technical support, strategic planning, and metacognitive control, this “awareness” often fails to continuously transform into high-quality learning behaviors and outcomes, leading to non-significant correlations between digital learning awareness and academic achievement. Digital learning engagement behaviors such as online communication, resource sharing, or platform usage frequency can reflect students’ participation status in digital environments. However, frequent online participation does not equal deep cognitive processing. According to Cognitive Load Theory (CLT) (Sweller et al., 2011), ineffective behavioral participation may increase extraneous cognitive load. Students with high behavioral competence scores in this study may have spent energy on shallow online interactions while neglecting deep knowledge processing, resulting in non-significant correlations between digital learning engagement behaviors and academic achievement. In the context of global higher education digital transformation, there is a need to enhance undergraduates’ digital learning competence to improve their learning effectiveness and performance, thereby improving talent cultivation quality. Therefore, enhancing students’ digital learning competence, especially digital learning evaluation competence, should become an important goal of university general education. Infrastructure and equipment, resources and support, ICT policies, and teacher training are crucial for higher education institutions seeking to implement digital strategies, meet students’ growing digital learning requirements, and maintain competitiveness in the global environment (Esteve-Mon et al., 2023). Therefore, based on the research findings, this study advocates for the “Education 4.0” concept, calling for integrating cutting-edge technologies and reconstructing educational ecosystems to achieve more personalized, flexible, and lifelong learning experiences. This addresses contradictions between standardized teaching and personalized needs in traditional educational systems, and in adapting to future societal requirements for talent capabilities. Specifically, suggestions for the optimization of university student cultivation processes are proposed in three dimensions: enriching carriers, optimizing methods, and creating environments.
First, in enriching carriers, regarding learners as central roles, paying attention to individual differences, formulating exclusive learning paths through data analysis, and cultivating interdisciplinary and innovative capabilities are required. Universities should integrate digital technology content, such as programming and data analysis, into curriculum systems, enhance students’ technical application abilities through specialized training, and use generative artificial intelligence chatbots and large language models (LLM) to enhance students’ digital learning and boost academic achievement (Shahzad et al., 2025). Establishing digital learning evaluation courses by systematically integrating capabilities, such as resource assessment, information retrieval, and critical thinking, into professional teaching can strengthen students’ self-evaluation abilities regarding learning processes and outcomes. Moreover, in optimizing methods, we emphasize the combination of personalized and practical teaching. Universities should leverage technologies such as artificial intelligence (AI), big data, virtual reality (VR/AR), blockchain, and Internet of Things (IoT) to optimize teaching processes. Teachers can formulate differentiated teaching plans based on students’ grade levels and ability levels, strengthen basic guidance for lower-grade students, and promote self-regulation ability development in higher-grade students. We can also introduce actual case teaching into classrooms, and promote cross-grade collaborative communication through research competitions, academic salons, and club activities, helping students use digital tools to solve real problems. Last but not least, in creating environments, Jeong et al. (2024) found through an analysis of PISA 2018 data that digital capital accumulation at the school, teacher, and student levels can influence students’ academic achievement. Particularly, ICT software infrastructure—referring to commonly adopted educational platforms and digital tools rather than a specific version—has important positive correlations with student performance. Universities should build intelligent and immersive learning environments, and improve network resources, teaching platforms, and hardware facilities. Constructing supportive cultural ecosystems and providing specialized training and role model inspiration for female students to enhance their digital participation willingness can promote institutional resource sharing to alleviate educational imbalance issues.
The limitations of this study include the following. First, the questionnaire was primarily distributed online, resulting in certain sample selection constraints. Demographic indicators did not include respondents’ socioeconomic backgrounds, geographic distribution, or other information. This may thus not fully represent the digital learning competence and academic achievement situations of all undergraduate student populations. Second, although the study considered control variables such as gender, grade level, and school type, other unidentified or uncontrolled variables may exist. These could include learning motivation, socioeconomic status, family background, etc. Such variables might affect the accuracy of the research results.
Regarding future research prospects, first, with the rapid development of digital technology, the connotations of undergraduate digital learning competence continue to evolve. Future research should deeply analyze the structural connotations of digital learning competence, clarify its core elements, and develop more accurate, comprehensive, and easily operable measurement tools. These would help effectively assess undergraduates’ digital learning competence and provide personalized learning feedback and guidance recommendations. Second, sample diversity and representativeness are crucial for the universality of research results. Future research should seek broader sample populations containing undergraduates from different regions, cultural backgrounds, and educational levels. This would help to explore differences in digital learning competence among different groups and to propose more targeted suggestions to improve undergraduate digital learning competence and academic achievement levels, enhancing the universality of research findings. Third, academic achievement is influenced by multiple factors. To more accurately assess the impact of undergraduate digital learning competence on academic achievement, future research should identify and control as many variables as possible that may affect academic achievement (these include learning motivation, socioeconomic status, family background, etc.) to improve the accuracy of research results. Fourth, this study did not directly measure the digital maturity of higher education institutions. Future research can consider the digital maturity of higher education institutions in the context of digital transformation as an important research direction. Developing and constructing university digital maturity assessment models can be undertaken to help examine the moderating or mediating effects of university digital maturity in the relationship between students’ digital learning competence and academic achievements. Finally, quantitative and qualitative research methods each have their advantages and disadvantages. Future research could attempt to combine both approaches. While collecting extensive data through questionnaire surveys, researchers could also try to obtain more comprehensive research perspectives through interviews, observations, or case studies. This would provide richer insights into and suggestions for educational practice.

6. Conclusions

Although digital learning competence has gradually become one of the core essential competencies for students, its specific mechanisms of influence on academic achievement have not been fully explored. This study focused on undergraduate students as research subjects, independently developed and validated measurement tools for digital learning competence and academic achievement with good reliability and validity, collected 312 valid questionnaires, and conducted a systematic empirical analysis.
The results indicate that the surveyed undergraduates demonstrated relatively high overall digital learning competence, with digital learning awareness and behavioral competence being the strongest, while technical competence was relatively weaker. Academic achievement was generally at the upper-middle level, with course achievement and practical achievement outperforming academic achievement. Significant gender differences were observed in academic achievement, practical achievement, and overall academic achievement. Senior students outperformed junior students in terms of course, academic, and comprehensive academic achievements. Students from “Double First-Class” and other key universities demonstrated better academic achievements than those from regular undergraduate institutions. A significant positive correlation exists between digital learning competence and academic achievement. Among these, digital learning evaluation competence as a sub-competence under digital learning competence is particularly prominent, showing significant positive predictive effects on academic achievement. Therefore, this study suggests utilizing multidimensional strategies, including curriculum system optimization, personalized teaching methods, and intelligent learning environment construction, to help university students scientifically apply digital technologies, especially AI technologies, to further enhance undergraduate students’ digital learning capabilities and academic achievement.

Author Contributions

Conceptualization, Y.S. and S.L.; methodology, Y.S.; software, S.L.; validation, Y.S., S.L. and M.W.; formal analysis, M.W. and Z.W.; investigation, S.L., M.W. and Z.W.; resources, Y.S. and W.D.; data curation, S.L., Y.S. and W.D.; writing—original draft preparation, S.L., Y.S. and M.W.; writing—review and editing, Z.W., Y.S. and M.W.; visualization, W.D., Y.S. and S.L.; supervision, W.D. and Y.S.; project administration, Y.S. and W.D.; funding acquisition, W.D. and Y.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Chinese National Social Science General Project: Research on the Formation Mechanism and Guidance Strategies of Public Health Information Avoidance Behavior Based on Multimodality Project. Funding number: 23BTQ069.

Institutional Review Board Statement

This study was conducted in accordance with the Declaration of Helsinki. Ethical review and approval were waived for this study according to the Measures for Ethical Review of Life Science and Medical Research Involving Humans (Article 32, Chapter 3).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Survey on Digital Learning Capabilities and Academic Achievement of Undergraduates

Dear Students:
Thank you for taking time from your busy schedule to participate in this survey. We are the Digital Learning in Higher Education Research Group from Tianjin University. This questionnaire aims to investigate the impact of undergraduate students’ digital learning capabilities on academic achievement. This survey is for thesis research purposes and uses anonymous responses. We guarantee that your responses will not affect you in any way, and your information will never be disclosed. We sincerely thank you for your help and support. We wish you a pleasant life and successful studies!
  • Personal Information (Single Choice Questions)
    • Your gender:
      ○ Male ○ Female
    • Your grade level:
      ○ Freshman ○ Sophomore ○ Junior ○ Senior ○ Fifth year (5-year program)
    • Your major’s discipline:
      ○ Philosophy ○ Economics ○ Law ○ Education ○ Literature ○ History ○ Science
      ○ Engineering ○ Agriculture ○ Medicine ○ Military Science ○ Management ○ Arts
    • Your university category:
      ○ “985 Project” institution ○ “211 Project” institution
      ○ Other “Double First-Class” institutions (excluding 985 and 211) ○ Others
  • Undergraduate Digital Learning Capability Scale (Matrix Scale Questions)
    A. Digital Learning Awareness
    Strongly DisagreeSomewhat DisagreeSlightly DisagreeUncertainSlightly AgreeSomewhat AgreeStrongly Agree
    I can distinguish digital learning from other learning methods
    I am willing to use mobile phones, computers, tablets, and other digital devices for learning
    When encountering learning problems, I am willing to use digital methods to solve them
    I am willing to actively explore new functions of digital learning platforms and tools
    I believe digital learning can improve my learning efficiency
    I believe digital learning can enhance my learning outcomes
    B. Digital Learning Technical Skills
    Strongly DisagreeSomewhat DisagreeSlightly DisagreeUncertainSlightly AgreeSomewhat AgreeStrongly Agree
    When encountering problems, I can quickly determine which digital tool to use
    I can obtain needed learning resources through the Internet
    I can adapt to the operational requirements of different digital learning platforms and tools
    I can skillfully use application software related to my major for learning and research, such as SPSS, CAD, PS, Python
    I can use multimedia tools and software, such as image editors and video editors, to create and edit digital learning content
    I can independently solve technical problems that may arise in digital learning, including software failures and network issues
    C. Digital Learning Engagement Behaviors
    Strongly DisagreeSomewhat DisagreeSlightly DisagreeUncertainSlightly AgreeSomewhat AgreeStrongly Agree
    I often actively participate in discussions on social media platforms such as Zhihu, Weibo, and Douban
    I often communicate and collaborate with teachers through information tools such as WeChat, QQ, and email
    I often communicate and collaborate with classmates through information tools such as WeChat, QQ, and email
    D. Digital Learning Self-Management
    Strongly DisagreeSomewhat DisagreeSlightly DisagreeUncertainSlightly AgreeSomewhat AgreeStrongly Agree
    I can categorize and manage learning resources obtained from the Internet
    I can categorize and manage digital learning tools
    I can establish clear learning plans and goals, and control learning progress
    I can allocate learning time reasonably to prevent procrastination and time waste
    E. Digital Learning Evaluation Competence
    Strongly DisagreeSomewhat DisagreeSlightly DisagreeUncertainSlightly AgreeSomewhat AgreeStrongly Agree
    I can evaluate the credibility and effectiveness of obtained digital learning resources
    I can evaluate the advantages and disadvantages of digital learning platforms and tools I use
    I can evaluate the advantages and disadvantages of digital learning environments
    I can evaluate my own learning outcomes
    I evaluate my learning outcomes through course grades
    I can evaluate the credibility and effectiveness of obtained digital learning resources
  • Current Status of Undergraduate Academic Achievement (Single Choice Questions)
    • Academic Performance
    • My major ranking is:
      ○ Top 20% ○ 20–40% ○ 40–60% ○ 60–80% ○ Bottom 20%
    • My CET-4 score is:
      ○ 425–500 ○ 501–550 ○ 551–600 ○ 601–650 ○ Above 650 ○ Failed ○ Not taken
    • My CET-6 score is:
      ○ 425–500 ○ 501–550 ○ 551–600 ○ 601–650 ○ Above 650 ○ Failed ○ Not taken
    B.
    Academic Achievement
    • Number of research projects participated in:
      ○ 0 ○ 1 ○ 2 ○ 3 ○ More than 3
    • Number of academic conferences attended:
      ○ 0 ○ 1 ○ 2 ○ 3 ○ More than 3
    • Number of disciplinary competitions with awards:
      ○ 0 ○ 1 ○ 2 ○ 3 ○ More than 3
    C.
    Practical Activities
    • Number of social practice activities participated in:
      ○ 0 ○ 1 ○ 2 ○ 3 ○ More than 3
    • Number of volunteer services participated in:
      ○ 0 ○ 1 ○ 2 ○ 3 ○ More than 3
    • Number of positions held in university/college student organizations, student clubs, practical education bases, party branches, or class committees:
      ○ 0 ○ 1 ○ 2 ○ 3 ○ More than 3
    • Number of off-campus internships (not organized by school):
      ○ 0 ○ 1 ○ 2 ○ 3 ○ More than 3
    • Number of sports competitions with awards:
      ○ 0 ○ 1 ○ 2 ○ 3 ○ More than 3
    • Number of arts competitions with awards:
      ○ 0 ○ 1 ○ 2 ○ 3 ○ More than 3

References

  1. Abdullah, M. A. (2023). Digital maturity of the Egyptian universities: Goal-oriented project planning model. Studies in Higher Education, 49(8), 1463–1485. [Google Scholar] [CrossRef]
  2. Ateş, H. (2024). Designing a self-regulated flipped learning approach to promote students’ science learning performance. Educational Technology & Society, 27(1), 65–83. [Google Scholar] [CrossRef]
  3. Bojórquez-Roque, M. S., Garcia-Cabot, A., Garcia-Lopez, E., & Oliva-Córdova, L. M. (2024). Digital competence learning ecosystem in higher education: A mapping and systematic review of the literature. IEEE Access, 12, 87596–87614. [Google Scholar] [CrossRef]
  4. Camacho-Sánchez, R., Serna Bardavío, J., Rillo-Albert, A., & Lavega-Burgués, P. (2023). Enhancing motivation and academic performance through gamified digital game-based learning methodology using the ARCS model. Interactive Learning Environments, 32(10), 6868–6885. [Google Scholar] [CrossRef]
  5. CEO Forum on Education and Technology. (2000). The power of digital learning: Integrating digital content (The CEO forum school technology and readiness report, year three). Available online: https://files.eric.ed.gov/fulltext/ED447781.pdf (accessed on 2 January 2025).
  6. Chaw, L. Y., & Tang, C. M. (2024). Exploring the relationship between digital competence proficiency and student learning performance. European Journal of Education, 59(1), e12593. [Google Scholar] [CrossRef]
  7. Chen, C., Jamiat, N., Abdul Rabu, S. N., & Mao, Y. (2024). Effects of a self-regulated-based gamified interactive e-books on primary students’ learning performance and affection in a flipped mathematics classroom. Education and Information Technologies, 29(1), 24143–24180. [Google Scholar] [CrossRef]
  8. Chen, L., Chen, P., & Lin, Z. (2020). Artificial intelligence in education: A review. IEEE Access, 8, 75264–75278. [Google Scholar] [CrossRef]
  9. Chen, X., Zou, D., Xie, H., Cheng, G., & Liu, C. (2022). Two decades of artificial intelligence in education. Educational Technology & Society, 25(1), 28–47. [Google Scholar]
  10. Cutumisu, M., Schwartz, D. L., & Lou, N. M. (2020). The relation between academic achievement and the spontaneous use of design-thinking strategies. Computers & Education, 149, 103806. [Google Scholar] [CrossRef]
  11. Demirtas-Zorbaz, S., Akin-Arikan, C., & Terzi, R. (2021). Does school climate that includes students’ views deliver academic achievement? A multilevel meta-analysis. School Effectiveness and School Improvement, 32(4), 543–563. [Google Scholar] [CrossRef]
  12. Esteve-Mon, F. M., Postigo-Fuentes, A. Y., & Castañeda, L. (2023). A strategic approach of the crucial elements for the implementation of digital tools and processes in higher education. Higher Education Quarterly, 77(3), 558–573. [Google Scholar] [CrossRef]
  13. European Commission. (2020). Digital education action plan 2021–2027: Resetting education and training for the digital age. Publications Office of the European Union. Available online: https://ec.europa.eu/education/education-in-the-eu/digital-education-action-plan_en (accessed on 2 January 2025).
  14. Galindo-Dominguez, H. (2021). Flipped classroom in the educational system. Educational Technology & Society, 24(3), 44–60. [Google Scholar]
  15. Guitert, M., Romeu, T., & Baztán, P. (2021). The digital competence framework for primary and secondary schools in Europe. European Journal of Education, 56(1), 133–149. [Google Scholar] [CrossRef]
  16. Hardaway, C. R., Sterrett-Hong, E. M., De Genna, N. M., & Cornelius, M. D. (2020). The role of cognitive stimulation in the home and maternal responses to low grades in low-income african american adolescents’ academic achievement. Journal of Youth and Adolescence, 49, 1043–1056. [Google Scholar] [CrossRef]
  17. Hargittai, E. (2008). An update on survey measures of web-oriented digital literacy. Social Science Computer Review, 27(1), 130–137. [Google Scholar] [CrossRef]
  18. Hidayat, R., & Wardat, Y. (2024). A systematic review of augmented reality in science, technology, engineering and mathematics education. Education and Information Technologies, 29(8), 9257–9282. [Google Scholar] [CrossRef]
  19. Holly, M., Pirker, J., Resch, S., Brettschuh, S., & Gütl, C. (2021). Designing VR experiences–expectations for teaching and learning in VR. Educational Technology & Society, 24(2), 107–119. [Google Scholar]
  20. Hong, J., Liu, W., & Zhang, Q. (2024). Closing the digital divide: The impact of teachers’ ICT use on student achievement in China. Journal of Comparative Economics, 52(3), 697–713. [Google Scholar] [CrossRef]
  21. Hong, J. C., Liu, X., Cao, W., Tai, K. H., & Zhao, L. (2022). Effects of self-efficacy and online learning mind states on learning ineffectiveness during the COVID-19 lockdown. Educational Technology & Society, 25(1), 142–154. [Google Scholar]
  22. Huang, H., & Lin, H. C. (2024). ChatGPT as a life coach for professional identity formation in medical education. Educational Technology & Society, 27(3), 374–389. [Google Scholar] [CrossRef]
  23. Ibrahim, R. K., & Aldawsari, A. N. (2023). Relationship between digital capabilities and academic performance: The mediating effect of self-efficacy. BMC Nursing, 22, 434. [Google Scholar] [CrossRef]
  24. International Society for Technology in Education. (2016). ISTE standards for students. Available online: https://iste.org/standards/students (accessed on 17 January 2025).
  25. Jeong, D. W., Moon, H., Jeong, S. M., & Moon, C. J. (2024). Digital capital accumulation in schools, teachers, and students and academic achievement: Cross-country evidence from the PISA 2018. International Journal of Educational Development, 107, 103024. [Google Scholar] [CrossRef]
  26. Jisc. (2021). Building digital capabilities: The six elements of digital capability. Available online: https://repository.jisc.ac.uk/8846/1/2022_Jisc_BDC_Individual_Framework.pdf (accessed on 4 January 2025).
  27. Kallas, K., & Pedaste, M. (2022). How to improve the digital competence for e-learning? Applied Sciences, 12(13), 6582. [Google Scholar] [CrossRef]
  28. Koch, T., Laaber, F., & Florack, A. (2024). Socioeconomic status and young people’s digital maturity: The role of parental mediation. Computers in Human Behavior, 154, 108157. [Google Scholar] [CrossRef]
  29. Laaber, F., Florack, A., Koch, T., & Hubert, M. (2023). Digital maturity: Development and validation of the Digital Maturity Inventory (DIMI). Computers in Human Behavior, 143, 107709. [Google Scholar] [CrossRef]
  30. Lampropoulos, G., & Kinshuk. (2024). Virtual reality and gamification in education: A systematic review. Educational Technology Research and Development, 72(3), 1691–1785. [Google Scholar] [CrossRef]
  31. Lavrijsen, J., Vansteenkiste, M., Boncquet, M., & Verschueren, K. (2022). Does motivation predict changes in academic achievement beyond intelligence and personality? A multitheoretical perspective. Journal of Educational Psychology, 114(4), 772–790. [Google Scholar] [CrossRef]
  32. Lee, S., Lee, J.-H., & Jeong, Y. (2023). The effects of digital textbooks on students’ academic performance, academic interest, and learning skills. Journal of Marketing Research, 60(4), 792–811. [Google Scholar] [CrossRef]
  33. Li, Z., Slavkova, O., & Gao, Y. (2022). Role of digitalization, digital competence, and parental support on performance of sports education in low-income college students. Frontiers in Psychology, 13, 979318. [Google Scholar] [CrossRef]
  34. Liang, Q., de la Torre, J., & Law, N. (2021). Do background characteristics matter in children’s mastery of digital literacy? A cognitive diagnosis model analysis. Computers in Human Behavior, 122, 106850. [Google Scholar] [CrossRef]
  35. Lin, T. J., & Tsai, C. C. (2018). Differentiating the sources of taiwanese high school students’ multidimensional science learning self-efficacy: An examination of gender differences. Research in Science Education, 48, 575–596. [Google Scholar] [CrossRef]
  36. Liu, G. L., Darvin, R., & Ma, C. (2024). Exploring AI-mediated informal digital learning of English (AI-IDLE): A mixed-method investigation of Chinese EFL learners’ AI adoption and experiences. Computer Assisted Language Learning, 1–29. [Google Scholar] [CrossRef]
  37. Luo, Q., Chen, L., Yu, D., & Zhang, K. (2023). The mediating role of learning engagement between self-efficacy and academic achievement among Chinese college students. Psychology Research and Behavior Management, 16, 1533–1543. [Google Scholar] [CrossRef] [PubMed]
  38. Mahling, M., Wunderlich, R., Steiner, D., Gorgati, E., Festl-Wietek, T., & Herrmann-Werner, A. (2023). Virtual Reality for Emergency Medicine Training in Medical School: Prospective, Large-Cohort Implementation Study. Journal of Medical Internet Research, 25, e43649. [Google Scholar] [CrossRef] [PubMed]
  39. Martzoukou, K., Kostagiolas, P., Lavranos, C., Lauterbach, T., & Fulton, C. (2021). A study of university law students’ self-perceived digital competences. Journal of Librarianship and Information Science, 54(4), 751–769. [Google Scholar] [CrossRef]
  40. Mehrvarz, M., Heidari, E., Farrokhnia, M., & Noroozi, O. (2021). The mediating role of digital informal learning in the relationship between students’ digital competence and their academic performance. Computers & Education, 167, 104184. [Google Scholar] [CrossRef]
  41. Mieg, H. A., Klieme, K. E., Barker, E., Bryan, J., Gibson, C., Haberstroh, S., Odebiyi, F., Rismondo, F. P., Römmer-Nossek, B., Thiem, J., & Unterpertinger, E. (2024). Short digital-competence test based on DigComp2. 1: Does digital competence support research competence in undergraduate students? Education and Information Technologies, 29(1), 139–160. [Google Scholar] [CrossRef]
  42. Mosqueira-Rey, E., Hernández-Pereira, E., Alonso-Ríos, D., Bobes-Bascarán, J., & Fernández-Leal, Á. (2023). Human-in-the-loop machine learning: A state of the art. Artificial Intelligence Review, 56(4), 3005–3054. [Google Scholar] [CrossRef]
  43. Moss-Racusin, C. A., Sanzari, C., Caluori, N., & Rabasco, H. (2018). Gender bias produces gender gaps in STEM engagement. Sex Roles, 79, 651–670. [Google Scholar] [CrossRef]
  44. Nguyen, T. Q., Ngoc, P. T. A., Phuong, H. A., Duy, D. P. T., Hiep, P. C., McClelland, R., & Noroozi, O. (2024). Digital competence of vietnamese citizens: An application of digcomp framework and the role of individual factors. Education and Information Technologies, 29, 19267–19298. [Google Scholar] [CrossRef]
  45. Niu, L., Wang, X., Wallace, M. P., Pang, H., & Xu, Y. (2022). Digital learning of English as a foreign language among university students: How are approaches to learning linked to digital competence and technostress? Journal of Computer Assisted Learning, 38(5), 1332–1346. [Google Scholar] [CrossRef]
  46. Oliveira, G., Grenha Teixeira, J., Torres, A., & Morais, C. (2021). An exploratory study on the emergency remote education experience of higher education students and teachers during the COVID-19 pandemic. British Journal of Educational Technology, 52(4), 1357–1376. [Google Scholar] [CrossRef] [PubMed]
  47. Organisation for Economic Co-operation and Development. (2021a). Supporting the digital transformation of higher education in Hungary. OECD Publishing. [Google Scholar] [CrossRef]
  48. Organisation for Economic Co-operation and Development. (2021b). The future of education and skills 2030: Learning compass 2030. Available online: https://www.oecd.org/education/2030-project/ (accessed on 2 January 2025).
  49. Örnek, F., Afari, E., & Alaam, S. A. (2024). Relationship between students’ ICT interactions and science achievement in PISA 2018: The case of Türkiye. Education and Information Technologies, 29, 13413–13435. [Google Scholar] [CrossRef]
  50. Pan, L., Haq, S. U., Shi, X., & Nadeem, M. (2024). The impact of digital competence and personal innovativeness on the learning behavior of students: Exploring the moderating role of digitalization in higher education quality. Sage Open, 14(3), 21582440241265919. [Google Scholar] [CrossRef]
  51. Panadero, E. (2017). A review of self-regulated learning: Six models and four directions for research. Frontiers in Psychology, 8, 422. [Google Scholar] [CrossRef] [PubMed]
  52. Pedaste, M., Kallas, K., & Baucal, A. (2023). Digital competence test for learning in schools: Development of items and scales. Computers & Education, 203, 104830. [Google Scholar] [CrossRef]
  53. Pinto, M. R., Salume, P. K., Barbosa, M. W., & de Sousa, P. R. (2023). The path to digital maturity: A cluster analysis of the retail industry in an emerging economy. Technology in Society, 72, 102191. [Google Scholar] [CrossRef]
  54. Pramila-Savukoski, S., Kärnä, R., Kuivila, H. M., Juntunen, J., Koskenranta, M., Oikarainen, A., & Mikkonen, K. (2023). The influence of digital learning on health sciences students’ competence development–A qualitative study. Nurse Education Today, 120, 105635. [Google Scholar] [CrossRef]
  55. Rof, A., Bikfalvi, A., & Marquès, P. (2022). Pandemic-accelerated digital transformation of a born digital higher education institution: Towards a customized multimode learning strategy. Educational Technology & Society, 25(1), 124–141. [Google Scholar]
  56. Scheel, L., Vladova, G., & Ullrich, A. (2022). The influence of digital competences, self-organization, and independent learning abilities on students’ acceptance of digital learning. International Journal of Educational Technology in Higher Education, 19(1), 44. [Google Scholar] [CrossRef]
  57. Shahzad, M. F., Xu, S., & Zahid, H. (2025). Exploring the impact of generative AI-based technologies on learning performance through self-efficacy, fairness & ethics, creativity, and trust in higher education. Education and Information Technologies, 30, 3691–3716. [Google Scholar] [CrossRef]
  58. Sobieraj, S., & Krämer, N. C. (2020). Similarities and differences between genders in the usage of computer with different levels of technological complexity. Computers in Human Behavior, 104, 106145. [Google Scholar] [CrossRef]
  59. Suad, A., Tapalova, O., Berestova, A., & Vlasova, S. (2024). Moodle’s communicative instruments: The impact on students academic performance. Innovations in Education and Teaching International, 1–15. [Google Scholar] [CrossRef]
  60. Sweller, J., Ayres, P., & Kalyuga, S. (2011). Cognitive load theory. Springer. [Google Scholar]
  61. Tan, X., Lin, X., & Zhuang, R. (2024). Development and validation of a secondary vocational school students’ digital learning competence scale. Smart Learning Environments, 11(1), 37. [Google Scholar] [CrossRef]
  62. Tao, Y., Meng, Y., Gao, Z., & Yang, X. (2022). Perceived teacher support, student engagement, and academic achievement: A meta-analysis. Educational Psychology, 42(4), 401–420. [Google Scholar] [CrossRef]
  63. Tran-Duong, Q. H. (2023). The effect of media literacy on effective learning outcomes in online learning. Education and Information Technologies, 28(3), 3605–3624. [Google Scholar] [CrossRef]
  64. Turnbull, D., Chugh, R., & Luck, J. (2021). Transitioning to E-Learning during the COVID-19 pandemic: How have Higher Education Institutions responded to the challenge? Education and Information Technologies, 26(5), 6401–6419. [Google Scholar] [CrossRef]
  65. Tzafilkou, K., Perifanou, M., & Economides, A. A. (2022). Development and validation of students’ digital competence scale (SDiCoS). International Journal of Educational Technology in Higher Education, 19(1), 30. [Google Scholar] [CrossRef]
  66. United Nations Educational, Scientific and Cultural Organization. (2021). Reimagining our futures together: A new social contract for education. Available online: https://unesdoc.unesco.org/ark:/48223/pf0000379707 (accessed on 6 January 2025).
  67. United Nations Educational, Scientific and Cultural Organization. (2022). Gateways to public digital learning: A multi-partner initiative to create and strengthen inclusive digital learning platforms and content. Available online: https://www.un.org/sites/un2.un.org/files/2022/09/gateways_to_public_digital_learning_long.pdf (accessed on 2 January 2025).
  68. Vuorikari, R., Kluzer, S., & Punie, Y. (2022). DigComp 2.2: The digital competence framework for citizens—With new examples of knowledge, skills and attitudes. Available online: https://vibe.cityoflearning.eu/storage/content/u3h4e3jb5t4lpvm44vs5vmrn3om3kgq1.pdf (accessed on 2 January 2025).
  69. Wang, S. L., Lin, J. J., & Su, P. C. (2024). The roles of self-efficacy and cognitive styles in reading behaviors and reasoning performance. Educational Technology & Society, 27(3), 165–184. [Google Scholar] [CrossRef]
  70. Wang, W. T., & Lin, Y. L. (2021). The relationships among students’ personal innovativeness, compatibility, and learning performance. Educational Technology & Society, 24(2), 14–27. [Google Scholar]
  71. Wang, X., Wang, Z., Wang, Q., Chen, W., & Pi, Z. (2021). Supporting digitally enhanced learning through measurement in higher education: Development and validation of a university students’ digital competence scale. Journal of Computer Assisted Learning, 37(4), 1063–1076. [Google Scholar] [CrossRef]
  72. Wang, X. W., Zhu, Y. J., & Zhang, Y. C. (2022). An empirical study of college students’ reading engagement on academic achievement. Frontiers in Psychology, 13, 1025754. [Google Scholar] [CrossRef]
  73. White, P. J., Ardoin, N. M., Eames, C., & Monroe, M. C. (2023). Agency in the Anthropocene: Supporting document to the PISA 2025 science framework (Vol. 297, pp. 1–43). OECD Education Working Papers. Available online: https://www.oecd.org/content/dam/oecd/en/publications/reports/2023/06/agency-in-the-anthropocene_b5843618/8d3b6cfa-en.pdf (accessed on 6 January 2025).
  74. Wu, T. T., Lee, H. Y., Li, P. H., Huang, C. N., & Huang, Y. M. (2023). Promoting self-regulation progress and knowledge construction in blended learning via ChatGPT-based learning aid. Journal of Educational Computing Research, 61(8), 1539–1567. [Google Scholar] [CrossRef]
  75. Yan, Q. (2022). Human-computer interactive English learning from the perspective of social cognition in the age of intelligence. Frontiers in Psychology, 13, 888543. [Google Scholar] [CrossRef]
  76. Yang, J., Tlili, A., Huang, R., Zhuang, R., & Bhagat, K. K. (2021). Development and validation of a digital learning competence scale: A comprehensive review. Sustainability, 13(10), 5593. [Google Scholar] [CrossRef]
  77. Younas, M., Noor, U., Zhou, X., Menhas, R., & Qingyu, X. (2022). COVID-19, students satisfaction about e-learning and academic achievement: Mediating analysis of online influencing factors. Frontiers in Psychology, 13, 948061. [Google Scholar] [CrossRef]
  78. Zhang, J., & Zeng, Y. (2024). Effect of college students’ smartphone addiction on academic achievement: The mediating role of academic anxiety and moderating role of sense of academic control. Psychology Research and Behavior Management, 17, 933–944. [Google Scholar] [CrossRef]
  79. Zhao, Y., Llorente, A. M. P., & Gómez, M. C. S. (2021). Digital competence in higher education research: A systematic literature review. Computers & Education, 168, 104212. [Google Scholar] [CrossRef]
  80. Zimmerman, B. J. (2000). Attaining self-regulation: A social cognitive perspective. In M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 13–39). Academic Press. [Google Scholar] [CrossRef]
Table 1. Dimensions of digital learning competence.
Table 1. Dimensions of digital learning competence.
CompetenceComponent ElementsDescription
Digital Learning CompetenceDigital Learning AwarenessAn individual’s cognition, understanding, and value placed on digital learning, as well as their attitude toward digital learning tools and resources.
Digital Learning Technical CompetenceAn individual’s skills and abilities in using digital technologies and tools.
Digital Learning Engagement BehaviorsA series of behaviors and habits exhibited by individuals during the digital learning process, including collaboration, communication, and connection abilities.
Digital Learning Self-ManagementAn individual’s ability to manage learning resources, learning tools, learning time, and learning emotions.
Digital Learning Evaluation CompetenceAn individual’s ability to reasonably and accurately evaluate digital learning resources, tools, environments, processes, and outcomes.
Table 2. Descriptive statistical analysis of undergraduate digital learning competence and academic achievement.
Table 2. Descriptive statistical analysis of undergraduate digital learning competence and academic achievement.
Comparative DimensionSecondary DimensionNMinimumMaximumMeanStandard Deviation
Digital Learning CompetenceDigital Learning Awareness3122.337.005.800.66
Digital Learning Technical Competence3123.336.505.060.76
Digital Learning Engagement Behaviors3123.677.005.850.66
Digital Learning Self-Management3124.007.005.570.65
Digital Learning Evaluation Competence3123.207.005.590.71
Overall3124.256.675.540.47
Academic AchievementCourse Achievement3121146.842.62
Scholarly Achievement3123145.492.39
Practical Achievement31263015.784.44
Overall3123.714.38.222.06
Table 3. Differential analysis results of undergraduate academic achievement dimensions.
Table 3. Differential analysis results of undergraduate academic achievement dimensions.
Comparative DimensionCourse AchievementScholarly AchievementPractical AchievementOverall Academic Achievement
GenderMale7.08 ± 2.756.03 ± 2.4616.44 ± 4.778.64 ± 2.01
Female6.66 ± 2.515.08 ± 2.2615.28 ± 4.127.91 ± 2.05
F1.3893.5222.2973.108
P0.1660.0000.0220.002
Grade LevelFirst year4.54 ± 2.273.62 ± 1.20 15.15 ± 4.85 6.38 ± 1.73
Second year7.40 ± 2.44 5.87 ± 2.56 15.69 ± 4.13 8.60 ± 2.37
Third year7.65 ± 2.56 6.35 ± 2.64 16.35 ± 4.90 9.00 ± 2.08
Fourth year6.80 ± 2.56 5.30 ± 2.13 15.63 ± 4.36 8.12 ± 1.82
Fifth year7.00 ± 2.62 7.90 ± 3.28 17.80 ± 3.71 9.43 ± 2.52
F7.2189.6440.9059.073
P0.0000.0000.4610.000
DisciplinePhilosophy7.20 ± 2.598.00 ± 2.0020.40 ± 3.7810.08 ± 2.42
Economics6.94 ± 2.215.78 ± 2.6916.39 ± 4.708.48 ± 2.43
Law7.00 ± 2.705.75 ± 2.6015.08 ± 3.878.24 ± 2.31
Education6.60 ± 3.015.56 ± 2.2017.31 ± 5.138.43 ± 1.80
Literature6.61 ± 2.384.89 ± 2.2315.14 ± 4.707.80 ± 1.91
History8.20 ± 1.645.20 ± 1.3015.20 ± 3.118.70 ± 0.84
Science7.00 ± 3.085.85 ± 2.4615.28 ± 3.988.31 ± 2.46
Engineering7.57 ± 2.775.13 ± 2.3115.63 ± 4.788.45 ± 2.12
Agriculture9.67 ± 0.586.00 ± 0.0015.33 ± 2.899.70 ± 0.87
Medicine6.00 ± 2.186.29 ± 3.4317.29 ± 3.368.34 ± 2.37
Military Science10.00 ± 0.007.50 ± 2.1213.50 ± 0.719.95 ± 0.49
Management6.36 ± 2.175.33 ± 2.3114.89 ± 3.847.75 ± 1.81
Arts6.00 ± 0.003.00 ± 0.0015.00 ± 6.256.90 ± 1.25
F1.3931.4471.6021.272
P0.1680.1440.0900.234
University TierProject 9857.65 ± 3.105.51 ± 2.20 16.15 ± 5.36 8.71 ± 1.99
Project 2118.52 ± 2.60 6.57 ± 2.88 17.59 ± 4.27 9.75 ± 2.26
“Double First-Class” (excluding 985 & 211)6.91 ± 2.12 6.32 ± 2.67 16.02 ± 3.85 8.55 ± 1.86
Other universities5.83 ± 1.98 4.89 ± 2.04 14.94 ± 3.89 7.37 ± 1.69
F18.3558.3214.54321.689
P0.0000.0000.0040.000
Table 4. Correlation analysis results between digital learning competence and academic achievement of undergraduates.
Table 4. Correlation analysis results between digital learning competence and academic achievement of undergraduates.
Digital Learning AwarenessDigital Learning Technical CompetenceDigital Learning Engagement BehaviorsDigital Learning Self-ManagementDigital Learning Evaluation CompetenceDigital Learning CompetenceCourse AchievementAcademic AchievementPractical AchievementOverall Academic Achievement
Digital Learning Awareness1
Digital Learning Technical Competence0.357 **1
Digital Learning Engagement Behaviors0.288 **0.127 *1
Digital Learning Self-Management0.278 **0.306 **0.118 *1
Digital Learning Evaluation Competence0.313 **0.429 **0.218 **0.377 **1
Digital Learning Competence0.711 **0.760 **0.425 **0.593 **0.726 **1
Course Achievement0.0410.058−0.0530.0970.138 *0.0951
Academic Achievement0.010.152 **−0.0630.157 **0.166 **0.143 *0.310 **1
Practical Achievement0.0430.218 **0.0810.0680.199 **0.196 **0.0860.354 **1
Academic Achievement0.0480.184 **−0.0210.146 **0.231 **0.194 **0.780 **0.697 **0.608 **1
* p < 0.05 ** p < 0.01.
Table 5. Regression analysis results of digital learning competence and academic achievement of undergraduates.
Table 5. Regression analysis results of digital learning competence and academic achievement of undergraduates.
ModelUnstandardized CoefficientstSig.Collinearity Statistics
BStd. Error ToleranceVIF
Independent Variable
(Constant)4.6730.8405.5620.000
Digital Learning Evaluation Competence0.4800.1493.2150.0010.9691.032
Control Variables
University Tier
Double First-Class Universities1.4580.2146.8280.0000.9651.036
Other Universities0
Grade Level
Lower Grades−0.6380.252−2.5330.0120.9811.019
Higher Grades0
Gender
Male0.5100.2142.3850.0180.9761.024
Female0
R2 0.208
Adjusted R2 0.198
F 17.462
P 0.000
Dependent Variable: Academic Achievement
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Song, Y.; Lv, S.; Wang, M.; Wang, Z.; Dong, W. The Impact of Digital Learning Competence on the Academic Achievement of Undergraduate Students. Behav. Sci. 2025, 15, 840. https://doi.org/10.3390/bs15070840

AMA Style

Song Y, Lv S, Wang M, Wang Z, Dong W. The Impact of Digital Learning Competence on the Academic Achievement of Undergraduate Students. Behavioral Sciences. 2025; 15(7):840. https://doi.org/10.3390/bs15070840

Chicago/Turabian Style

Song, Yafeng, Shuqi Lv, Meng Wang, Zhuoxi Wang, and Wei Dong. 2025. "The Impact of Digital Learning Competence on the Academic Achievement of Undergraduate Students" Behavioral Sciences 15, no. 7: 840. https://doi.org/10.3390/bs15070840

APA Style

Song, Y., Lv, S., Wang, M., Wang, Z., & Dong, W. (2025). The Impact of Digital Learning Competence on the Academic Achievement of Undergraduate Students. Behavioral Sciences, 15(7), 840. https://doi.org/10.3390/bs15070840

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop