Abstract
While blended learning facilitates digital literacy development, the specific design models and student factors contributing to this process remain underexplored. This study examined the relationship between various blended learning design models and digital literacy skill acquisition among 106 upper-secondary Vocational Education and Training (VET) students. Relationships among student activities, digital competencies, and prior blended learning experience were analyzed. Engagement in collaborative, task-based instructional designs—specifically collaborative projects and regular quizzing supported by digital tools—was positively associated with digital competence. Conversely, passive participation in live sessions or viewing pre-recorded videos exhibited a comparatively weaker association with competence development. While the use of virtual/augmented reality and interactive video correlated positively with digital tool usage, it did not significantly predict perceptions of online safety or content creation skills. Students with prior blended learning experience reported higher proficiency in developmental competencies, such as content creation and research, compared to their inexperienced peers. Cluster analysis identified three distinct student profiles based on technical specialization and blended learning experience. Overall, these findings suggest that blended learning implementation should prioritize structured collaboration and formative assessment.
1. Introduction
As Vocational Education and Training (VET) institutions increasingly adopt blended learning to enhance digital skills, there is a critical need to identify which specific activities most effectively foster students’ digital competence. Researchers have defined blended learning (BL) in various ways, but the most common definition focuses on mixing online and offline education. Many describe it as a combination that brings together the best of traditional face-to-face classes—like live instruction and classroom interaction—with online learning, which uses technology to connect students, teachers, and digital resources. Early definitions from scholars like Garrison and Kanuka [1] and Graham [2] described this model as a blend of teacher-led classroom sessions and internet-based content that students can use at their own pace. However, the post-pandemic landscape has changed blended learning, particularly within VET. Song and Lai [3] observe that the pandemic necessitated an increased reliance on technology. Technological integration was expanded within blended models to foster greater flexibility and accessibility. This evolution combined face-to-face instruction with robust online components to enhance skill acquisition and align with industry standards. The pandemic underscored the necessity for flexibility, compelling institutions to adopt blended approaches that support digital literacy, personalized learning, and infrastructure development. These shifts were further driven by student and industry demand for practical, job-ready skills. This transition reflects not merely a technological adaptation, but a pedagogical paradigm shift designed to meet learners’ evolving needs in hybrid environments.
1.1. Key Digital Competences Enhanced by Blended Learning
The efficacy of blended learning in developing students’ digital competencies depends on several key determinants identified in prior research. Paramount among these is the quality of technology integration, which predicts student competence and engagement more accurately than mere frequency of use [4]. The Technology Acceptance Model (TAM) provides valuable insights here, demonstrating that perceived usefulness and perceived ease of use of digital tools strongly correlate with their actual adoption and effective deployment in learning contexts [5].
Key competences include information and data literacy, communication and collaboration, digital content creation, and problem-solving, all of which are crucial for improving student learning performance in higher education [6]. The integration of technologies such as augmented reality (AR) further develops students’ digital skills by enabling them to become prosumers of virtual content, a role traditionally reserved for IT developers [7]. Additionally, Artificial Intelligence (AI) technologies in blended settings have been shown to enhance learning outcomes, provided that instructors possess the requisite digital competencies to effectively incorporate AI into their pedagogy [8]. Analysis of award-winning VET courses reveals specific strategies that effectively foster digital competence [9]. These courses employ more open-ended questioning techniques and create greater opportunities for student expression compared to routine courses. Effective VET instructors consistently relate content to real-world work contexts and provide feedback that combines cognitive guidance with emotional encouragement [9].
Post-pandemic higher education research (e.g., [10,11]) consistently shows that technological access and sophisticated platforms, while necessary, are insufficient for digital competence development. Transformative pedagogical design and digital pedagogy emerge as primary drivers instead.
Competence development is supported by frameworks such as DigComp 2.2, which guide institutions in designing activities that foster digital proficiency [6]. Moreover, the widespread adoption of technology in educational settings has been linked to increased student engagement, communication, and access to learning resources, thereby promoting academic success across diverse student populations [12]. The comprehensive digital competence scale developed by Tzafilkou et al. [13] underscores the importance of skills in online learning, collaboration, and data protection, which are essential for navigating the digital landscape of modern education [13].
1.2. Prior Blended Learning Experience and Digital Competence Development
Prior experience with blended learning in secondary education significantly influences the acquisition of digital competencies by providing a structured environment that integrates online and offline modalities. This integration fosters competence by encouraging meaningful engagement with technology, thereby enhancing digital literacy and problem-solving capabilities. Crucially, the quality of technology integration—rather than mere frequency of use—drives competence development. This is evidenced by the Technology Integration Quality Scale (TIQS), which highlights the necessity of learning support, classroom management, and cognitive activation in technology use [4].
While blended learning environments enriched by pandemic-era distance instruction show potential for enhancing skills, the extent of this enrichment varies based on the efficacy of the instructor’s digital pedagogy [14]. Furthermore, the concept of ‘digital instinct’—students’ intuitive and familiar use of digital technologies—emerges as a critical factor in developing digital literacy, suggesting that students’ prior experiences with technology can serve as a foundation for more reflective and advanced digital practices [15]. In vocational settings, the acceptance and effectiveness of blended learning are further modulated by social influence, facilitating conditions, and self-efficacy, all of which collectively enhance a student’s willingness to engage [16]. Additionally, frameworks for secondary vocational students that include cognitive processing and activity management indicate that structured blended experiences significantly contribute to competence development [17].
Demographic variables such as age, study mode, and nationality also affect the perceived utility of remote learning, although employment status appears to have minimal impact [5]. These factors help explain variations in competence development across diverse populations and highlight the need for differentiated instructional approaches in VET environments.
1.3. Linking Intrinsic Motivation to Digital Competence Growth
Intrinsic motivation is a critical determinant of digital competence, influencing both engagement and learning processes in virtual environments. Self-Determination Theory posits that intrinsic motivation stems from the satisfaction of basic psychological needs—competence, autonomy, and relatedness—which are essential for effective engagement with digital tools [18,19]. In the context of digital learning, intrinsic motivation enhances self-regulated learning by fostering students’ acceptance of technology, which in turn improves their digital competence [20]. This is supported by findings that intrinsic motivation mediates the relationship between technology acceptance and self-regulated learning, suggesting that motivated students are more likely to engage deeply with digital content and develop their digital skills [20].
Concurrently, recent scholarship on online adult education emphasizes that the quality of pedagogical relationships—characterized by respect, empathy, and dialogic interaction—is vital for learner well-being and cannot be assumed to transfer automatically from face-to-face settings [21]. These studies suggest that digital environments risk weakening relationships and amplifying exclusion when treated as a purely technical substitute, but can instead enhance participation and learner autonomy when designed according to constructivist, person-centred principles that align cognitive demands with relational support [21].
Building on this broader evidence, the dynamics between motivation and digital competence can be further illuminated by findings from a recent comprehensive study in wood science and technology education [22]. Examining academic motivation and digital competencies the study revealed that academic motivation explained between 22% and 29% of the variance in students’ self-perceived digital sustainability competencies [22]. Importantly, the study found that intrinsic and extrinsic motivation did not function as distinct dimensions but formed a unidimensional construct, suggesting that both internal interests and external incentives jointly support digital competence development. This challenges traditional conceptualizations that position intrinsic and extrinsic motivation as separate constructs and suggests a more integrated approach to fostering motivation in VET contexts.
Intrinsic motivation is also linked to sustained contributions to online knowledge-sharing platforms, indicating that individuals who are intrinsically motivated are more likely to continuously engage with digital tools and platforms, thereby enhancing their digital competence over time [23]. Educational interventions, such as gamification and business simulation games, have been shown to boost intrinsic motivation, which in turn supports the development of digital competence by making learning more engaging and meaningful [24]. These interventions leverage elements of challenge, autonomy, and social interaction to enhance motivation, which is crucial for the effective acquisition of digital skills [25]. Fostering intrinsic motivation via targeted strategies and supportive digital environments increases digital competence by endorsing engagement, continuous learning, and effective technology use [26].
1.4. Research Focus and Objectives
This study investigates three primary domains. First, it examines the correlations between student participation in specific blended learning activities and targeted digital competencies. Unlike prior research focusing on general trends, emerging findings indicate that varied pedagogical approaches in blended environments foster distinct competency dimensions. Second, the study explores whether prior blended learning experience moderates self-reported competence growth, specifically investigating if prior exposure heightens perceptions of skill acquisition. Third, it analyzes the link between intrinsic motivation and perceived competence growth, building on evidence that motivation shapes digital literacy.
The following research questions guide this investigation:
- (1)
- How do specific blended learning activities correlate with the development of specific digital competences?
- (2)
- To what extent does self-reported digital competence development differ between students with and without prior blended learning experience?
- (3)
- Is there a positive association between students’ intrinsic motivation and their perceived development of digital competencies?
- (4)
- Can distinct student digital competence clusters be identified, and how do they associate with demographics and teaching approaches?
2. Materials and Methods
2.1. Participants
We employed a convenience sampling strategy, wherein instructors distributed online survey links to students in upper-secondary technical programs. Participation was voluntary, yielding a final sample of 106 students. The cohort comprised 98 males (92.5%) and 7 females (6.6%), with one student declining to disclose gender. The average age of participants was 17.1 years. Given the demographics typical of VET programs such as Computer Technician, Auto Technician, Electrical Engineering Technician, and Mechanical Engineering Technician, the male majority was anticipated.
Participants were enrolled in either 4-year or 5-year curriculum structures. At the time of data collection, the distribution across academic years was as follows: 56 students in the third year, 36 in the second year, 11 in the first year, and 3 in the fourth year.
2.2. Measures
The survey consisted of two primary sections: (1) demographic information (e.g., gender, age, program, year of study) and (2) blended learning experiences, including teaching approaches, competencies development, and motivation.
2.2.1. Teaching Approaches
We utilized a 10-item scale to measure the frequency of instructional methods. Participants rated how often specific methods were employed in their classes on a 5-point scale ranging from 0 (“Never”) to 4 (“Almost Always”).
As indicated in Table 1, live online sessions and pre-recorded videos were the most frequently reported approaches2. The scale demonstrated acceptable internal consistency (Cronbach’s alpha of 0.789).
Table 1.
Descriptive statistics for the scale of teaching approaches.
2.2.2. Digital Competence Areas
Digital competence was assessed using a checklist adapted from the Students’ Digital Competence Scale (SDiCoS) [13], which evaluates students’ digital skills across multiple domains. To align with the DigComp threshold-based philosophy, items were presented in a dichotomous format (Yes/No), generating a composite index (range 0–9) representing the breadth of self-reported attainment [27]. While this approach reduces granularity regarding proficiency levels, it effectively captures cohort-level acquisition patterns. Participants were permitted to select multiple categories to indicate the skills they had developed. Responses were coded as 0 (Not Selected) or 1 (Selected).
Table 2 shows that students most frequently reported developing practical digital skills in association with their blended learning experiences, especially in using digital tools and software (71.3%), online communication (67.8%), and collaboration tools (65.5%). Fewer reported progress in creating content (41.4%) or understanding digital ethics (40.2%), suggesting stronger gains in operational than in critical or ethical com.
Table 2.
Digital Competence Areas Associated with Blended Learning Participation.
2.2.3. Prior Blended-Learning Experience
Prior experience was assessed via a single dichotomous question: “Have you ever been involved in blended learning before?”. Responses were coded as 0 (No) and 1 (Yes). Approximately 45.3% of the sample reported prior blended learning experience, while 54.7% indicated no previous exposure (Table 3).
Table 3.
Prior Experience with Blended Learning.
According to Table 3, just over half of the students (54.7%) reported having no prior experience with blended learning, whereas 45.3% indicated previous participation in blended learning environments.
2.2.4. Student Motivation
The student motivation evaluation was carried out using multiple scales from the Intrinsic Motivation Inventory (IMI). The IMI has been validated through many studies before ours. Four of the seven IMI scales were used in this study along with slight adjustments. The “Interest/Enjoyment” (IMI-I) scale contained four self-report items (for example, “I would call BL very interesting”), measured overall interest in the material. The “Effort/Importance” (IMI-E) subscale was made up of four items (for instance, “I put a lot of time into my BL course”) and measures the degree to which an activity is important to one’s motivation. The “Perceived Competence” (IMI-C) subscale consisted of four self-report items (such as, “After doing BL for a little while, I felt pretty confident”), is linked to self-report and observable intrinsic motivation. The “Value/Usefulness” (IMI-V) subscale contained four items (for example, “I believe the things I did in BL could have some worth”), measures the degree to which an activity is seen as valuable and/or relevant. Each item was rated on a five-point Likert-type scale from 1 (“strongly disagree”) to 5 (“strongly agree”).
Table 4 summarizes the results of the students’ ratings for each of the scales in the IMI. All four motivation scales demonstrated good to excellent internal consistency, with Cronbach’s Alpha values ranging from 0.80 to 0.88. Data indicated that the blended learning activities were both motivating and enhancing their confidence in performing the tasks. The averages for effort/importance were just under those of the other three scales but still above the midpoint on the rating scale, indicating the students generally experienced the courses positively.
Table 4.
Descriptive statistics for the student’s motivation scale (N = 94).
2.2.5. Instrument Validation
To ensure valid results, all research tools were checked for validity using established methods before data collection. For the digital competence scale we used Students’ Digital Competence Scale (SDiCoS; [13] framework with content validation by three education technology experts; the intrinsic motivation scale applied Ryan and Deci’s [28] subscales, showing good reliability (α > 0.78) and convergent validity with self-efficacy (r = 0.52, p < 0.001); the teaching approaches items were developed in consultation with blended learning literature and pilot-tested with five instructors for relevance and clarity. All scales showed acceptable internal consistency in our sample (α = 0.79–0.88).
2.3. Data Analysis
To address RQ1, we calculated Pearson correlation coefficients to assess the relationship between the frequency of online learning activities (scale 1–5) and the development of specific digital competencies (coded 0/1). Standard procedures for bivariate correlations and significance testing were performed using SPSS version 21.
For RQ2, Chi-square tests of independence were conducted to investigate differences in digital competence development between students with and without prior blended learning experience. Each of the nine competencies was analyzed as a separate binary outcome. Fisher’s Exact Test was utilized when expected cell frequencies fell below five. Effect sizes were computed using Cramér’s V.
To address RQ3, we employed descriptive and inferential statistics to evaluate the relationship between intrinsic motivation and digital competence. Subscale scores were calculated by averaging corresponding items (after reverse-coding negative items). A Total Digital Competence Index was created by summing the selected competencies. We assessed bivariate relationships using Pearson correlation and modeled the index as a function of the IMI subscales.
Finally, for RQ4, we conducted a hierarchical cluster analysis to classify students into distinct groups based on their digital skill profiles. We utilized the average linkage (between-groups) method and squared Euclidean distances, a suitable approach for binary data. The optimal number of clusters was determined by inspecting the dendrogram and agglomeration coefficients for significant increases in fusion coefficients.
At this point, we must also address the missing values in some tests. Of 106 initial participants, 87 (82%) provided complete data for cluster analysis; remaining analyses used available cases (N = 90–96). Missingness (11–19 cases) was primarily due to skipped items (e.g., VR/AR activities) rather than entire surveys. Crucially, participants with complete vs. incomplete data did not differ significantly in main variables observed. This absence of systematic differences supports the assumption that data were missing completely at random (MCAR), minimizing bias risk [29].
3. Results
This section details the findings relative to the four research questions: (1) correlations between learning activities and competence; (2) comparisons based on prior blended learning experience; (3) the relationship between intrinsic motivation and competence; and (4) the identification of distinct digital competence profiles.
3.1. Correlations Between Blended Learning Activities and Digital Competence Development (RQ1)
Table 5 displays the Pearson correlations between frequency of online learning activities and students’ self-reported digital competence indicators. Teaching approaches variables used 5-point frequency scales, which are inherently non-normal. As expected, Shapiro–Wilk tests confirmed deviations from normality (all W < 0.80, p < 0.001). Nonetheless, Pearson correlations were retained because: (1) the 5-point scale meets recommendations for parametric use when distributions are not severely skewed [30]; (2) scatterplots showed approximately linear relationships; and (3) Spearman’s ρ values closely mirrored Pearson’s r (e.g., r = 0.36, ρ = 0.34 for Frequent quizzes × Using Digital Tools), supporting result stability. Thus, Pearson coefficients are reported for interpretability and consistency with prior literature.
Table 5.
Pearson Correlations Between Online Learning Activities and Digital Competence Indicators.
The findings in Table 5 suggest that interactive and collaborative learning formats have the highest positive association with developing digital competence. Specifically, group projects had a very high positive correlation with a number of competencies including collaboration tools (r = 0.59, p < 0.001), online communication (r = 0.61, p < 0.001), and using digital tools (r = 0.31, p < 0.001). Likewise, frequent quizzes were positively correlated with the competencies; using digital tools (r = 0.36, p < 0.001), collaboration tools (r = 0.28, p < 0.01), and technology for learning (r = 0.25, p < 0.01), which indicates that structured engagement and feedback enhance students’ ability to apply their digital skills. The other two formats of learning (VR/AR and interactive video) showed varying degrees of association with digital competence. While both formats were positively correlated with using digital tools (r = 0.31, p < 0.001; r = 0.28, p < 0.01, respectively), there was a small or negative correlation between these formats and both online safety (r = −0.18, p < 0.05; r = −0.12, ns) and content creation (r = −0.21, p < 0.01; r = −0.20, p < 0.05), indicating that novelty is not sufficient to produce competence. Traditional or passive learning formats (live sessions and pre-recorded videos) displayed low or non-significant associations with all three areas of digital competence. Overall, the results indicate that active and task-based learning formats, particularly those with opportunities for collaboration (e.g., group projects) and assessment (e.g., quizzes), are the most effective at enhancing students’ overall digital competence across a variety of areas.
3.2. Differences in Competence Development by Prior Blended Learning Experience (RQ2)
Table 6 presents the percentage of students reporting each digital competence level, comparing those with and without prior blended learning experience. To assess if a student’s prior exposure to blended learning will be correlated with their self-assessment of their digital competence as it relates to different digital skills, the percentage of students who reported being confident with each of the skills is presented in Table 6, comparing students who had prior exposure to blended learning and those that did not have prior exposure to blended learning.
Table 6.
Percentage of Students Reporting Digital Competence by Prior Blended Learning Experience.
The results show consistent and often significant differences between the two groups. Students without prior blended learning experience reported higher confidence in several basic and tool-related competences, such as using digital tools and software (82.7% vs. 54.3%, χ2 = 6.91, p < 0.01), understanding online safety and privacy, using online collaboration tools, and troubleshooting computer problems. Conversely, students with blended learning experience scored higher on more advanced, creative, or learning-oriented competences—particularly creating and publishing digital content (57.1% vs. 30.8%) and using technology for learning and research (70.3% vs. 45.3%).
Although several of the χ2 values indicate statistically significant differences, the overall pattern suggests that previous blended learning may encourage deeper, learner-centered uses of technology, while those without such experience appear more confident in routine technical operations.
3.3. Relationship Between Intrinsic Motivation and Perceived Digital Competence Growth (RQ3)
To examine the relationship between students’ overall digital competence and their intrinsic motivation components, Pearson correlation analyses were conducted between the total Digital Competence Index and each intrinsic motivation subscale (Table 7). Shapiro–Wilk tests indicated slight deviations from normality for all motivation subscales (W = 0.948–0.973, p = 0.001–0.045), consistent with the bounded nature of Likert-scale data. However, Pearson correlations remain appropriate given skewness values are within acceptable limits (<1.1 for all subscales); and Spearman’s rank correlations showed near-identical patterns (mean |r − ρ| = 0.03), confirming result robustness [30].
Table 7.
Pearson Correlations Between Total Digital Competence Index and Intrinsic Motivation Subscales.
Results in Table 7 indicated there was an extremely high association, positively, between students’ level of digital competence and several of the motivational components. Value/Usefulness had the highest correlation to the overall competence (r = 0.76, p < 0.001), which indicates students who see digital skills as being useful or valuable are more likely to be at higher levels of competence than those that do not. Perceived Competence demonstrated a moderate, positive association to the competence index (r = 0.59, p < 0.001), indicating a relationship exists between a student’s self-efficacy and their actual level of digital competence. A moderate, but statistically significant association existed for effort/importance (r = 0.53, p < 0.001) and interest/enjoyment (r = 0.30, p < 0.001), which suggests a motivation to invest time and attention into digital activities will result in higher competence. High inter-correlation existed among the motivation sub-scales (specifically, perceived competence and interest/enjoyment (r = 0.70, p < 0.001)) and suggest students who have confidence in their ability to learn digitally also have enjoyment from the learning process. Collectively the results suggest an interaction effect between competence and motivation; where students enjoy, value and believe they can use digital skills will significantly increase the likelihood of them demonstrating greater digital proficiency.
3.4. Student Clusters Based on Digital Competence Profiles and Related Course Factors (RQ4)
Three student categories were found through hierarchical cluster analysis, as a result of the way that the demographic, educational and experiential attributes of the students differed from one another. Cross-tabulation tests were used to explore whether there are associations between cluster groupings and other categorical variables: gender, technical subject area, study level (year) and previous blended learning experience. The distribution of students across these variables is shown in Table 8.
Table 8.
Distribution of Students Across Clusters by Key Variables (N = 87).
There was no significant association identified between gender and cluster membership, χ2(2, N = 87) = 0.09, p = 0.954, meaning that gender distribution is consistent across all clusters; therefore, gender will be excluded as an interpretive label for cluster membership. However, a significant relationship was identified between cluster membership and student’s technical discipline, χ2(4, N = 87) = 19.18, p < 0.001, Cramér’s V = 0.33, meaning that students in mechanical and mechatronics disciplines were over-represented in Cluster 3 (90%), while students in electrical and computer disciplines were more represented in Clusters 1 & 2. Furthermore, study year did not have a significant association with cluster membership, χ2(2, N = 87) = 1.12, p = 0.572, which suggests that students who started their program earlier or later in time were distributed relatively evenly. On the other hand, students’ prior experiences in blended learning significantly differed by cluster membership, χ2(2, N = 87) = 11.65, p = 0.003, Cramér’s V = 0.37, where 90% of Cluster 3 students reported some form of prior blended learning experience when compared to approximately 1/3 of Cluster 1 & 2 students.
Cluster names were determined through a synthesis of students’ academic progress, technical area of study, and prior blended learning engagement. These elements were interpreted as indicators of blended learning competence. Cluster 1 was termed Advancing Practitioners primarily consisting of upper-level students with technical specialization in the middle range and limited prior blended learning experience; thus indicating developing competence. The largest and most diverse cluster was labeled Emerging Practitioners, consisting of primarily first year students with minimal exposure to blended learning; this indicates foundational competence. The final cluster, comprising predominantly upper-level mechanical/mechatronic students with extensive prior blended learning experience, was labeled Experienced (High Competence) Practitioners.
Collectively, the above-mentioned clusters provide evidence of a continuum of development in blended learning competence from emerging (Cluster 2) to advancing (Cluster 1) to experienced (Cluster 3).
The differences observed across the clusters are in terms of educational maturity, technical specialization, and prior blended learning participation rather than demographic differences.
To gain insight into how different participant groups integrate the use of individual instruction techniques within the context of the blended learning course, the frequencies of various teaching approaches applied across three competence-based clusters were examined using ANOVA (see Table 9).
Table 9.
Descriptive Statistics and ANOVA Results for Teaching Approaches by Competence-Based Cluster (N = 87).
As shown by the data in Table 9, there is little difference in the teaching methods that the three different competence-based clusters reported using for each activity. Most mean scores range from 1.4 to 2.6 on a scale of 0–4, which implies that the teaching strategies were rarely or never used. There was no statistical significance in the first five items in terms of live sessions, video lectures, expert talks, group or online projects, and breakout room interactions, so it appears that all three clusters used them at similar frequencies.
There was greater variability in the use of the next set of strategies. Significant effects were found for live sessions (F = 2.91, p < 0.05), expert talks (F = 3.15, p < 0.05) and group or online projects (F = 4.13, p < 0.01). These results indicate that participants who were classified as high competence level practitioners tend to be more likely to use more interactive and collaborative ways of instruction than the other two groups. Although the differences are small, this could indicate an evolving trend from more teacher-centered ways of instructional delivery to more student participative ways of instructional delivery as students move up the competence level scale.
4. Discussion
The current study assessed how specific blended learning activities (e.g., quizzes, group projects, VR/AR, interactive video), prior blended learning experience, and intrinsic motivation relate to students’ digital competence. In doing so, it addresses a significant gap in the literature by linking specific pedagogical activities to multivariate indices of digital competence within a single cohort, rather than treating “technology use” as a generic variable. Additionally, by synthesizing correlational data, group comparisons, and cluster analysis, this study provides a detailed explication of how participatory design and motivational factors interact to influence competence development.
4.1. Participatory Activities and Competence Development
Participatory activities generally align with overall competence development. Designs that promote collaborative and task-based interaction (particularly group projects and frequent low-stakes quizzes) show the strongest positive correlations with multiple competence indicators, including collaboration tools, communication, and technology for learning. These findings are supported by meta-analytic evidence showing that both active and collaborative learning produce correlated with better performance and transferable skills in higher education [31,32] and by research on the “testing effect,” which demonstrates that frequent practice and feedback enhance applied knowledge and strategic tool usage [33]. These studies collectively suggest that digital competences develop most effectively when students co-create products, coordinate tasks, and iteratively access and used information—conditions inherent to well-designed group work and formative assessments.
4.2. Associations Between Immersive Technology Use and Digital Competence
Immersive technologies, such as VR, AR, and interactive video, show positive relationships with digital tool use but do not consistently benefit all aspects of learning. Neither online safety nor content creation exhibit significant positive associations, and prior research warns that highly immersive media may increase cognitive load or divert focus from metacognitive regulation and transfer. Research has found that VR often causes an increase in cognitive load that hinders learning, particularly among novices, and that the technology’s effectiveness depends critically on how well it manages cognitive demand [34,35]. However, educational applications of augmented reality have been found to significantly enhance learners’ motivation, confidence, attention, creativity, and satisfaction by offering immersive and interactive experiences, which not only increase engagement but also support the development of digital competences and self-directed learning [36]. These results suggest that while novel or interactive media can rapidly boost relevant fluency, explicit scaffolding (such as guidance for responsible use and structured authoring tasks) is essential to translate technological novelty into meaningful competence development.
4.3. Effect of Prior Blended Learning Experience
Prior blended experience distinguishes between “routine” versus “developmental” competences. Students who did not report prior blended learning experience reported significantly higher levels of confidence regarding routine, tool-based operational competences (e.g., basic software use, troubleshooting) compared to those students reporting prior blended experience who reported higher levels of advanced, learning-based competences (e.g., content creation, using technology for learning/research). The distribution of digital competences between students with and without prior blended learning experience reveals a complex developmental pattern that warrants multifaceted interpretation. Our finding that inexperienced students report higher confidence in routine, tool-based competences while experienced students excel in advanced, learning-oriented competences could reflect several interrelated phenomena. First, this pattern may partially stem from metacognitive limitations in self-assessment among less experienced learners, where limited exposure to authentic digital challenges leads to overestimation of basic skills—a manifestation of the Dunning–Kruger effect where individuals with lower competence lack the metacognitive awareness to accurately evaluate their own abilities. Research consistently shows that low performers tend to overestimate their skills in self-assessments, particularly in technical domains where objective benchmarks are not immediately apparent [37]. Second, students with prior blended learning experience may have consolidated their fundamental operational skills through repeated authentic application, allowing cognitive resources to shift toward strategic and creative dimensions of digital competence. When grounded in the theory of technology as a cognitive partner, authentic technology integration demonstrably enhances understanding [38]. This theoretical principle is supported empirically by research such as the meta-analysis by Zheng et al. [39], which found that the positive effects on student learning were strongest in one-to-one laptop programs where technology was deeply integrated into daily instructional practices, not used peripherally. Third, research consistently supports the idea that digital competence starts with learning the tools and ends with applying them to real work. The UNESCO [40]) ICT Competency Framework illustrates this by plotting a path from “Knowledge Acquisition,” where users simply learn to operate technology, to “Knowledge Deepening,” where they use those skills to master specific subjects. This aligns with Martin and Grudziecki’s [41] findings, which separate basic “Digital Competence” from “Digital Usage”— the stage where skills are properly applied within a professional context. This suggests that prior blended experience functions not merely as exposure but as a catalyst for reconceptualizing technology’s role in learning.
4.4. Interdependence of Motivation and Competence
Motivation and competence appear to be interdependent. The Digital Competence Index was found to correlate most strongly with perceived value/usefulness and perceived competence, followed by effort/importance and interest/enjoyment. According to Self-Determination Theory, perceived value (identified regulation) and efficacy beliefs interact to predict long-term commitment to engagement and performance [28]. Meta-analytic research has shown that intrinsic motivation and identified motivation have been consistently linked to performance and persistence in cognitively demanding tasks [42]. Collectively, the present pattern suggests a mutual reinforcement of competence and motivation; as students perceive the utility of digital skills and perceive themselves as competent in applying them, they invest more time and derive more interest, further strengthening their competence. Instructionally, this suggests that incorporating value-affirming activities with direct relevance and structured success experiences is important.
4.5. Competence Clusters and Interactive Teaching Approaches
Competence clusters correlate to a moderate degree with more interactive teaching approaches. The three competence clusters (emerging, advancing, and experienced) differed most clearly in terms of prior blended learning experience and disciplinary background. Specifically, experienced practitioners reported using interactive and collaborative approaches, such as group projects, slightly more often. Although these effects were small, they align with research linking stronger self-regulation and technology-supported collaboration to higher engagement in blended learning environments [43]. Collectively, these findings support a developmental continuum: as competence and prior experience grow, learners benefit more from participatory designs.
5. Limitations
Several limitations of this study warrant consideration. First, the cross-sectional design and absence of baseline competence measures preclude conclusions about development or improvement over time; findings reflect associations between current self-perceived competence and blended learning experiences, not causal effects. Second, reliance on retrospective self-assessment introduces potential biases, including social desirability [44] and metacognitive limitations in self-evaluation [45]. Third, convenience sampling from technical schools in a single region limits generalizability to other educational contexts or national systems. Finally, the strong male majority in the sample (92%) reflects local enrollment patterns but constrains applicability to more gender-balanced populations, especially given documented gender differences in digital self-efficacy and tool preferences [46].
6. Future Research Perspectives
While this study advances our understanding of how blended learning activities, experience, and motivation shape digital competence, further investigation is required. Future research should prioritize longitudinal designs with pre- and post-assessment measures to establish causality and track competence development over time. To address the potential limitations of self-reporting—specifically the Dunning–Kruger effect among inexperienced learners—objective performance-based assessments should be employed to triangulate self-report data. Additionally, cross-disciplinary investigations are necessary to determine whether these findings generalize beyond technical domains. Finally, the feedback loop between motivation and competence requires longitudinal investigation, potentially using experience sampling or real-time engagement data to identify critical intervention points.
7. Conclusions
This research contributes to the literature on blended learning in VET by highlighting four key insights. First, participatory and feedback-rich activities demonstrate the strongest association with digital competence development. Second, prior experience with blended learning is linked to the acquisition of advanced, learning-focused skills rather than merely basic operational proficiency. Third, intrinsic motivation and digital competence are deeply intertwined, suggesting that value and self-efficacy are critical drivers of skill acquisition. Finally, students with higher competence are more likely to engage with collaborative teaching approaches. These findings provide actionable evidence for instructional designers, suggesting that connecting active learning strategies with motivational support can accelerate the transition from basic digital literacy to creative, discipline-specific proficiency.
Author Contributions
Conceptualization, D.M.R. and M.R.; methodology, D.M.R. and M.R.; validation, D.M.R. and M.R.; formal analysis, M.R.; investigation, D.M.R. and M.R.; resources, D.M.R. and M.R.; data curation, M.R.; writing—original draft preparation, D.M.R.; writing—review and editing, D.M.R. and M.R.; visualization, D.M.R. and M.R.; supervision, D.M.R.; project administration, D.M.R. and M.R.; funding acquisition, D.M.R. All authors have read and agreed to the published version of the manuscript.
Funding
This research was funded by EEA Grants, grant number ATP213.
Institutional Review Board Statement
Ethical approval was not required, as educational opinion surveys in Slovenia do not fall under ethics committee review. Nonetheless, we upheld the highest ethical standards: data were collected anonymously, participation was voluntary, and responses remained confidential.
Informed Consent Statement
Informed consent was obtained from all subjects involved in the study.
Data Availability Statement
The anonymized datasets generated during and/or analyzed during the study are available from the corresponding author upon reasonable request.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Garrison, D.R.; Kanuka, H. Blended Learning: Uncovering Its Transformative Potential in Higher Education. Internet High. Educ. 2004, 7, 95–105. [Google Scholar] [CrossRef]
- Graham, C.R. Blended learning systems: Definition, current trends and future directions. In The Handbook of Blended Learning: Global Perspectives, Local Designs; Bonk, C.J., Graham, C.R., Eds.; Pfeiffer Publishing: San Francisco, CA, USA, 2006; pp. 3–21. [Google Scholar]
- Song, S.; Lai, Y.C. Blended learning in vocational education: Benefits, challenges, and student engagement. Cogent Educ. 2025, 12, 2548348. [Google Scholar] [CrossRef]
- Consoli, T.; Schmitz, M.-L.; Antonietti, C.; Gonon, P.; Cattaneo, A.; Petko, D. Quality of technology integration matters: Positive associations with students’ behavioral engagement and digital competencies for learning. Educ. Inf. Technol. 2025, 30, 7719–7752. [Google Scholar] [CrossRef]
- Szopiński, T. E-learning Acceptance Model in the Post-Pandemic World. J. Perspect. Econ. Polit. Soc. Integr. 2025, 30, 109–133. [Google Scholar] [CrossRef]
- Chaw, L.Y.; Tang, C.M. Exploring the relationship between digital competence proficiency and student learning performance. Eur. J. Educ. 2024, 59, e12593. [Google Scholar] [CrossRef]
- Vidal, I.M.G.; López, B.C.; Otero, L.C. Nuevas competencias digitales en estudiantes potenciadas con el uso de Realidad Aumentada. Estudio piloto. RIED-Rev. Iberoam. Educ. Distancia 2021, 24, 137–157. [Google Scholar] [CrossRef]
- Ng, D.T.K.; Leung, J.K.L.; Su, J.; Ng, R.C.W.; Chu, S.K.W. Teachers’ AI digital competencies and twenty-first century skills in the post-pandemic world. Educ. Technol. Res. Dev. 2023, 71, 137–161. [Google Scholar] [CrossRef]
- Cui, Y.; Li, M.; Luo, Y. Strategies for Conducting Blended Learning in VET: A Comparison of Award-Winning Courses and Daily Courses. Behav. Sci. 2025, 15, 787. [Google Scholar] [CrossRef]
- Hervás-Torres, M.; Bellido-González, M.; Soto-Solier, P.M. Digital competences of university students after face-to-face and remote teaching: Video-animations digital create content. Heliyon 2024, 10, e32589. [Google Scholar] [CrossRef]
- Myyry, L.; Kallunki, V.; Katajavuori, N.; Repo, S.; Tuononen, T.; Anttila, H.; Kinnunen, P.; Haarala-Muhonen, A.; Pyörälä, E. COVID-19 Accelerating Academic Teachers’ Digital Competence in Distance Teaching. Front. Educ. 2022, 7, 770094. [Google Scholar] [CrossRef]
- Gaddis, M.L. Faculty and Student Technology Use to Enhance Student Learning. Int. Rev. Res. Open Distrib. Learn. 2020, 21, 39–60. [Google Scholar] [CrossRef]
- Tzafilkou, K.; Perifanou, M.; Economides, A.A. Development and validation of students’ digital competence scale (SDiCoS). Int. J. Educ. Technol. High. Educ. 2022, 19, 30. [Google Scholar] [CrossRef] [PubMed]
- Simonova, I.; Faltynkova, L.; Kostolanyova, K. New Blended Learning Enriched after the COVID-19 Experience? Students’ Opinions. Sustainability 2023, 15, 5093. [Google Scholar] [CrossRef]
- Boie, M.A.K.; Dalsgaard, C.; Caviglia, F. Digital instinct—A keyword for making sense of students’ digital practice and digital literacy. Br. J. Educ. Technol. 2024, 55, 668–686. [Google Scholar] [CrossRef]
- Li, W.; Xue, Z.; Li, J.; Wang, H. The interior environment design for entrepreneurship education under the virtual reality and artificial intelligence-based learning environment. Front. Psychol. 2022, 13, 944060. [Google Scholar] [CrossRef] [PubMed]
- Tan, X.; Lin, X.; Zhuang, R. Development and validation of a secondary vocational school students’ digital learning competence scale. Smart Learn. Environ. 2024, 11, 37. [Google Scholar] [CrossRef]
- Li, J.; Zhang, J.; Chai, C.S.; Lee, V.W.; Zhai, X.; Wang, X.; King, R.B. Analyzing the network structure of students’ motivation to learn AI: A self-determination theory perspective. Npj Sci. Learn. 2025, 10, 48. [Google Scholar] [CrossRef] [PubMed]
- Cotter, L.M.; Shah, D.; Brown, K.; Mares, M.-L.; Landucci, G.; Saunders, S.; Johnston, D.C.; Pe-Romashko, K.; Gustafson, D.; Maus, A.; et al. Decoding the Influence of eHealth on Autonomy, Competence, and Relatedness in Older Adults: Qualitative Analysis of Self-Determination Through the Motivational Technology Model. JMIR Aging 2024, 7, e56923. [Google Scholar] [CrossRef]
- An, F.; Xi, L.; Yu, J. The relationship between technology acceptance and self-regulated learning: The mediation roles of intrinsic motivation and learning engagement. Educ. Inf. Technol. 2024, 29, 2605–2623. [Google Scholar] [CrossRef] [PubMed]
- Schmidt-Hertha, B.; Bernhardt, M. Pedagogical Relationships in Digitised Adult Education. Stud. Adult Educ. Learn. 2022, 28, 11–24. [Google Scholar] [CrossRef]
- Goropečnik, L.; Kropivšek, J.; Kristl, N.; Radovan, D.M. The effect of students’ academic motivation on their self-perceived digital and sustainability competencies in wood science and technology education. BioResources 2025, 21, 267–287. [Google Scholar] [CrossRef]
- Zhang, L.; Han, Y.; Zhou, J.-L.; Liu, Y.-S.; Wu, Y. Influence of intrinsic motivations on the continuity of scientific knowledge contribution to online knowledge-sharing platforms. Public Underst. Sci. 2021, 30, 369–383. [Google Scholar] [CrossRef]
- Huang, J.; Zhou, L. Gamification, intrinsic motivation and work engagement in app-work: Moderating effects of algorithmic control. Internet Res. 2025. [Google Scholar] [CrossRef]
- Swiatczak, M.D. Towards a neo-configurational theory of intrinsic motivation. Motiv. Emot. 2021, 45, 769–789. [Google Scholar] [CrossRef]
- Alonso, R.K.; Vélez, A.; Martínez-Monteagudo, M.C. Interventions for the Development of Intrinsic Motivation in University Online Education: Systematic Review—Enhancing the 4th Sustainable Development Goal. Sustainability 2023, 15, 9862. [Google Scholar] [CrossRef]
- Vuorikari, R.; Kluzer, S.; Punie, Y. DigComp, The Digital Competence Framework for Citizens: With New Examples of Knowledge, Skills and Attitudes; Publications Office of the European Union: Luxembourg, 2022; Available online: https://data.europa.eu/doi/10.2760/115376 (accessed on 7 December 2025).
- Ryan, R.M.; Deci, E.L. Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am. Psychol. 2000, 55, 68–78. [Google Scholar] [CrossRef] [PubMed]
- Enders, C.K. Applied Missing Data Analysis, 2nd ed.; in Methodology in the social sciences; The Guilford Press: New York, NY, USA; London, UK, 2022. [Google Scholar]
- Norman, G. Likert scales, levels of measurement and the ‘laws’ of statistics. Adv. Health Sci. Educ. 2010, 15, 625–632. [Google Scholar] [CrossRef]
- Freeman, S.; Eddy, S.L.; McDonough, M.; Smith, M.K.; Okoroafor, N.; Jordt, H.; Wenderoth, M.P. Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. USA 2014, 111, 8410–8415. [Google Scholar] [CrossRef] [PubMed]
- Springer, L.; Stanne, M.E.; Donovan, S.S. Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: A meta-analysis. Rev. Educ. Res. 1999, 69, 21–51. [Google Scholar] [CrossRef]
- Roediger, H.L.; Karpicke, J.D. Test-Enhanced Learning: Taking Memory Tests Improves Long-Term Retention. Psychol. Sci. 2006, 17, 249–255. [Google Scholar] [CrossRef]
- Makransky, G.; Lilleholt, L. A structural equation modeling investigation of the emotional value of immersive virtual reality in education. Educ. Technol. Res. Dev. 2018, 66, 1141–1164. [Google Scholar] [CrossRef]
- Radianti, J.; Majchrzak, T.A.; Fromm, J.; Wohlgenannt, I. A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned, and research agenda. Comput. Educ. 2020, 147, 103778. [Google Scholar] [CrossRef]
- Iqbal, M.Z.; Mangina, E.; Campbell, A.G. Current Challenges and Future Research Directions in Augmented Reality for Education. Multimodal Technol. Interact. 2022, 6, 75. [Google Scholar] [CrossRef]
- Mahmood, K. Do People Overestimate Their Information Literacy Skills? A Systematic Review of Empirical Evidence on the Dunning-Kruger Effect. Commun. Inf. Lit. 2016, 10, 3. [Google Scholar] [CrossRef]
- Jonassen, D.H. Computers as Mindtools for Schools: Engaging Critical Thinking; Prentice Hall: Upper Saddle River, NJ, USA, 2000. [Google Scholar]
- Zheng, B.; Warschauer, M.; Lin, C.-H.; Chang, C. Learning in One-to-One Laptop Environments. A Meta-Analysis and Research Synthesis. Rev. Educ. Res. 2016, 86, 1052–1084. [Google Scholar] [CrossRef]
- UNESCO. UNESCO ICT Competency Framework for Teachers: Version 3; UNESCO: Paris, France, 2018; Available online: https://unesdoc.unesco.org/ark:/48223/pf0000265721 (accessed on 15 October 2025).
- Martin, A.; Grudziecki, J. DigEuLit: Concepts and Tools for Digital Literacy Development. Innov. Teach. Learn. Inf. Comput. Sci. 2006, 5, 249–267. [Google Scholar] [CrossRef]
- Cerasoli, C.P.; Nicklin, J.M.; Ford, M.T. Intrinsic motivation and extrinsic incentives jointly predict performance: A 40-year meta-analysis. Psychol. Bull. 2014, 140, 980–1008. [Google Scholar] [CrossRef] [PubMed]
- Broadbent, J.; Poon, W.L. Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review. Internet High. Educ. 2015, 27, 1–13. [Google Scholar] [CrossRef]
- Podsakoff, P.M.; MacKenzie, S.B.; Lee, J.-Y.; Podsakoff, N.P. Common method biases in behavioral research: A critical review of the literature and recommended remedies. J. Appl. Psychol. 2003, 88, 879–903. [Google Scholar] [CrossRef] [PubMed]
- Kruger, J.; Dunning, D. Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. J. Pers. Soc. Psychol. 1999, 77, 1121–1134. [Google Scholar] [CrossRef]
- van Deursen, A.J.; van Dijk, J.A. The first-level digital divide shifts from inequalities in physical access to inequalities in material access. New Media Soc. 2019, 21, 354–375. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).