Next Article in Journal
COVID-19’s Impact on Türkiye’s Lemon Exports: Constant Market Share Decomposition (2015–2024)
Previous Article in Journal
Analysis of the Efficiency and Environmental Impact of Municipal Solid Waste Incineration as a Tool for Sustainability Development in Kazakhstan
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Promoting Sustainable Medical Education Through Digital Competence: A Cross-Sectional Pilot Study

Faculty of Health Sciences, University of Primorska, Polje 42, 6310 Izola, Slovenia
*
Author to whom correspondence should be addressed.
Sustainability 2025, 17(19), 8699; https://doi.org/10.3390/su17198699 (registering DOI)
Submission received: 12 August 2025 / Revised: 18 September 2025 / Accepted: 26 September 2025 / Published: 27 September 2025

Abstract

The increasing digitalisation of medical education requires teachers to have a broad range of competences that go beyond basic technical knowledge. This pilot cross-sectional study assessed the self-perceived digital competence of medical faculty members and examined differences by professional role, experience and gender. Of 298 eligible staff, 48 participated (response rate 16.1%), including 19 women (39.6%) and 29 men (60.4%). The data was collected via an online survey using the validated Digital Competence Scale for University Teachers, which comprises four subscales: digital literacy, digital skills, digital interaction and technology integration. The overall median score indicated a generally high level of self-perceived digital competence, with 95% bootstrap confidence intervals confirming this pattern. Assistant professors achieved higher scores in all subscales than associate and full professors. Self-perceived digital competence was positively correlated with participation in structured training, higher interest and frequency of use of digital tools, while age and teaching experience were negatively correlated. The findings suggest unequal levels of self-perceived digital competence across the academic status and highlight the positive association of self-perceived digital competence with participation in targeted, practical and inclusive training programmes.

1. Introduction

The accelerating digital transformation of society has not only changed the competences required for effective teaching at all levels of education, but has also become a central element in achieving long-term sustainability goals, especially in the context of inclusive and equitable quality education [1,2,3]. In this context, digital competence has emerged as a key qualification for teachers that goes beyond mere mastery of technology to encompass pedagogical innovation, ethical awareness and the critical use of digital tools [4]. Frameworks such as the European DigCompEdu model emphasise that digital competence encompasses professional engagement, the sourcing and creation of digital resources, the harmonisation of digital teaching and learning, the use of technology for assessment, the empowerment of learners and the promotion of students’ own digital skills, thus contributing to more sustainable, inclusive and resilient education systems [2,4]).
The digital transformation of higher education is not just a technological trend, but a crucial part of the sustainability vision emphasised in the global Sustainable Development Goals (SDG 4: Quality Education) and the European strategies for digital education [5]. The development of digital competences among higher education teachers contributes to more inclusive, accessible and adaptable learning environments that are resilient to disruptions such as pandemics or rapidly changing labour market demands [2,3,6]. In medical education, the aim is not only to improve pedagogical practises, but also to strengthen the digital readiness of future healthcare professionals, thereby supporting sustainable and high-quality healthcare in the long term [7,8].
In higher education, and particularly in the medical sciences, digital competences are critical to preparing students for increasingly technology-intensive clinical environments [8]. Research emphasises that effective digital teaching not only increases pedagogical flexibility and inclusivity, but also promotes deeper learning, critical thinking and student engagement [9]. However, studies consistently show that while many teachers have basic technical skills, there are still significant gaps in the pedagogical integration of digital technologies [10,11]. With the increasing integration of artificial intelligence (AI) tools such as ChatGPT into education, there is also a new demand for AI skills alongside broader digital competence [12,13].
In medical education, the situation is particularly complex. Medical teachers must not only master content knowledge and clinical expertise, but also integrate innovative digital methods such as telemedicine, simulation-based learning and AI-supported decision support systems into their pedagogical practise [8,14]. Despite the recognised importance of digital competence for medical teachers, existing evidence points to significant differences in competence levels, with many teachers expressing a need for structured training and institutional support [15,16]. Furthermore, it is becoming increasingly clear that digital competence is not static, but evolves with technological advances, including the ethical challenges and opportunities presented by AI, big data and immersive technologies [9,15].
Given these challenges and opportunities, it is critical to systematically assess the current state of digital competence of medical teachers to identify gaps and develop targeted strategies for professional development. While several international studies have examined digital competence in healthcare and education separately, there is little data that focuses specifically on the competences of medical teachers in a European context.

Aim of the Study

To address this gap, the present study was designed as a pilot cross-sectional study, aiming to explore the feasibility and relevance of assessing digital competences specifically among medical teachers. Conducted in Slovenia, the pilot focused exclusively on self-perceived competences to inform the design of a larger-scale research effort. The aim of the study was to investigate the strengths and weaknesses in the most important areas of digital competence and to identify priority areas for faculty development initiatives. The guiding research question was “What is the current level of self-perceived digital competence among medical teachers, and which areas require the most support for future professional development initiatives?”

2. Materials and Methods

A questionnaire-based cross-sectional design was used in this study. A quantitative research approach was chosen to systematically measure the self-perceived digital competence of medical teachers. The cross-sectional design was chosen because it allows for the efficient collection of standardised data from a larger sample at a single point in time, facilitating the objective identification of competency levels and priority areas for professional development, while providing a practical and cost-effective method for the initial exploration of this educational phenomenon [17]. The study is reported in accordance with the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) guidelines for observational research.

2.1. Sample and Setting

The study was conducted among the academic staff of two medical faculties in Slovenia. At the time of data collection, a total of 298 university teachers were employed. A convenience sample of 48 teachers participated in the survey, representing 16.1% of the target population. The sample comprised 26 assistant professors (54.2%), 18 associate professors (37.5%) and 4 full professors (8.3%). Of the respondents, 39.6% (n = 19) were female and 60.4% (n = 29) male. The average age was 50.56 years (SD = 9.66), with participants aged between 35 and 73 years. Participants reported an average of 16.73 years of teaching experience (SD = 8.29) and an average of 8.10 years (SD = 5.44) of experience with e-learning.

2.2. Instrument

The Digital Competence Scale for University Teachers (DCS-UT) was used to assess digital competence in this pilot study. This scale was specifically developed as part of the ongoing project ‘Development of a digital education standard in higher education for ensuring equity and accessibility in digital education’, which aims to create a digital education standard in higher education [6]. The DCS-UT is tailored to measure the digital competences of higher education teachers, which makes it highly relevant for this study. The scale ranges from 35 to 175 points and comprises 35 items rated on a 5-point Likert scale, with higher scores indicating better digital competence. It comprises four subscales: Digital literacy (17–85 points), Digital skills (6–30 points), Digital interaction (6–30 points) and Technology integration (6–30 points). In the original validation study [6], the scale showed excellent internal consistency (Cronbach’s α = 0.974), with subscale alphas of 0.966 for Digital literacy, 0.929 for Digital skills, 0.879 for Digital interaction and 0.791 for Technology integration, which provides the reliability evidence on which the present pilot study is based. Internal consistency was evaluated on the pilot sample as well (n = 48). The results indicated excellent reliability for the total DCS-UT scale (35 items; α = 0.980; ω = 0.981) and for all subscales: Digital literacy (17 items; Cronbach’s α = 0.958; McDonald’s ω = 0.959), Digital skills (6 items; α = 0.948; ω = 0.950), Digital interaction (6 items; α = 0.911; ω = 0.929), and Technology integration (6 items; α = 0.877; ω = 0.889).
All DCS-UT scores represent self-perceived (self-reported) digital competence, as participants rated their own practices and abilities; therefore, all findings in this study refer to self-perceived rather than objectively tested digital competence.

2.3. Data Collection

The data was collected via an online survey conducted between April and July 2024. Medical faculties were initially asked to support the study by distributing the survey to their academic staff. Following their consent, participants were invited via faculty-administered email invitations that included information about the purpose of the study, the voluntary nature of participation and assurances of confidentiality. The survey was conducted via the online platform 1KA (https://www.1ka.si/d/en accessed on 5 February 2022), and participants gave informed consent before proceeding with the questionnaire. The survey took approximately 10 min to complete.

2.4. Data Analysis

For data analysis, the data set was processed with IBM SPSS Statistics Version 29.0 (SPSS Inc., Chicago, IL, USA) and the Jamovi statistical platform (version 2.4). Descriptive statistics (medians, interquartile ranges, observed minimum and maximum values, and 95% bootstrap confidence intervals) were calculated for the DCS-UT total score and each subscale. Spearman’s correlation was used to examine relationships between participant age, the total DCS-UT score, and subscale scores. Group differences were examined using Mann–Whitney U tests for two-group comparisons (with r reported as an effect size) and Kruskal–Wallis tests for comparisons involving three or more groups (with η2 reported as an effect size). Where a Kruskal–Wallis test indicated a significant effect, pairwise post hoc comparisons were performed using the Dwass–Steel–Critchlow–Fligner (DSCF) procedure, which provides p-values already adjusted for multiple pairwise comparisons following a significant Kruskal–Wallis test. All tests were two-tailed with p < 0.05 as the significance threshold.

2.5. Ethical Considerations

Ethical approval for this study was obtained from the Commission of the University of Primorska for Ethics in Human Subjects Research (Approval No: 4264-16-3/2022). All participants provided written informed consent to participate.

3. Results

The mean age of the participants was 50.56 years (SD = 9.656), with a range of 35 to 73 years. The distribution of academic status among the participants was as follows: 54.2% assistant professors, 37.5% associate professors, and 8.3% full professors. Most university teachers have a positive attitude towards digital technologies in teaching. 58.4% stated that they used them either confidently or skilfully. The majority used digital tools frequently or always (60.4%), while only a minority said they rarely used them (10.4%). In addition, interest in learning about digital technologies was high, with 62.5% stating that they were actively or strongly motivated to improve their digital teaching methods.
Table 1 presents the medians, interquartile ranges (IQR), observed minimum and maximum values, and 95% bootstrap confidence intervals for the DCS-UT total score and each subscale in the pilot sample (n = 48).
Results show that median scores were highest for Digital literacy (Mdn = 57.0, 95% CI [48.0, 64.0]) and lowest for Technology integration (Mdn = 20.0, 95% CI [19.0, 24.0]). The total DCS-UT scale had a median of 113 (95% CI [100.0, 139.0]). Interquartile ranges (IQRs) indicated moderate variability, from 8.0 for Digital skills to 52.0 for the total scale. Observed scores ranged from 61 to 175, which is close to the full theoretical range (35–175), indicating reasonable dispersion and no serious floor or ceiling effects.
To investigate differences in self-perceived digital competence, participants were asked whether they had previously participated in a structured training or workshop aimed at improving the implementation of e-learning (excluding the general use of platforms such as Zoom). A Mann–Whitney U-test showed that participants with previous training had significantly higher scores on the digital literacy (U = 179.00, Z = −2.17, p = 0.030, r = 0.369) and the technology integration subscale (U = 175.00, Z = −2.27, p = 0.023, r = 0.383). However, no statistically significant differences were found for the digital skills subscale (U = 203.00, Z = −1.69, p = 0.091), digital interaction (U = 225.50, Z = −1.21, p = 0.226) or the DCS-UT (U = 204.00, Z = −1.66, p = 0.098).
To further investigate how self-perceived integration, frequency of use and interest in digital technologies relate to the level of self-perceived digital competence, the participants’ responses to three single-item questions were compared with the four subscales and the DCS-UT using the Kruskal–Wallis H-test. The DCS-UT in turn comprises four subscales: Digital Literacy (17–85), Digital Skills (6–30), Digital Interaction (6–30) and Technology Integration (6–30) (Table 2).
Statistically significant differences were observed in all subscales: digital literacy, H(4) = 13.45, p = 0.009; digital skills, H(4) = 16.52, p = 0.002; digital interaction, H(4) = 16.20, p = 0.003; and technology integration, H(4) = 10.26, p = 0.036. The DCS-UT total score also showed a significant difference between the groups, H(4) = 9.61, p = 0.048. The effect sizes for these Kruskal–Wallis tests ranged from η2 = 0.22 to 0.36, indicating small to moderate effects by conventional standards. Post hoc comparisons with the Dwass–Steel–Critchlow–Fligner test further clarified these differences. Participants who reported ‘skilful integration’ of digital technologies scored significantly higher on digital literacy than ‘frustrated learners’ (W = 4.68, p = 0.008). In the digital skills subscale, a significant difference was found between ‘skilful integrators’ and ‘frustrated learners’ (W = 5.24, p = 0.002). In digital interaction, the ‘skilful integrators’ scored significantly higher than the “gain confidence” group (W = 4.24, p = 0.022), and similar differences were observed in technology integration (W = 3.90, p = 0.046). On the DCS-UT total scale, the ‘skilful integrators’ also differed significantly from the ‘gain confidence’ group (W = 4.26, p = 0.021).
Significant group differences were also found at all levels of reported frequency of use of digital technologies in the teaching (p < 0.05), except for technology integration (H(4) = 9.164, p = 0.057). For the frequency of use of digital technology in teaching, the Kruskal–Wallis tests yielded η2 values between 0.20 and 0.40, indicating small to moderate effect sizes by conventional standards. Post hoc comparisons showed that participants who reported always using digital technologies scored significantly higher than less frequent users. In terms of digital literacy, significant differences were found between the ‘always’ and ‘rarely’ (W = 4.17, p = 0.026) and ‘occasionally’ (W = 4.93, p = 0.004) groups. In terms of digital skills, the ‘always’ users performed significantly better than the ‘rarely’ (W = 4.42, p = 0.015), ‘occasionally’ (W = 4.62, p = 0.010) and ‘frequently’ users (W = 4.29, p = 0.020). Significant differences in digital interaction were found between ‘always’ and ‘occasionally’ (W = 4.29, p = 0.020) and ‘always’ and ‘rarely’ (W = 3.99, p = 0.038). A significant difference was found on the DCS-UT scale between ‘always’ and ‘occasionally’ (W = 4.82, p = 0.006).
The differences in self-perceived digital competence were also statistically significant between the levels of interest in learning and use of digital technologies for teaching: digital literacy, H(3) = 18.59, p < 0.001; digital skills, H(3) = 16.47, p < 0.001; digital interaction, H(3) = 14.18, p = 0.003; technology integration, H(3) = 13.43, p = 0.004; and DCS-UT score, H(3) = 14.15, p = 0.003. The Kruskal–Wallis analyses for this set of comparisons yielded η2 values between 0.29 and 0.40, reflecting moderate effect sizes and indicating significant group differences in interest in learning and the use of digital technologies for teaching. A post hoc analysis revealed that participants who were ‘highly interested and motivated’ performed significantly better on digital literacy than those who were ‘slightly interested’ (W = 3.83, p = 0.034), ‘moderately interested’ (W = 4.09, p = 0.020) and ‘fairly interested’ (W = 4.12, p = 0.019). A significant difference was found between the ‘highly interested’ and ‘moderately interested’ groups on the digital skills subscale (W = 4.83, p = 0.004). On the DCS-UT scale, the ‘highly interested’ group scored significantly higher than the ‘fairly interested’ group (W = 5.28, p = 0.001). Further analyses were conducted to better understand the observed differences in self-perceived digital competence between academic status (Table 3).
For the total DCS-UT scale, the median score for assistant professors was 139.00 (IQR = 56.0), compared to 99.00 (IQR = 26.0) for associate professors and 98.50 (IQR = 12.0) for full professors. A Kruskal–Wallis H-test confirmed a statistically significant difference between the three groups, H(2) = 11.94, p = 0.003. Significant differences were also found in all four subscales (p < 0.05). The effect sizes for these Kruskal–Wallis tests ranged from η2 = 0.20 to 0.28, indicating small to moderate effects by conventional standards. A post hoc test revealed that assistant professors performed significantly better than associate professors in all five domains: digital literacy (W = 4.04, p = 0.012), digital skills (W = 4.62, p = 0.003), digital interaction (W = 3.90, p = 0.016), technology integration (W = 3.98, p = 0.013) and the DCS-UT (W = 4.19, p = 0.009). Significant differences were also found between assistant professors and full professors in digital literacy (W = 3.72, p = 0.023), digital interaction (W = 4.14, p = 0.010) and DCS-UT (W = 3.46, p = 0.038). No significant differences were found between associate and full professors in any domain (p > 0.05).
The Mann–Whitney U-test showed no significant difference in self-perceived digital competence scores between male (n = 29) and female (n = 19) participants (U = 204.500, p = 0.134).
Table 4 presents the correlation coefficients between age, years of teaching experience, and years of e-learning experience with the four subscales as well as with the total DCS-UT score.
The Spearman correlation analysis showed that age was negatively associated with all dimensions of digital competence, with older teachers consistently reporting lower scores (p < 0.001). A similar pattern emerged for years of teaching experience, which was associated with lower self-perceived digital competence scores (all p < 0.001). On the other hand, experience with e-learning was positively related to the integration of technology into teaching (p = 0.05), which could suggest that hands-on engagement with digital tools may help teachers to incorporate them more effectively into their work.

4. Discussion

This pilot study provides important preliminary insights into the current state of self-perceived digital competence of medical teachers in Slovenia.
Despite the generally positive attitude of the participants and the relatively frequent use of digital tools in teaching, the study revealed key areas where skills are still underdeveloped. In particular, lower self-assessment in the creative use of digital technologies shows a gap in skills related to digital content creation, including ethical issues such as licencing. Boté-Vericad et al. [10] pointed out that while many teachers can operate digital platforms, their ability to design, adapt and evaluate digital learning materials in a pedagogically meaningful and legally compliant way is often limited. The analysis also revealed that teachers who regularly used digital technologies in their teaching achieved higher scores in almost all competence dimensions. Those who reported always using digital tools scored significantly higher in the areas of self-perceived digital competence, interaction and technical skills. These differences indicate that regular and deliberate use of technology in everyday classroom practice was positively associated with higher confidence and more advanced pedagogical application [18]. These observations are consistent with the idea that digital literacy is not static but develops through habitual engagement and critical reflection [2].
A further pattern emerged in relation to teachers’ interest in digital technologies. Participants with a strong motivation to learn and use digital methods scored higher on the competence scale than their peers. High levels of personal engagement were positively associated with better developed digital skills, supporting the interpretation that motivation is positively related to teachers’ professional development [19]. Similar correlations have been found in previous studies, suggesting that teachers with a proactive and self-directed approach are more likely to explore and apply innovative digital strategies [20]. Participants who considered themselves capable of skilfully integrating digital tools into the classroom also scored significantly higher in all areas. This internal coherence within the self-report instrument supports the view that professional self-perception can serve as a meaningful indicator of digital readiness. The ability to plan, implement and reflect on the use of digital tools appears to be associated with more advanced pedagogical digital practises [8]. Comparable European data confirm this association. Ersoy et al. [21] reported that faculty members with higher personal engagement and frequent use of digital tools achieved significantly higher scores across digital competence domains, underscoring the cross-national relevance of motivation as a key driver of digital readiness.
Teachers with more extensive experience in e-learning environments tended to score higher in the area of technology integration, indicating that hands-on use of digital tools was positively associated with higher self-perceived applied competence. Rather than acquiring digital skills in abstract or isolated contexts, participants seemed to benefit most from the practical application that allowed experimentation and customisation in real classroom situations. This is in line with the findings of Ramírez-Montoya et al. [22], who showed that participation in targeted online training programmes such as Massive Open Online Courses (MOOCs) can not only promote general digital competencies, but also increase teachers’ willingness to create and use open digital resources in teaching. Similarly, the DigCompEdu framework [4] emphasises that digital competence develops through practise, reflection and context-specific application.
Differences in self-perceived digital competence by academic status were evident, with assistant professors outperforming their senior colleagues. Similar patterns were reported by Hautz et al. [23], who found that institutional barriers often hinder the systematic integration of digital competencies into curricula and faculty development. This is consistent with the findings of previous studies showing that younger or less experienced teachers tend to have greater adaptability and familiarity with digital tools [12,24], a trend likely driven by generational shifts in professional development, changes in educational technology use, and more frequent engagement with technology in early academic roles [6]. As the digital transformation of healthcare and medical education accelerates, these results emphasise the need for vertically integrated faculty development strategies that engage all academic ranks in continuing education [7]. Similarly, a recent international consensus on digital health education highlights that sustained competence growth depends on coordinated institutional and national strategies, a conclusion echoed by our findings on the pivotal role of structured faculty development [25]. In addition, a positive correlation was found between participation in structured digital courses and higher levels of competence, particularly in digital literacy and technology integration. Although the effects were not the same across all subdomains, this supports previous evidence that targeted, practice-orientated training is positively associated with higher self-perceived digital skills [26] and enhance teachers’ ability to effectively navigate and use digital tools in educational contexts [2,22].
This study found no significant gender-specific differences in self-perceived digital competences. This is in contrast to previous findings, such as those of O’Doherty et al. [24], where female teachers reported lower levels of confidence and competence in certain digital areas, particularly in the use of mobile technology. The absence of such gender differences in our sample may reflect a gradual shift towards greater gender parity in digital use in the Slovenian academic context. European-level evidence also suggests that gender gaps in educators’ digital competence are narrowing, though not uniformly across contexts, supporting our observation of reduced gender effects in Slovenia [27]. It is plausible that systemic efforts, such as institutional digital literacy strategies and professional development programmes, have contributed to narrowing the previously observed differences [28]. Recent studies support this trend and suggest that the gender digital divide decreases in higher education environments where equal access to technology and structured education is promoted [29,30]. In addition, EU-level initiatives such as the Digital Education Action Plan 2021–2027 [31] have emphasised the promotion of inclusive digital education with the aim of reducing inequalities in digital skills acquisition across gender and socio-demographic boundaries [1]. In contexts with mature institutional and policy frameworks that support digital capacity building, gender may no longer be the most important factor in digital literacy.
The observed negative correlation between age and self-perceived digital competence, particularly in areas such as digital literacy, technology integration and interaction, mirrors similar patterns found in other studies. For example, O’Doherty et al. [24] reported that senior medical teachers scored significantly lower in the creative and mobile subdomains of digital competence, highlighting the need for targeted interventions that recognise age-related differences in digital readiness and support teachers in updating their pedagogical approaches. This pattern resonates with a recent European mapping study showing that older health professionals across EU Member States systematically display lower digital skills, reinforcing the need for age-sensitive faculty development at a European scale [27].
The results of this pilot study have several practical implications for teacher development and curriculum design. First, institutions need to recognise that digital competence is not static but dynamic and is shaped by rapid technological development, including artificial intelligence, telemedicine and immersive learning tools. Therefore, teacher development programmes should go beyond the general teaching of digital skills and include pedagogically relevant, domain-specific applications [3]. Secondly, the differences in competence according to academic status suggest that training should be differentiated and personalised. Senior faculty members and those in higher positions may benefit from mentoring models or peer learning programmes that promote intergenerational knowledge exchange [23].
Strengthening individual and collective digital competences enables sustainable medical education that goes beyond the mere acquisition of technical skills. Higher levels of digital literacy support flexible and inclusive forms of teaching that maintain continuity of learning during disruptions such as pandemics and provide equitable access for diverse student groups. At the institutional level, digitally competent educators co-create curricula and assessment systems that are resilient, evidence-based and aligned with sustainability principles. These priorities are consistent with the European DigCompEdu framework and the Digital Education Action Plan 2021–2027, which explicitly promote equity, resilience and systemic readiness, core objectives of Sustainable Development Goal 4 (SDG 4): Quality Education [4,31].

4.1. Practical Implications

In line with the principles of sustainable higher education, the results show that long-term pedagogical change depends on coherent and well-supported institutional structures. Sustainable digital teaching capacity requires more than individual motivation. It is strengthened by consistent access to educational technologists, protected time for continuous professional development and formal recognition of achievements in digital pedagogy. Embedding teacher education in a broader institutional and policy framework such as DigCompEdu [2,4] ensures alignment with European policies and contributes to the resilience of education systems in the face of technological and societal change.
Teacher development programmes should be structured in such a way that they offer modular, needs-based and work-integrated learning opportunities. These should address areas such as digital assessment, the ethical and legal use of digital content and the integration of artificial intelligence into medical curricula [32]. For senior faculty members, long-term mentoring and peer learning can help to balance generational differences and strengthen institutional knowledge. Students should be involved as active contributors through peer-teaching programmes and participatory curriculum design, drawing on their experiences as digital natives [23]. In addition, digital competence training should also include legal and ethical knowledge, particularly in relation to the licencing and use of online content, where many teachers reported uncertainty [33]. These recommendations are intended to be integrated flexibly into institutional development plans so that universities can adapt them to their specific contexts and ensure lasting growth of digital competence.
Progress in digital competence should be regularly monitored using validated frameworks such as DigCompEdu [2,4]. This allows institutions to measure development, recognise skills gaps and adapt training to achieve sustainable improvements. A cyclical process of planning, implementation and review helps to ensure that digital skills development is effective and environmentally, socially and educationally sustainable [34,35].

4.2. Study Limitations

This study has several limitations that need to be considered. Although non-parametric tests with a conventional significance threshold (p < 0.05) were used, the large number of comparisons means that the possibility of occasional false-positive findings cannot be fully excluded. We addressed this by reporting effect sizes and using the Dwass–Steel–Critchlow–Fligner procedure, which provides p-values already adjusted for multiple pairwise comparisons, but the results should still be interpreted with appropriate caution. The use of self-assessments introduces potential biases, including social desirability and self-assessment inaccuracies, especially in areas such as digital creativity and inclusion, which may be over- or underestimated depending on the respondent’s confidence and interpretation of the scale [36]. Because all variables were derived from the same self-report survey, the risk of common-method bias cannot be completely ruled out. In addition, the use of convenience sampling from two Slovenian medical faculties may not capture the full diversity of experiences, institutional context and levels of digital support available to medical educators nationally or internationally. Furthermore, although both Slovenian medical faculties were included, the response rate was modest, resulting in some very small subgroups by academic rank, for these subgroups, dispersion estimates (IQR) are less stable and should be interpreted with caution. Despite these limitations, the pilot study provides important preliminary findings and serves as a valuable basis for the design of a more extensive and representative investigation.

5. Conclusions

This study contributes to the growing literature on teachers’ digital competence by providing a detailed insight into the strengths and ongoing challenges faced by medical teachers in a European context. Whilst overall motivation and engagement with digital technologies was high, significant differences in competence between academic ranks emerged, with assistant professors outperforming their senior colleagues. These patterns, as well as the observed correlations between competence, structured training and regular use of digital tools, emphasise the importance of targeted, practice-oriented and sustainable professional development. Embedding such initiatives in institutional and policy frameworks, such as the European DigCompEdu model and the EU Digital Education Action Plan, will be critical to building resilient, inclusive and sustainable digital teaching capacity in medical education.

Author Contributions

Conceptualization, S.L. and M.P.; methodology, S.L.; software, S.L.; validation, S.L. and M.P.; formal analysis, S.L.; investigation, S.L. and M.P.; resources, M.P.; data curation, S.L.; writing—original draft preparation, S.L. and M.P.; writing—review and editing, S.L. and M.P.; visualization, S.L.; supervision, S.L. and M.P.; project administration, S.L.; funding acquisition, S.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by ARIS, the Slovenian Research and Innovation Agency, within the project no. J5-4572.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Commission of the University of Primorska for Ethics in Human Subjects Research (Approval No: 4264-16-3/2022). Permission was also granted by the University of Primorska Faculty of Health Sciences for the dissemination of the survey via student emails. The survey complied with the EU General Data Protection Regulation (GDPR) on information privacy. Only the principal investigator had access to the raw data. Participants also had the right to skip any questions they did not wish to answer and could withdraw from the study at any time without any consequences.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data are available from the corresponding author upon reasonable request as the participants were assured that it would remain confidential.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. European Education Area. Higher Education Initiatives. Available online: https://education.ec.europa.eu/sl/node/1717 (accessed on 5 February 2022).
  2. Vuorikari, R.; Kluzer, S.; Punie, Y. DigComp 2.2: The Digital Competence Framework for Citizens—With New Examples of Knowledge, Skills and Attitudes; Publications Office of the European Union: Luxembourg, 2022. [Google Scholar]
  3. Haleem, A.; Javaid, M.; Qadri, M.A.; Suman, R. Understanding the Role of Digital Technologies in Education: A Review. Sustain. Oper. Comput. 2022, 3, 275–285. [Google Scholar] [CrossRef]
  4. Redecker, C.; Punie, Y. European Framework for the Digital Competence of Educators: DigCompEdu; Publications Office of the European Union: Luxembourg, 2017; ISBN 978-92-79-73494-6. [Google Scholar]
  5. Fura, B.; Karasek, A.; Hysa, B. Statistical Assessment of Digital Transformation in European Union Countries under Sustainable Development Goal 9. Qual. Quant. 2025, 59, 937–972. [Google Scholar] [CrossRef]
  6. Ličen, S.; Prosen, M. Strengthening Sustainable Higher Education with Digital Technologies: Development and Validation of a Digital Competence Scale for University Teachers (DCS-UT). Sustainability 2024, 16, 9937. [Google Scholar] [CrossRef]
  7. Kraus, S.; Schiavone, F.; Pluzhnikova, A.; Invernizzi, A.C. Digital Transformation in Healthcare: Analyzing the Current State-of-Research. J. Bus. Res. 2021, 123, 557–567. [Google Scholar] [CrossRef]
  8. Saaiq, M.; Khan, R.A.; Yasmeen, R. Digital Teaching: Developing a Structured Digital Teaching Competency Framework for Medical Teachers. Med. Teach. 2024, 46, 1362–1368. [Google Scholar] [CrossRef]
  9. Ogundiya, O.; Rahman, T.J.; Valnarov-Boulter, I.; Young, T.M. Looking Back on Digital Medical Education Over the Last 25 Years and Looking to the Future: Narrative Review. J. Med. Internet Res. 2024, 26, e60312. [Google Scholar] [CrossRef]
  10. Boté-Vericad, J.-J.; Palacios-Rodríguez, A.; Gorchs-Molist, M.; Cejudo-Llorente, C. Comparison of the Teaching of Digital Competences between Health Science Faculties in Andalusia and Catalonia. Educ. Médica 2023, 24, 100791. [Google Scholar] [CrossRef]
  11. Çebi, A.; Reisoğlu, İ. Defining “Digitally Competent Teacher”: An Examination of Pre-Service Teachers’ Metaphor. J. Digit. Learn. Teach. Educ. 2022, 38, 185–198. [Google Scholar] [CrossRef]
  12. Asal, M.G.R.; Alsenany, S.A.; Elzohairy, N.W.; El-Sayed, A.A.I. The Impact of Digital Competence on Pedagogical Innovation among Nurse Educators: The Moderating Role of Artificial Intelligence Readiness. Nurse Educ. Pract. 2025, 85, 104367. [Google Scholar] [CrossRef]
  13. Maaß, L.; Grab-Kroll, C.; Koerner, J.; Öchsner, W.; Schön, M.; Messerer, D.; Böckers, T.M.; Böckers, A. Artificial Intelligence and ChatGPT in Medical Education: A Cross-Sectional Questionnaire on Students’ Competence. J. CME 2025, 14, 2437293. [Google Scholar] [CrossRef]
  14. Pramila-Savukoski, S.; Kärnä, R.; Kuivila, H.-M.; Juntunen, J.; Koskenranta, M.; Oikarainen, A.; Mikkonen, K. The Influence of Digital Learning on Health Sciences Students’ Competence Development—A Qualitative Study. Nurse Educ. Today 2023, 120, 105635. [Google Scholar] [CrossRef] [PubMed]
  15. Potter, A.; Munsch, C.; Watson, E.; Hopkins, E.; Kitromili, S.; O’Neill, I.C.; Larbie, J.; Niittymaki, E.; Ramsay, C.; Burke, J.; et al. Identifying Research Priorities in Digital Education for Health Care: Umbrella Review and Modified Delphi Method Study. J. Med. Internet Res. 2025, 27, e66157. [Google Scholar] [CrossRef] [PubMed]
  16. Ylönen, M.; Forsman, P.; Karvo, T.; Jarva, E.; Antikainen, T.; Kulmala, P.; Mikkonen, K.; Kärkkäinen, T.; Hämäläinen, R. Social Services and Healthcare Personnel’s Digital Competence Profiles: A Finnish Cross-Sectional Study. Int. J. Med. Inf. 2025, 193, 105658. [Google Scholar] [CrossRef]
  17. Wang, X.; Cheng, Z. Cross-Sectional Studies: Strengths, Weaknesses, and Recommendations. Chest 2020, 158, S65–S71. [Google Scholar] [CrossRef]
  18. Hanaysha, J.R.; Shriedeh, F.B.; In’airat, M. Impact of Classroom Environment, Teacher Competency, Information and Communication Technology Resources, and University Facilities on Student Engagement and Academic Performance. Int. J. Inf. Manag. Data Insights 2023, 3, 100188. [Google Scholar] [CrossRef]
  19. Stumbrienė, D.; Jevsikova, T.; Kontvainė, V. Key Factors Influencing Teachers’ Motivation to Transfer Technology-Enabled Educational Innovation. Educ. Inf. Technol. 2024, 29, 1697–1731. [Google Scholar] [CrossRef]
  20. Garzón Artacho, E.; Martínez, T.S.; Ortega Martín, J.L.; Marín Marín, J.A.; Gómez García, G. Teacher Training in Lifelong Learning—The Importance of Digital Competence in the Encouragement of Teaching Innovation. Sustainability 2020, 12, 2852. [Google Scholar] [CrossRef]
  21. Ersoy, H.; Baskici, C.; Aytar, A.; Strods, R.; Jansone Ratinika, N.; Manuel Lopes Fernandes, A.; Neves, H.; Blaževičienė, A.; Vaškelytė, A.; Wikström-Grotell, C.; et al. Digital Competence of Faculty Members in Health Sciences Measured via Self-Reflection: Current Status and Contextual Aspects. PeerJ 2024, 12, e18456. [Google Scholar] [CrossRef]
  22. Ramírez-Montoya, M.-S.; Mena, J.; Rodríguez-Arroyo, J.A. In-Service Teachers’ Self-Perceptions of Digital Competence and OER Use as Determined by a xMOOC Training Course. Comput. Hum. Behav. 2017, 77, 356–364. [Google Scholar] [CrossRef]
  23. Hautz, S.C.; Hoffmann, M.; Exadaktylos, A.K.; Hautz, W.E.; Sauter, T.C. Digital Competencies in Medical Education in Switzerland: An Overview of the Current Situation. GMS J. Med. Educ. 2020, 37, Doc62. [Google Scholar] [CrossRef] [PubMed]
  24. O’Doherty, D.; Lougheed, J.; Hannigan, A.; Last, J.; Dromey, M.; O’Tuathaigh, C.; McGrath, D. Internet Skills of Medical Faculty and Students: Is There a Difference? BMC Med. Educ. 2019, 19, 39. [Google Scholar] [CrossRef] [PubMed]
  25. Car, J.; Ong, Q.C.; Erlikh Fox, T.; Leightley, D.; Kemp, S.J.; Švab, I.; Tsoi, K.K.F.; Sam, A.H.; Kent, F.M.; Hertelendy, A.J.; et al. The Digital Health Competencies in Medical Education Framework: An International Consensus Statement Based on a Delphi Study. JAMA Netw. Open 2025, 8, e2453131. [Google Scholar] [CrossRef]
  26. Cattaneo, A.A.P.; De Jong, F.; Ramos, J.L.; Laitinen-Väänänen, S.; Pedaste, M.; Leijen, Ä.; Evi-Colombo, A.; Monginho, R.; Bent, M.; Velasquez-Godinez, E.; et al. Video-Based Collaborative Learning: A Pedagogical Model and Instructional Design Tool Emerging from an International Multiple Case Study. Eur. J. Teach. Educ. 2022, 47, 466–490. [Google Scholar] [CrossRef]
  27. Kaihlanen, A.-M.; Hietapakka, L.; Heponiemi, T. Increasing Cultural Awareness: Qualitative Study of Nurses’ Perceptions about Cultural Competence Training. BMC Nurs. 2019, 18, 38. [Google Scholar] [CrossRef]
  28. Campos, D.G.; Scherer, R. Digital Gender Gaps in Students’ Knowledge, Attitudes and Skills: An Integrative Data Analysis across 32 Countries. Educ. Inf. Technol. 2024, 29, 655–693. [Google Scholar] [CrossRef]
  29. Grande-de-Prado, M.; Cañón, R.; García-Martín, S.; Cantón, I. Digital Competence and Gender: Teachers in Training. A Case Study. Future Internet 2020, 12, 204. [Google Scholar] [CrossRef]
  30. Norhagen, S.L.; Krumsvik, R.J.; Røkenes, F.M. Developing Professional Digital Competence in Norwegian Teacher Education: A Scoping Review. Front. Educ. 2024, 9. [Google Scholar] [CrossRef]
  31. European Education Area. Digital Education Action Plan (2021–2027). Available online: https://education.ec.europa.eu/node/1518 (accessed on 25 November 2022).
  32. Rincón, E.H.H.; Jimenez, D.; Aguilar, L.A.C.; Flórez, J.M.P.; Tapia, Á.E.R.; Peñuela, C.L.J. Mapping the Use of Artificial Intelligence in Medical Education: A Scoping Review. BMC Med. Educ. 2025, 25, 526. [Google Scholar] [CrossRef] [PubMed]
  33. Guillén-Gámez, F.D.; Colomo-Magaña, E.; Cívico-Ariza, A.; Linde-Valenzuela, T. Which Is the Digital Competence of Each Member of Educational Community to Use the Computer? Which Predictors Have a Greater Influence? Technol. Knowl. Learn. 2024, 29, 1–20. [Google Scholar] [CrossRef]
  34. Martín Párraga, L.; Llorente Cejudo, C.; Barroso Osuna, J. Validation of the DigCompEdu Check-in Questionnaire through Structural Equations: A Study at a University in Peru. Educ. Sci. 2022, 12, 574. [Google Scholar] [CrossRef]
  35. Markauskaite, L.; Carvalho, L.; Fawns, T. The Role of Teachers in a Sustainable University: From Digital Competencies to Postdigital Capabilities. Educ. Technol. Res. Dev. 2023, 71, 181–198. [Google Scholar] [CrossRef] [PubMed]
  36. Malmqvist, J.; Hellberg, K.; Möllås, G.; Rose, R.; Shevlin, M. Conducting the Pilot Study: A Neglected Part of the Research Process? Methodological Findings Supporting the Importance of Piloting in Qualitative Research Studies. Int. J. Qual. Methods 2019, 18, 1–11. [Google Scholar] [CrossRef]
Table 1. Medians with 95% bootstrap confidence intervals for the DCS-UT total scale and its subscales (n = 48).
Table 1. Medians with 95% bootstrap confidence intervals for the DCS-UT total scale and its subscales (n = 48).
VariablekMdnIQRMin/Max95% Confidence Interval
LowerUpper
Digital literacy1757.025.027/8548.064.0
Digital skills621.08.011/3018.024.0
Digital interaction622.09.011/3019.024.0
Technology integration620.09.010/3019.024.0
DCS-UT scale3511352.061/175100.0139.0
Note. k—number of items; Mdn—Median; IQR—Interquartile Range; Min/Max—observed minimum and maximum; 95% CI—95% bootstrap confidence interval based on 1000 resamples.
Table 2. Differences in the assessment of self-perceived digital competence according to self-reported integration, frequency of use and interest in digital technologies for teaching: Kruskal–Wallis test.
Table 2. Differences in the assessment of self-perceived digital competence according to self-reported integration, frequency of use and interest in digital technologies for teaching: Kruskal–Wallis test.
VariablesnDLDSDITIDCS-UT
Mdn[IQR]Mdn[IQR]Mdn[IQR]Mdn[IQR]Mdn[IQR]
Integration of digital technology in teachingFrustrated learner451.0[17.0]21.0[8.0]17.0[7.0]19.0[5.0]107.0[69.0]
Begins to understand550.0[14.0]18.0[9.0]18.0[8.0]18.0[2.0]99.0[40.0]
Gain confidence1445.0[16.0]16.0[7.0]19.0[8.0]19.0[7.0]99.0[30.0]
Recognise effectiveness1163.0[28.0]21.0[6.0]24.0[6.0]19.0[8.0]115.0[38.0]
Skilful integration1472.0[18.0]25.0[4.0]28.0[6.0]25.5[6.0]149.0[38.0]
H(df) 13.452(4)16.518(4)16.201(4)10.257(4)9.608(4)
p 0.0090.0020.0030.0360.048
η2 0.2890.3590.3450.2200.218
Frequency of use of digital technology in teachingRarely 537.0[17.0]15.0[8.0]15.0[4.0]20.0[10.0]99.0[38.0]
Occasionally 1045.0[18.0]17.0[9.0]18.5[9.0]18.0[5.0]103.0[17.0]
Frequently 1957.0[25.0]21.0[8.0]24.0[6.0]21.0[5.0]100.00[50.0]
Almost always 468.0[41.0]24.5[11.0]26.0[13.0]23.5[12.0]160.0[65.0]
Always 1070.5[17.0]25.0[5.0]27.0[6.0]25.5[10.0]148.0[43.0]
H(df) 16.136(4)18.568(4)17.933(4)9.164(4)9.756(4)
p 0.003<0.0010.0010.0570.045
η2 0.3430.3950.3820.1950.208
Interest in learning and using digital technologies in teaching purposesSlightly interested and not a priority327.0[0.0]11.0[0.0]11.0[0.0]12.0[0.0]107.0[0.0]
Moderately interested and willing to learn1546.0[14.0]18.00[8.0]19.0[5.0]18.0[7.0]107.0[62.0]
Fairly interested and actively seeking opportunities1350.0[25.0]16.0[8.0]22.0[7.0]19.0[6.0]99.0[36.0]
Highly interested and motivated1768.0[18.0]24.0[4.0]24.0[6.0]24.0[10.0]140.0[46.0]
H(df) 18.594(3)16.471(3)14.182(3)13.425(3)14.148(3)
p <0.001<0.0010.0030.0040.003
η2 0.3960.3500.3020.2860.301
Note. Mdn—Median; IQR—Interquartile Range; H—Kruskal–Wallis test; df—Degrees of Freedom; p—Statistical Significance; η2—eta squared; DL—Digital literacy; DS—Digital skills; DI—Digital interaction; TI—Technology integration; DCS-UT—The Digital Competence Scale for University Teachers; Cells with n < 5 and/or IQR = 0.0 indicate very small groups for which the IQR should be interpreted with caution. Integration of digital technology in teaching : Knowing but not using—Knows about digital technology but does not yet use it for teaching, possibly even avoids it (only one respondent chose this option, so it was excluded from further analysis). Frustrated learner—Often feels frustrated or unsure when trying to learn how to use digital technology for teaching. Begins to understand—Begins to understand how to use digital tools for specific teaching tasks. Gain confidence—Become more confident in using digital technology for specific teaching tasks. Recognise effectiveness—See digital technology as an effective tool that supports teaching, not just a device. Skilful integration—Use digital knowledge and different tools safely and effectively for teaching purposes.
Table 3. Median values of the digital competence subscales and the DCS-UT total score in relation to academic status: Kruskal–Wallis Test.
Table 3. Median values of the digital competence subscales and the DCS-UT total score in relation to academic status: Kruskal–Wallis Test.
VariablesnDLDSDITIDCS-UT
Mdn[IQR]Mdn[IQR]Mdn[IQR]Mdn[IQR]Mdn[IQR]
Academic statusAssistant professor2666.5[26.0]24.0[6.0]24.0[9.0]24.0[9.0]139[56.0]
Associate professor1848.0[15.0]17.5[7.0]18.5[11.0]18.5[5.0]99.0[26.0]
Full professor439.5[11.0]17.0[7.0]15.5[4.0]16.5[11.0]98.5[12.0]
H(df) 12.925(2)11.900(2)12.677(2)9.503(2)11.941(2)
p 0.0020.0030.0020.0090.003
η2 0.2750.2530.2700.2020.254
Note. Mdn—Median; IQR—Interquartile Range; H—Kruskal–Wallis test; df—Degrees of Freedom; p—Statistical Significance; η2—eta squared; DL—Digital literacy; DS—Digital skills; DI—Digital interaction; TI—Technology integration; DCS-UT—The Digital Competence Scale for University Teachers.
Table 4. Spearman’s correlations between demographic variables and dimensions of Digital Competence (DCS-UT).
Table 4. Spearman’s correlations between demographic variables and dimensions of Digital Competence (DCS-UT).
VariablesDLDSDITIDCS-UT
Age–0.787 **–0.807 **–0.616 **–0.587 **–0.814 **
Years of teaching experience–0.422 **–0.456 **–0.375 **–0.347 *–0.419 **
Years of e-learning experience0.0620.0990.1720.239 *0.041
Note. ** Correlation is significant at the 0.01 level; * Correlation is significant at the 0.05 level; DL—Digital literacy; DS—Digital skills; DI—Digital interaction; TI—Technology integration; DCS-UT—The Digital Competence Scale for University Teachers.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ličen, S.; Prosen, M. Promoting Sustainable Medical Education Through Digital Competence: A Cross-Sectional Pilot Study. Sustainability 2025, 17, 8699. https://doi.org/10.3390/su17198699

AMA Style

Ličen S, Prosen M. Promoting Sustainable Medical Education Through Digital Competence: A Cross-Sectional Pilot Study. Sustainability. 2025; 17(19):8699. https://doi.org/10.3390/su17198699

Chicago/Turabian Style

Ličen, Sabina, and Mirko Prosen. 2025. "Promoting Sustainable Medical Education Through Digital Competence: A Cross-Sectional Pilot Study" Sustainability 17, no. 19: 8699. https://doi.org/10.3390/su17198699

APA Style

Ličen, S., & Prosen, M. (2025). Promoting Sustainable Medical Education Through Digital Competence: A Cross-Sectional Pilot Study. Sustainability, 17(19), 8699. https://doi.org/10.3390/su17198699

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop