Next Article in Journal
AI-Enhanced Digital STEM Language Learning in Technical Education
Previous Article in Journal
Acceptance and Use of Generative Artificial Intelligence in Higher Education: A UTAUT-Based Model Integrating Trust and Privacy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Towards Digital Transformation in University Teaching: Diagnosis of the Level and Profile of Digital Competence Based on the DigCompEdu and OpenEdu Frameworks Among University Lecturers in Chile

by
Irma Riquelme-Plaza
1,* and
Jesús Marolla-Gajardo
2
1
Interdisciplinary Center for Educational Innovation (CIED), Universidad Santo Tomás, Santiago de Chile 7550611, Chile
2
Center for Historical Studies and Humanities, Faculty of Human Sciences, Universidad Bernardo O’Higgins, Santiago de Chile 870006, Chile
*
Author to whom correspondence should be addressed.
Educ. Sci. 2026, 16(2), 174; https://doi.org/10.3390/educsci16020174
Submission received: 28 December 2025 / Revised: 15 January 2026 / Accepted: 16 January 2026 / Published: 23 January 2026
(This article belongs to the Section Teacher Education)

Abstract

This study diagnoses the level and profile of university lecturers’ digital competence at a Chilean higher education institution, drawing on the DigCompEdu and OpenEdu frameworks. A non-experimental correlational design was used, based on a self-perception questionnaire adapted from the DigCompEdu Check-In tool and administered to 569 lecturers through the Qualtrics platform. The instrument underwent external expert validation and demonstrated excellent internal consistency (Cronbach’s α = 0.96). Results indicate that 44% of lecturers position themselves at the “Integrator” level, 22% at the “Explorer” level, and 19% at the “Expert” level, with three clearly differentiated competence profiles. These findings informed the development of a structured training programme centred on three components: the pedagogical use of digital technologies, the incorporation of open educational practices aligned with OpenEdu, and the strengthening of students’ digital competence. The programme includes modular workshops, mentoring led by high-competence lecturers, and the creation of open educational resources. Overall, the study provides empirical evidence to guide institutional policies and to foster a reflective, ethical, and pedagogically grounded integration of digital technologies in university teaching.

1. Introduction

1.1. Conceptual and Analytical Framework

In a global scenario characterised by accelerated digital transformation, the development of digital competences has become a central requirement for twenty-first-century citizenship. Beyond the ability to access information, individuals must critically interpret, ethically manage, and collaboratively produce knowledge within increasingly complex digital ecosystems (Howard & Tondeur, 2023; Vuorikari et al., 2022). Higher education institutions therefore face the challenge of redefining their formative role by cultivating digitally literate, reflexive, and socially responsible citizens.
Within this process, universities have progressively reconfigured organisational structures, institutional cultures, and pedagogical practices, moving towards models grounded in the meaningful integration of digital technologies rather than the formerly dominant ICT-centred paradigm. Evidence consistently shows that teachers’ attitudes towards technology are among the strongest predictors of its effective pedagogical appropriation, influencing both their willingness to innovate and their capacity for instructional redesign (Teo, 2019; Huang et al., 2019). It is important to emphasise, however, that a positive disposition towards digital technologies does not in itself guarantee effective teaching or meaningful learning. In this study, digital technologies are conceptualised not as ends in themselves, but as pedagogical tools whose educational value depends on their alignment with learning objectives, instructional design, and students’ learning processes.
The digital competence of university lecturers thus emerges as a key enabling condition for institutional digital transformation. Strengthening lecturers’ digital competence is essential for the development of educational environments that integrate technological, pedagogical, and organisational components in a coherent and sustainable manner (UNESCO, 2023; Falloon, 2020; Bernsteiner et al., 2025). Mere access to technological resources is insufficient; successful transformation requires robust teaching practices supported by reflective and ethically informed digital decision-making.
This situation underscores the importance of systematic professional development, as students’ operational familiarity with digital tools does not necessarily translate into pedagogically meaningful academic use when teaching practices are not supported by adequate levels of teachers’ digital competence. The COVID-19 pandemic made this disparity particularly visible, revealing heterogeneous levels of digital competence among university teaching staff and highlighting the need for more structured and context-sensitive training opportunities (Selwyn, 2020).
University lecturers are widely recognised as occupying a strategic position in the development of reflective, inclusive, and innovative pedagogical practices, particularly within student-centred educational models. The DigCompEdu framework (Salinas-Ibáñez et al., 2022) provides an analytical framework for understanding this role across six interrelated areas—from professional engagement to the facilitation of learners’ digital competence—emphasising that digital competence is a multidimensional construct encompassing pedagogical, technical, and ethical dimensions (Redecker, 2017; Salinas-Ibáñez et al., 2022).
Internationally, the European DigCompEdu Framework and the Opening Up Education Framework (OpenEdu) have guided policy design and institutional planning, establishing shared criteria for the development of digital and open educational practices (Inamorato dos Santos & Punie, 2016). Complementarily, UNESCO’s ICT Competency Framework for Teachers provides a global benchmark for professional standards, advocating a comprehensive and equity-driven approach to digital competence (UNESCO, 2023). However, in Latin America—and particularly in Chile—regulatory progress in higher education has been slower and more fragmented. Although standards exist for initial teacher education, universities still lack a unified framework for diagnosing and strengthening lecturers’ digital competence (Inamorato dos Santos & Punie, 2016).
In addition to DigCompEdu, several international frameworks have contributed to defining teacher digital competence, including UNESCO’s policy-oriented approaches to digital education (Claro & Castro-Grau, 2023); the Technological Pedagogical Content Knowledge (TPACK) framework, which conceptualises the integration of technological, pedagogical, and content knowledge in teaching practice (Mishra & Koehler, 2006); and the ISTE Standards for Educators, which provide performance-oriented guidelines for educators’ professional use of digital technologies (International Society for Technology in Education, 2017).
In this context, this study adopts the European Framework for the Digital Competence of Educators (DigCompEdu) as its primary analytical reference, given its structured domain-based model and progressive competence levels, which enable diagnostic and comparative analysis in higher education settings (Redecker, 2017; Cabero-Almenara & Palacios-Rodríguez, 2019). In addition, this paper incorporates the OpenEdu Framework as a complementary perspective, as it explicitly addresses dimensions of openness—such as access, reuse, and the creation of digital resources—that have been identified in the literature as increasingly relevant for universities undergoing digital transformation (Inamorato dos Santos, 2019; Santos-Hermosa et al., 2020). On this basis, this study posits that the combined use of DigCompEdu and OpenEdu provides a coherent and contextually appropriate conceptual foundation for the analysis developed in this article.
At the same time, it is important to acknowledge that the principles underpinning OpenEdu—such as open access, open-source software, open educational resources, and the reuse of digital assets—often encounter significant institutional, cultural, and economic resistance. In many national contexts, including those outside the European Union, universities and school systems remain structurally dependent on commercial platforms, proprietary software, and closed content ecosystems. While open approaches are widely recognised as beneficial, inclusive, and effective in reducing the financial burden borne by institutions, families, and students, their implementation is neither straightforward nor universally supported. This tension constitutes an additional challenge that must be explicitly recognised when adopting OpenEdu as an analytical and aspirational framework.
Despite the relevance of these frameworks, DigCompEdu was selected as the primary analytical reference because it offers a more granular and structured model for diagnostic purposes in higher education. Its domain-based architecture and progressive competence levels allow for institutional profiling, comparative analysis, and the identification of differentiated professional development needs. The incorporation of OpenEdu as a complementary dimension further extends this analytical scope by addressing openness and resource-sharing practices that are increasingly relevant in contemporary universities. Taken together, this combination reflects a methodological choice based on analytical suitability and pedagogical coherence, rather than on geopolitical preference.
As a result, institutional practices tend to remain predominantly instrumental, with limited impact on learning when not supported by coherent pedagogical planning and contextualised training (Cabero-Almenara et al., 2020). Furthermore, digital competence development is shaped by structural, attitudinal, and contextual variables—including age, gender, teaching experience, discipline, and institutional conditions—which contribute to heterogeneous levels of digital maturity among academic staff (Marín-Díaz et al., 2021; Lucas et al., 2021; Saikkonen & Kaarakainen, 2021).

1.2. Aim of the Study

Considering this complex landscape, the present study aims to systematically diagnose university lecturers’ digital competence at a private Chilean university, using the DigCompEdu framework as the primary analytical reference and incorporating the OpenEdu dimension to account for open educational practices. The study adopts a self-perception approach in order to examine competence levels, identify differentiated lecturer profiles, and explore their relationship with selected sociodemographic and professional variables.
However, it is important to acknowledge that the adoption of the OpenEdu Framework also presents challenges, particularly in non-European contexts where policies related to open access, open source, free software, and the use of digital assets such as 3D printing often encounter institutional, cultural, or economic resistance (Inamorato dos Santos, 2019; Santos-Hermosa et al., 2020). Although these principles promote inclusion, reduce financial pressure on educational institutions and families, and enhance opportunities for innovation, their implementation is far from uniform across countries (Selwyn, 2020; Claro & Castro-Grau, 2023). Many higher education systems still face obstacles related to technological infrastructure, regulatory frameworks, and entrenched commercial dependencies. Recognising these difficulties is essential to contextualise the incorporation of OpenEdu within the Chilean higher education landscape and to avoid assuming that the benefits associated with openness can be achieved without addressing the broader structural constraints that shape local educational ecosystems. Based on this general aim, the study is guided by the following specific objectives, which are explicitly aligned with the DigCompEdu competence areas and the complementary OpenEdu dimension:
The specific objectives are: (a) to analyse the relationship between digital competence and sociodemographic and professional variables; (b) to establish lecturer profiles based on their level of digital competence; and (c) to characterise lecturers’ competence levels across the DigCompEdu and OpenEdu dimensions in order to inform institutionally relevant training strategies.
Accordingly, the study addresses the following research questions:
  • (RQ1) How are university lecturers’ levels of digital competence related to sociodemographic and professional variables?
  • (RQ2) What digital competence profiles can be identified among university lecturers based on the DigCompEdu framework and the OpenEdu dimension?
  • (RQ3) How do lecturers’ self-perceived competence levels vary across the different DigCompEdu competence areas?
Although the DigCompEdu Framework was originally developed within the European Union, its adoption in this study should not be interpreted as a form of geopolitical alignment or normative transfer. We explicitly acknowledge that the use of European policy frameworks in non-EU contexts may raise geopolitical and epistemic sensitivities. However, the selection of DigCompEdu in the Chilean context is grounded in methodological and analytical criteria rather than in regional affiliation. Specifically, DigCompEdu offers a conceptually robust, pedagogically grounded, and internationally comparable structure, with clearly articulated competence areas and progression levels that support diagnostic and institutional analysis in higher education. Its growing uptake beyond Europe reflects its capacity to function as a shared analytical language rather than as a prescriptive policy model tied to a specific geopolitical space.
It is also necessary to acknowledge that “Europe” is not a monolithic entity and that significant differences exist across its educational systems, including in relation to inclusion policies, public investment, teacher training policies, and levels of digital maturity (Howard & Tondeur, 2023; Salinas-Ibáñez et al., 2022). DigCompEdu does not reflect the specific circumstances of any single European country—whether Italy, Spain, or the Scandinavian nations—but rather an aggregated set of principles developed by the Joint Research Centre to provide a common reference point across heterogeneous systems (Redecker, 2017). For this reason, its use in the present study does not assume European uniformity nor replicate the priorities of any particular national model. Instead, it draws on DigCompEdu as a technical and pedagogical framework whose structure facilitates comparison and internal evaluation while remaining open to contextual adaptations that respond to the specific needs, constraints, and institutional realities of Chilean higher education (Cabero-Almenara & Palacios-Rodríguez, 2019).

2. Materials and Methods

2.1. Study Design

This study employed a non-experimental, cross-sectional, and correlational design. This approach is appropriate for investigating the association between lecturers’ self-perceived digital competence and a set of sociodemographic and professional variables without manipulating conditions or assigning participants to experimental groups. The design is coherent with the study’s objectives, which seek to diagnose existing competence levels, explore their relationships with contextual factors, and identify naturally occurring teaching profiles.
A correlational design is widely used in research on digital competence when the aim is to analyse patterns and interactions within complex educational ecosystems (Liu et al., 2020). The cross-sectional nature of the study also aligns with institutional diagnostic processes, allowing the collection of data from a large and heterogeneous academic population at a single point in time. Furthermore, the use of a self-perception questionnaire is consistent with established diagnostic approaches in the field, particularly within frameworks such as DigCompEdu, which conceptualise digital competence as a situated, context-dependent construct (Redecker, 2017).
Finally, the combination of descriptive, correlational, and cluster analyses is methodologically suited to identifying differentiated groups of lecturers based on their digital competence profiles, a procedure frequently used in higher education research to inform targeted professional development strategies (Marín-Díaz et al., 2021; Saikkonen & Kaarakainen, 2021).

2.2. Procedure

The methodology was developed in two stages. The first stage involved a correlational analysis between sociodemographic variables and levels of digital competence (Cabero-Almenara et al., 2020). In the second stage, a two-step cluster analysis was applied to identify three distinct groups according to their competence levels (Ñaupas et al., 2018). From an epistemological perspective, the study adopts a quantitative empirical approach oriented towards the systematic analysis of observable patterns in teachers’ digital competence, without assuming full objectivity or exhaustiveness in the interpretation of social phenomena.

2.3. Participants

The study involved a non-probabilistic convenience sample of 569 lecturers from a private Chilean university, corresponding to 15.9% of the total academic staff (N = 3566). Participation was voluntary, and lecturers from all faculties and regions of the institution were included, ensuring diversity across disciplinary areas, employment conditions, and teaching responsibilities.
According to the institutional diagnostic process, the sample is predominantly female (57%), with most lecturers belonging to the 30–49 age range (62%). In disciplinary terms, the largest proportion of participants work in Health Sciences (45%), followed by Social and Legal Sciences (24%), Sciences (16%), and Engineering and Architecture (4%). These distributions reflect the composition of the university’s academic workforce.
Although the institutional report produced by International Society for Technology in Education (2017) provides a detailed analysis of the academic staff’s digital competence, the present manuscript includes the essential descriptive characteristics required to contextualise the sample. These include gender distribution, disciplinary affiliation, and the general age ranges of participants. Since age constitutes one of the variables analysed in the study, a summary table is included below for transparency. The specific frequencies and percentages should be completed based on the institutional dataset (Table 1).
The type of contract is also a variable considered for the analysis, therefore the corresponding data are included in the following table (Table 2).
These characteristics provide an adequate descriptive foundation for the inferential and multivariate analyses carried out in the study, including the exploration of associations between digital competence and sociodemographic variables.

2.4. Instruments

The instrument used in this study was a structured questionnaire based on the DigCompEdu and OpenEdu frameworks, adapted from the DigCompEdu Check-In model originally developed by Redecker (2017). The adaptation process was carried out by the research team and consisted of two stages. First, the wording of several items was adjusted to reflect terminology commonly used in Chilean higher education institutions, with the aim of ensuring linguistic clarity and contextual relevance for local lecturers. Second, an additional dimension related to Open Education was incorporated, following the conceptual guidelines of the OpenEdu Framework, in order to capture practices associated with openness, including the use, creation, and sharing of open educational resources.
With regard to validity, the adaptation was guided by criteria of content validity and conceptual coherence. All items were reviewed to ensure their alignment with the theoretical constructs underlying the DigCompEdu and OpenEdu frameworks, as well as their relevance to the institutional and educational context under study. This process sought to preserve the original structure and meaning of the instrument while enhancing its suitability for the Chilean higher education setting.
Following these adaptations, the instrument underwent a validation process to assess its coherence and suitability for the national context. A panel of academic specialists with experience in digital competence and university teaching reviewed the modified items, examining their clarity, relevance, and alignment with the DigCompEdu descriptors. Feedback obtained during this expert-judgement procedure led to minor adjustments in wording but did not involve substantive changes to the conceptual structure of the questionnaire.
The final version of the instrument consisted of 25 items assessing the six areas of DigCompEdu—Professional Engagement, Digital Resources, Teaching and Learning, Assessment and Feedback, Empowering Learners, and Facilitating Learners’ Digital Competence—plus one additional area for Open Education. Thirteen sociodemographic questions were included to support the analysis of competence-related differences across participant groups.
The questionnaire was administered digitally via the Qualtrics platform. Reliability was examined using Cronbach’s alpha, which yielded a single overall coefficient of 0.96. This value indicates excellent internal consistency for the entire instrument and supports its use as a diagnostic tool in the studied context. The previously included table has been removed, as the relevant reliability information is clearly conveyed through the reported coefficient.
The structure of the instrument comprised two main sections:
  • Section 1: A conceptual overview of the DigCompEdu framework to guide respondents’ understanding of competence levels.
  • Section 2: A set of 25 items designed to assess the DigCompEdu areas:
    -
    Area 1: Professional Engagement;
    -
    Area 2: Digital Resources;
    -
    Area 3: Teaching and Learning;
    -
    Area 4: Assessment and Feedback;
    -
    Area 5: Empowering Learners;
    -
    Area 6: Facilitating Learners’ Digital Competence;
    -
    Area 7: Based on OpenEdu, which addresses Open Education, along with 13 sociodemographic questions.
  • Section 3: A set of 11 questions addressing three aspects related to the digital competence of the respondents: Perceptions of technology use, institutional conditions, and methodological support
This structure enabled a comprehensive analysis by linking the level of digital competence with lecturers’ characteristics. The reliability of the questionnaire was evaluated using Cronbach’s alpha coefficient, and the results are presented in Table 3.
The obtained alpha value of 0.96 indicates excellent internal consistency for the instrument, supporting its reliability as a measurement tool (Marín-Díaz et al., 2020). Although the internal consistency of the instrument, as measured by Cronbach’s alpha (α = 0.96), was excellent, it is important to acknowledge the limitations of this coefficient, particularly its assumption of tau-equivalence and its sensitivity to test length. For this reason, McDonald’s Omega (ω = 0.91) was also calculated, as it provides a more theoretically robust estimate of reliability for multidimensional instruments. The Omega value obtained further confirms the internal consistency of the instrument and strengthens the psychometric rigour of the measures used in this study.

2.5. Statistical Analysis

The statistical analysis was conducted in three stages: descriptive, multivariable, and inferential. First, the sample was characterised by sociodemographic variables and levels of digital competence, identifying general trends (Baptista et al., 2014; Cabero-Almenara & Palacios-Rodríguez, 2019). Next, correlation analyses were performed to explore significant relationships among key variables (Gisbert-Cervera & Esteve-Mon, 2022; Liu et al., 2020). Finally, hypothesis testing and a two-step cluster analysis were conducted, allowing for the identification of homogeneous profiles of teachers’ digital competence and the specification of training needs (Ñaupas et al., 2018; Marín-Díaz et al., 2020). The data were processed using SAS v. 9.4 software, ensuring analytical precision and robustness. (Hair et al., 2019). In line with current research in this field, the number of participants can be considered adequate for an institutional diagnostic study and for the identification of competence-based profiles within higher education.

3. Results

Correlations, non-parametric tests, and cluster analyses were applied, enabling the study to progress from a general description to a deeper interpretation, with implications for the design of contextually grounded training policies. In order to avoid confusion regarding the structure of the assessment, it is important to clarify that the instrument used in this study builds upon the six areas established by the DigCompEdu framework and incorporates a seventh dimension related to Open Education. This additional area was included because the Chilean adaptation of the instrument sought to capture practices that extend beyond digital pedagogical competence and address the institutional emphasis on openness, resource sharing, and the creation and adaptation of educational materials. The inclusion of this dimension does not imply an alteration of the DigCompEdu framework itself; rather, it functions as a complementary analytical component aligned with the principles of the OpenEdu Framework, providing a broader perspective on lecturers’ digital practices within the local context.
The seventh dimension, therefore, should not be interpreted as a comparative framework or as an evaluative structure parallel to DigCompEdu. Its purpose is to deepen the analysis by incorporating aspects of Open Education that are particularly relevant to Chilean higher education, where initiatives promoting the use, modification, and sharing of digital and open educational resources have gained increasing institutional relevance. By integrating this additional area, the study aims to offer a more comprehensive overview of lecturers’ digital competence, acknowledging that open practices constitute an important component of digital innovation and pedagogical transformation. This clarification ensures that the interpretation of the results remains consistent with the methodological adaptations described in Section 2.4.

3.1. Digital Competence Across DigCompEdu and OpenEdu Areas

In line with Objective 3, this section examines lecturers’ self-perceived levels of digital competence across the DigCompEdu areas and the OpenEdu dimension, focusing on the internal relationships among these competence domains. To examine the relationship between the different areas of teachers’ digital competence and open education, a correlational analysis was conducted among the seven areas evaluated in the survey, using Spearman’s rank correlation coefficient. This methodological decision was based on the Shapiro–Wilk normality test, which indicated that the data did not follow a normal distribution.
The null hypotheses for this analysis were as follows:
H0: 
There is no association between the areas of digital competence and open education.
H1: 
There is an association between the areas of digital competence and open education.
An association was considered statistically significant when the p-value was less than 0.05. The results are summarised in Table 4.
As shown, all correlations were positive and statistically significant. The strongest associations were between Area 3 (Teaching and Learning) and Area 5 (Empowering Learners), as well as between Area 3 and Area 4 (Assessment and Feedback). The weakest correlation was between Area 1 (Professional Engagement) and Area 7 (Open Education).
These results demonstrate the internal coherence of the DigCompEdu model, in which the different dimensions of digital competence tend to develop together. In particular, the strong association between Areas 3 and 5 suggests that those who integrate technologies into their teaching also tend to actively empower their students.

3.2. Digital Competence and Sociodemographic Variables

In accordance with Objective 1, this section analyses the relationship between lecturers’ digital competence and selected sociodemographic and professional variables, including gender, age, type of contract, and frequency of digital technology use in teaching.

3.2.1. Gender

An analysis was conducted to determine whether statistically significant differences existed in the level of development across areas of teachers’ digital competence according to sociodemographic variables. This section presents the results related to the gender variable. Because the data did not follow a normal distribution and the sample was non-probabilistic, non-parametric tests were applied: the Mann–Whitney U test for dichotomous variables and the Kruskal–Wallis test when a third category (“Prefer not to say”) was included (Table 5).
Test Hypotheses:
H0: 
The medians of the responses between groups are equal.
H1: 
At least one median differs between groups.
Statistically significant differences were found between men and women in two areas: Area 2 (Digital Resources) and Area 7 (Open Education) (Table 6).
It was confirmed that there are statistically significant differences between men and women regarding digital resources and open education.
In addition to the gender-based analysis, possible differences in teachers’ digital competence levels were explored according to other sociodemographic and professional variables: age, type of contract, and years of teaching experience. For this analysis, non-parametric tests were applied: the Mann–Whitney U test for dichotomous variables and the Kruskal–Wallis test for variables with more than two categories. In all cases, the null hypothesis assumed equality of medians among the compared groups (Table 7).

3.2.2. Age

Given the low number of cases in some categories, the original age ranges were regrouped: the category “Under 25 years” was merged with “25–29 years,” and the option “Prefer not to say” was incorporated into the modal group “30–39 years”.
Table 7. Kruskal–Wallis Test for Age.
Table 7. Kruskal–Wallis Test for Age.
Areap-Value
Overall Score0.7194
10.6743
20.8617
30.8696
40.4418
50.7441
60.6947
70.8623
The results show that there are no statistically significant differences in any of the areas assessed with respect to age. Therefore, the null hypothesis is retained, indicating that the level of teachers’ digital competence is homogeneous across the different age groups.

3.2.3. Type of Contract

The responses of academics with permanent contracts were compared with those of those holding non-permanent contracts (Table 8).
Statistically significant differences were found in Areas 2 (Digital Resources), 3 (Teaching and Learning), and 5 (Empowering Learners), as well as in the overall score. In general, this indicates that the type of contract influences the development of certain dimensions of digital competence. Those with more stable employment relationships likely have greater opportunities for training and for using technologies in sustained teaching and learning contexts.

3.2.4. Years of Teaching Experience

The results indicate that there are no significant differences among the groups based on years of teaching experience. Although some p-values (such as those for Area 1, Professional Engagement, and Area 2, Digital Resources) approach the significance threshold, they do not allow for the rejection of the null hypothesis (Table 9 and Table 10).
This analysis allows us to conclude that gender and type of contract are variables significantly associated with the level of teachers’ digital competence, particularly in dimensions related to collaboration, student empowerment, and the integration of open practices. In contrast, age and teaching experience are not significantly related to the levels of development in any of the areas assessed.

3.2.5. Percentage of Technology Use in Teaching

An analysis was conducted to determine whether statistically significant differences existed across areas of teachers’ digital competence according to the percentage of classes in which technological tools were used during the past three months. To meet the minimum frequency requirements per group, the six original categories were regrouped into four ranges: the 0–10% group was merged with 11–25%, and the “Prefer not to say” option was combined with the modal group 76–100%.
The analysis was conducted using the non-parametric Kruskal–Wallis test (Table 11), as the data did not meet the assumptions of normality and homogeneity of variances. The null hypothesis stated that there were no differences between the medians of the groups.
The results clearly show that the recent level of digital technology use in classes is significantly associated with the development of all areas of teachers’ digital competence. This finding is particularly relevant, as it demonstrates that the frequency of technology use has a direct and broad impact on perceived competence.

3.3. Digital Competence Profiles Among University Lecturers

Addressing Objective 2, this section presents the identification and characterisation of distinct digital competence profiles among university lecturers. Using a cluster-based analytical approach, lecturers were grouped according to their levels of digital competence across the DigCompEdu areas and the OpenEdu dimension. The cluster analysis used in this study enabled the classification of participating lecturers into three groups based on their level of development across the DigCompEdu areas. This multivariate approach combined hierarchical and non-hierarchical procedures in order to identify internally coherent and clearly differentiated groups. The final classification was established on the basis of the average scores obtained in each of the seven evaluated areas, allowing for the identification of three distinct teaching staff profiles. The distribution of average scores across competence areas for each profile is presented in Table 12.
As shown in the table, Cluster 1 (High Profile) includes lecturers who demonstrate the highest levels of digital competence across all dimensions. They are especially strong in the pedagogical use of digital technologies (Area 6), Teaching and Learning (Area 3), and Professional Engagement (Area 1). Cluster 2 (Medium Profile) holds an intermediate position, with moderate scores that are closer to the high group in Professional Engagement, Teaching and Learning, and the Development of Students’ Digital Competence (Areas 1, 2, and 3). This group may represent lecturers who are in transition or in the process of strengthening their digital competencies. Cluster 3 (Low Profile) shows the lowest levels across all areas, indicating limited integration of digital competences. Scores are particularly low in Open Education (Area 7) and in Teaching, Assessment, and Feedback practices. (Areas 3 and 4).

3.3.1. Sociodemographic Characterisation by Profile

To better understand the teaching profiles within each cluster, a cross-analysis was conducted with the sociodemographic variables. The main characteristics are presented in Table 13 and Table 14.
This analysis identified three distinct profiles of academics based on their level of digital teaching competence. The High-Profile group is characterised by consistently higher scores across the analysed competence areas, while the Low-Profile group presents lower levels of development in both technical and pedagogical dimensions. The Medium Profile group occupies an intermediate position, suggesting a more heterogeneous level of competence development across areas. Rather than implying normative or prescriptive interpretations, these profiles are intended to provide an analytical framework for understanding patterns of digital competence within the academic staff under study. In addition, the profile analysis also considered lecturers’ perceptions of technology use in teaching. The results of this analysis are presented in Table 15 and are reported descriptively in order to complement the competence-based profiles without assuming causal relationships or intervention priorities.
The results indicate differences in lecturers’ self-reported perceptions of the pedagogical use of digital technologies across the identified profiles. In Profile 1 (High), a larger proportion of academics reported strong agreement with statements regarding the perceived educational benefits of ICT, such as enhanced learning and improved teaching outcomes. In Profile 2 (Medium), favourable perceptions were also observed, although with lower levels of strong agreement, suggesting more moderate or less consolidated views. In contrast, in Profile 3 (Low), responses tended to concentrate around neutral or disagreeing options, indicating lower levels of perceived usefulness and confidence in the pedagogical use of digital technologies.
It is important to note that these findings are based on self-reported data and therefore reflect lecturers’ perceptions and confidence in the use of ICT rather than objectively measured digital competence or observed teaching practices. Consequently, the results should be interpreted descriptively and do not imply a direct or causal relationship between competence levels and pedagogical effectiveness. Rather, they highlight perceived differences in attitudes towards technology use that coexist with the competence profiles identified in this study.
Rather than treating this finding as self-evident, the observed association between lecturers’ level of digital competence and their positive perception of the pedagogical use of digital technologies requires careful interpretation. In contemporary higher education contexts, it has become increasingly difficult for teachers to openly declare negative attitudes towards digital technologies, given their institutional normalisation and the normative expectations surrounding digital innovation. For this reason, such responses could be interpreted, at first glance, as socially desirable or declarative rather than analytically meaningful.
However, the pattern identified in this study cannot be reduced solely to a desirability effect. The differentiated distribution of responses across the three competence profiles provides an important analytical nuance. Lecturers with high levels of digital competence consistently express stronger and more confident endorsement of the pedagogical value of digital technologies, whereas those positioned in the low-competence profile show markedly more neutral or ambivalent responses. This variation suggests that attitudes are not merely rhetorical but are closely linked to lecturers’ perceived self-efficacy, accumulated experience, and pedagogical intentionality in the use of digital tools. In this sense, positive perceptions reflect not only alignment with institutional discourse but also a sense of professional agency grounded in actual competence development.
Exploring the relationship between this result and the item concerning the development of digital competences provides additional interpretative depth. Lecturers who perceive themselves as having progressed in their digital competence over time are also more likely to express favourable views regarding the integration of digital technologies into teaching. This discursive alignment reinforces existing evidence indicating that sustained competence development fosters positive pedagogical beliefs, reduces perceived barriers, and enhances the sense of professional agency. While numerical correlations in the dataset support this tendency, the underlying explanation lies in the mutually reinforcing relationship between competence, perceived usefulness, and pedagogical confidence. In other words, lecturers’ affirmative attitudes are not simply normative statements but reflections of an evolving professional identity shaped by their engagement with digital technologies.
This interpretation is further reinforced when considering the item related to the perceived development of digital competences over time. Lecturers who report higher levels of competence also tend to perceive clearer progress in their own digital development, indicating a mutually reinforcing relationship between competence acquisition, perceived usefulness, and pedagogical confidence. While numerical correlations support this association, its explanatory value lies primarily in the discursive alignment between competence, experience, and professional identity. Thus, affirmative attitudes towards digital technologies should be understood not as merely normative statements, but as expressions of an evolving pedagogical self shaped by sustained engagement with digital practices (Table 16).
The results reveal differences in lecturers’ self-reported perceptions of institutional and personal conditions for professional development across the identified profiles. In Profile 1 (High), respondents reported higher levels of perceived autonomy in the use of digital technologies and more favourable views regarding institutional support and opportunities for continuous digital training. In Profile 2 (Medium), these perceptions remained generally positive, although with more moderate values, particularly when referring to technical and pedagogical support. In contrast, Profile 3 (Low) was characterised by lower levels of perceived autonomy and more constrained views of institutional conditions, including workload and access to professional development opportunities (Table 17).
It should be emphasised that these findings are based on self-reported data and therefore reflect lecturers’ perceptions and confidence regarding institutional conditions rather than objectively measured levels of support or competence. Accordingly, the results are interpreted descriptively and indicate perceived associations between competence profiles and contextual conditions, without implying causal relationships or normative conclusions.
The analysis confirms a direct relationship between the level of digital competence and the perception of institutional support received. In Profile 1 (High), lecturers express a notably more favourable perception: 41.3% reported that they “strongly agree” their institution promotes the use of technologies, and 44.8% stated that collaborative work with digital tools is encouraged. These responses reflect an institutional environment perceived as conducive to digital development. Profile 2 (Medium) adopts intermediate positions: although some support is acknowledged, the intensity of agreement is lower, indicating that there is still room to strengthen institutional support. In contrast, Profile 3 (Low) presents the most critical assessments. Only 21.2% fully agree that the use of technologies is promoted, and just 17% recognise clarity in institutional policies on educational technology. The neutrality or disagreement expressed by about one-third of these lecturers suggests possible feelings of demotivation or institutional disconnection. In summary, the data suggest that higher levels of digital competence are associated with more positive perceptions of the institutional environment. This reinforces the need to make technological support policies more visible and robust, particularly for lecturers with lower competence development, to foster their effective integration into educational innovation processes.

3.3.2. Perception of Improvement in Digital Competence

In response to the question, “Do you consider that your digital teaching competence has improved over the past year?” the results show a direct relationship between the level of digital competence and the perception of improvement. In Profile 1 (High), 81.2% of academics reported improvement in their digital skills, with 40.6% stating that they “strongly agree,” reflecting a proactive attitude toward technological updating and consistency with their high competence levels. Profile 2 (Medium) shows intermediate results, with 52.7% acknowledging some improvement, but only 22.8% expressing the highest level of agreement. In contrast, Profile 3 (Low) shows a significantly lower perception: only 7.3% “strongly agree” that their competence has improved, while 31.3% remain neutral, which may be related to low self-confidence or limited access to training opportunities. Overall, the data suggest that higher levels of digital competence are associated not only with better performance but also with a more positive self-perception of progress, reinforcing the need for differentiated training strategies to support lecturers with lower competence development (Table 18).
The levels reported in Table 19 are derived from the DigCompEdu framework and correspond to the progressive stages of digital competence defined in the DigCompEdu Check-In instrument (Redecker, 2017; Cabero-Almenara & Palacios-Rodríguez, 2019). Specifically, the framework distinguishes six levels of development, ranging from A1 (Beginner) and A2 (Explorer), which indicate initial stages of competence acquisition, through B1 (Integrator) and B2 (Expert), reflecting more consolidated pedagogical use of digital technologies, to C1 (Leader) and C2 (Pioneer), which represent advanced and innovative levels of digital competence. The percentages presented reflect lecturers’ self-assessment of their competence level within this classification.
In response to the question “How would you rate your digital competence?”—which focuses on teachers’ self-assessment of their digital competence—the results confirm the profile classification established in this study. Profile 1 (High) includes more than 54% of academics at advanced levels (B2, C1, and C2), with 6.3% identifying as Pioneers. This self-perception corresponds with their high overall scores and frequent use of technologies in teaching practice. In contrast, Profile 2 (Low) is mainly concentrated at the initial and intermediate levels: 46.3% identify as Explorers (A2), 41.7% as Integrators (B1), and 7.7% still consider themselves Beginners (A1). This distribution indicates limited knowledge and a low level of systematic ICT integration in teaching. Profile 3 (Medium) shows an intermediate distribution, with a predominance at the Integrator level (53.9%) and a gradual increase toward the Expert level (27.5%). However, the Leader (C1) and Pioneer (C2) levels remain marginal. These results highlight the need to design differentiated training pathways tailored to the actual and perceived levels of each teaching group to promote a more complex, reflective, and transformative form of digital competence.

4. Discussion

The results show a significant association with employment conditions. In particular, lecturers with non-permanent contracts display higher levels in certain areas, which may be interpreted either as an adaptive strategy to enhance their employability or as greater openness to innovation in response to job insecurity. This trend has been documented in previous studies highlighting how contractual instability can encourage the development of differentiating competences, including digital ones (Fernández-Martínez et al., 2021). However, it could also reflect a lower level of institutional investment in the professional development of those with permanent contracts.
The findings reveal that lecturers’ digital competence is shaped by a combination of structural, professional, and attitudinal factors, rather than by sociodemographic variables alone. Also, the findings reveal clear heterogeneity in the development of teachers’ digital competence, consistent with previous research demonstrating its uneven distribution in higher education (Cabero-Almenara & Palacios-Rodríguez, 2019). The segmentation into three profiles—high, medium, and low—indicates that competence level depends not only on age or experience but also on formative, attitudinal, and institutional factors. Notably, the high-competence group masters digital tools and exhibits a more positive self-perception, confirming the relationship between effective use and digital confidence (Cabero-Almenara & Palacios-Rodríguez, 2019). Furthermore, their positioning at the B2, C1, and C2 levels of the DigCompEdu framework reflects advanced pedagogical appropriation, consistent with the notion of digital maturity proposed by Redecker (2017).
Conversely, a substantial proportion of academics remain at the initial levels (A1 and A2), particularly within the low-profile group. This situation presents urgent challenges for higher education institutions, which must go beyond simply providing technological infrastructure and actively commit to the digital professional development of their teaching staff. Educational transformation cannot be achieved through tools alone but requires pedagogical competencies that enable the critical and meaningful integration of ICT (Teo, 2019). The differences observed according to type of contract and recent use of technology reinforce the need to implement equitable, sustained, and adaptive training policies that help close the digital divide within the university system.
The profile analysis reinforces this idea by revealing distinct groups that cannot be explained solely by sociodemographic variables. The group with the highest level of digital competence stands out not only for its experience and frequent use of technologies but also for its proactive attitude toward continuous learning, self-training, and digital collaboration. This underscores the key role of the attitudinal component in technology appropriation, consistent with studies emphasising the importance of fostering an institutional culture oriented toward pedagogical innovation and collaborative work (Liu et al., 2020).
Finally, the findings reveal a deficit in the appropriation of open education principles, particularly among groups with medium and low levels of digital competence. In particular, women showed higher levels of digital resources and practices related to open education, which could be related to their more active participation in pedagogical networks and collaborative actions aimed at inclusion (as also evidenced in other studies: Cabero-Almenara et al., 2020; Santos-Hermosa et al., 2020). Despite institutional discourse on digital transformation, gaps persist in adopting practices such as the use of Open Educational Resources (OER) and open-access publishing. This indicates limited advanced digital literacy and highlights the need to align training policies with international frameworks such as the OpenEdu Framework (Inamorato dos Santos, 2019), promoting a critical, ethical, and collaborative integration of technologies in higher education.
In summary, the study’s findings confirm that the development of digital competence among university lecturers is not uniform but is influenced by individual, institutional, and attitudinal factors. The segmentation achieved through profile analysis identified distinct groups that reflect not only varying levels of technological mastery but also different approaches to integrating these competences into everyday pedagogical practice. This heterogeneity supports the argument by Cabero-Almenara et al. (2020), who contend that teachers’ digital competence should be understood as a dynamic, situated, and context-dependent process.

5. Conclusions

Overall, the findings indicate significant correlations across all areas of the DigCompEdu framework. The strongest association was observed between teaching and learning and learner empowerment, suggesting that those who diversify their methodologies tend to foster greater student autonomy. Significant differences were also identified by gender, type of contract, and frequency of ICT use, confirming the existence of structural inequalities in the development of these competences (Inamorato dos Santos & Punie, 2016; Redecker, 2017). The results also revealed a direct relationship between ICT use in the classroom and the level of digital competence, reinforcing the link between pedagogical practice and technological integration.
Taken together, the results suggest that the cluster analysis identified three profiles characterised by high, medium, and low levels of digital competence. The high-competence profile was distinguished by higher levels of training and technological experience, as well as a more balanced gender distribution, which contrasts with previous studies reporting a greater male presence at more advanced levels of digital competence (Guillén-Gámez et al., 2019). These findings reinforce the view that the development of teachers’ digital competence is not homogeneous but rather shaped by a combination of individual, institutional, and attitudinal factors (Lucas et al., 2021; Howard & Tondeur, 2023).
Furthermore, the observed differences associated with gender, contract type, and frequency of ICT use are consistent with research emphasising the need for differentiated and context-sensitive professional development strategies in higher education (Fernandes et al., 2023; Salinas-Ibáñez et al., 2022). From a broader perspective, these results align with conceptualisations of digital competence as a pedagogical, reflective, and transformative construct in higher education, as articulated in frameworks such as DigCompEdu and OpenEdu (Redecker, 2017; Cabero-Almenara & Palacios-Rodríguez, 2019; Inamorato dos Santos, 2019).
The diagnosis of digital competence at a Chilean university shows that most lecturers are at the Integrator level (44%), indicating a regular but still limited use of digital technologies in teaching. To progress to higher levels such as Expert or Leader, it is necessary to promote pedagogical innovation, critical reflection, and student empowerment in digital environments (Cabero-Almenara & Palacios-Rodríguez, 2019; Cabero-Almenara et al., 2020). Persistent weaknesses in content creation, feedback, digital security, use of Open Educational Resources (OER), and inclusive practices highlight the urgency of a more comprehensive and holistic training approach. (Reisoğlu & Çebi, 2020; Fernandes et al., 2023).
This study is subject to several limitations that should be acknowledged when interpreting the findings. The research design relied exclusively on quantitative self-reported data, which does not capture the full depth of lecturers’ experiences or the contextual meanings associated with their digital teaching practices. In addition, technical difficulties in accessing the instrument—managed externally by MetaRed—as well as time constraints and the absence of a hierarchical structure facilitating staff engagement, limited the possibility of conducting in-depth qualitative, longitudinal, or mixed-methods analyses. These limitations frame the scope of the conclusions drawn and point to important avenues for further research.
Despite these limitations, this study makes a significant and original contribution to the understanding of digital competence in higher education. By integrating the DigCompEdu and OpenEdu frameworks and applying a cluster-based analytical approach, the research provides a structured and context-sensitive profile of lecturers’ digital competence in a Latin American university setting. The identification of differentiated competence profiles, together with the analysis of associated perceptions and institutional conditions, offers a robust empirical basis for informing future research, professional development strategies, and policy discussions. In this sense, the study advances the field by moving beyond descriptive assessments towards a more nuanced understanding of how digital competence is experienced, perceived, and shaped within contemporary higher education institutions.
Building on the identified competence profiles and the exploratory nature of the findings, several implications can be outlined for different stakeholders in higher education. For researchers, the results highlight the value of combining quantitative profiling approaches with qualitative or longitudinal designs in order to gain deeper insight into lecturers’ professional trajectories, contextual constraints, and institutional cultures surrounding digital competence development.
For educators and learners, the differentiated profiles identified in this study suggest that digital competence development is not linear or uniform. Rather than assuming homogeneous levels of readiness or confidence, professional learning initiatives may benefit from recognising diverse starting points and supporting reflective engagement with digital technologies as part of ongoing teaching and learning practices. From this perspective, digital competence can be understood as a situated and evolving dimension of academic work rather than as a fixed set of skills.
For course and module designers, the findings underscore the importance of embedding digital competence development across curricula in ways that are pedagogically meaningful and context-sensitive. Aligning learning activities with frameworks such as DigCompEdu and OpenEdu may support coherence while allowing for flexibility in disciplinary and institutional contexts.
At the institutional level, the study points to the relevance of developing supportive conditions for digital competence development, including access to training opportunities, recognition of teaching innovation, and the fostering of communities of practice and mentoring networks among academic staff. Finally, at the policy level, the results may inform discussions on the design of inclusive and sustainable digital education strategies that move beyond instrumental approaches and attend to ethical, pedagogical, and organisational dimensions of digital transformation in higher education.

Author Contributions

Conceptualisation, I.R.-P.; methodology, I.R.-P. and J.M.-G.; validation, J.M.-G.; formal analysis, I.R.-P.; investigation, I.R.-P. and J.M.-G.; resources, J.M.-G.; data curation, I.R.-P.; writing—original draft preparation, I.R.-P. and J.M.-G.; writing—review and editing, I.R.-P.; supervision, I.R.-P.; project administration; I.R.-P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Baptista, P., Fernández, C., & Hernández, R. (2014). Metodología de la investigación (6th ed.). McGraw-Hill. [Google Scholar]
  2. Bernsteiner, A., Haagen-Schützenhöfer, C., & Schubatzky, T. (2025). Teacher education in the age of digitality: Conclusions from a design-based research project. European Journal of Education, 60(1), e12904. [Google Scholar] [CrossRef]
  3. Cabero-Almenara, J., Barroso-Osuna, J., Palacios Rodríguez, A., & Llorente-Cejudo, C. (2020). Marcos de competencias digitales para docentes universitarios: Su evaluación a través del coeficiente competencia experta. Revista Electrónica Interuniversitaria de Formación del Profesorado, 23(3), 1–14. [Google Scholar] [CrossRef]
  4. Cabero-Almenara, J., & Palacios-Rodríguez, A. (2019). Marco europeo de competencia digital docente «DigCompEdu»: Traducción y adaptación del cuestionario «DigCompEdu Check-In». EDMETIC, Revista de Educación Mediática y TIC, 9(1), 213–234. [Google Scholar] [CrossRef]
  5. Claro, M., & Castro-Grau, C. (2023). El papel de las tecnologías digitales en los aprendizajes del siglo XXI. Oficina Regional de Educación para América Latina y el Caribe (OREALC/IIPE-UNESCO). Available online: https://unesdoc.unesco.org/ark:/48223/pf0000386981 (accessed on 10 January 2024).
  6. Falloon, G. (2020). From digital literacy to digital competence: The teacher digital competency (TDC) framework. Educational Technology Research and Development, 68(5), 2449–2472. [Google Scholar] [CrossRef]
  7. Fernandes, S., Araújo, A. M., Miguel, I., & Abelha, M. (2023). Teacher professional development in higher education: The impact of pedagogical training perceived by teachers. Education Sciences, 13(3), 309. [Google Scholar] [CrossRef]
  8. Fernández-Martínez, A., Llorens-Largo, F., Céspedes-Lorente, J. J., & Rubio de las Alas-Pumariño, T. (2021). Modelo de universidad digital (mUd). Publicaciones de la Universidad de Alicante. Available online: https://rua.ua.es/server/api/core/bitstreams/28105b90-58b3-4554-bc33-2c8262c84878/content (accessed on 30 May 2024).
  9. Gisbert-Cervera, M., & Esteve-Mon, F. M. (2022). La competencia digital docente y su relación con las prácticas de aula innovadoras. Revista de Educación, 397, 351–380. [Google Scholar] [CrossRef]
  10. Guillén-Gámez, F. D., Lugones, A., Mayorga-Fernández, M. J., & Wang, S. (2019). ICT use by pre-service foreign language teachers according to gender, age and motivation. Cogent Education, 6(1), 1574693. [Google Scholar] [CrossRef]
  11. Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2019). Multivariate data analysis (8th ed.). Cengage Learning. [Google Scholar]
  12. Howard, S. K., & Tondeur, J. (2023). Digital competences for higher education teachers in a blended future. Educational Technology Research and Development, 71, 1–6. [Google Scholar] [CrossRef] [PubMed]
  13. Huang, F., Teo, T., Sánchez-Prieto, J., García-Peñalvo, F. J., & Olmos-Migueláñez, S. (2019). Cultural values and technology adoption: A model comparison with university teachers from China and Spain. Computers & Education, 133, 69–81. [Google Scholar] [CrossRef]
  14. Inamorato dos Santos, A. (2019). OpenEdu: Las diez dimensiones del Marco Europeo de Educación Abierta. Revista Mexicana de Bachillerato a Distancia, 22, 1–10. [Google Scholar] [CrossRef]
  15. Inamorato dos Santos, A., & Punie, Y. (2016). Opening up Education: A support framework for higher education institutions (EUR 27938 EN). Publications Office of the European Union. [Google Scholar] [CrossRef]
  16. International Society for Technology in Education. (2017). ISTE standards for educators. Available online: https://www.iste.org/standards/iste-standards-for-teachers (accessed on 15 July 2025).
  17. Liu, Q., Geertshuis, S., & Grainger, R. (2020). Understanding academics’ adoption of learning technologies: A systematic review. Computers & Education, 151, 103857. [Google Scholar] [CrossRef]
  18. Lucas, M., Bem-Haja, P., Siddiq, F., Moreira, A., & Redecker, C. (2021). The relation between in-service teachers’ digital competence and personal and contextual factors: What matters most? Computers & Education, 160, 104052. [Google Scholar] [CrossRef]
  19. Marín-Díaz, V., Reche, E., & Martín, J. (2021). University virtual learning in Covid times. Technology, Knowledge and Learning, 27(4), 1291–1309. [Google Scholar] [CrossRef]
  20. Marín-Díaz, V., Riquelme, I., & Cabero-Almenara, J. (2020). Uses of ICT tools from the perspective of Chilean university teachers. Sustainability, 12(15), 6134. [Google Scholar] [CrossRef]
  21. Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108, 1017–1054. [Google Scholar] [CrossRef]
  22. Ñaupas, H., Valdivia, M., Palacios, J., & Romero, H. (2018). Metodología de la investigación cuantitativa-cualitativa y redacción de la tesis (5th ed.). Ediciones de la U. [Google Scholar] [CrossRef]
  23. Redecker, C. (2017). European framework for the digital competence of educators: DigCompEdu. Publications Office of the European Union. [Google Scholar] [CrossRef]
  24. Reisoğlu, İ., & Çebi, A. (2020). How can the digital competences of pre-service teachers be developed? Examining a case study through the lens of DigComp and DigCompEdu. Computers & Education, 156, 103940. [Google Scholar] [CrossRef]
  25. Saikkonen, L., & Kaarakainen, M. T. (2021). Multivariate analysis of teachers’ digital information skills: The importance of available resources. Computers & Education, 168, 104206. [Google Scholar] [CrossRef]
  26. Salinas-Ibáñez, J., de Benito-Crosetti, B., Moreno-García, J., & Lizana Carrió, A. (2022). Nuevos diseños y formas organizativas flexibles en educación superior. Pixel-Bit. Revista de Medios y Educación, 63, 65–91. [Google Scholar] [CrossRef]
  27. Santos-Hermosa, G., Estupinyà, E., Nonó-Rius, B., París-Folch, L., & Prats-Prat, J. (2020). Recursos educativos abiertos (REA) en las universidades españolas. Profesional de la Información, 29(6), e290606. [Google Scholar] [CrossRef]
  28. Selwyn, N. (2020). Re-imagining the role of technology in education: Lessons from the COVID-19 pandemic. Postdigital Science and Education, 2(3), 685–691. [Google Scholar] [CrossRef]
  29. Teo, T. (2019). Students and teachers’ intention to use technology: A review. Computers & Education, 130, 121–138. [Google Scholar] [CrossRef]
  30. United Nations Educational, Scientific and Cultural Organization. (2023). Global education monitoring report 2023: Technology in education—A tool on whose terms? Available online: https://unesdoc.unesco.org/ark:/48223/pf0000385723 (accessed on 22 May 2025).
  31. Vuorikari, R., Kluzer, S., & Punie, Y. (2022). DigComp 2.2: The Digital Competence Framework for Citizens—With new examples of knowledge, skills and attitudes. Publications Office of the European Union. Available online: https://op.europa.eu/en/publication-detail/-/publication/50c53c01-abeb-11ec-83e1-01aa75ed71a1 (accessed on 22 May 2025).
Table 1. Age distribution of participants.
Table 1. Age distribution of participants.
AgeFrequencyPercentage
Under 2510
25–29285
30–3918332
40–4917130
50–5912322
60 or older5910
I prefer not to say41
Total569100
Table 2. Type of contract by participants.
Table 2. Type of contract by participants.
Type of ContractFrequencyPercentage
Permanent31155%
Not permanent25845%
Total569100%
Table 3. Reliability Statistics of the Questionnaire.
Table 3. Reliability Statistics of the Questionnaire.
Cronbach’s Alpha (α)Omega (ω)
0.960.91
Table 4. Spearman Correlation Matrix.
Table 4. Spearman Correlation Matrix.
Area1234567
110.6440.6970.6340.6360.6120.485
2 10.7300.6700.6400.6600.550
3 10.7800.7900.7700.620
4 10.7500.7100.590
5 10.7900.620
6 10.650
7 1
Note: All p-values associated with the correlations are <0.0001.
Table 5. Results of the Kruskal–Wallis Test by Gender.
Table 5. Results of the Kruskal–Wallis Test by Gender.
Areap-ValueSignificant Difference
Overall Score0.2231No
10.8534No
20.0127Yes
30.3870No
40.8424No
50.5965No
60.3858No
70.0008Yes
Table 6. Post Hoc Mann–Whitney Comparisons by Gender (Wilcoxon Z).
Table 6. Post Hoc Mann–Whitney Comparisons by Gender (Wilcoxon Z).
AreaComparisonPr > DSCFSignificant
2Male vs. Female0.0088Yes
7Male vs. Female0.0005Yes
Male vs. Prefer not to say0.9487No
Female vs. Prefer not to say0.7283No
Table 8. Mann-Whitney test for contract type.
Table 8. Mann-Whitney test for contract type.
Areap-ValueSignificant Difference
Overall Score0.0236Yes
10.6462No
20.0040Yes
30.0082Yes
40.1720No
50.0278Yes
60.0539No (limit)
70.0590No
Table 9. Kruskal–Wallis Test for Years of Teaching Experience.
Table 9. Kruskal–Wallis Test for Years of Teaching Experience.
Areap-Value
Overall Score0.1363
10.0621
20.0961
30.2056
40.1879
50.1023
60.3573
70.2300
Table 10. Summary of Comparative Analysis.
Table 10. Summary of Comparative Analysis.
VariableSignificant Differences by Area
Gender2 (Digital Resources), 7 (Open Education)
AgeNone
Type of Contract2 (Digital Resources), 3 (Teaching and Learning), 5 (Empowering Learners), Overall Score
Years of Teaching ExperienceNone
Table 11. Kruskal–Wallis Test by Percentage of Technology Use in Teaching.
Table 11. Kruskal–Wallis Test by Percentage of Technology Use in Teaching.
Areap-ValueSignificant Difference
Overall Score<0.0001Yes
1<0.0001Yes
2<0.0001Yes
3<0.0001Yes
4<0.0001Yes
5<0.0001Yes
6<0.0001Yes
7<0.0001Yes
Table 12. Average Scores by Area in Each Cluster.
Table 12. Average Scores by Area in Each Cluster.
AreaHigh Profile (1)Medium Profile (2)Low Profile (3)
116.7814.089.66
213.7510.957.17
318.0013.326.22
413.019.085.03
513.699.255.43
623.3614.699.11
79.284.481.79
Note: The clusters were defined based on participants’ average scores across the seven DigCompEdu competence areas. The values reported correspond to mean scores obtained for each area within each cluster, and reflect relative differences in levels of digital competence among the identified profiles.
Table 13. Gender Distribution by Profile.
Table 13. Gender Distribution by Profile.
ProfileMen (%)Women (%)No Response (%)
High49.7491.4
Medium37.160.52.4
Low39.459.11.5
Table 14. Type of Contract by Profile.
Table 14. Type of Contract by Profile.
ContratHigh (%)Medium (%)Low (%)
Permanent44.159.957.1
Non-permanent55.940.142.9
Table 15. Perceptions of Technology Use by Profile.
Table 15. Perceptions of Technology Use by Profile.
Question
How Would You Describe Yourself and Your Personal Use of Technology?
CategoryHigh Profile (N°)High Profile (%)Medium Profile (N°)Medium Profile (%)Low Profile (N°)Low Profile (%)
I find it easy to work with computers and other digital devicesStrongly Disagree 0.0%21.2%31.2%
Disagree10.7%10.6%186.9%
Neither Agree nor Disagree21.4%74.2%4015.4%
Agree3927.3%6740.1%12347.5%
Strongly agree10170.6%9053.9%7529.0%
I use the Internet extensively and competentlyStrongly Disagree 0.0%21.2%31.2%
Disagree21.4% 0.0%72.7%
Neither Agree nor Disagree21.4%84.8%3413.1%
Agree3423.8%5734.1%13150.6%
Strongly agree10573.4%10059.9%8432.4%
I am open to and curious about new applications, programmers, and digital resourcesStrongly Disagree 0.0%21.2%31.2%
Disagree 0.0% 0.0%124.6%
Neither Agree nor Disagree32.1%116.6%3814.7%
Agree3423.8%6337.7%12146.7%
Strongly agree10674.1%9154.5%8532.8%
I am a member of several social networksStrongly Disagree10.7%31.8%186.9%
Disagree85.6%84.8%3513.5%
Neither Agree nor Disagree2114.7%4124.6%6424.7%
Agree5035.0%6740.1%10139.0%
Strongly agree6344.1%4828.7%4115.8%
Strongly Disagree 0.0%21.2%31.2%
Table 16. Perception of Institutional Conditions and Development of Digital Competence by Profile.
Table 16. Perception of Institutional Conditions and Development of Digital Competence by Profile.
Question
To What Extent Does Your Work Environment Meet the Following Criteria?
CategoryHigh Profile (N°)High Profile (%)Medium Profile (N°)Medium Profile (%)Low Profile (N°)Low Profile (%)
The university promotes the integration of digital technologies in teachingStrongly Disagree00.0%21.2%41.5%
Disagree10.7%42.4%145.4%
Neither Agree nor Disagree117.7%137.8%4918.9%
Agree4732.9%8148.5%12949.8%
Strongly agree8458.7%6740.1%6324.3%
The university invests in updating and improving its technical infrastructureStrongly Disagree21.4%31.8%145.4%
Disagree107.0%2112.6%4115.8%
Neither Agree nor Disagree3121.7%4325.7%7529.0%
Agree5035.0%6438.3%9837.8%
Strongly agree5035.0%3621.6%3112.0%
The university provides the necessary technical supportStrongly Disagree21.4%31.8%124.6%
Disagree107.0%1710.2%3513.5%
Neither Agree nor Disagree2215.4%3118.6%6023.2%
Agree5437.8%8249.1%12046.3%
Strongly agree5538.5%3420.4%3212.4%
Students have access to digital devicesStrongly Disagree10.7%31.8%62.3%
Disagree64.2%42.4%103.9%
Neither Agree nor Disagree1510.5%2816.8%4517.4%
Agree5639.2%8651.5%16162.2%
Strongly agree6545.5%4627.5%3714.3%
The university’s internet connection is reliable and fastStrongly Disagree107.0%148.4%259.7%
Disagree1711.9%2917.4%4918.9%
Neither Agree nor Disagree3323.1%4124.6%8231.7%
Agree4934.3%5432.3%8633.2%
Strongly agree3423.8%2917.4%176.6%
The university supports the development of my digital competence, for example, through continuous professional development activitiesStrongly Disagree21.4%42.4%93.5%
Disagree21.4%116.6%2911.2%
Neither Agree nor Disagree2718.9%2816.8%8131.3%
Agree4632.2%6639.5%10239.4%
Strongly agree6646.2%5834.7%3814.7%
Table 17. Institutional Methodological Support and Development of Competence by Profile.
Table 17. Institutional Methodological Support and Development of Competence by Profile.
Question
The COVID-19 Pandemic Has Altered the Usual Activity Within Higher Education Institutions, and Many Dynamics Have Had to Adapt. To What Extent Do You Consider That Your Institution Has Worked on This Adaptation Process?
CategoryHigh Profile (N°)High Profile (%)Medium Profile (N°)Medium Profile (%)Low Profile (N°)Low Profile (%)
The university has provided the necessary equipment to deliver my classes in different modalities: face-to-face, online, or hybrid (digitalised classrooms, audio and video devices, etc.).Strongly Disagree10.7%95.4%145.4%
Disagree53.5%63.6%2610.0%
Neither Agree nor Disagree2316.1%3621.6%4316.6%
Agree5538.5%7243.1%12146.7%
Strongly agree5941.3%4426.3%5521.2%
The University has provided the necessary tools for developing digital content, supplying resources such as guides, training courses, and other support for their production.Strongly Disagree10.7%31.8%103.9%
Disagree53.5%116.6%2610.0%
Neither Agree nor Disagree1711.9%2615.6%5922.8%
Agree6243.4%8450.3%12046.3%
Strongly agree5840.6%4325.7%4417.0%
The University has supported the methodological shift (in areas like teaching and assessment) by providing a range of instructional support resources, including new materials, mentoring systems, and training courses.Strongly Disagree00.0%31.8%103.9%
Disagree32.1%127.2%2810.8%
Neither Agree nor Disagree2215.4%3219.2%6525.1%
Agree5437.8%7444.3%10841.7%
Strongly agree6444.8%4627.5%4818.5%
Table 18. Perceptions of Improvement in Digital Competence by Profile.
Table 18. Perceptions of Improvement in Digital Competence by Profile.
Response CategoryProfile 1 (High)Profile 2 (Medium)Profile 3 (Low)
Strongly Disagree2.8%1.8%3.1%
Disagree4.2%3.6%6.6%
Neither Agree nor Disagree11.9%19.2%31.3%
Agree40.6%52.7%51.7%
Strongly Agree40.6%22.8%7.3%
Table 19. Self-Assessment of Digital Competence by Profile.
Table 19. Self-Assessment of Digital Competence by Profile.
Competence LevelProfile 1 (High)Profile 2 (Medium)Profile 3 (Low)
A1: Beginner1.4%0.6%7.7%
A2: Explorer4.9%11.4%46.3%
B1: Integrator39.2%53.9%41.7%
B2: Expert31.5%27.5%3.1%
C1: Leader16.8%6.6%1.2%
C2: Pioneer6.3%0.0%0.0%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Riquelme-Plaza, I.; Marolla-Gajardo, J. Towards Digital Transformation in University Teaching: Diagnosis of the Level and Profile of Digital Competence Based on the DigCompEdu and OpenEdu Frameworks Among University Lecturers in Chile. Educ. Sci. 2026, 16, 174. https://doi.org/10.3390/educsci16020174

AMA Style

Riquelme-Plaza I, Marolla-Gajardo J. Towards Digital Transformation in University Teaching: Diagnosis of the Level and Profile of Digital Competence Based on the DigCompEdu and OpenEdu Frameworks Among University Lecturers in Chile. Education Sciences. 2026; 16(2):174. https://doi.org/10.3390/educsci16020174

Chicago/Turabian Style

Riquelme-Plaza, Irma, and Jesús Marolla-Gajardo. 2026. "Towards Digital Transformation in University Teaching: Diagnosis of the Level and Profile of Digital Competence Based on the DigCompEdu and OpenEdu Frameworks Among University Lecturers in Chile" Education Sciences 16, no. 2: 174. https://doi.org/10.3390/educsci16020174

APA Style

Riquelme-Plaza, I., & Marolla-Gajardo, J. (2026). Towards Digital Transformation in University Teaching: Diagnosis of the Level and Profile of Digital Competence Based on the DigCompEdu and OpenEdu Frameworks Among University Lecturers in Chile. Education Sciences, 16(2), 174. https://doi.org/10.3390/educsci16020174

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop