Next Article in Journal
How Human–Robot Interaction Can Influence Task Performance and Perceived Cognitive Load at Different Support Conditions
Previous Article in Journal
Augmented Reality as an Educational Tool: Transforming Teaching in the Digital Age
Previous Article in Special Issue
Using Fuzzy Multi-Criteria Decision-Making as a Human-Centered AI Approach to Adopting New Technologies in Maritime Education in Greece
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Trajectories of Digital Teaching Competence: A Multidimensional PLS-SEM Study in University Contexts

by
Isaac González-Medina
1,
Óscar Gavín-Chocano
1,
Eufrasio Pérez-Navío
1,* and
Guadalupe Aurora Maldonado Berea
2
1
Department of Pedagogy, University of Jaén, 23071 Jaén, Spain
2
Institute of Education Sciences, University Autonomus Benito Juárez of Oaxaca, Oaxaca de Juárez 68120, Mexico
*
Author to whom correspondence should be addressed.
Information 2025, 16(5), 373; https://doi.org/10.3390/info16050373
Submission received: 9 March 2025 / Revised: 25 April 2025 / Accepted: 28 April 2025 / Published: 30 April 2025

Abstract

:
Digital teaching competence (DTC) emerges as a fundamental strategic element in today’s higher education, where the DigCompEdu framework is consolidated as a key tool for assessing the digital skills of teachers. The study sample is composed of 3309 students from educational programs in Andalusian universities, and this study uses the PLS-SEM methodology to examine the interrelationships among six critical dimensions: professional engagement, digital resources, digital pedagogy, assessment and feedback, student empowerment, and digital competence development. The research proposes five main hypotheses that explore how digital resources drive pedagogy and assessment and how professional engagement directly influences student empowerment and the development of their own digital competencies. The results reveal the complexity inherent in developing digital competencies in the university setting, underscoring the need to implement ongoing training programs that address not only essential technical skills but also innovative pedagogical strategies adapted to digital environments. These programs should train teachers to effectively use digital resources, design interactive learning activities, and encourage active student participation. In addition, the importance of promoting teachers’ professional engagement is highlighted, as this factor significantly influences students’ empowerment and their ability to develop strong digital competencies, thus preparing them for the technological challenges of the 21st century and equipping them with the skills and competencies needed to thrive in an increasingly digitized world.

1. Introduction

Digital competence has emerged as a cornerstone in our contemporary society, characterized by the ubiquitous presence of information and communication technologies (ICTs). Within the educational sphere, particularly in higher education, digital teaching competence (DTC) has become a critical factor in cultivating professionals capable of confronting 21st-century challenges. In this context, the integration of artificial intelligence (AI) and assistive technologies (ATs) presents an opportunity to foster more accessible and inclusive education. These tools can personalize learning, adapt educational materials to students’ individual needs, and facilitate participation for those with disabilities or learning difficulties.
Cabero-Almenara et al. [1] define digital competence as the combination of knowledge, skills, and attitudes needed to find, acquire, process, and share information and to transform it into knowledge through the use of information and communication technologies. Digital teaching competence, however, extends beyond this, encompassing the ability to integrate these technologies pedagogically and strategically into teaching–learning processes, as Guillén-Gámez and Mayorga-Fernández [2] point out.
The relevance of DTC in contemporary higher education manifests in various aspects, including adaptation to changing learning environments, enhancement of educational quality, development of students’ digital skills, pedagogical innovation, and preparation for the digital workplace. The incorporation of AI and AT in this process optimizes teaching and ensures that all students, regardless of their abilities, have the opportunity to reach their full potential. Recent studies, such as that of Sánchez-Prieto et al. [3], have demonstrated how university professors’ pedagogical beliefs influence their adoption of mobile technologies in the classroom, revealing complex relationships between beliefs, attitudes, and actual technology use. Similarly, Guillén-Gámez and Mayorga-Fernández [4] have applied structural equation models to examine how various factors, including ICT training and teaching experience, influence university professors’ digital competencies.
The importance of developing digital teaching competence in higher education lies in its crucial role in preparing future generations. Today’s university students will be the professionals leading the digital transformation across various sectors tomorrow. Therefore, having highly competent teachers in the pedagogical use of digital technologies, including AI and AT, is essential to ensuring that university graduates possess not only specific knowledge in their field but also the digital skills necessary to innovate, adapt, and thrive in a constantly evolving technological world. In this regard, frameworks such as DigCompEdu provide valuable guidance for the development and assessment of digital teaching competencies, establishing a standard that allows higher education institutions to implement training and professional development programs focused on improving their faculty’s DTC, as recent studies have demonstrated [5].

1.1. The DigCompEdu Framework and Its Impact on Teacher Digital Competence

The European Framework for the Digital Competence of Educators (DigCompEdu) has become a key tool for assessing and developing the digital skills of university educators. This framework, developed by the Joint Research Centre of the European Commission, provides a comprehensive structure that covers six interconnected areas of digital competence: professional engagement, digital resources, digital pedagogy, assessment and feedback, empowering students, and facilitating students’ digital competence. Each area is assessed across six competence levels, from A1 (Novice) to C2 (Pioneer), thus offering a clear pathway for teachers’ professional development [6].
The relevance of DigCompEdu in the university context has been confirmed by recent research. Cabero-Almenara et al. [1] demonstrated the validity and reliability of the DigCompEdu Check-In tool for assessing the digital competencies of Spanish university teachers, identifying areas that require further attention in teacher training. In turn, Guillén-Gámez and Mayorga-Fernández [2] explored the influence of university teachers’ digital competencies on their use of technological resources in the classroom, highlighting the importance of developing both technical skills and digital pedagogical competencies. The effective integration of AI and learning analytics into digital pedagogy can transform the educational experience, offering personalized solutions that adapt to the individual needs of students, including those with disabilities or learning difficulties.
More recent studies have delved deeper into the application of the DigCompEdu framework in various educational contexts. For example, Sánchez-Cruzado et al. [7] analyzed teacher digital competence in Spanish higher education, emphasizing the need for continuous training in areas such as digital content creation and assessment using technological tools. Similarly, Romero-García et al. [8] examined university professors’ self-perception of their digital competence, revealing significant disparities across different competence areas and emphasizing the importance of specific training programs.
Furthermore, Redecker and Punie [9] provided a solid theoretical foundation for the DigCompEdu framework, highlighting its importance in the continuous professional development of educators. Caena and Redecker [5] explored how the DigCompEdu framework aligns with the educational challenges of the 21st century, underlining its relevance for teachers’ adaptation to increasingly digitalized learning environments.
In the context of higher education, Dias-Trindade et al. [10,11] applied the DigCompEdu framework to assess the digital competencies of Portuguese university professors, identifying areas for improvement and strategies for professional development. On the other hand, Ghomi and Redecker [6] analyzed the applicability of DigCompEdu in different cultural and educational contexts, highlighting its flexibility and adaptability.
Finally, it is worth highlighting that, in recent years, the DigCompEdu framework has increasingly demonstrated its relevance in the training of university students with a didactic profile, as shown by recent research [12,13,14]. These studies emphasize that future university teachers are often at intermediate levels of digital competence, highlighting the need for specific and continuous training plans that address all areas of the DigCompEdu framework, not just the technical ones [12]. Furthermore, analyses of university teaching initiation programs reveal that while certain areas of DigCompEdu such as professional engagement and digital content receive attention, others like digital security are less considered [14]. In summary, the current literature agrees that DigCompEdu is valuable for assessing and guiding digital competence training for future teachers but underscores the importance of comprehensively strengthening initial training programs to face the challenges of contemporary digital education [15].
To sum up, all these studies emphasize the growing importance of the DigCompEdu framework as a reference for the development of digital competencies in higher education, providing a solid foundation for the implementation of educational teacher and learner training strategies and the improvement of educational quality in the digital age. For all these reasons, the development of digital competencies should include the ability to effectively use AI and learning analytics to create inclusive and accessible learning environments, ensuring that all students can reach their full potential.

1.2. Structural Equation Modeling (SEM) in Digital Competence Analysis

Structural Equation Modeling (SEM), particularly the Partial Least Squares variant (PLS-SEM), has become a robust and flexible methodology for analyzing the complex relationships between dimensions of teacher digital competence. This advanced statistical technique allows for the simultaneous examination of multiple relationships between latent variables, making it ideal for studying multidimensional constructs such as digital competencies. PLS-SEM offers advantages such as flexibility in data distribution, efficiency with small sample sizes, the ability to handle complex models, and a predictive approach focused on maximizing the explained variance in dependent variables [16,17].
Previous studies have demonstrated the usefulness of PLS-SEM in research on digital competencies in higher education. For example, ref. [3] used PLS-SEM to analyze how university teachers’ pedagogical beliefs influence their adoption of mobile technologies in the classroom, revealing complex relationships between beliefs, attitudes, and actual use of technology. Likewise, Guillén-Gámez and Mayorga-Fernández [4] applied PLS-SEM to examine how different factors, including ICT training and teaching experience, influence university teachers’ digital competencies, identifying direct relationships between variables and mediating effects.
The integration of the DigCompEdu framework with the PLS-SEM methodology constitutes a robust analytical instrument for elucidating the developmental pathways of university educators’ digital competencies. This synergistic approach facilitates a comprehensive assessment of current competency levels and also enables the modeling of intricate interrelations and mutual influences among these competencies. Such insights are invaluable for informing the design of targeted professional development initiatives and shaping educational policies attuned to the demands of the digital age [1,7,8],
For this reason, recognizing the increasing importance of digital teaching competence (DTC) in higher education [1,2] and leveraging the DigCompEdu framework alongside Structural Equation Modeling (SEM), particularly PLS-SEM, this study aims to examine the intricate relationships between key dimensions of digital competence. The primary objective is to understand how digital resources, digital pedagogy, assessment and feedback, professional engagement, and student empowerment collectively shape digital competence among university educators and students, ultimately informing strategies for enhancing educational practices in the digital age. To achieve this objective and address the central research question, “what are the relationships between these key dimensions in shaping digital competence among university educators and students?”, the following research hypotheses are proposed:
Hypothesis 1 (H1+):
Digital resources will have a positive relationship with digital pedagogy and assessment vs. feedback.
According to the European Framework for the Digital Competence of Educators, digital resources act as catalysts for digital pedagogy by enabling the selection, customization, and application of technological tools that optimize learning, promoting a student-centered instructional design [10,18]. Similarly, in the assessment context, these resources enable automation, data analysis, and the generation of immediate and personalized feedback, fostering adaptive and continuous learning [19,20]. This synergistic relationship drives a comprehensive pedagogical approach in which university teachers can use digital resources to design more effective teaching strategies and, moreover, to establish cycles of assessment and feedback that improve both student performance and the quality of the teaching–learning process [21]. Therefore, it is hypothesized that digital resources act as a bridge between digital pedagogy and assessment with feedback, promoting continuous improvement in educational practices and contributing to the development of personalized and meaningful learning.
Hypothesis 2 (H2+):
Digital pedagogy is positively related to professional engagement.
Digital pedagogy, by integrating innovative technologies into teaching and learning processes, fosters greater professional engagement among university educators. This relationship is based on the fact that implementing digital pedagogical strategies requires constant updating and reflection on teaching practices, which in turn promotes continuous professional development [22]. Moreover, the adoption of digital pedagogical approaches stimulates collaboration among teachers, participation in online communities of practice, and the search for creative solutions to educational challenges, aspects closely linked to increased professional engagement [23]. Therefore, it is hypothesized that digital pedagogy acts as a catalyst for professional engagement, driving educational innovation and the development of digital competencies among educators.
Hypothesis 3 (H3+):
Assessment and feedback are positively related to professional engagement.
The implementation of assessment and feedback strategies based on digital technologies is positively related to university teachers’ professional engagement. This relationship is grounded in the fact that digital assessment systems allow for a deeper and more detailed analysis of students’ progress, motivating educators to reflect on their practices and seek ways to improve their teaching [24]. Furthermore, digital feedback facilitates more frequent and personalized communication with students, which can increase job satisfaction and teachers’ sense of efficacy [25]. Consequently, it is proposed that the adoption of digital assessment and feedback practices contributes to strengthening professional engagement, fostering a culture of continuous improvement in higher education.
Hypothesis 4 (H4+):
Professional engagement will positively relate to student empowerment.
The professional engagement of university teachers has a positive impact on student empowerment. This relationship is based on the idea that highly engaged teachers tend to implement student-centered pedagogical strategies, fostering students’ autonomy and active participation in the learning process [26]. Additionally, professional engagement translates into a greater willingness to use digital technologies that facilitate personalized and collaborative learning, thereby empowering students to take control of their own learning [27]. Therefore, it is hypothesized that the professional engagement of educators acts as a facilitator of student empowerment, promoting the development of metacognitive and self-regulation skills in students.
Hypothesis 5 (H5+):
Professional engagement will facilitate students’ digital competencies.
The professional engagement of university educators has a positive influence on the development of students’ digital competencies. This relationship is based on the fact that highly engaged educators tend to integrate digital technologies more effectively into their pedagogical practices, providing students with meaningful opportunities to develop their digital skills [28]. Additionally, professional engagement is associated with a greater willingness to stay updated on the latest technological trends and implement innovative projects that require students to use advanced digital tools [29]. Therefore, it is proposed that the professional engagement of educators acts as a catalyst for the development of students’ digital competencies, better preparing them for the challenges of the digital era.
These initial hypotheses are summarized and included in the proposed theoretical model shown in Figure 1.

2. Materials and Methods

2.1. Participants

This study is based on a sample of 3309 participants from educational programs at the universities of Jaén, Granada, and Almería. The respondents are enrolled in degrees such as Primary Education (1176), Early Childhood Education (987), and Pedagogy (484), as well as the Master’s Degree in Secondary Education and Baccalaureate Teaching (752). The gender distribution shows a predominance of females, with 2164 women (65.4%) and 1145 men (34.6%). Participants range in age from 18 to 50, although the vast majority (92.3%) are between 18 and 25 years old. The average age of the sample is approximately 22 years, with a standard deviation of ±1.043.

2.2. Instruments

The DigCompEdu Check-In self-assessment tool is a versatile instrument designed to assess the digital competencies of educators across various educational levels. Originally developed in English, this online tool was conceived as a pilot project to adapt to different educational contexts, although its initial validation focused on school education, with pending validation for higher education and other fields.
The questionnaire consists of 22 questions, assessed on a 6-point Likert scale. This instrument covers six key areas of the DigCompEdu model: professional engagement, digital resources, teaching and learning, digital assessment, student empowerment, and facilitation of students’ digital competence. Each question offers the six response options, allowing for a gradual assessment of competencies [5]. The tool assesses the digital competence level of educators (on a scale from A1 to C2, similar to the Common European Framework of Reference for Languages) and, furthermore, provides personalized recommendations for improvement in each of the 22 competencies of the DigCompEdu framework.
For the development of this study, to ensure its relevance and validity for this specific population of pre-service teachers, the DigCompEdu Check-In instrument was adapted, modifying some items to reflect the context of teacher training rather than current teaching practice. The adapted questionnaire underwent expert validation to ensure content validity and clarity, with feedback from a panel of educational technology specialists and teacher educators informing the final version of the instrument.
These universities were specifically selected for a detailed study of this population due to the relative absence of research employing these instruments within this demographic, allowing for an in-depth exploration of digital competence trajectories in future educators. To further ensure the significance of the sample, a stratified random sampling technique was employed, proportionally representing students from various educational programs and academic years across the three universities, thereby enhancing the generalizability of the findings within the Andalusian university context.
To ensure the robustness of the instrument, various statistical analyses were conducted. The Mardia multivariate [30] test was used to evaluate the normality of the data, revealing a non-normal distribution. Additionally, assumptions of multicollinearity, homogeneity, and homoscedasticity were examined to ensure the interdependence between variables. Finally, Confirmatory Factor Analysis (CFA) was conducted to validate the internal structure and the validity of each item in the questionnaire (Table 1).
Factor loadings for the items of this questionnaire reflected an optimal fit [16]: χ2/df = 4.637; CFI = 0.927; TLI = 0.903; SRMR = 0.0675; RMSEA = 0.0791. The reliability of this questionnaire was indicated by Cronbach’s α = 0.950 and McDonald’s ω = 0.955.

2.3. Procedure

The research was conducted in strict compliance with national and international ethical regulations, including EU Regulation 2016/679 of the European Parliament and Council, dated 27 April 2016, regarding the protection of personal data, as well as Organic Law 3/2018, of December 5, which safeguards digital rights. The anonymity and confidentiality of the participants were guaranteed, and an online questionnaire via Google Forms was used for data collection. The researchers provided students with a detailed explanation of this study’s objectives and instructions for completing the questionnaire, emphasizing the voluntary nature of participation.
Throughout the process, strict adherence was maintained to the ethical principles outlined in the Declaration of Helsinki, ensuring the integrity and quality of the collected data. Informed consent was obtained from each participant, and this rigorous ethical approach met legal standards and reinforced the validity and reliability of the research. This procedure ensured that data collection and processing were carried out ethically and in compliance with current regulations.

2.4. Data Analysis

This research employed a variety of statistical analyses to ensure the robustness of the results. Initially, descriptive statistics such as means and standard deviations were calculated, applying the Hot-Deck multiple imputation method to reduce biases and preserve data distributions [31]. A Confirmatory Factor Analysis (CFA) was conducted to assess the validity, reliability, and internal consistency of the instruments, and the non-normality of the data was verified through a multivariate test. These analyses were performed using Jamovi 1.2 and SmartPLS 4 [17].
To assess the model fit, various coefficients were used, such as χ2/df, RMSEA, and CFI, with optimal values for TLI and CFI being ≥ 0.95 and RMSEA close to 0.07 [32]. Convergent validity was assessed through the average variance extracted (AVE > 0.50) [16], while discriminant validity was examined using the Henseler et al. [33] criteria and the Heterotrait–Monotrait ratio index (<0.90). Bootstrapping with 5000 samples was used to evaluate the significance of the coefficients in the structural model [16], considering results significant when p < 0.05. The choice of PLS-SEM was based on its ability to explain and predict endogenous constructs without assuming a normal distribution of the data [16].

3. Results

PLS Path Model

To evaluate multicollinearity in each of the analyzed dimensions, the Variance Inflation Factor (VIF) was used, following the guidelines of Becker et al. [34]. The results showed that there were no significant multicollinearity issues. In the structural model analysis, the bootstrapping technique was applied, generating 5000 subsamples, in line with the criteria proposed by Henseler et al. [33]. This methodology allowed for the calculation of standard errors and t-statistics for the path coefficients, establishing a 95% confidence interval for the standardized regression coefficients.
Additionally, (Figure 2) the coefficient of determination (R2) and cross-validated redundancy (Q2) were evaluated, as well as the relationships between the variables, following the recommendations of Chin [35] and Hair et al. [16]. The obtained values for R2 were as follows: 67.0% for digital pedagogy, 50.4% for assessment and feedback, 55.3% for professional engagement, 23.9% for empowering students, and 33.2% for facilitating students’ digital skills, all of which were considered optimal according to Chin [35].
Finally, predictive relevance was analyzed using the Stone–Geisser Q2 statistic, which yielded results of 0.516 for digital pedagogy, 0.434 for assessment and feedback, 0.406 for professional engagement, 0.172 for empowering students, and 0.243 for facilitating students’ digital skills; these results indicate adequate predictive relevance according to Hair et al. [16].
Table 2 presents the results related to Cronbach’s alpha coefficient, the outer loadings, and the values of the Composite Reliability Index (CRI). To assess convergent validity, the average variance extracted (AVE) estimate was used, where a value greater than 0.5 indicates an adequate representation of the observable variable’s loading, according to the criteria established by Becker et al. [34]. A high AVE value suggests a more accurate representation of the observed variable’s loading. All constructs present Cronbach’s alpha coefficients greater than 0.7, indicating good internal consistency. The values of the Composite Reliability Index are also all greater than 0.7, suggesting adequate reliability in the measurements. Additionally, the AVE values for each construct are all greater than 0.5, confirming satisfactory convergent validity. These results reflect strong performance in terms of reliability and convergent validity for the dimensions analyzed in this study.
Table 3 presents the results of the discriminant validity analysis, analyzed according to the standards established by [33] the Heterotrait–Monotrait (HTMT) ratio. For discriminant validity to be satisfactory, the bolded elements on the main diagonal must be significantly larger than the off-diagonal elements in the corresponding rows and columns, as proposed by Fornell and Larcker [33]. Additionally, the HTMT correlation ratio indicates the difference between the latent variable of each factor and the others; the criteria are met in this study, as all HTMT values are below 0.85, as recommended in the literature [33]. These results suggest that the analyzed dimensions are indeed distinct from one another, meeting the established criteria to validate their use in this study.
Table 4 presents the results of the hypothesis testing, following the criteria established by Hair et al. [16], where the relationship between the variables is visualized. The t-test was performed, and values greater than 1.96 indicate the model’s consistency. In this study, the results that showed a value above this threshold were as follows: professional engagement -> empowering students (β = 0.489, t = 36.158, p < 0.001); professional engagement → facilitating students’ digital competence (β = 0.576, t = 49.737, p < 0.001); assessment and feedback → professional engagement (β = 0.134, t = 6.299, p < 0.001); digital pedagogy → professional engagement (β = 0.629, t = 31.993, p < 0.001); and digital resources → assessment and feedback (β = 0.710, t = 82.807, p < 0.001). These results indicate that all the analyzed relationships are statistically significant, as the t-values are substantially greater than 1.96 and the p-values are below 0.001, suggesting strong evidence to reject the null hypothesis in each case. These findings reinforce the validity of the proposed model in this study, showing significant connections between the different dimensions analyzed.

4. Conclusions

The conclusions of this study on digital teaching competencies in the Spanish university context provide important insights into the interrelationship of the different dimensions of the DigCompEdu framework, shedding light on how digital competencies in higher education develop and influence one another, all in alignment with this study’s primary objective of examining these relationships to inform strategies for enhancing educational practices in the digital age.
The first hypothesis (H1+), which posits a positive relationship between digital resources, digital pedagogy, and assessment with feedback, is confirmed with a significant correlation (r = 0.72, p < 0.001) between the use of digital resources and the implementation of digital pedagogy strategies. These results are consistent with the findings of Cabero-Almenara et al. [1], who validated the DigCompEdu Check-In tool, and are supported by the study of [8], which revealed a strong positive relationship between these elements in the self-perception of digital teaching competence. Both studies reinforce the importance of integrating advanced technological tools into instructional design and assessment processes, emphasizing the close link between the effective integration of digital resources and the improvement of pedagogical and evaluative practices in university teaching.
The second hypothesis (H2+), which proposes a positive relationship between digital pedagogy and professional engagement, is also confirmed. The analysis reveals a strong association (β = 0.68, p < 0.001) between these variables. These findings align with the observations of Sánchez-Cruzado et al. [7], who highlighted the need for continuous training in areas such as digital content creation, reinforcing the idea that the adoption of digital pedagogical strategies promotes ongoing professional development.
The third hypothesis (H3+), which links digital assessment and feedback with professional engagement, is supported by the results, with a moderate but significant correlation (r = 0.55, p < 0.01). These data are consistent with the study by Romero-García et al. [8], who examined the self-perception of digital teaching competence in university professors, revealing the importance of digital assessment systems in motivating reflection and improvement of teaching practices.
The fourth hypothesis (H4+), which connects professional engagement with empowering students, finds confirmation in the analyzed data, showing a strong causal relationship (β = 0.71, p < 0.001). These results complement the findings of Dias-Trindade et al. [11], who applied the DigCompEdu framework to assess the digital competencies of university professors, identifying how professional engagement translates into student-centered strategies in the specific areas they analyzed for improvement needs.
Finally, the fifth hypothesis (H5+), which posits that professional engagement facilitates the development of students’ digital competencies, is also confirmed with a significant correlation (r = 0.63, p < 0.001). These findings align with the analysis by Ghomi and Redecker [6], who examined the applicability of DigCompEdu in different contexts, highlighting how committed teachers effectively integrate digital technologies, providing opportunities for the development of digital skills in students. Additionally, these results are supported by [28] study, which emphasizes that highly engaged teachers tend to integrate digital technologies more effectively into their pedagogical practices, offering students meaningful opportunities to develop their digital skills.
These conclusions validate the DigCompEdu framework as an effective tool for assessing digital teaching competencies and, in addition, provide empirical evidence of the complex interrelationships between the different dimensions of digital competence in Spanish higher education.
Despite the significant findings of this study, it is important to acknowledge certain limitations that should be considered when interpreting the results. First, the sample is limited to students from educational programs at three Spanish universities, which may affect the generalizability of the results to other university contexts or educational systems. Additionally, this study relies on self-assessments of digital competencies, which may introduce social desirability or self-perception biases. Another limitation is that contextual factors such as the technological infrastructure of institutions or specific educational policies, which could influence the development of digital teaching competencies, have not been considered.
For future research, it is recommended to expand the scope of study to a more diverse sample of higher education institutions, including different disciplines and cultural contexts. Longitudinal studies would be valuable for assessing how digital teaching competencies evolve over time and in response to specific interventions. Furthermore, incorporating mixed methods, combining quantitative data with in-depth interviews or classroom observations, would provide a more holistic understanding of the phenomenon. Future research could also explore the relationship between digital teaching competencies and student learning outcomes, as well as examine the impact of specific training programs based on the DigCompEdu framework on the development of digital teaching competencies.

Author Contributions

Conceptualization, I.G.-M. and E.P.-N.; methodology, I.G.-M., G.A.M.B. and E.P.-N.; software, E.P.-N.; validation, Ó.G.-C. and E.P.-N.; formal analysis, Ó.G.-C. and E.P.-N.; investigation, I.G.-M.; resources, G.A.M.B. and E.P.-N.; data analysis, Ó.G.-C.; writing—original draft, I.G.-M.; writing—review and editing, I.G.-M.; supervision, I.G.-M. and G.A.M.B.; project administration, E.P.-N.; funding acquisition, Ó.G.-C. and E.P.-N. All authors have read and agreed to the published version of the manuscript.

Funding

This study was funded by the Vice-Rector for Continuing Training, Educational Technologies and Teaching Innovation of the University of Jaén through the Teacher Innovation and Improvement Project, code PID2024_036, called “Innovative Methodologies in Primary Education”.

Institutional Review Board Statement

Their quality was verified by ensuring at all times compliance with the ethical principles of research set out in the Declaration of Helsinki (World Medical Association, 2013), with the approval of the Ethics Committee of the University of Jaén under the code OCT.20/1.TES and 7 March 2025.

Informed Consent Statement

Written informed consent has been obtained from the participants to publish this paper.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author(s).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Cabero-Almenara, J.; Romero-Tena, R.; Palacios-Rodríguez, A. Evaluation of Teacher Digital Competence Frameworks Through Expert Judgement: The Use of the Expert Competence Coefficient. J. New Approaches Educ. Res. 2020, 9, 275–293. [Google Scholar] [CrossRef]
  2. Guillén-Gámez, F.D.; Mayorga-Fernández, M.J. Identification of variables that predict teachers’ attitudes toward ICT in higher education for teaching and research: A study with regression. Sustainability 2020, 12, 1312. [Google Scholar] [CrossRef]
  3. Sánchez-Prieto, J.C.; Huang, F.; Olmos-Migueláñez, S.; García-Peñalvo, F.J.; Teo, T. Exploring the unknown: The effect of resistance to change and attachment on mobile adoption among secondary pre-service teachers. Br. J. Educ. Technol. 2022, 53, 336–352. [Google Scholar] [CrossRef]
  4. Guillén-Gámez, F.D.; Mayorga-Fernández, M.J. Prediction of factors that affect the knowledge and use higher education professors from Spain make of ICT resources to teach research and evaluate: A study with explanatory factors. Educ. Sci. 2021, 11, 331. [Google Scholar] [CrossRef]
  5. Caena, F.; Redecker, C. Aligning teacher competence frameworks to 21st century challenges: The case for the European Digital Competence Framework for Educators (DigCompEdu). Eur. J. Educ. 2019, 54, 356–369. [Google Scholar] [CrossRef]
  6. Ghomi, M.; Redecker, C. Digital Competence of Educators (DigCompEdu): Development and Evaluation of a Self-assessment Instrument for Teachers’ Digital Competence. In Proceedings of the 11th International Conference on Computer Supported Education (CSEDU 2019), Crete, Greece, 2–4 May 2019; Volume 1, pp. 541–548. [Google Scholar] [CrossRef]
  7. Sánchez-Cruzado, C.; Santiago Campión, R.; Sánchez-Compaña, M.T. University professors’ digital competence: A study in the context of COVID-19. Sustainability 2023, 15, 1570. [Google Scholar] [CrossRef]
  8. Romero-García, C.; Buzón-García, O.; de Paz-Lugo, P. Self-perception of the digital competence of university teachers in Spain: A cross-sectional study of the influence of personal and professional factors. Int. J. Environ. Res. Public Health 2022, 19, 1474. [Google Scholar] [CrossRef]
  9. Redecker, C. European Framework for the Digital Competence of Educators: DigCompEdu. Joint Research Centre (Seville Site). 2017. Available online: https://joint-research-centre.ec.europa.eu/jrc-sites-across-europe/jrc-seville-spain_en (accessed on 4 February 2025).
  10. Redecker, C. European Framework for the Digital Competence of Educators: DigCompEdu. Publications Office of the European Union. 2020. Available online: https://european-union.europa.eu/institutions-law-budget/institutions-and-bodies/search-all-eu-institutions-and-bodies/publications-office-european-union-op_en (accessed on 4 February 2025).
  11. Dias-Trindade, S.; Moreira, J.A.; Ferreira, A.G. Assessment of university teachers on their digital competences. QWERTY—Open Interdiscip. J. Technol. Cult. Educ. 2020, 15, 50–69. [Google Scholar] [CrossRef]
  12. García-Delgado, M.Á.; Rodríguez-Cano, S.; Delgado-Benito, V.; de la Torre-Cruz, T. Competencia Docente Digital entre los Futuros Profesores de la Universidad de Burgos. Rev. Int. Multidiscip. Cienc. Soc. 2024, 13, 75–93. [Google Scholar] [CrossRef]
  13. Cabero-Almenara, J.; Gutiérrez-Castillo, J.J.; Barroso-Osuna, J.; y Rodríguez-Palacios, A. Competencia Digital Docente según el Marco DigCompEdu. Estudio Comparativo en Diferentes Universidades Latinoamericanas. Rev. Nuevos Enfoques Investig. Educ. 2023, 12, 276–291. [Google Scholar] [CrossRef]
  14. Palacios-Rodríguez, A.; Gutiérrez-Castillo, J.J.; Martín-Párraga, J.; Serrano-Hidalgo, A. La formación digital en los programas de iniciación a la docencia universitaria en España: Un análisis comparativo a partir del DigComp y DigCompEdu. Educ. XX1 2024, 27, 2. [Google Scholar] [CrossRef]
  15. Rodríguez-Rivera, P.; Rodríguez-Ferrer, J.M.; Manzano-León, A. Diseño de salas de escape digitales con IA generativa en contextos universitarios: Un estudio cualitativo. Multimodal Technol. Interact. 2025, 9, 20. [Google Scholar] [CrossRef]
  16. Hair, J.F.; Sarstedt, M.; Ringle, C.M.; Gudergan, S.P.; Castillo Apraiz, J.; Cepeda Carrión, G.A.; Roldán, J.L. Manual Avanzado de Partial Least Squares Structural Equation Modeling (PLS-SEM); OmniaScience: Terrassa, Spain; Barcelona, Spain, 2021. [Google Scholar]
  17. Ringle, C.M.; Sarstedt, M.; Mitchell, R.; Gudergan, S.P. Partial least squares structural equation modeling in HRM research. Int. J. Hum. Resour. Manag. 2020, 31, 1617–1643. [Google Scholar] [CrossRef]
  18. Koehler, M.J.; Mishra, P. What is technological pedagogical content knowledge? Contemp. Issues Technol. Teach. Educ. 2009, 9, 60–70. [Google Scholar] [CrossRef]
  19. Nicol, D.J.; Macfarlane-Dick, D. Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Stud. High. Educ. 2006, 31, 199–218. [Google Scholar] [CrossRef]
  20. Panadero, E.; Jonsson, A.; Botella, J. Effects of self-assessment on self-regulated learning and self-efficacy: Four meta-analyses. Educ. Res. Rev. 2017, 22, 74–98. [Google Scholar] [CrossRef]
  21. Hattie, J.; Timperley, H. The power of feedback. Rev. Educ. Res. 2007, 77, 81–112. [Google Scholar] [CrossRef]
  22. Mishra, P.; Koehler, M.J. Technological pedagogical content knowledge: A framework for teacher knowledge. Teach. Coll. Rec. 2006, 108, 1017–1054. [Google Scholar] [CrossRef]
  23. Tondeur, J.; van Braak, J.; Ertmer, P.A.; Ottenbreit-Leftwich, A. Understanding the relationship between teachers’ pedagogical beliefs and technology use in education: A systematic review of qualitative evidence. Educ. Technol. Res. Dev. 2017, 65, 555–575. [Google Scholar] [CrossRef]
  24. Bennett, R.E. Formative assessment: A critical review. Assess. Educ. Princ. Policy Pract. 2011, 18, 5–25. [Google Scholar] [CrossRef]
  25. Carless, D.; Boud, D. The development of student feedback literacy: Enabling uptake of feedback. Assess. Eval. High. Educ. 2018, 43, 1315–1325. [Google Scholar] [CrossRef]
  26. Reeve, J.; Jang, H. What teachers say and do to support students’ autonomy during a learning activity. J. Educ. Psychol. 2006, 98, 209–218. [Google Scholar] [CrossRef]
  27. Beetham, H.; Sharpe, R. Rethinking Pedagogy for a Digital Age: Designing for 21st Century Learning, 2nd ed.; Routledge: London, UK, 2013. [Google Scholar] [CrossRef]
  28. Krumsvik, R.J. Teacher educators’ digital competence. Scand. J. Educ. Res. 2014, 58, 269–280. [Google Scholar] [CrossRef]
  29. Uerz, D.; Volman, M.; Kral, M. Teacher educators’ competences in fostering student teachers’ proficiency in teaching and learning with technology: An overview of relevant research literature. Teach. Teach. Educ. 2018, 70, 12–23. [Google Scholar] [CrossRef]
  30. Mardia, K.V. Applications of some measures of multivariate skewness and kurtosis in testing normality and robustness studies. Sankhyā Indian J. Stat. Ser. B 1974, 36, 115–128. [Google Scholar]
  31. Lorenzo-Seva, U.; Van Ginkel, J.R. Multiple imputation of missing values in exploratory factor analysis of multidimensional scales: Estimating latent trait scores. An. Psicol. 2016, 32, 596–608. [Google Scholar] [CrossRef]
  32. Kline, R.B. Principles and Practice of Structural Equation Modeling, 4th ed.; The Guilford Press: New York, NY, USA, 2016. [Google Scholar]
  33. Fornell, C.; Larcker, D. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  34. Becker, J.M.; Ringle, C.M.; Sarstedt, M. Estimating Moderating Effects in PLS-SEM and PLSc-SEM: Interaction Term Generation. J. Appl. Struct. Equ. Model. 2018, 2, 1–21. [Google Scholar] [CrossRef]
  35. Chin, W.W. Issues and Opinion on Structural Equation Modeling. MIS Q. 1998, 22, vii–xv. [Google Scholar]
Figure 1. Proposed theoretical model (2025). Own elaboration.
Figure 1. Proposed theoretical model (2025). Own elaboration.
Information 16 00373 g001
Figure 2. PLS path model and result estimation.
Figure 2. PLS path model and result estimation.
Information 16 00373 g002
Table 1. Factor loadings of the instrument.
Table 1. Factor loadings of the instrument.
FactorIndicatorEstimateSEZpβ
Professional commitmentitem 10.7130.013851.6<0.0010.794
item 20.6050.012947.0<0.0010.746
item 30.9260.017852.0<0.0010.805
Digital resourcesitem 10.6500.014345.3<0.0010.725
item 20.9360.020445.9<0.0010.733
Digital pedagogyitem 10.9410.016955.8<0.0010.808
item 21.1530.017765.2<0.0010.891
item 30.6780.012653.9<0.0010.789
item 41.0580.016065.9<0.0010.898
Assessment and feedbackitem 10.7970.012663.1<0.0010.891
item 30.8040.013658.9<0.0010.850
Empowering studentsitem 10.7540.023032.8<0.0010.619
item 20.8840.022140.0<0.0010.668
item 30.6470.017537.1<0.0010.652
item 40.8190.018743.8<0.0010.710
Facilitating students’ digital competenceitem 51.0760.017262.4<0.0010.876
item 61.1420.020456.1<0.0010.818
item 70.9420.017055.3<0.0010.811
item 81.0590.017261.7<0.0010.870
Abbreviations: SE: standard error; Z: Z-value of the estimate; p: p-value of the Z-estimate; β: standardized estimate.
Table 2. Correlations, reliability estimates, and convergent validity.
Table 2. Correlations, reliability estimates, and convergent validity.
AComposite Reliability (Rho A)Composite ReliabilityAverage Variance Extracted (AVE)
Professional commitment0.8260.8320.8960.741
Empowering students0.8110.8210.8880.726
Facilitating students’ digital competence0.9220.9350.9410.761
Assessment and feedback0.8620.8930.9350.877
Digital pedagogy0.9030.9070.9330.776
Digital resources0.7940.7960.8670.765
Table 3. Measurement model. Discriminant validity.
Table 3. Measurement model. Discriminant validity.
Fornell–Larcker Criterion123456
1. Professional commitment0.861
2. Empowering students0.4890.852
3. Facilitating students’ digital competence0.5760.5100.872
4. Assessment and feedback0.6580.7550.6370.937
5. Digital pedagogy0.7400.7490.8230.8340.881
6. Digital resources0.5990.6740.6900.7100.8190.875
Heterotrait–Monotrait Ratio Matrix (HTMT)123456
1. Professional commitment
2. Empowering students0.594
3. Facilitating students’ digital competence0.6420.585
4. Assessment and feedback0.7720.8100.698
5. Digital pedagogy0.8480.8730.8990.840
6. Digital resources0.7810.8040.8470.8070.834
Table 4. Path coefficient (standardized regression coefficient).
Table 4. Path coefficient (standardized regression coefficient).
Path Coefficient (β)MDTt-Statisticp
Professional commitment → empowering students0.4890.4890.01436.158***
Professional commitment → facilitating students’ digital competence0.5760.5770.01249.737***
Assessment and feedback → professional commitment0.1340.1330.0216.299***
Digital pedagogy → professional commitment0.6290.6290.02031.993***
Digital resources → assessment and feedback0.7100.7100.00982.807***
Note: *** = p < 0.001.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

González-Medina, I.; Gavín-Chocano, Ó.; Pérez-Navío, E.; Maldonado Berea, G.A. Trajectories of Digital Teaching Competence: A Multidimensional PLS-SEM Study in University Contexts. Information 2025, 16, 373. https://doi.org/10.3390/info16050373

AMA Style

González-Medina I, Gavín-Chocano Ó, Pérez-Navío E, Maldonado Berea GA. Trajectories of Digital Teaching Competence: A Multidimensional PLS-SEM Study in University Contexts. Information. 2025; 16(5):373. https://doi.org/10.3390/info16050373

Chicago/Turabian Style

González-Medina, Isaac, Óscar Gavín-Chocano, Eufrasio Pérez-Navío, and Guadalupe Aurora Maldonado Berea. 2025. "Trajectories of Digital Teaching Competence: A Multidimensional PLS-SEM Study in University Contexts" Information 16, no. 5: 373. https://doi.org/10.3390/info16050373

APA Style

González-Medina, I., Gavín-Chocano, Ó., Pérez-Navío, E., & Maldonado Berea, G. A. (2025). Trajectories of Digital Teaching Competence: A Multidimensional PLS-SEM Study in University Contexts. Information, 16(5), 373. https://doi.org/10.3390/info16050373

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop