Next Article in Journal
A Text-Mining Approach to Compare Impacts and Benefits of Nature-Based Solutions in Europe
Next Article in Special Issue
Embedded Information Problem-Solving Instruction to Foster Learning from Digital Sources: Longitudinal Effects on Task Performance
Previous Article in Journal
Why Choose to Cycle in a Low-Income Country?
Previous Article in Special Issue
Interdisciplinary Learning at University: Assessment of an Interdisciplinary Experience Based on the Case Study Methodology
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Improving Future Teachers’ Digital Competence Using Active Methodologies

Carmen Romero-García
Olga Buzón-García
2 and
Patricia de Paz-Lugo
Department of Didactics of Mathematics and Experimental Sciences, Universidad Internacional de la Rioja, 28040 Madrid, Spain
Department of Didactics and Educational Organization, Universidad de Sevilla, 41013 Sevilla, Spain
Department of Didactics of Physical and Health Education, Universidad Internacional de la Rioja, 28040 Madrid, Spain
Author to whom correspondence should be addressed.
Sustainability 2020, 12(18), 7798;
Submission received: 18 August 2020 / Revised: 11 September 2020 / Accepted: 16 September 2020 / Published: 21 September 2020


Contemporary society demands a university education based on active and participatory educational models that enable the development of competences, with digital competence being amongst the most demanded ones. This work presents the results of an educational innovation at the university level. It intends to analyse whether the implementation of an active methodology supported by technological tools in a virtual classroom contributes to students’ digital development. A quantitative methodology with a pre-experimental pretest-posttest design was used. The sample comprised 30 students studying the Curriculum Design module on the Biology and Geology Specialism of the Master’s in Teacher Training at the Universidad Internacional de la Rioja. The results show an improvement in the five areas of the digital competence specified by the Common Framework for Teachers’ Digital Competence (MCCDD) established by Spain’s National Institute of Educational Technologies and Teacher Training (INTEF), with a large effect size. It is concluded that the educational experiment implemented has enabled an increment in the level of digital competence of future teachers.

1. Introduction

Nowadays, students have to learn to live in a globalised, digitised, intercultural, and changing society that produces vast quantities of information. Therefore, students’ learning needs require ways of teaching that are different from those used 20 years ago [1,2]. For some years, we have been experiencing a transition from an education model centred on teaching and content transmission towards a methodological model focused on the acquirement of competencies. However, university education has traditionally been based on a lecturer-centred educational model that emphasises the transmission of knowledge and its reproduction by the students, the lecturer’s lesson, and individual work [3].
One of the strategic objectives of the European Commission in the field of education and training (“ET2020”) is to encourage innovation and creativity, promoting the acquisition of transversal competences, including digital competence, by all citizens [4].
Digital competence is one of the eight key competencies that every person should have developed upon completion of compulsory education to be able to adapt quickly to a rapidly changing world with multiple interconnections [5].
The DIGCOMP project—the project on Digital Competence launched by the European Commission on the better understanding and development of Digital Competence on Europe—identifies digital competence as a transversal competence; in other words, one that enables us to acquire other competences, and one that is related with the skills of the 21st century that every member of society should acquire to ensure their active participation in society and the economy [6].
According to the European Commission, digital competence involves safe, responsible, and critical use of digital technologies for learning, working, and participating in society [7]. It involves not only basic technical mastery, but also the development of abilities to: (1) browse, evaluate, and manage information; (2) communicate and collaborate; (3) create digital contents; (4) preserve safety; and (5) solve problems, both in formal, non-formal, and informal learning contexts.
Similarly, Spain’s National Institute of Educational Technologies and Teacher Training (INTEF) states that digital competence involves creative, safe, and critical use of information and communication technology (ICT) to achieve objectives related to learning, employability, work, free time, inclusion, and participation in society [8].
Against this background, education policies have made evident efforts to introduce ICT in schools in the hope of achieving improvements in learning and digital literacy.
Correct development of digital competence in the educational system requires teachers to have sufficient training in this competence, as the introduction of ICT in classrooms does not guarantee improved educational quality unless teachers have suitable digital competencies [4,8,9,10,11,12,13,14,15].
Teachers’ digital competence has been defined as the set of capacities and skills that result in the adequate incorporation and use of ICT as a methodological resource, integrated into the teaching–learning process, thus transforming ICT into Learning and Knowledge Technology (LKT) with a clear educational application [16].
The basis of effective teaching with the use of ICT arises from the interaction between content, pedagogy and technological knowledge (T-PACK) [17,18], which means that teacher digital competence is the teacher’s competency in the use of ICT in a professional with good pedagogical-didactic judgment and knowledge of the implications for learning strategies and digital training of students [19].
Various studies have shown that, up to now, future teachers’ initial training programs in digital competence has been quite poor [20,21,22,23,24,25,26]. This could be one of the main reasons for the failure of the integration of ICT into the curriculum in education.
Teachers can use ICT to follow a traditional transition–reception pedagogical model. ICT can also be used to respond to the challenges of contemporary society [27]. The type of teaching and learning model that future teachers experience during their training will determine which option they choose in their professional practice, hence the importance of an adequate initial training in the use and application of constructivist and collaborative models based on ICT for future teachers [28].
Successful integration of ICT into the curriculum in teachers’ future educational practice will only be achieved with good initial training in teachers’ digital competence [10,29,30,31].
Nonetheless, in most cases, teacher training in digital competence is solely limited to instrumental questions, leaving aside the implementation of innovative teaching practices involving these technologies [32].
The aim of this work is to analyse the improvement in digital competence of future secondary education teachers. For the purpose of the analysis, we have implemented a teaching design based on the collaborative learning methodology mediated by digital tools in an online setting.
For the purpose of this paper, we will use the definition of teacher digital competence contained in the Common Digital Competence Framework for Teachers (INTEF) [8].
In Spain, the organisation responsible for regulating teachers’ digital competence is the National Institute of Educational Technologies and Teacher Training (INTEF). INTEF’s Common Framework for Teachers’ Digital Competence (MCCDD) [8] for diagnosing and improving teachers’ digital competences is based on the competences described in the DigComp project for all citizens [6] and the European Framework for the Digital Competence for Educators (DigCompEdu) [13]. It is composed of 21 competences in 5 areas (Table 1), while DigCompEdu distinguishes six different areas in which the educator’s Digital Competence is expressed with a total of 22 competences (Table 2). The Common Framework is described in Table 1. These are the basis of the instrument used in the present research for evaluating teachers’ digital competence, which was previously validated by Tourón et al. [16].
ICT does not in itself produce improvements in learning unless students are regarded as people who are capable of thinking [33]. Taking this into account, we have used a teaching design based on collaborative learning in which students play a key role and are the main protagonists of their learning process. It has been proven that collaborative learning offers numerous benefits for the students’ learning process [34,35,36,37,38,39,40,41].
However, we must be clear about what we mean when we talk about collaborative learning. In the classic definitions of Johnson and Johnson (1987) [42] and Johnson, Johnson and Smith (1991) [43], the emphasis is placed on the interdependence between individual and group effort and learning, since each member of the group is responsible, both for their learning, and that of the other members, and in the motivation to help each other in order to achieve common goals. If these premises are produced in an adequate way, collaborative methodologies can improve the learning process.
Among the main advantages of collaborative learning methodology through ICT are the following: (a) academic benefits, such as promoting metacognition and allowing students to exercise a sense of control over the task; (b) social benefits, by encouraging students to see situations from different perspectives and creating an environment where students can practice social and leadership skills, as well as facilitating the integration of students with learning difficulties; (c) psychological benefits, by providing a satisfactory learning experience, reducing student anxiety [34,36,40].

2. Materials and Methods

2.1. Sample

The experiment was developed in the group taught by the authors of the present paper. Therefore, the sampling used was non-probability convenience. A total of 30 students participated in the research. These students were taking the Curriculum Design module in the Biology and Geology specialism of the Master’s in Secondary and Baccalaureate Teacher Training in the Faculty of Education at the Universidad Internacional de La Rioja (UNIR), a wholly online university, during the 2018–2019 academic year. Out of these students, 61.7% were women and 38.3% men, and 19.56% are doctors and 80.43% are graduates, with a mean age of 32.3 years. As noted, the mean age of online students is greater than at universities that use face-to-face teaching, where the mean age of male students is 23.2 and for female students 22.9 [43]. Regarding previous teaching experience, 62.4% of the students have none, 16.8% have less than 1 year, 16% have between 1 and 3 years, and 4.8% have over 5 years’ experience.

2.2. Research Design

To evaluate the results of the educational intervention programme implemented, we used a quantitative methodology with a pre-experimental design using a pretest and posttest group. This research design is appropriate when carrying out research practices within natural contexts, such as a classroom. In these situations, group equalizing techniques do not offer full control of certain variables (characteristics of the subjects, previous experiences, etc.) and are therefore not the most adequate approach [44]. The programme is based on a collaborative learning methodology supported by various digital tools. The syllabus for the module, comprising 14 topics, was delivered in 15 virtual live sessions of 120 min duration each, which took place once a week, and 5 sessions of 60 min that were spread throughout the semester. The sessions were delivered synchronously in a virtual classroom using the Adobe Connect software, which enables the teacher to play video and audio, share the blackboard and material, exchange comments with students through an interactive chat function, and divide the class into independent breakout rooms that simulate the distribution into groups in a face-to-face class where each group works independently.
Twenty working sessions were designed in which the students performed collaborative activities synchronously in the virtual classroom, putting theoretical content into practice and developing and/or fostering digital competence. These activities were supported by digital content creation, collaboration, and evaluation tools (Table 3).
The teaching design used in the virtual classroom was as follows.
Content was presented and students’ preconceptions were detected using videos recorded by the lecturer and enriched with questions on the Edpuzzle platform ( or documents shared with the students using the Perusall app. The lecturers could review students’ answers to the questions in the videos and the comments they made on the document to establish, in advance of the virtual class, whether students were clear about the theoretical concepts necessary to tackle the corresponding session. In both cases, the students’ answers and/or comments were shared and the class started with a session on doubts on this content that the lecturer could see caused the greatest difficulty. In some sessions, the content was presented through an explanation by the lecturer, supported by a presentation, and students’ preconceptions were detected at the start of the class via a brainstorm written up on a notepad.
Once the content presentation and/or solving of doubts were complete, 10 min were spent explaining the activity to be performed and the digital tool to be used. The activities were done synchronously online. In them, the students first used the tools as learners and then learnt to use them from the point of view of the lecturer. To do this, the lecturer shared a document with the students setting out the objectives, the content to be covered, the procedure to follow to do the activity, and the evaluation. Next, working groups of 4–6 people were set up using the “create breakout rooms” function of the Adobe Connect platform.
During the activity, the lecturer moved around the groups to give students feedback on their work. Once the session had ended, the teacher reviewed the work and sent students corrections and comments on the completed activity using the forum function.

2.3. Instrument

The study variable after the intervention was the students’ digital competence.
To determine changes in the level of digital competence caused by the educational intervention, a questionnaire validated by Tourón et al. was used [16]. This comprises five dimensions based on the five areas established in the Common Framework for Teachers’ Digital Competence developed by INTEF [8]: Information and Data Literacy, Communication and Collaboration, Digital Content Creation, Safety, and Problem Solving. Each dimension comprises a variable number of items, which are evaluated using two Likert-type scales (1 Not at all—2 Very little—3 A little—4 Somewhat—5 A lot—6 Very much—7 Completely), one of which relates to knowledge of the item in question and the other to how the students use it. This questionnaire is considered to be suitable for measuring the level of competence of the future teacher and was applied at two different points—at the start of the module and after completing it—to determine whether levels of digital competence changed after carrying out the learning experience in comparison with the level established at the start.
The questionnaires were prepared using Google Forms and were shared with the students through the lecturer–student communication forum in the learning platform normally used.

2.4. Data Analysis

Firstly, to check whether the data on digital competence obtained followed a normal distribution, we used the Kolmogorov–Smirnov and Shapiro–Wilk tests. Secondly, we used the Wilcoxon signed-rank test to analyse levels of digital competence before and after the intervention and to verify whether any changes occurred. Finally, for all of the comparisons of groups, we calculated the effect sizes (Cohen’s r), where values of r = 0.10 are regarded as low, r = 0.3 medium, r = 0.5 large, and r = 0.7 very large [45]. We organised, codified, and analysed the data using the SPSS 26.0 statistics package.

3. Results

The results of the Kolmogorov–Smirnov test with the Lillefors correction and the Shapiro–Wilk test indicated that the pretest-posttest data did not have a normal distribution (0.918, p = 0.002 in the pretest and 0.907, p = 0.001 for the posttest). In order to establish whether there was an increase in students’ level of digital competence, we analysed the results before the experience (pretest) and after the experience (posttest), with the aim of establishing whether there were changes. As the variables do not have a normal distribution, we used nonparametric statistics, specifically Wilcoxon’s W test.
Analysing each area globally, we found statistically significant differences in all of them. If we observe the effect size (Table 4 and Table 5), we find that there is a very large effect size in both of the scales analysed, both in area D3 Digital Content Creation (with r = 0.74 on the Knowledge scale and r = 0.73 on the Use scale) and in D5 Problem Solving (where r = 0.71 on the Knowledge scale and r = 0.81 on the Use scale). On the other hand, we find a large effect size both in area D1 Information and Data Literacy (where on the Knowledge scale r = 0.65 and on the Use scale r = 0.64) and in D2 Communication and Collaboration (where on the Knowledge scale r = 0.62 and on the Use scale r = 0.63). We only find a medium effect size in area D4 Safety (where on the Knowledge scale r = 0.48 and on the Use scale r = 0.49).
If we examine each of the areas in depth, we can see different results. In the first one, Information and Data Literacy, there are statistically significant differences between the pretest and the posttest (Table 6) in all of the items on the “Knowledge” scale. On the other hand, on the “Use” scale, there are also significant differences in all of the variables except in IL3 (z = −1.066, p = 0.286) and IL6 (z = −1.904, p = 0.057). If we analyse the effect size, we find that there is a medium effect on both scales for all items, except for item IL7, where on the “Knowledge” scale we find a very large effect size (r = 0.71) and on item IL4, where on the “Use” scale the effect is medium (r = 0.37).
In the Communication and Collaboration area, there are statistically significant differences between the pretest and the posttest (Table 7). On the “Knowledge” scale, these differences are present in all of the variables, while on the “Use” scale, the differences are present in all of the variables apart from CC1 (z = −1.872, p = 0.061). Analysing the effect size, we find that there is a large effect in both of the scales analysed in all items in the area, except for item CC3 where the effect size is medium (r = 0.37 on the “Knowledge” scale and r = 0.41 on the “Use” scale).
With regards to the Digital Content Creation area, there are again statistically significant differences between the pretest and the posttest (Table 8). On the “Knowledge” scale, differences are present in all variables except DC3 (z = −1.467, p = 0.142). Meanwhile, on the “Use” scale, differences are apparent in all of the variables except for DC3 (z = −1.264, p = 0.206), DC4 (z = −1.817, p = 0.069), and DC12 (z = −1.602, p = 0.109). If we analyse the effect size, we find that, on the “Knowledge” scale, there is a large effect for all variables apart from DC1 (r = 0.70), DC8 (r = 0.75), and DC16 (r = 0.72), where the effect is very large. In contrast, on the “Use” scale, the effect size is large for all variables apart from DC1 (r = 0.47) and DC2 (r = 0.42), where the effect size is medium.
In the “Safety” area, there are also statistically significant differences between the pretest and posttest (Table 9). In this case, on the “Knowledge” scale, there are differences in all of the variables apart from S2 (z = −1.772, p = 0.076), while on the “Use” scale, differences are present in all of the variables apart from S1 (z = −1.400, p = 0.162) and S3 (z = −1.331, p = 0.183). If we analyse the effect size, we find that on the “Knowledge” scale there is a large effect for all variables apart from S7 (r = 0.64), where the effect is large. As for the “Use” scale, the effect size is medium for all variables apart from S4 (r = 0.52) and S7 (r = 0.62), where the effect size is large.
Finally, in the Problem Solving area there are again statistically significant differences between the pretest and the posttest (Table 10). On the “Knowledge” scale, differences are present in all variables except PS2 (z = −0.851, p = 0.395). On the other hand, on the “Use” scale, there are also differences in all of the variables except in PS2 (z = −1.012, p = 0.311) and PS3 (z = −1.008, p = 0.313). If we analyse the effect size, we find that, on the “Knowledge” scale, there is a very large effect for all variables apart from PS1 (r = 0.43) and PS3 (r = 0.39), where the effect is medium. The opposite is the case on the “Use” scale, where there is a medium effect for all variables apart from PS1 (r = 0.59), PS9 (r = 0.51), PS11 (r = 0.67), and PS12 (r = 0.64), where the effect size is large, and for variables PS8 (r = 0.75) and PS10 (r = 0.77), where the effect size is very large.

4. Discussion

Digital competence has become a transversal one that every member of society needs in order to ensure active participation in the 21st century. It is also a key competence for future teachers. The development of digital competence in the education system means that teachers are trained in it, something that involves making them capable of using ICT appropriately as a methodological resource integrated into the teaching and learning process [46]. This is why in this work we have presented a teaching design proposal based on an educational model that integrates knowledge of the subject being delivered, the most appropriate didactic methods for the subject and the students, and the most appropriate technological tools in order to teach specific content better. This model is based on one of the reference models, the T-PACK model proposed by Koehler, Mishra, and Cain [18], which enjoys considerable support for training teachers as it integrates technology into the classroom effectively, allowing training in digital competence [47,48,49].
The experience presented here has contributed to improving future teachers’ skills in the five digital competence areas established by INTEF [8]. The future teachers improved globally in the Information and Data Literacy area, developing strategies for searching for and managing information in different formats, and in criteria for critically evaluating the selected information. Their knowledge of tools for storing files and shared content such as Drive, Dropbox, and Office 365 and of channels for selecting educational videos improved, but their use of them did not. Although these resources are integrated into the proposed activities, their use in learning activities should perhaps be strengthened.
Focusing on the Communication and Collaboration area, there was an improvement in both knowledge and use of collaborative learning tools, with the exception of forums and chat programs. Forums and chat programs are commonly used in everyday life and students are very accustomed to using these tools as part of the virtual teaching carried out in this fully online university. We noted a greater improvement in the competence regarding rules for behaviour online in the educational context. Command of collaboration tools is key for future teachers. The studies by Carrió [36], García-Valcárcel et al. [39], Kolloffel, Eysink, and Jong [34], and Lee and Tsai [40] determined that collaboration between students improves learning. Designing ICT-based collaborative learning activities gives greater independence and motivation and options for adapting to students’ different levels.
Previous studies [50] focusing on the Digital Content Creation area have shown that university students have a low competency level. However, the present study shows that, after implementing the teaching design, there is an improvement in the knowledge and use of evaluation tools, and in some tools that facilitate learning such as mind maps and infographics, and applying gamification in the classroom. The future teachers discovered the potential of ICT for content creation. However, with tools relating to the creation of presentations or videos, there was an improvement in knowledge but not use of them, even though one of the activities proposed was to create a video. These results underline the importance of incorporating experiences in the classroom to improve this area of digital competence. The studies by Cabezas, Casillas, and Pinto [51], Cózar and Roblizo [21], Prendes, Castañeda, and Gutierrez [52], Romero, Hernández, and Ordoñez [53], Romero-Martín et al. [25], Garzón Artacho et al. [54], Pozo-Sánchez et al. [15] and Napal Freire et al. [26] found a low level of training of future teachers in use of digital educational resources. It is a very important area of digital competence for teachers, who need to know how to manage the use of ICT in the classroom and have skills for selecting, adapting, and creating teaching materials and for evaluation in digital settings [55].
In the Safety area, the results again underline an improvement in most of the competences associated with this area in both the Knowledge and Use scales. The improvement observed in competences relating to the responsible and healthy use of digital technologies is especially noteworthy. No changes were observed in the use of devices for the protection of virus threats and of document protection systems. However, the learning experiment carried out was more focused on improving the areas set out previously as we considered those to be more relevant when providing future teachers with the skills related to learning in the classroom.
Finally, we observe improvements in most of the competencies that comprise the Problem Solving area of digital competence, related to learning to solve problems through digital means, using technologies creatively to generate knowledge, and identifying areas for improvement in one’s own competence. We draw attention to a major improvement in basic skills for teachers, such as the use of tools for evaluating, tutoring, or monitoring students and in creative teaching activities to develop students’ digital competence, as well as in the use of spaces to continue training and updating digital competence. In recent years, other studies have also shown the effectiveness of technology for generating pedagogical or technological knowledge or knowledge related to the use of technology in teaching methodologies [12,17,27].
Some prior studies [11,15,23,26,33,56] have shown that command of digital tools is still a challenge in the training and professional development of teachers. Nonetheless, pedagogical use of these tools is vital for tackling the education of new generations in the digital age. In the study by Romero-Martín et al. [25], teachers in secondary education believed that digital competence was fundamental for improving teaching and learning processes. Nonetheless, in most cases, teacher training in digital competence is frequently limited to solely instrumental questions, neglecting the implementation of innovative teaching practices involving these technologies [57,58,59]. Digital competence cannot be developed using models based on mere knowledge transmission; it requires ICT to be integrated into learning activities [29,32,60]. In this sense, we underline the importance of the study presented, which integrates ICT into activities related to the planning and development of the teaching and evaluation of students’ learning, achieving holistic training in digital competence for future teachers. Training future teachers in this competence is key for integrating ICT into the curriculum in educational practice and for the training of secondary education students in a competence that is essential for the personal development and future professional development of our students [61,62].

5. Conclusions

We conclude that future teachers, after studying a module which implements a pedagogical proposal based on active methodologies supported by digital tools, have improved in all of the digital competence areas proposed by INTEF [8] (Information and Data Literacy, Communication and Collaboration, Developing Digital Content, Safety, and Problem Solving). Therefore, we suggest the use of this online learning methodology and propose the continuation of research in the area of activity design of activities in order to achieve a greater command of the competencies in which the implemented innovation has had the least impact. We also believe it is important to repeat the study with a larger sample of students from the Master’s in Teacher Training.
It would also be of interest to incorporate proposals of this type into other modules to contribute to better training in digital competence for future biology, geology, and secondary education teachers, as well as extending this experience to other specialties on the master’s degree in question. Another potential line of research focuses not only on perceptions but also on the design of instruments for real measurement of digital competence.

Author Contributions

Conceptualisation, P.d.P.-L.; methodology, C.R.-G.; formal analysis, O.B.-G.; research, C.R.-G., P.d.P.-L., and O.B.-G.; data curation, O.B.-G.; writing—preparation of first draft, C.R.-G., P.d.P.-L., and O.B.-G; writing—revision and editing, C.R.-G., P.d.P.-L., and O.B.-G.; supervision, C.R.-G.; project administration, C.R.-G.; acquiring funding, C.R.-G. All authors have read and agreed to the published version of the manuscript.


This research project was funded by the Vice-Rectorate of Research of the Universidad Internacional de la Rioja in the 2019–2020 call.

Conflicts of Interest

The authors declare that there are no conflict of interest.


  1. Cardona, A.J. Teacher Training and Professional Development in the Knowledge Society; Universitas: Madrid, Spain, 2008. [Google Scholar]
  2. Marín, V.; Reche, E.; Maldonado, G.A. Advantages and disadvantages of online training. Rev. Digit. Investig. Docencia Univ. 2013, 7, 33–43. [Google Scholar] [CrossRef]
  3. López, E.; Vázquez-Cano, E.; Jaén, A. The group e-portfolio: A diachronic study at University Pablo de Olavide in Spain (2009–2015). Rev. Humanid. 2017, 31, 123–152. [Google Scholar] [CrossRef] [Green Version]
  4. European Commission. Strategic Framework for Education and Training 2020 (ET2020); Official Journal of the European Union (2009/C 119/02); European Commission: Luxembourg, 2009. [Google Scholar]
  5. European Parliament and of the Council. Recommendation of the European Parliament and of the Council of December 18, 2006 on key competences for lifelong learning. Off. J. Eur. Union 2006, 30, 2006. [Google Scholar]
  6. Ferrari, A. DIGCOMP: A Framework for Developing and Understanding Digital Competence in Europe; European Commission: Sevilla, Spain, 2013. [Google Scholar]
  7. European Commission. Proposal for a Council Recommendation on Key Competences for Lifelong Learning; Official Journal of the European Union (2018/C 189/01); European Commission: Luxembourg, 2018. [Google Scholar]
  8. National Institute of Educational Technologies and Teacher Training. Common Digital Competence Framework for Teachers. October 2017; Ministry of Education, Science and Sports: Madrid, Spain, 2017.
  9. Correa, J.M.; de Pablos, J. New Technologies and Educational Innovation. Rev. Psicodidact. 2009, 14, 133–145. [Google Scholar]
  10. Gutiérrez, A.; Palacios, A.; Torrego, L. Teacher Training and ICT integration in Education: Anatomy of a Mismatch. Rev. Educ. 2010, 352, 267–293. [Google Scholar]
  11. López-Belmonte, J.; Pozo-Sánchez, S.; Fuentes-Cabrera, A.; Trujillo-Torres, J.M. Analytical Competences of Teachers in Big Data in the Era of Digitalized Learning. Educ. Sci. 2019, 9, 177. [Google Scholar] [CrossRef] [Green Version]
  12. Gisbert, M.; González, J.; Esteve, F. Students’ and Teachers’ Digital Competence: An Overview on Research Status. Rev. Interuniv. Investig. Technol. Educ. 2016, 74–83. [Google Scholar] [CrossRef] [Green Version]
  13. Redecker, C.; Punie, Y. European Framework for the Digital Competence of Educators: DigCompEdu; European Union: Luxembourg, 2017. [Google Scholar] [CrossRef]
  14. United Nations Educational, Scientific and Cultural Organization (UNESCO). Education 2030. Incheon Declaration and Framework for Action for the implementation of Sustainable Development Goal 4: Ensure Inclusive and Equitable Quality Education and Promote Lifelong Learning Opportunities for All; UNESCO: Paris, France, 2015. [Google Scholar]
  15. Pozo-Sánchez, S.; López-Belmonte, J.; Rodríguez-García, A.M.; López-Núñez, J.A. Teachers’ digital competence in using and analytically managing information in flipped learning. Cult. Educ. 2020, 32, 213–241. [Google Scholar] [CrossRef]
  16. Tourón, J.; Martín, D.; Navarro Asencio, E.; Pradas, S.; Íñigo, V. Construct validation of a questionnaire to measure teachers’ digital competence (TDC). Rev. Esp. Pedagog. 2018, 76, 25–54. [Google Scholar] [CrossRef]
  17. Mishra, P.; Koehler, M.J. Technological Pedagogical Content Knowledge: A Framework for Teacher knowledge. Teach. Coll. Rec. 2006, 108, 1017–1054. [Google Scholar] [CrossRef]
  18. Koehler, M.J.; Mishra, P.; Cain, W. What Is Technological Pedagogical Content Knowledge (TPACK)? J. Educ. 2013, 193, 13–19. [Google Scholar] [CrossRef] [Green Version]
  19. Krumsvik, R.J. Situated learning and teachers’ digital competence. Educ. Inf. Technol. 2008, 13, 279–290. [Google Scholar] [CrossRef]
  20. Álvarez, J.F.; Gisbert, M. Information Literacy Grade of Secondary School Teachers in Spain. Beliefs and Self-perceptions. Comun. Rev. Cient. Iberoam. Comun. Educ. 2015, 45, 187–194. [Google Scholar]
  21. Cózar, R.; Roblizo, M.J. Digital skill in would-be teachers: Perceptions from the Teacher Training Degree students at the Faculty of Education in Albacete. Rev. Latinoam. Tecnol. Educ. 2014, 13, 119–133. [Google Scholar]
  22. Pérez, A.; Rodríguez, M.J. Evaluation of the self-perceived digital competences of the Primary School Teachers in Castilla and Leon (Spain). Rev. Investig. Educ. 2016, 34, 399–415. [Google Scholar] [CrossRef] [Green Version]
  23. Fernández, F.J.; Fernández, M.J.; Rodríguez, J.M. The Integration Process and Pedagogical Use of ICT in Madrid Schools. Educ. XX1 2018, 21, 395–416. [Google Scholar] [CrossRef]
  24. Passey, D.; Shonfeld, M.; Appleby, L.; Judge, M.; Saito, T.; Smits, A. Digital Agency: Empowering Equity in and through Education. Tech. Knowl. Learn. 2018, 23, 425–439. [Google Scholar] [CrossRef] [Green Version]
  25. Romero-Martín, M.R.; Castejón-Oliva, F.J.; López-Pastor, V.M.; Fraile-Aranda, A. Formative Assessment, Communication Skills and ICT in Initial Teacher Education. Comunicar 2017, 25, 73–82. [Google Scholar] [CrossRef] [Green Version]
  26. Napal Fraile, M.; Peñalva-Vélez, A.; Mendióroz Lacambra, A.M. Development of Digital Competence in Secondary Education Teachers’ Training. Educ. Sci. 2018, 8, 104. [Google Scholar] [CrossRef] [Green Version]
  27. Mirete, A.B.; Maquilón, J.J.; Mirete, L.; Rodríguez, R.A. Digital Competence and University Teachers’ Conceptions about Teaching. A Structural Causal Model. Sustainability 2020, 12, 4842. [Google Scholar] [CrossRef]
  28. Díez, E.J. Socio-constructivist and Collaborative Models in the Use of ICTs in Initial Teacher Education. Rev. Educ. 2012, 358, 175–196. [Google Scholar] [CrossRef]
  29. Escudero, J.M.; Martínez-Domínguez, B.; Nieto, J.M. ICT in Continuing Teacher Training in the Spanish Context. Rev. Educ. 2018, 382, 57–78. [Google Scholar] [CrossRef]
  30. Papanikolaou, K.; Makri, K.; Roussos, P. Learning Design as a Vehicle for Developing TPACK in Blended Teacher Training on Technology Enhanced Learning. Int. J. Educ. Technol. High Educ. 2017, 14, 34–41. [Google Scholar] [CrossRef] [Green Version]
  31. Castañeda, L.; Esteve, F.; Adell, J. Why Rethinking Teaching Competence for the Digital World? Rev. Educ. Distancia 2018, 6, 1–20. [Google Scholar] [CrossRef]
  32. Miralles-Martínez, P.; Gómez-Carrasco, C.J.; Arias, V.B.; Fontal-Merillas, O. Digital Resources an Didactic Methodology in the Initial Training of History Teachers. Comunicar 2019, 61, 45–56. [Google Scholar] [CrossRef]
  33. Cañada, M.D. Teaching and Learning Approaches and Computer-mediated Practices in Higher Education. Rev. Educ. 2012, 359, 388–412. [Google Scholar] [CrossRef]
  34. Kolloffel, B.; Eysink, T.; Jong, T. Comparing the Effects of Representational Tools in Collaborative and Individual Inquiry Learning. Int. J. Comput. Support. Collab. Learn. 2011, 6, 223–251. [Google Scholar] [CrossRef] [Green Version]
  35. Kozma, R.B.; Anderson, R.E. Qualitative Case Studies of Innovative Pedagogical Practices Using ICT. J. Comput. Assist. Learn. 2002, 18, 387–394. [Google Scholar] [CrossRef] [Green Version]
  36. Carrió, M.L. Advantages of Using Technology in Collaborative Learning. Rev. Investig. Educ. 2007, 41, 1–10. [Google Scholar]
  37. Plomp, T.; Voogt, J. Pedagogical Practices and ICT Use Around the World: Findings from the IEA International Comparative Study SITES 2006. Educ. Inf. Technol. 2009, 14, 285–292. [Google Scholar] [CrossRef] [Green Version]
  38. García-Valcárcel, A.; Tejedor, F.J. Evaluation of School Innovation Processes based on ICT Development in the Comunidad de Castilla y León. Rev. Educ. 2010, 352, 125–148. [Google Scholar]
  39. García-Valcárcel, A.; Basilotta, V.; López, C. ICT in Collaborative Learning in the Classrooms of Primary and Secondary Education. Comunicar 2014, 22, 65–74. [Google Scholar] [CrossRef] [Green Version]
  40. Lee, S.W.; Tsai, C.C. Technology-Supported Learning in Secondary and Undergraduate Biological Education: Observations from Literature Review. J. Sci. Educ. Technol. 2013, 22, 226–233. [Google Scholar] [CrossRef]
  41. Johnson, D.W.; Johnson, R.T. Learning Together and Alone: Cooperative, Competitive, and Individualistic Learning, 2nd ed.; Prentice-Hall, Inc.: Englewood Cliffs, NJ, USA, 1987. [Google Scholar]
  42. Johnson, D.W.; Johnson, R.; Smith, K. Active Learning: Cooperation in the College Classroom; Interaction Book Company: Edina, MS, USA, 1991. [Google Scholar]
  43. Ministerio de Educación y Formación Profesional. Estadísticas de Educación. Educabase. 2018. Available online: (accessed on 14 July 2020).
  44. Salas Blas, E. Diseños preexperimentales en psicología y educación: Una revisión conceptual. Liberabit 2013, 19, 133–141. Available online: (accessed on 14 July 2020).
  45. Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Lawrence Erlbaum: Hillsdale, MI, USA, 1988. [Google Scholar]
  46. Miguel-Revilla, D.; Martínez-Ferreira, J.M.; Sánchez-Agustí, M. Assessing the digital competence of educators in social studies: An analysis in initial teacher training using the TPACK-21 model. Australas. J. Educ. Technol. 2020, 36, 1–12. [Google Scholar] [CrossRef] [Green Version]
  47. Cabero, J.; Marín, V.; Castaño, C. Validación de la aplicación del modelo TPACK para la formación del profesorado en TIC. @tic Rev. Innov. Educ. 2015, 14, 13–22. [Google Scholar] [CrossRef] [Green Version]
  48. Colomer Rubio, J.C.; Sáiz Serrano, J.; Bel Martínez, J.C. Competencia digital en futuros docentes de Ciencias Sociales en Educación Primaria: Análisis desde el modelo TPACK. Educ. Siglo XXI 2018, 36, 107–128. [Google Scholar] [CrossRef]
  49. Tourón, J. TPACK: Uno Modelo Para los Profesores de Hoy. Available online: (accessed on 14 July 2020).
  50. López-Meneses, E.; Sirignano, F.M.; Vázquez-Cano, E.; y Ramírez-Hurtado, J.M. University students’ digital competence in three areas of the DigCom 2.1 model: A comparative study at three European universities. Australas J. Educ. Technol. 2020, 36, 69–88. [Google Scholar] [CrossRef] [Green Version]
  51. Cabezas González, M.; Casillas Martín, S.; Pinto Llorente, A.M. Percepción de los alumnos de Educación Primaria de la Universidad de Salamanca sobre su competencia digital. EDUTEC Rev. Electrón. Tecnol. Educ. 2014, 48. [Google Scholar] [CrossRef]
  52. Prendes, M.P.; Castañeda, L.; Gutiérrez, I. Competencias para el uso de TIC de los futuros maestros. Comunicar 2010, 35, 175–182. [Google Scholar] [CrossRef] [Green Version]
  53. Romero Martínez, S.J.; Hernández Lorenzo, C.J.; Ordóñez Camacho, X.G. La competencia digital en los docentes en educación primaria: Análisis cuantitativo de su competencia, actitud hacia las nuevas tecnologías en la práctica docente. TCyE Technol. Cienc. Educ. 2015, 4, 33–51. [Google Scholar]
  54. Garzón Artacho, E.; Sola Martínez, T.; Ortega Martín, J.L.; Marín, J.A.; Gómez García, G. Teacher Training in Lifelong Learning—The Importance of Digital Competence in the Encouragement of Teaching Innovation. Sustainability 2020, 12, 2852. [Google Scholar] [CrossRef] [Green Version]
  55. Area Moreira, M. La Enseñanza Universitaria Digital: Fundamentos Pedagógicos y Tendencias Actuales. Documento de Apoyo Para el Módulo “Docencia Digital” del Curso de Acreditación en Competencia Digital Docente de la Universidad de La Laguna. 2019. Available online: (accessed on 18 July 2020).
  56. Cabero, J.; Barroso, J. ICT teacher training: A view of the TPACK model/Formación del profesorado en TIC: Una visión del modelo TPACK. Cult. Educ. 2016, 28, 633–663. [Google Scholar] [CrossRef]
  57. Dias da Silva, M.; da Silva, D.; do Amaral, S.L. The Y generation myth: Evidences based on the causality relations among age, di_usion and adoption of technology of college students of São Paulo State. Future Stud. Res. J. Trends Strateg. 2014, 6, 32–52. [Google Scholar]
  58. Fajardo, I.; Villalta, E.; Salmerón, L. ¿Son realmente tan buenos los nativos digitales? Relación entre las habilidades digitales y la lectura digital. An. Psicol. 2015, 32, 89. [Google Scholar] [CrossRef]
  59. Acosta–Silva, D.A. Tras las competencias de los nativos digitales: Avances de una metasíntesis. Rev. Latinoam. Cienc. Soc. Niñez Juv. 2017, 15, 471–489. [Google Scholar]
  60. Gómez-Trigueros, I.M.; Ruiz-Bañuls, M.; Ortega-Sánchez, D. Digital Literacy of Teachers in Training: Moving from ICTs (Information and Communication Technologies) to LKTs (Learning and Knowledge Technologies). Educ. Sci. 2019, 9, 274. [Google Scholar] [CrossRef] [Green Version]
  61. Domingo-Coscolla, M.; Bosco, A.; Carrasco Segovia, S.; Sánchez Valero, J.A. Fomentando la competencia digital docente en la universidad: Percepción de estudiantes y docentes. Rev. Investig. Educ. 2020, 38, 167–782. [Google Scholar] [CrossRef]
  62. Aslaug Grov Almås, I.G.; Bjørkelo, B. Becoming a professional digital competent teacher. Prof. Dev. Educ. 2020, 46, 324–336. [Google Scholar] [CrossRef] [Green Version]
Table 1. Common Framework for Teachers’ Digital Competence [8].
Table 1. Common Framework for Teachers’ Digital Competence [8].
Competence AreasCompetencies
Area 1. Information and Data Literacy1.1. Browsing, searching and filtering data, information and digital content
1.2. Evaluating data, information and digital content
1.3. Managing and retrieval of data, information and digital content
Area 2. Communication and Collaboration2.1. Interacting using digital technologies
2.2. Sharing information and digital content
2.3 Citizen participation online
2.4. Collaborating through digital technologies
2.5. Netiquette
2.6. Managing digital identity
Area 3. Digital Content Creation3.1. Developing Digital Content
3.2. Integrating and adapting digital content
3.3. Copyright and licences
3.4. Programming
Area 4. Safety4.1. Protecting devices and digital content
4.2. Protecting personal data and privacy
4.3. Protecting health and well-being
4.4. Protecting the environment
Area 5. Problem Solving5.1. Solving technical problems
5.2. Identifying technological needs and responses
5.3. Innovation and use of digital technologies creatively
5.4. Identifying gaps in digital competency
Table 2. DigCompEdu competences [13].
Table 2. DigCompEdu competences [13].
Competence AreasCompetencies
Area 1. Professional Engagement1.1. Organisational communication
1.2. Professional collaboration
1.3. Reflective practice
1.4. Digital Continuous
Professional Development CPD
Area 2. Digital Resources2.1. Selecting
2.2. Creating & modifying
2.3. Managing, protecting, sharing
Area 3. Teaching and Learning3.1. Teaching
3.2. Guidance
3.3. Collaborative learning
3.4. Self-regulated learning
Area 4. Assessment4.1. Assessment strategies
4.2. Analysing evidence
4.3. Feedback & planning
Area 5. Empowering Learners5.1. Accessibility & inclusion
5.2. Differentiation & personalisation
5.3. Actively engaging learners
Area 6. Facilitating learners’ Digital Competence6.1. Information & media literacy
6.2. Communication
6.3. Content creation
6.4. Responsible use
6.5. Problem solving
Table 3. Types of activity performed and digital tools used.
Table 3. Types of activity performed and digital tools used.
ActivityDigital Tools
Designing a Treasure HuntThe activity consists in designing a treasure hunt in the format of a web page using Google Sites (
The activity starts with an introduction where the study topic is presented and associated with reality in an attractive way. A series of questions are then set. The students then research the questions using a number of web pages selected by the teacher. Finally, the main question is set, which is the problem to be solved using the information obtained in the answer to the questions.
Reading and discussing a documentUse of the Perusall app ( to read a document on the key competences students should develop in secondary education and make comments proposing activities for working on the competences. The document is shared in class and a discussion is held.
Simulating a departmental meetingIn groups, the teachers in training, or future teachers simulate a departmental meeting to reach agreements on methodology and evaluation prior to drawing up a unit plan. These agreements are noted down on a collaborative digital wall (
Collaborative mind mapDrawing up a collaborative mind map using the Mindmeister tool ( showing the sections a unit plan should have.
Preparing a unit planDesigning a unit plan in a shared document (
Designing a video Designing a motivational video to present the content of a unit plan (
Designing an escape roomPreparing an escape room in Google Sites. This is a way of using gamification in which learning achievements are proposed in the form of different challenges to be solved in a team. To do this, a narrative or context that frames the challenges that the participants have to overcome is proposed, making the experience more attractive. Overcoming the challenges posed and receiving rewards guides students to advance towards a final goal or solution to a complex problem.
Creating evaluation problemsThe Kahoot ( and Socrative ( tools are used for detecting preconceptions and self-evaluation at the end of an activity.
Designing a rubricDesigning a rubric to evaluate the escape room using the Rubistar tool (
Table 4. Results of the Wilcoxon W test for the five digital competence areas in the Knowledge scale.
Table 4. Results of the Wilcoxon W test for the five digital competence areas in the Knowledge scale.
NAverage RankSum of Rankszpr
D1Information and Data LiteracyNeg.312.0036.00−3.5440.0000.65
D2Communication and CollaborationNeg.67.0042.00−3.3910.0010.62
D3Digital Content CreationNeg.45.1320.50−4.0500.0000.74
D5Problem SolvingNeg.21.753.50−3.9020.0000.71
Table 5. Results of the Wilcoxon W test for the five digital competence areas in use scale.
Table 5. Results of the Wilcoxon W test for the five digital competence areas in use scale.
NAverage RankSum of Rankszpr
D1Information and Data LiteracyNeg.410.7543.00−3.5090.0000.64
D2Communication and CollaborationNeg.67.6746.00−3.4370.0010.63
D3Digital Content CreationNeg.45.8823.50−3.9780.0000.73
D5Problem SolvingNeg.22.505.00−4.4210.0000.81
Table 6. Results of the Wilcoxon W test for the indicators from the Information and Data Literacy area.
Table 6. Results of the Wilcoxon W test for the indicators from the Information and Data Literacy area.
zAsymptotic Significance
zAsymptotic Significance
IL1Internet browsing strategies (e.g., searches, filters, specific commands, using search operators)−2.8560.0040.52−3.0380.0020.55
IL2Strategies for searching for information in different media or formats (text, video, etc.) to locate and select information−2.6530.0080.48−2.6930.0070.49
IL3Specific channels for selecting educational videos−2.3800.0170.43−1.0660.286-
IL4Rules or criteria for critically evaluating the content of a website (updates, citations, sources)−2.6000.0090.47−2.0160.0440.37
IL5Criteria for evaluating the reliability of sources of information, data, digital content, etc.−3.4070.0010.62−3.2030.0010.58
IL6Tools for storing and managing shared files and content (e.g., Drive, Box, Dropbox, Office 365, etc.)−2.7800.0050.51−1.9040.057-
IL7Information management strategies (using bookmarks, retrieving information, classification, etc.)−3.9020.0000.71−3.3730.0010.62
Table 7. Results of the Wilcoxon W test for the indicators from the Communication and Collaboration area.
Table 7. Results of the Wilcoxon W test for the indicators from the Communication and Collaboration area.
zAsymptotic Significance
zAsymptotic Significance
CC1Online communication tools: forums, instant messaging, chats, video conferencing, etc.−2.7400.0060.50−1.8720.061-
CC2Spaces for sharing files, images, work, etc.−2.8730.0040.52−2.3140.0210.42
CC3Social networks, learning communities, etc. for sharing information and educational content (e.g., Facebook, Twitter, Google+ and others).−2.0060.0450.37−2.2630.0240.41
CC4Other people’s educational experience or research that can provide me with content or strategies−2.5210.0120.46−2.0400.0410.37
CC5Tools for shared or collaborative learning (e.g., blogs, wikis, specific platforms such as Edmodo and others)−3.1670.0020.58−2.6090.0090.48
CC6Basic behaviour and etiquette rules for communication through the internet in an educational context−3.5180.0000.64−3.5620.0000.65
Table 8. Results of the Wilcoxon W test for the indicators from the Digital Content Creation area.
Table 8. Results of the Wilcoxon W test for the indicators from the Digital Content Creation area.
zAsymptotic Significance (2-Sided)Effect SizezAsymptotic Significance (2-Sided)Effect Size
DC1Tools for preparing evaluation tests−3.8280.0000.70−2.6020.0090.47
DC2Tools for preparing rubrics−3.7360.0000.68−2.3230.0200.42
DC3Tools for creating presentations−1.4670.142-−1.2640.206-
DC4Tools for creating educational videos−3.5080.0000.64−1.8170.069-
DC5Tools that facilitate learning such as infographics, interactive graphics, mind maps, timelines, etc.−3.6540.0000.67−3.1330.0020.57
DC7Tools for creating voice recordings (podcasts)−3.7980.0000.69−3.4490.0010.63
DC8Tools that help gamify learning−4.1100.0000.75−3.6940.0000.67
DC11Open Educational Resources (OER)−3.3270.0010.61−2.9410.0030.54
DC12Tools for reworking or enriching content in different formats (e.g., texts, tables, audio, images, videos, etc.).−2.4470.0140.45−1.6020.109-
DC15Basic modification and configuration of digital devices−3.3650.0010.61−3.3120.0010.64
DC16The potential of information and communication technologies ICT for programming and creating new products−3.9460.0000.72−3.4130.0010.62
Table 9. Results of the Wilcoxon W test for the indicators from the “Safety” area.
Table 9. Results of the Wilcoxon W test for the indicators from the “Safety” area.
zAsymptotic Significance (2-Sided)Effect SizezAsymptotic Significance (2-Sided)Effect Size
S1Protecting devices from threats from viruses, malware, etc.−1.9770.0480.36−1.4000.162-
S2Protecting information relating to people in your immediate setting (colleagues, students, etc.)−1.7720.076-−2.3940.0170.44
S3Device and document protection systems (access control, privileges, passwords, etc.)−2.0580.0400.37−1.3310.183-
S4Ways of eliminating data/information about yourself or third parties for which you are responsible−2.4270.0150.44−2.8500.0040.52
S7Norms relating to responsible and healthy use of digital technologies−3.4890.0000.64−3.3780.0010.62
Table 10. Results of the Wilcoxon W test for the indicators from the Problem Solving area.
Table 10. Results of the Wilcoxon W test for the indicators from the Problem Solving area.
zAsymptotic Significance (2-Sided)Effect SizezAsymptotic Significance (2-Sided)Effect Size
PS1Basic energy-saving measures−2.3520.0190.43−3.2440.0010.59
PS2Basic computer maintenance tasks to avoid possible performance problems (e.g., updates, clearing cache and disk, etc.)−0.8510.395-−1.0120.311-
PS3Basic solutions to technical problems deriving from the use of digital devices−2.1370.0330.39−1.0080.313-
PS4Compatibility of peripherals (microphones, headphones, printers, etc.) and connectivity requirements−3.0360.0020.55−2.5560.0110.47
PS5Solution for “cloud” management and storage, sharing files, granting access privileges, etc. (e.g., Drive, OneDrive, Dropbox or others)−3.5370.0000.64−2.5850.0100.47
PS6Tools that help respond to diversity in the classroom−4.0510.0000.74−2.0340.0420.37
PS7Ways of solving problems between peers−4.3860.0000.80−2.6240.0090.48
PS8Options for combining digital and non-digital technology to find solutions−4.3700.0000.80−4.0990.0000.75
PS9Tools for evaluating, tutoring, and monitoring students−4.0900.0000.75−2.7770.0050.51
PS10Didactic activities created to develop students’ digital competence−4.1350.0000.75−4.0200.0000.77
PS11Ways to keep myself up to date and include new devices, apps, and tools−4.2290.0000.77−3.6580.0000.67
PS12Spaces for training and updating my digital competence−4.1060.0000.75−3.5150.0000.64

Share and Cite

MDPI and ACS Style

Romero-García, C.; Buzón-García, O.; de Paz-Lugo, P. Improving Future Teachers’ Digital Competence Using Active Methodologies. Sustainability 2020, 12, 7798.

AMA Style

Romero-García C, Buzón-García O, de Paz-Lugo P. Improving Future Teachers’ Digital Competence Using Active Methodologies. Sustainability. 2020; 12(18):7798.

Chicago/Turabian Style

Romero-García, Carmen, Olga Buzón-García, and Patricia de Paz-Lugo. 2020. "Improving Future Teachers’ Digital Competence Using Active Methodologies" Sustainability 12, no. 18: 7798.

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop