Next Article in Journal
Rethinking Highway Safety Analysis by Leveraging Crowdsourced Waze Data
Next Article in Special Issue
Education in Programming and Mathematical Learning: Functionality of a Programming Language in Educational Processes
Previous Article in Journal
Fault Loop Impedance Measurement in Circuits Fed by UPS and Principle of Safety Protection
Previous Article in Special Issue
The Integration of Sustainable Development Goals in Educational Robotics: A Teacher Education Experience
Article

Self-Perception of the Digital Competence of Educators during the COVID-19 Pandemic: A Cross-Analysis of Different Educational Stages

Department of Didactics and School Organization, Faculty of Education, University of the Basque Country (UPV/EHU), 48940 Leioa, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2020, 12(23), 10128; https://doi.org/10.3390/su122310128
Received: 14 November 2020 / Revised: 1 December 2020 / Accepted: 3 December 2020 / Published: 4 December 2020
(This article belongs to the Special Issue ICT and Sustainable Education)

Abstract

The objective of this research is to measure the perception that teachers had about their own performance when they were forced to carry out Emergency Remote Teaching due to the COVID-19 pandemic. A questionnaire was provided to teachers of every educational stage in the Basque Country (Pre-school, Primary and Secondary Education, Professional Training, and Higher Education) obtaining a total of 4586 responses. The statistical analysis of the data shows that the greatest difficulties reported by educators are shortcomings in their training in digital skills, which has made them perceive a higher workload during the lockdown along with negative emotions. Another finding is the existing digital divide between teachers based on their gender, age, and type of school. A further worrying result is the lower technological competence at lower educational levels, which are the most vulnerable in remote teaching. These results invite us to reflect on the measures to be taken to improve equity, social justice, and the resilience of the educational system, which align with some of the Sustainable Development Goals.
Keywords: digital technology; inclusive education; b-learning; educational policy; educational organization digital technology; inclusive education; b-learning; educational policy; educational organization

1. Introduction

The Sustainable Development Goals (SDGs) are a universal call to action to end poverty, protect the planet, and improve the lives and prospects of everyone, everywhere. The 17 Goals were adopted by all UN Member States in 2015 as part of the 2030 Agenda for Sustainable Development, which set out a 15-year plan to achieve the Goals. Although remarkable progress has been made, overall, action to meet the Goals is not yet advancing at the speed or scale required. The UN claimed a decade of action: “2020 needs to usher in a decade of ambitious action to deliver the Goals by 2030” [1]. Instead, the unexpected emergence of COVID-19, and the resulting global confinement since March 2020, has contributed to hindering its development [2].
COVID-19 has broadened existing inequalities because, for the most vulnerable communities, it has severely affected their health, economy, and education. The pandemic has made more visible how communities with low economic resources and fragile social protection nets suffer to a greater extend the consequences of the crisis. Social, political, educational, and economic inequalities have amplified the effects of the pandemic [2]. Among the most affected vulnerable communities are low-income families and women. Therefore, COVID-19 has directly influenced the delay in the progress made on Goal 5, “Gender Equality”, and Goal 4, “Quality Education”. At the same time, Quality Education during confinement has been intimately linked to the need for development of Goal 9, “Building resilient infrastructure, promoting sustainable industrialization, and encouraging innovation”.
Information and Communication Technologies have been at the forefront of the response to COVID-19. The crisis has accelerated the digitization of education, but it has also contributed to increasing the digital divide among students [3,4,5], which has been dragging on for years [6,7].
Thus, according to the United Nations, in 2020, 3.6 billion people still do not have an internet connection and cannot access online education.
Emergency Remote Teaching [8,9] was developed as a rapid response [10,11,12,13] to the situation. Its nature has made the proper acquisition and access to the needed technology difficult [14]. The reduction of present and, above all, future inequalities demands that every student and teacher is suitably trained to acquire the digital competences they need in digital environments [15]. We must remember that Emergency Remote Teaching (ERT) is an alternative way of teaching due to the circumstances of crisis [16,17,18,19,20,21], while quality online or blended learning ask for careful instructional design and planning. Online learning has proven its effectiveness in numerous research studies [22,23,24] whenever a systematic model for design and development is adopted [25]. Therefore, in order to contribute to the improvement of the quality of education, it seems necessary to carry out an in-depth analysis of what has been done and what should be improved. This unexpected challenge has placed on the agendas of the major educational leaders the need to develop the digitalization of education (SDG 9) and hopefully reduce the digital divide, the social gap, and gender inequality in the population (SDG 10) as one of the axes to guarantee a quality education (SDG 4).
On this shift of education to digital, the European Union has published the Digital Education Action Plan (2021–2027), which sets out the criteria for high-quality, inclusive, and accessible digital education for all in Europe. The plan aims to make education and training systems fit for the digital age and presents two strategic priorities: fostering the development of a high-performing digital education ecosystem and enhancing digital skills and competences for the digital transformation [26]. In short, it requires work to be carried out on infrastructure, connectivity, and digital equipment, and also on the development of digital literacy, which will mean breaking down the inequalities among the population.
The “Digital Spain Plan 2025” [27] is aligned with the European Commission in promoting the digitization of education with a radical change in methods and contents, including the promotion of distance education and the implementation of digital vouchers to facilitate connectivity for students. Therefore, in addition to supplying technological resources to the classrooms, the development of the digital competence of students and teachers is prioritized in order to reduce the digital divide.
Digital Competence of Educators (DCE) is not a new concept, and it has been studied by researchers of educational technology over the last few decades [28,29,30,31,32,33,34,35]. In the European arena, the European Framework for the Digital Competence of Teachers (DigCompEdu) [36] is a scientifically sound framework describing what it means for educators at all stages to be digitally competent. DigCompEdu details 22 competences organized in six Areas and distinguishes six levels along which educators’ digital competence typically develops. For each of the 22 competences, level descriptors and proficiency statements are provided and allow educators to understand their level of competence and their specific development needs. The framework aims to detail how digital technologies can be used to enhance and innovate education and training.
The impact of the pandemic and, above all, the period of severe confinement experienced since March 2020 has forced the abrupt development of the DCE for active teachers. But it has also brought to the table the gaps that exist between teachers and the needs to develop some key aspects of DigCompEdu. Prior to this situation, teachers had developed their DCE in their daily interaction with technology [37], but this new situation has pushed them to increase the use of digital resources abruptly in order to respond to the change in the reality in which the teaching-learning process is taking place. At pointed out by [38], teaching professionals have gained in fluency, mastery, and comfort, not only in the use of basic applications but also in the management of information, the creation of content, and the use of technology to keep their students connected.
This paper presents a study based on the analysis of the self-perception teachers have of their digital competence; that is, how their proficiency level of digital competence has influenced the development of quality Emergency Remote Teaching during the COVID-19 confinement. Therefore, it contributes to the measurement of the DCE at a time when DCE constrained teachers’ daily work at all educational levels and the resilient response of the educational system to a situation never experienced before.

2. Materials and Methods

The main objective of this research is to analyze whether the teaching community has perceived itself as digitally capable of dealing with Emergency Remote Teaching (ERT) caused by the COVID-19 pandemic. Having this objective in mind, the following research questions have been posed:
(1)
Have teachers perceived themselves as digitally competent to deliver Emergency Remote Teaching (ERT)?
(2)
Is Digital Competence of Educators (DCE) biased depending on gender, age, educational institution, or stage of education?
(3)
Did the training received in DCE have an impact in the perception of well-being of teachers during Emergency Remote Teaching?
To this end, the following variables have been considered: Digital Competence of Educators (DCE), DCE Training, workload of teachers (before and during the lockdown), and emotions (positive and negative) experienced during the confinement. The impact of these other secondary variables have also been taken into account: age, gender, type of educational institution (private or public), and educational stage.

2.1. Design of the Questionnaire

Four blocks of questions were defined in order to measure the Digital Competence of Educators (DCE), their workload before and during the lockdown, the (positive and negative) emotions they experienced during confinement, and socio-demographical data.
Regarding DCE, some of the existing questionnaires about the digital skills of teachers [39,40] were adapted to the context of Emergency Remote Teaching. The result was a questionnaire composed of seven multidimensional Likert-type scales, with response options from 1 (little) to 5 (a lot); two multiple-choice questions; two questions with a dichotomous answer (Yes/No); and one open question. The first five Likert scales were used to measure the perception of their digital skills. The DigCompEdu framework and the National Institute of Technology and Professional Development (INTEF) [41] proposal on the development of Digital Competences for teachers were taken into account. Two other scales registered the perception of the ability of students to deal with emergency remote learning and the quality of the training that teachers had received in digital skills. Finally, a question about whether some kind of ‘emergency’ training had been delivered during the confinement period was included.
In relation to the workload, the questionnaire developed by [42] and validated by [43] was adapted to the context of the Autonomous Community of the Basque Country (ACBC) and the pandemic situation. The questionnaire consisted of six multidimensional Likert-type scales, with response options from 1 (low) to 5 (high). The first three scales referred to the demands imposed on the person (mental, physical, and temporal demand) and the other three referred to the person’s interaction with the task (effort, performance, and frustration).
The block about emotional responses consisted of items to evaluate positive emotions (pride, satisfaction, enthusiasm, confidence, and relief) and items to measure negative emotions (insecurity, stress, concerns, anger, and frustration) [44]. The instrument used was generated by adapting the questionnaire for virtual learning environments (WebCT) proposed by [45].
The questionnaire, the dataset, the 4589 responses, and supplementary research data have all been publicly shared online [46].

2.2. Population

A non-probabilistic sampling method was selected attending to the limitations to access the target population [47]. Specifically, a convenience sampling was carried out by sending the questionnaire by email to all educational centers of the Autonomous Community of the Basque Country (ACBC) in two versions using Basque and Spanish languages. The ‘snowball’ technique was also used for non-probabilistic sampling in conditions of difficult access [48], making use of social networks (Facebook and Twitter) and other teaching networks articulated through WhatsApp groups. Information was collected during three weeks in May 2020. Finally, 4589 responses to the questionnaire were obtained; that is, almost 10.5% of all the teaching staff of the Basque Country [49] participated voluntarily in the study. Regarding internet connectivity, 1.5 million persons (80.2% of the population) in the Basque Country have a broadband connection [50].
An ad hoc questionnaire with four questions (age, gender, educational stage of teaching, and type of educational center) was used to find out the socio-demographic profile. Out of the 4589 teachers in the sample for this study, 23.3% are male, 75.5% female, and 0.2% non-binary, with an average age of 54 (SD = 6.24). These professionals work at different educational stages (Pre-school Education, 10.8%; Primary Education, 31.6%; Secondary Education and Baccalaureate, 38.3%; Vocational Training, 5.3%; Higher Education, 8.6%; Others, 5.4%) and in different types of educational centers (public centers, 77.2%; private centers, 22.8%).

2.3. Validation of the Questionnaire

The psychometric properties of the questionnaire were analyzed. Firstly, the total sample was randomly divided into two halves. With the first subsample (n = 2296), a Parallel Analysis (PA) was carried out in order to explore the factorial structure of the instrument. In this case, the software Factor 10.4.01 [51] was used. The procedure selected to determine the number of dimensions was the optimal implementation of the PA [52], and the parameter estimation method used was the Diagonally Weighted Least Squares (DWLS) method. This method is the most appropriate when the variables are ordinal, as in the case of Likert-type scales [53]. Finally, taking into account that a one-dimensional solution was expected, no rotation method was applied. Thus, the results of this first analysis suggested a unifactorial structure for five items of the DCE block, two factors for perceived workload before and during lockdown, and two factor for negative and positive perceived emotions.
Based on these first exploratory results, a Confirmatory Factorial Analysis was carried out with the second subsample (n = 2294). This analysis was performed with Lisrel 8.80 software [54], using the DWLS parameter estimation method. The quality of the fit was evaluated through the following goodness-of-fit indexes: the Root Mean Square Error of Approximation (RMSEA), whose value must be less than 0.08 [55], the Non-Normed Fit Index (NNFI), the Comparative Fit Index (CFI), and the Goodness of Fit Index (GFI), whose values must be greater than 0.95 [56]. The results of this analysis showed a satisfactory fit of the two-factor model to the data: RMSEA = 0.04, NNFI = 0.99, CFI = 0.99, GFI = 0.99

2.4. Statistical Analysis

The statistical values of the responses to each of the five items included in the Digital Competence of Educators (DCE) factor were calculated, as were those regarding the question about the educational platform used. Next, the changes in the DCE factor were analyzed by means of ANOVA according to the socio-demographic variables: gender, age, type of educational center (public, private), and educational stage. The relationships between DCE proficiency and three other items—training in DCE, performance of students online, and use of educational platforms—were also analyzed. This was implemented through Spearman’s non-parametric correlation coefficient. Pearson’s method was applied to identify the correlations between perceived workload factors (during and previous to the confinement), perceived emotions (positive and negative), and digital competence.
Finally, by means of two hierarchical regressions, to what extent the workload before and after the lockdown and the DCE variable predict the appearance of positive emotions (regression 1) and negative emotions (regression 2) was analyzed. In both cases, the socio-demographic variables and the opposite emotion variable—negative emotions in the case of regression 1 and positive emotions in regression 2—were controlled as covariates. It was implemented in a two-step process, where the first step included the covariant variables (whose effect we are interested in controlling) and the second step added the predictor variables (workload before and during the confinement and DCE). This procedure allows us to determine to what extent an association of variables predicts positive or negative emotions, extending the one-to-one variable analysis performed by correlation.

3. Results

This section describes the results of the analysis, attending to each of the three research questions.

3.1. Have Teachers Perceived Themselves as Digitally Competent to Deliver Emergency Remote Teaching (ERT)?

In relation to the first research question, teachers have perceived themselves to be partially competent of developing ERT. Moreover, they think they are more skilled in the use of digital tools for general communication but they feel less confident concerning the specific tools used to facilitate teaching-learning processes.
The distributions of responses to statements about DCE are very symmetrical, fitting well into the normal distribution. The statement “I have the knowledge and skills to use online communication tools (chat, forum, videoconference, e-mail...)” stands out from the rest because it has the highest mean (4.00 out of 5), the lowest standard deviation (less dispersion), the highest negative asymmetry (distribution tail lengthens for values below the mean), and the only positive kurtosis (data concentrated on the mean, pointed curve). “I have the knowledge and skills to use the educational platform” scores 0.53 points less, which is a significant difference (13.25%), while “I have basic knowledge and skills to create and edit online activities” has a mean value of 3.60; “I have basic knowledge and skills to search for online activities” has a value of 3.70; and “During the confinement I have had difficulty in making corrections and getting them to the students” has 3.00.
The averages in the DCE factor, which comprises the previous five statements, were analyzed according to the educational platform each teacher used. The highest value, 1.13 points above the average and with the smallest deviation, is given for Moodle, followed by Google Classroom, which is the most widely used platform. The lowest value is given among Microsoft users. It is significant that 14.9% do not use any platform despite the fact that the use of an educational platform has shown a strong correlation (0.40) with the performance, perceived by their teachers, of students in online learning and with the DCE factor (0.29).
In addition, DCE is also related to the existence or absence of training during confinement as well as to the familiarity with digital educational platforms before the pandemic. This has been studied by means of the correlation of the DCE factor with the answers to the questions “Have you received training in digital skills and educational integration of technologies?” resulting in a correlation coefficient of 0.31 and “In the COVID19 period, have you received training to adapt your subjects to digital format?” scoring 0.14. This seems to point out that emergency training in digital literacy is good for DCE, but it is far more effective the training received before the emergency. In the same vein, there is a correlation (0.27) between the use of educational platforms and the quality of the previous training received.
What is also very significant (0.36) is the correlation between the DCE factor and online performance of students.

3.2. Is DCE Biased Depending on Gender, Age, Educational Institutio, or Stage of Education?

The second research question inquires about the existence of gaps in DCE between teachers according to their age, gender, type of school (public, private), or educational stage.
With regard to gender, the results showed a significant effect, F(2, 4588) = 24.97, p < 0.001, with men (M = 12.46) scoring higher than women (M = 11.55) in DCE factor. The data show a gender digital gap quantified at 0.91 points, with the number of women in the sample more than triple that of men.
In relation to age, a significant effect was also found, F(7, 4588) = 32.46, p < 0.001. Hochberg’s Post Hoc GT2 test generally showed that older teachers were less technologically competent than younger ones. There is a marked linear decrease in the DCE mean value according to age, with the difference being quantified at 3.26 points (from 13.66 in the 21–25 years range to 10.40 in 61–65). The standard deviation also increases with age (3.22 points, from 3.030 to 4.250), showing that the DCE is more homogeneous among the youngest teachers. It should also be noted that the most numerous group (41–50) is 28.7% of the total, and its DCE corresponds to the total average.
With regard to the type of school, a significant effect was also found, F(2, 4585) = 5.54, p = 0.004. Hochberg’s Post Hoc test showed that public schools (M = 11.66) scored lower in technological competence than private schools (M = 12.12). The digital divide by type of center is quantified in a difference of 0.46 points in DCE mean between public and private centers. Bearing in mind that half of all students in the Autonomous Community of the Basque Country (ACBC) study in mixed schools [57], participation in the sample has been overwhelmingly greater among public teachers than among teachers from private centers despite the fact that the questionnaire was sent to all school management bodies in the ACBC.
Regarding the educational stage, a significant effect of F(6, 4559)= 48.82, p < 0.001 was observed. Hochberg’s Post Hoc GT2 test revealed that, generally speaking, the higher the grade, the higher the digital competence.
There is a linear increase in the DCE mean value as the educational grade grows, with a difference of 2.65 points (from 10.46 in Early Childhood education to 13.11 in University education), with the standard deviation also being lower as the educational level increases.

3.3. Did the Training Received in DCE Have an Impact in the Perception of Well-Being of Teachers during Emergency Remote Teaching?

The results also show that there is a relationship between DCE and the perception of well-being in the development of ERT.
As well as the DCE factor (questions about digital competence), which was composed of five items, other factors were created by the association of items related to the workload of teachers before and during the confinement and the emotions, positive and negative, experienced during the lockdown. Table 1 shows the result of analyzing all the correlations between these five factors using Pearson’s correlation coefficient.
Among the results, it is noticeable that, due to its high level of relevance, negative emotions are strongly related to high workloads during COVID (0.47) and better training in CDE is associated with more positive emotions (0.31) and less negative emotions (0.25). Furthermore, in some cases, a good training in digital competences is slightly correlated with less workload during COVID (−0.03). Therefore, teachers who perceive themselves as digitally competent present more positive emotions, and some of them feel that they have had a smaller workload.
On the basis of meaningful relations among variables, a hierarchical regression can be designed in order to confirm the findings [58,59,60]. Table 2 revels the results of a two-step hierarchical regression, which predicts positive emotions from workload variables and DCE—dependent variables—controlling, at the same time, as covariates, the effect of socio-demographic variables and negative emotions.
Hierarchical regression should be understood as a framework for model comparison [61]. In this sense, the prediction model that considers workloads and competencies as predictors is meaningfully better (ΔR2 = 0.15) than the first one, which is made of covariates. All predictor variables and most covariates have a significant effect on the prediction of positive emotions. Negative emotions are the most influential, also with a significant weight on the COVID workload and the DCE. Covariates have little influence, so they are not relevant for predicting positive emotions. In conclusion, high DCE and absence of negative emotions could predict positive emotions, and it is significant that high workload can also be present in the mix.
On the other hand, Table 3 presents the results of the two-step hierarchical regression that predicts negative emotions from the workload and DCE variables, controlling, in turn, as covariates, the effect of socio-demographic variables and positive emotions.
The model explains 38% (total R2) of the variance shared by all variables together, which is higher than in the case of positive emotions (26%). The influence of competencies and workload predictors (ΔR2 = 0.25) is nearly double that of the covariates (R2 = 0.13). The most relevant variable in the prediction is by far the COVID workload, with the DCE also having a significant weight and a significant effect of the covariate ‘positive emotions’.
Therefore, the hierarchical regression confirms that negative emotions can be predicted by high workload and low DCE, as well as the absence of positive emotions. This combination is even worse in the case of women.

4. Discussion

The pandemic has exposed the fact that educational systems around the world should improve their resilience to unexpected situations for the effective development of “Quality Education” (SDG 4), even in the event of those situations. Regarding educators and the goals of “Good health and well-being” (SDG 3) and “Decent work” (SDG 8), this research shows they have experienced high workload, stress, and negative emotions during the ERT. This study also verifies the fact that digital gaps have flourished in this emergency situation. Education systems should guarantee that the digital divide is reduced at all stages of education, which is linked to the goals of “Gender equality” (SDG 5) and “Reduced inequalities” (SDG 10).
The need for proper Digital Competence for Educators has been fundamental for the avoidance of disruption in teaching-learning processes. The results seem to draw a pattern: weaknesses in DCE increase when it comes to situations or tools specifically related to online teaching [62]. In other words, competence is greater in the regular digital communication skills (chat, forum, videoconference, email...) that most people usually use, regardless of their profession. This is an important nuance because these are the specific digital skills needed for the development of teaching methods (creating and managing meaningful activities online, knowing how to use the educational platform, structuring a subject online, etc.) that prove to be more related to good student performance [63,64]. This leads to the need for suitable training on DCE in a structured manner. Several studies [65,66] have addressed this demand for future teachers at faculties of education, as well as the demand for active teachers by means of Professional Development programs. It can be said that COVID-19 has emphasized the importance of teacher professional development for online and blended learning [67].
In the same vein, there are three other digital gaps that the results reveal: the gender gap between active teachers, the age gap, and the gap that arises in relation to the type of educational center (private and public).
In terms of the gender gap, this study has pointed out the lower mean values in DCE for female teachers, as well as the greater likelihood of suffering the mix of negative emotions and high workload during the confinement. The data from this study corroborate what has been observed in other geographical areas [68,69], as well as in other spheres outside of education [70]. In line with this, the European Commission’s publication “Women in the Digital Scoreboard” shows that Spain is in a low position in all the indicators with regard to all types of skills associated with Information and Communication Technologies (ICT) and that the difference between genders is very significant, clearly establishing a difference in favor for men in all the skills analyzed [71]. These data call for urgent action to put women on an equal footing in the so-called fourth industrial revolution. It is perhaps an indicator that, even within the education system, the transmission of social gender roles is maintained and that there is still a lack of role models to contribute to the equal use of ICTs.
An age gap regarding digital issues has also been detected among active teachers in the Autonomous Community of the Basque Country during the pandemic. The need for continuous training has been confirmed because, as [72] point out, the progressive increase in the age divide has a great impact on the instruction of students for full personal and professional life in a world where technology has taken on special relevance. Equal opportunities come hand in hand with digital literacy of our students [73]. This article presents results based on the self-perception of teachers and this fact leaves room for interpretation. Maybe older teachers feel that their tech expertise is not sufficient for them to develop in a digital environment the complexity of inquiry and project instruction that they can manage in a face-to-face environment thanks to their experience. This opens an interesting line for future research work.
Finally, with regard to the differences detected according to the type of the educational center, this result can be compared with other studies at an international level that show similar conclusions. In a study carried out in 80 public, private, and mixed centers by the Francisco de Vitoria University and the Complutense University of Madrid, it was established that half of the teachers had a very poor or a poor level regarding the use of technology and, at the same time, there were great differences between the digital competences of the mixed centers compared to the public centers, with the latter having worse results [74]. These findings should help the responsible institutions reflect on how they can make the same level of opportunities possible for every student in our increasingly digital world.
Therefore, the effective development of a quality education (SDG 4) that reduces inequalities (SDG 5, SDG 10) requires us to place people at the center of solutions (SDG 3, SDG 8) and avoid the danger of exploiting online learning only from economical perspectives. For this reason, both the diagnosis that we carried out and the possible solutions that the results have allowed us to outline respond to the model that we believe in, which places teachers and students at the center of the teaching-learning process. It is not only a matter of using technological tools but also of thinking digitally and respecting the technical, cognitive, and socio-emotional dimensions, as well as putting technology at the service of pedagogy as advocated by models such as TPACK [75] or its TPeCS review [76]. Any pedagogical alternative to improve the Digital Competence of Educators must be carried out through different practices where the training in the educational centers will go through different stages of both technical and conceptual appropriation of the technology [77,78]. The resilient response to any future situation that may arise should be composed of a suisection infrastructure and innovation (SDG 9) at the service of competent professionals. In short, next time we should be able to offer quality distance learning teaching instead of ERT, and the development of DCE is a fundamental axis for the sustainable and resilient education system we need.

Author Contributions

Conceptualization, J.P., U.G., and E.T.; data curation, J.P.; formal analysis, E.T. and N.B.; funding acquisition, U.G.; investigation, U.G.; methodology, U.G. and N.B.; project administration, U.G.; resources, E.T. and N.B.; software, J.P.; supervision, J.P. and U.G.; validation, J.P. and U.G.; writing—original draft, J.P. and U.G.; writing—review and editing, E.T. and N.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the University of the Basque Country, grant number GIU 19/010, PPGI19/11 and by the Basque Government, grant number IT1195-19.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. United Nations. Decade of Action. Ten Years to Transform Our World. 2020. Available online: https://www.un.org/sustainabledevelopment/decade-of-action/ (accessed on 10 November 2020).
  2. United Nations. The Sustainable Development Goals Report 2020. Available online: https://unstats.un.org/sdgs/report/2020/The-Sustainable-Development-Goals-Report-2020.pdf (accessed on 10 November 2020).
  3. Wang, G.; Zhang, Y.; Zhao, J.; Zhang, J.; Jiang, F. Mitigate the effects of home confinement on children during the COVID-19 outbreak. Lancet 2020, 395, 945–947. [Google Scholar] [CrossRef]
  4. Díez, E.J.; Gajardo, K. Educar y evaluar en tiempos de Coronavirus: La situación en España. REMIE Multidiscip. J. Educ. Res. 2020, 10. [Google Scholar] [CrossRef]
  5. Choi, B.; Jegatheeswaran, L.; Minocha, A.; Alhilani, M.; Nakhoul, M.; Mutengesa, E. The impact of the COVID-19 pandemic on final year medical students in the United Kingdom: A national survey. BMC Med. Educ. 2020, 20, 206. [Google Scholar] [CrossRef] [PubMed]
  6. INE Encuesta Sobre Equipamiento y Uso de Tecnologías de Información y Comunicación en los Hogares. 2019. Available online: https://cutt.ly/Tg1BVQ0 (accessed on 10 November 2020).
  7. Garmendia, M.; Jiménez, E.; Karrera, I.; Larrañaga, N.; Casado, M.; Martínez, G.; Garitaonandia, C. Actividades, Mediación, Oportunidades y Riesgos Online de Menores en la era de la Convergencia Mediática; UPV/EHU y INCIBE: León, Spain, 2018; Available online: https://www.is4k.es/sites/default/files/contenidos/informe-eukidsonline-2018.pdf (accessed on 10 November 2020).
  8. Hodges, C.; Moore, S.; Lockee, B.; Trust, T.; Bond, A. The difference between emergency remote teaching and online learning. 2020. Available online: https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning (accessed on 5 November 2020).
  9. Moorhouse, B.L. Adaptations to a face-to-face initial teacher education course ‘forced’ online due to the COVID-19 pandemic. J. Educ. Teach. 2020. [Google Scholar] [CrossRef]
  10. Álvarez, D. COVID-19, Educación Digital y el Futuro que se Anticipa. In E-Aprendizaje. 2020. Available online: https://e-aprendizaje.es/2020/03/23/covid-19-educacion-digital-y-el-futuro-que-se-anticipa/ (accessed on 10 November 2020).
  11. Chen, Y.; Jin, J.L.; Zhu, L.J.; Comillo, Z.M.; Wu, N.; Du, M.X.; Jian, M.; Wang, J.; Jao, J.J. The network investigation on knowledge, attitude and practice about COVID-19 of the residents in Anhui Province. Chin. J. Prev. Med. 2020, 54, 367–373. [Google Scholar]
  12. Sivakumar, B. Educational Evaluation Survey on Corona Virus 19 (An Awareness)—South India. Stud. Indian Place Names 2020, 40, 228–234. Available online: https://cutt.ly/pyjZfm9 (accessed on 12 November 2020).
  13. Almodóvar-López, M.; Atiles, J.; Chavarría-Vargas, A.; Dias, M.; Zúñiga-León, I. La enseñanza remota no viene sin desafíos. Rev. Electrón. Educ. 2020, 24, 1–4. [Google Scholar] [CrossRef]
  14. UNESCO. COVID-19 Educational Disruption and Response. 2020. Available online: https://en.unesco.org/covid19/educationresponse (accessed on 11 September 2020).
  15. Gewin, V. Five tips for moving teaching online as COVID-19 takes hold. Nature 2020, 580, 295–296. [Google Scholar] [CrossRef]
  16. Davies, L.; Bentrovato, D. Understanding Education’s Role in Fragility: Synthesis of Four Situational Analyses of Education and Fragility: Afghanistan, Bosnia and Herzegovina, Cambodia, Liberia. UnesDOC. 2011. Available online: https://unesdoc.unesco.org/ark:/48223/pf0000191504 (accessed on 12 November 2020).
  17. Trujillo-Sáez, F.; Fernández-Navas, M.; Montes-Rodríguez, M.; Segura-Robles, A.; Alaminos-Romero, F.J.; Postigo-Fuentes, A.Y. Panorama de la educación en España tras la pandemia de COVID-19: La opinión de la comunidad educativa; Fundación de Ayuda contra la Drogadicción (Fad): Madrid, Spain, 2020. [Google Scholar]
  18. Portillo, S.; Castellanos, L.; Reynoso, O.; Gavotto, O. Enseñanza remota de emergencia ante la pandemia Covid-19 en Educación Media Superior y Educación Superior. Propósitos Represent. J. Educ. Psychol. 2020, 8, e589. [Google Scholar] [CrossRef]
  19. Galindo, D.; García, L.; García, R.; González, P.; Hernández, P.C.; López, M.; Luna, V.; Moreno, C.I. Recomendaciones didácticas para adaptarse a la enseñanza remota de emergencia. Rev. Digit. Univ. 2020, 21. [Google Scholar] [CrossRef]
  20. Bozkurt, A.; Sharma, R.C. Emergency remote teaching in a time of global crisis due to CoronaVirus pandemic. Asian J. Distance Educ. 2020, 15. [Google Scholar] [CrossRef]
  21. Toquero, C.M. Emergency remote education experiment amid COVID-19 pandemic. Int. J. Educ. Res. Innov. 2020, 162–172. [Google Scholar] [CrossRef]
  22. Noesgaard, S.S.; Ørngreen, R. The Effectiveness of E-Learning: An Explorative and Integrative Review of the Definitions, Methodologies and Factors that Promote e-Learning Effectiveness. Electron. J. e-Learn. 2015, 13, 278–290. [Google Scholar]
  23. Johnson, R.D.; Hornik, S.; Salas, E. An empirical examination of factors contributing to the creation of successful e-learning environments. Int. J. Hum. Comput. Stud. 2008, 66, 356–369. [Google Scholar] [CrossRef]
  24. Thalheimer, W. Does eLearning Work? What the Scientific Research Says! Work-Learning Research Website. 2017. Available online: https://bityl.co/4RET (accessed on 12 February 2020).
  25. Branch, L.; Dousay, T. Survey of Instructional Design Model, 2nd ed.; AECT: Indianapolis, IN, USA, 2015. [Google Scholar]
  26. European Comisión. Digital Education Action Plan 2021–2027. 2020. Available online: https://ec.europa.eu/education/education-in-the-eu/digital-education-action-plan_en (accessed on 11 September 2020).
  27. Gobierno de España. Plan de España Digital 2025. 2020. Available online: https://cutt.ly/Jg1Mhes (accessed on 10 November 2020).
  28. Prendes, M.P. Competencias TIC Para la Docencia en la Universidad Pública Española. Indicadores y Propuestas Para la Definición de Buenas Prácticas; Universidad de Murcia. Area y Sanabria: Murcia, Spain, 2014; Available online: https://www.um.es/competenciastic (accessed on 10 November 2020).
  29. Alonso-Ferreiro, A. Competencia Digital y Escuela: Estudio de caso Etnográfico en dos CEIP de Galicia. Ph.D. Thesis, Universidad de Santiago de Compostela, Santiago de Compostela, Lugo, Galicia, 2016. [Google Scholar]
  30. Gisbert, M.; González, J.; Esteve, F.M. Competencia digital y competencia digital docente: Una panorámica sobre el estado de la cuestión. Rev. Interuniv. Investig. Tecnol. Educ. 2016, 74–83. [Google Scholar] [CrossRef]
  31. Rodríguez-García, A.M.; Raso, F.; Ruiz-Palmero, J. Competencia digital, educación superior y formación del profesorado: Un estudio de meta-análisis en la web of science. Pixel Bit. Rev. Medios Educ. 2019, 54, 65–81. [Google Scholar] [CrossRef]
  32. Tourón, J.; Martín, D.; Navarro, E.; Pradas, S.; Íñigo, V. Validación de constructo de un instrumento para medir la competencia digital docente de los profesores (CDD). Rev. Española Pedagog. 2018, 76, 25–54. [Google Scholar] [CrossRef]
  33. Durán, M.C.; Prendes, M.P.E.; Gutierrez, I.P. Certificación de la Competencia Digital Docente: Propuesta para el profesorado universitario. Rev. Iberoam. Educ. Distancia 2019, 22, 187–205. [Google Scholar] [CrossRef]
  34. Cabero, J.; Martínez, A. Las tecnologías de la información y comunicación y la formación inicial de los docentes. Modelos y competencias digitales. Profr. Rev. Currículum Form. Profr. 2019, 23, 247–268. [Google Scholar] [CrossRef]
  35. Cabero-Almenara, J.; Barroso-Osuna, J.; Palacios-Rodríguez, A.; Llorente-Cejudo, C. Marcos de Competencias Digitales para docentes universitarios: Su evaluación a través del coeficiente competencia experta. Rev. Electrón. Interuniv. Form. Profr. 2020, 23. [Google Scholar] [CrossRef]
  36. European Commission. European Framework for the Digital Competence of Educators (DigCompEdu). 2017. Available online: https://ec.europa.eu/jrc/en/publication/eur-scientific-and-technical-research-reports/european-framework-digital-competence-educators-digcompeduColbert (accessed on 9 November 2020).
  37. Colbert, A.; Yee, N.; George, G. The digital workforce and the workplace of the future. Acad. Manag. J. 2016, 59, 781–789. [Google Scholar] [CrossRef]
  38. Trujillo, F. Aprender y Enseñar en Tiempos de Confinamiento; Catarata: Madrid, Spain, 2020. [Google Scholar]
  39. Hung, M.L.; Chou, C.H.; Chen, C.H.; Own, Z. Learner readiness for online learning; Scale development and student perceptions. Comput. Educ. 2010, 55, 1080–1090. [Google Scholar] [CrossRef]
  40. Gutíerrez Castillo, J.J.; Cabero, J. Estudio de caso sobre la autopercepción de la competencia digital del estudiante universitario de las titulaciones de grado de Educación Infantil y Primaria. Profr. Rev. Currículum Form. Profr. 2016, 20, 180–199. [Google Scholar]
  41. INTEF. Marco Común de Competencia Digital Docente 2017. Available online: https://cutt.ly/Xg1MPuE (accessed on 12 November 2020).
  42. Hart, S.G.; Staveland, L.E. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. Adv. Psychol. 1988, 52, 139–183. [Google Scholar] [CrossRef]
  43. Hill, S.; Iavecchia, H.; Byers, J.; Bittner, A.C.; Zaklad, A.L.; Christ, R.E. Comparison of four subjective workload rating scales. Hum. Factors 1992, 34, 429–439. [Google Scholar] [CrossRef]
  44. Guedes, S.; Mutti, C. Affections in learning situations: A study of an entrepreneurship skills development course. J. Workplace Learn. 2010, 23, 195–208. [Google Scholar] [CrossRef]
  45. Rebollo-Catalan, M.A.; García-Pérez, R.; Buzón-García, O.; Vega-Caro, L. Las emociones en el aprendizaje universitario apoyado en entornos virtuales: Diferencias según la actividad de aprendizaje y motivación del alumnado. Rev. Complut. Educ. 2013, 25, 69–93. [Google Scholar] [CrossRef]
  46. Garay, U.; Romero, A.; Tejada, E.; Portillo, J.; Bilbao, N.; de la Serna, A.L. Dataset of Adieraz Project; OSF (Open Science Framework): Frankfurt, Germany, 2020. [Google Scholar] [CrossRef]
  47. Sabariego, M. El proceso de investigación. In R. Bisquerra (Coord). Metodología de la Investigación Educativa, 3rd ed.; La Muralla: Madrid, Spain, 2012; pp. 127–163. [Google Scholar]
  48. Blanco, M.C.; Castro, A.B. El muestreo en la investigación cualitativa. Nure Investig. 2007, 27, 1–4. [Google Scholar]
  49. Instituto Vasco de Estadística. Personal Docente en la C.A. de Euskadi por Sexo, Nivel, Territorio Histórico y Titularidad, 2018/19. 2020. Available online: https://bityl.co/4eKJ (accessed on 12 November 2020).
  50. Instituto Vasco de Estadística. Encuesta Sobre la Sociedad de la Información; Familias (ESIF): Vitoria, Spain, 2019. [Google Scholar]
  51. Lorenzo-Seva, U.; Ferrando, P.J. FACTOR 9.2 A Comprehensive Program for Fitting Exploratory and Semiconfirmatory Factor Analysis and IRT Models. Appl. Psychol. Meas. 2013, 37, 497–498. [Google Scholar] [CrossRef]
  52. Timmerman, M.E.; Lorenzo-Seva, U. Dimensionality assessment of ordered polytomous items with parallel analysis. Psychol. Methods 2011, 16, 209–220. [Google Scholar] [CrossRef]
  53. Mîndrilă, D. Maximum Likelihood (ML) and Diagonally Weighted Least Squares (DWLS) estimation procedures: A comparison of estimation bias with ordinal and multivariate non-normal data. Int. J. Digit. Soc. 2010, 1, 60–66. [Google Scholar] [CrossRef]
  54. Jöreskog, K.; Sörbom, D. Lisrel 8: User’s Reference Guide; Scientific Software International: Lincolnwood, IL, USA, 1997. [Google Scholar]
  55. Browne, M.W.; Cudeck, R. Alternative ways of assessing fit. Sociological Methods and Research. In Testing Structural Equation Models; Bollen, K.A., Ed.; Sage: Newbury Park, CA, USA, 1992; pp. 136–162. [Google Scholar] [CrossRef]
  56. Hu, L.; Bentler, P.M. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Struct. Equ. Model. 1999, 6, 1–55. [Google Scholar] [CrossRef]
  57. Gobierno Vasco. Enrolment in Year 2019–2020. Department of Education. 2020. Available online: https://www.euskadi.eus/matricula-graficos-evolutivos/web01-a2hestat/es/ (accessed on 8 November 2020).
  58. Herrero-Fernandez, D. Spanish adaptation of the behavioural dimension of the Displaced Aggression Questionnaire: Analysis of validity with general and on-the-road measures of anger and aggression. Rev. Psicol. Soc. 2013, 28, 273–284. [Google Scholar] [CrossRef]
  59. Herrero-Fernandez, D.; Oliva-Macias, M.; Parada-Fernandez, P. Development of the Pedestrian Anger Scale. A Pilot Study. Span. J. Psychol. 2019, 22. [Google Scholar] [CrossRef]
  60. Redondo, I.; Herrero-Fernandez, D. Validation of the Reading the Mind in the Eyes Test in a healthy Spanish sample and women with anorexia nervosa. Cogn. Neuropsychiatry 2018, 23, 201–217. [Google Scholar] [CrossRef]
  61. Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Erlbaum: Hillsdale, NJ, USA, 1988. [Google Scholar]
  62. Andía, L.A.; Santiago, R.; Sota, S.M. Estamos técnicamente preparados para el Flipped Classroom? Un análisis de las competencias digitales de los profesores en España. Contextos Educ. 2020, 25, 275–311. [Google Scholar] [CrossRef]
  63. Poyo, S.R. Transforming Teacher Preparation: Assessing Digital Learners’ Needs for Instruction in Dual Learning Environments. Available online: https://bit.ly/2V5Kl3g (accessed on 5 November 2020).
  64. Cabero, J. Aprendiendo del tiempo de la COVID-19. Rev. Electron. Educ. 2020, 24, 1–3. [Google Scholar] [CrossRef]
  65. Marin, V.; Reche, E. Universidad 2.0. Actitudes y aptitudes ante las TIC del alumnado de nuevo ingreso de la Escuela Universitaria de Magisterio de la UCO. Pixel Bit. Rev. Medios Educ. 2012, 40, 197–211. [Google Scholar]
  66. Fernández-Márquez, E.; Vázquez-Cano, E.; López-Meneses, E.; Sirignano, F. La competencia digital del alumnado universitario de diferentes universidades europeas. Rev. Espac. 2020, 41, 1–15. [Google Scholar]
  67. Portillo, J.; Lopez de la Serna, A. An international perspective for ‘Improving teacher professional development for online and blended learning: A systematic meta-aggregative review’. Educ. Technol. Res. Dev. 2020. [Google Scholar] [CrossRef]
  68. López, J.; Pozo, S.; Fuentes, A. Analysis of electronic leadership and digital competence of teachers of educational cooperatives in Andalucia (Spain). Multidiscip. J. Educ. Res. 2019, 9, 194–223. [Google Scholar] [CrossRef]
  69. Cabezas, M.; Casillas, S.; Sanches-Ferreira, M.; Teixeira, F.L. Do gender and age affectthe level of digital competence? A study with University students. Fonseca. J. Commun. 2017, 115–132. [Google Scholar] [CrossRef]
  70. Sáinz, M.; Arroyo, L.; Castaño, C.; Mujeres y Digitalización. Mujeres y Digitalización. De las Brechas a los Algoritmos. Instituto de la Mujer y para la Igualdad de Oportunidades. Ministerio de Igualdad. 2020. Available online: https://cpage.mpr.gob.es (accessed on 5 November 2020).
  71. ONTSI. Dossier de Indicadores del Índice de Desarrollo Digital de las Mujeres en España. 2019. Available online: https://cutt.ly/Ag1GEa8 (accessed on 12 November 2020).
  72. Peral, B.; Arenas, J.; Villarejo-Ramos, A.F. De la brecha digital a la brecha psico-digital: Mayores y redes sociales. Comun. Rev. Científica Iberoam. Comun. Educ. 2015, 57–64. [Google Scholar] [CrossRef]
  73. Tello Díaz, J.; Aguaded Gómez, J.I. Desarrollo profesional docente ante los nuevos retos de las tecnologías de la información y la comunicación en los centros educativos. Pixel Bit. Rev. Medios Educ. 2009, 34, 31–47. [Google Scholar]
  74. Fernández Cruz, F.J.; Fernández Díaz, M.J.; Rodríguez Mantilla, J.M. El proceso de integración y uso pedagógico de las TIC en los centros educativos madrileños. Educ. XX1 2018, 21, 395–416. [Google Scholar] [CrossRef]
  75. Mishra, P.; Koehler, M.J. Technological Pedagogical Content Knowledge: A new framework for teacher knowledge. Teach. Coll. Rec. 2006, 108, 1017–1054. [Google Scholar] [CrossRef]
  76. Kali, Y.; Sagy, O.; Benichou, M.; Atias, O.; Levin-Peled, R. Teaching expertise reconsidered: The Technology, Pedagogy, Content and Space (TPeCS) knowledge framework. Br. J. Educ. Technol. 2019, 50, 2162–2177. [Google Scholar] [CrossRef]
  77. Cejas-León, R.; Navío, A.; Barroso, J. Las competencias del profesorado universitario desde el modelo TPACK (conocimiento tecnológico y pedagógico del contenido). Píxel-Bit. Rev. Medios Educ. 2016, 49, 105–119. [Google Scholar] [CrossRef]
  78. Marín, V.; Vázquez, A.I.; Llorente, C.; Cabero, J. La alfabetización digital del docente universitario en el Espacio de Educación Superior. Edutec. Rev. Electrón. Tecnol. Educ. 2012, 39, 1–10. [Google Scholar] [CrossRef]
Table 1. Correlation coefficient between variables (Pearson’s r).
Table 1. Correlation coefficient between variables (Pearson’s r).
12345
1. Pre-COVID workload factor-
2. COVID workload factor0.18 **-
3. Positive Emotions factor0.17 **0.12 **-
4. Negative Emotions factor0.020.47 **−0.32 **-
5. Digital Competence of Educators (DCE) factor0.05 **−0.03 *0.31 **−0.25 **-
* p < 0.05, ** p < 0.001.
Table 2. Hierarchical regression analysis of predictors of positive emotions.
Table 2. Hierarchical regression analysis of predictors of positive emotions.
MeasurementBE.T.Bβ
Step 1—Covariate Variables
Educational stage0.030.040.01
Type of center0.400.110.05 ***
Gender (1 = man; 2 = woman; 3 = non binary)0.070.130.01
Age0.110.030.05 **
Negative emotions−0.280.01−0.32 ***
Step 2—Workload + competencies
Educational stage−0.050.03−0.02
Type of center0.220.100.03 *
Gender (1 = man; 2 = woman; 3 = non binary)−0.230.12−0.03 *
Age0.190.030.08 ***
Negative emotions−0.350.01−0.40 ***
Workload before lockdown0.150.020.11 ***
Workload during lockdown0.390.020.29 ***
Competencies0.240.010.23 ***
* p < 0.05; ** p < 0.01; *** p < 0.001. Positive emotions: R2 = 0.11 (p < 0.001) in step 1; ΔR2 = 0.15 (p < 0.001) in step 2. R2= 0.26.
Table 3. Hierarchical regression analysis of predictors of negative emotions.
Table 3. Hierarchical regression analysis of predictors of negative emotions.
MeasurementBE.T.Bβ
Step 1—Covariate variables
Educational stage−0.170.04−0.06 ***
Type of center0.070.130.01
Gender (1 = man; 2 = woman; 3 = non binary)1.270.140.13 ***
Age0.130.040.05 **
Positive emotions−0.370.02−0.32 ***
Step 2—Workload + Competencies
Educational stage−0.090.03−0.03 **
Type of center−0.020.11−0.01
Gender (1 = man; 2 = woman; 3 = non binary)0.180.120.02
Age−0.030.03−0.01
Positive emotions−0.390.02−0.34 ***
Workload before lockdown−0.020.02−0.01
Workload during lockdown0.780.020.50 ***
Competencies−0.140.02−0.12 ***
** p < 0.01; *** p < 0.001. Negative Emotions: R2 = 0.13 (p < 0.001) in step 1; ΔR2 = 0.25 (p < 0.001) in step 2. R2: 0.38.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Back to TopTop