Next Article in Journal
Individual Attitudes towards Immigration in Turkey: Evidence from the European Social Survey
Next Article in Special Issue
Democratizing Higher Education: The Use of Educational Technologies to Promote the Academic Success of University Students with Disabilities
Previous Article in Journal
Leadership Styles, Organizational Climate, and School Climate Openness from the Perspective of Slovak Vocational School Teachers
Previous Article in Special Issue
Conceptual Cartography for the Systematic Study of Music Education Based on ICT or EdTech
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Digitalization of Educational Organizations: Evaluation and Improvement Based on DigCompOrg Model

by
Ángel David Fernández-Miravete
and
Paz Prendes-Espinosa
*
Research Group in Educational Technology, University of Murcia, 30100 Murcia, Spain
*
Author to whom correspondence should be addressed.
Societies 2022, 12(6), 193; https://doi.org/10.3390/soc12060193
Submission received: 3 November 2022 / Revised: 12 December 2022 / Accepted: 13 December 2022 / Published: 18 December 2022
(This article belongs to the Special Issue Digital Transformation: Social and Educational Perspective)

Abstract

:
The digitalization of educational organizations is a political and social priority at European level and the model which is the basis for the analysis is DigCompOrg as part of the European Framework of Competences. This article summarizes the results of a longitudinal evaluative research (from 2018 until 2022) around the digitalization process of a compulsory secondary education center. We have applied a mixed method and an evaluative research design based on the use of questionnaires, focus groups and a research diary. This article is focused on data from the last evaluation (2021–2022) where the participants are 26 members of the management team, 46 teachers and 374 students. Our results show that progress has been made in the digitalization process, especially in some areas such as leadership, infrastructure/equipment and pedagogy/support/resources, which have obtained high scores. On the other hand, the data also show other areas where there is more scope for improvement, such as collaboration, digital networks and also innovative assessment practices. This research can be valuable as an example of a good practice around the digitalization of institutions of formal education.

1. Introduction

Digital technologies have transformed the way we live in the world and our learning processes, which has very relevant educational implications. Consequently, schools as complex organizations need to adapt to these new ways of learning and acquiring knowledge. That is why the European Commission, in the last fifteen years, has developed frameworks and tools to support the improvement of digital skills in the education system. At the same time, the Ministries of Education of European countries, and many other countries in the international context, have increasingly supported the teaching of digital competence in formal school education, the need for which has become more evident since the COVID-19 pandemic [1,2]. The European Commission recognizes this and reinforces the movement in this direction through the Digital Education Action Plan 2021–2027 which proposes a vision of a high-quality, inclusive and accessible digital education in Europe that entails improving digital skills and capacities for a digital transformation [3].
One of the fundamental aspects for this digital transformation of education is the improvement of digital competence education that not only includes the competence of the main agents of the educational process (students and teachers), but also guides their actions in the educational center, regarded as an enabling and dynamic agent [4] as well as an organizational learning environment from the perspective of organizational psychology [5,6].
In this way, digital competence is addressed in the analysis of organizations and at the same time it arises as an essential skill for citizens of the 21st century [7] that guarantees the continuity of education in any social situation at a global level [8]. Initially defined as “the confident and critical use of Information Society Technology (IST) for work, leisure and communication” [9] (p. 394/15), since then there have been many definitions and approaches that try to narrow down its meaning [10]. In this research we rely on the concept of digital competence based on the model “DigComp: The Digital Competence Framework for Citizens” [11] reviewed in “DigComp 2.0: The Digital Competence Framework for Citizens” [12]. Five dimensions are distinguished in this competency model: information, communication, content creation, safety and problem solving and can be broadly defined as the safe, critical and creative use of digital technologies to achieve goals related to work, employability, learning, leisure, inclusion and participation in society [12].
Other key contributions to facilitate the digital transformation of education, training and the acquisition of digital skills are the “European Framework for the Digital Competence of Educators” (DigCompEdu) [13] and the “European Framework for Digitally Competent Educational Organisations” (DigCompOrg) [14,15]. The concept of digitally competent educational organization refers to the effective use of digital technology by the educational organization that makes it possible to provide a compelling student experience and ensure a good return on the investment made in digital technology [15]. In this line, different studies point out the convenience of this model to evaluate educational actions within a digital action plan [16,17,18,19]. Based on these frameworks, the European Commission, through the Joint Research Center (JRC), developed in 2018 the online self-reflection tool on digital skills for schools called “SELFIE” (“Self-reflection on Effective Learning by Fostering the use of Innovative Educational technologies”) [20] and in 2021, the “SELFIE for TEACHERS” tool [21] to help teachers reflect on how they are using digital technologies in their professional practice.
As a consequence of the Council Recommendation to adapt the education and training system to the digital age, the different member states of the European Union have begun to define official curricula and review study plans in accordance with these strategic lines. According to a recent report by the European Commission/EACEA/Eurydice [22], more than half of the European countries regard the treatment of digital competence as a cross-cutting theme in primary education, but this percentage decreases in secondary and non-compulsory education. In the Spanish regulatory context, the Organic Law 3/2020, of 29 December, which modifies Organic Law 2/2006, of 3 May, on Education (LOMLOE) [23] describes the obligation to contemplate the digital change that is taking place in society. The preamble of the law specifies that the educational system must cater to a new social reality with a broader and more modern approach to digital competence in accordance with the European recommendations regarding competences for lifelong learning and gender digital divide [24].
Similarly, the different Spanish autonomous communities have begun to establish digital transformation plans at all levels of compulsory education and training, promoting the evaluation of skills based on instruments based on European reference models [25,26,27]. The response of the autonomous community of the Region of Murcia (Spain)—which is the geographical context of our research—is the implementation of the educational program called “Digital Centers” [28] offered for the compulsory primary and secondary education stages and in whose institutional context this research emerges. With a duration of four academic years, this program was born with the aim of advancing the widespread use of digital technologies in the educational centers that take advantage of the program. Among its main purposes are to increase the digital competence of students, promote active and participatory methodologies thanks to the use of technologies or improve teaching digital competence through training. It is relevant to note that “Digital Centers” has been replaced in the 2021–2022 academic year by the “Prodigi-e Plan (2021–2025)” program [29]. This program continues along the path of the previous one, but goes further by proposing an educational digital transformation in educational centers that provide non-university education. This digital transformation is understood as the set of actions that guide the improvement and modernization of the processes, habits and behavior of educational organizations and their staff using digital technologies. In addition, the use of SELFIE is required as a tool that allows establishing the grounds for the elaboration of a digital plan in the center, understood as the instrument that adapts and facilitates the use of digital media in the teaching and learning process for a development integral of the student body [30].
The validity of the SELFIE tool is supported by numerous studies that opt for it in order to assess the digital maturity of the educational institution [31,32,33,34,35,36,37]. From these works, it is concluded that SELFIE represents the practical implementation of DigCompOrg in terms of operationalizing and evaluating the digital capacity of schools, it offers valid information that can be used to prepare a school digital strategy regarding potential areas for improvement and also helps to monitor digital maturity progress over time.

2. Research Problem and Objectives

This research is proposed in the field of scientific studies aimed at educational improvement based on precise data related to awareness of the context. Thus, the research problem is: how to evaluate the digitalization process of an educational organization to promote improvement processes? For this, the DigCompOrg model has been taken as a basis and the following research objectives have been specified:
  • Evaluating the digital capacity of the educational institution based on the perception of the main agents involved (school leaders, teachers and students).
  • Analyzing the improvement processes through a longitudinal analysis that allows us to observe the evolution of the study over time.
The main final outcome of these research objectives is the design of the Digitalization Plan for the future of our secondary education school trying to reach the top of the model in every indicator and dimension. This research and the consequent improvement plan can serve as models for future research and also for other educational organizations interested in the digitalization process as measured by the indicators of the European model DigCompOrg.

3. Materials and Methods

A non-experimental, change-oriented research and evaluative design was developed. In relation to the temporal dimension, it is a longitudinal research design. And depending on the participants, it is a single case design [38]. The reason for the choice of the case study was convenience, since the educational center in which the research is carried out has developed, during the last four academic years, an educational innovation program to improve the digital capacity of the center called “Digital Centers” [28] whose main purpose is to promote the progressive incorporation of digital resources and media in the teaching and learning process.
A mixed method has been used with quantitative techniques (SELFIE questionnaire) and qualitative techniques (focus groups and researcher’s diary) that have allowed the development of participatory strategies in order to build co-design processes for the proposal of improvement strategies [39]. This type of mixed design fits the definition of concurrent mixed method, based on the use of both quantitative and qualitative data “in order to provide a comprehensive analysis of the research problem” [40] (p. 31), thus integrating the information to improve the interpretation of the results.
For the successive implementations of the improvement processes, the instructional design is based on the ADDIE model [41,42], which allows integrating the basic aspects of the educational innovation program in the traditional sequence that goes from the analysis of the context to the evaluation. The data from the first phases of the research have been published in other previous works [43,44,45]. This article focuses mainly on the third phase of the evaluation, but emphasizing the comparative analysis with data from previous cycles.

3.1. Participants

For the application of the SELFIE questionnaire, all compulsory secondary education students (ESO in Spain), who were the direct recipients of the educational program to improve digital skills, were invited, as well as all teachers and the management team. The total number of participants in the 2021–2022 academic year was 446, with 374 students (ST in the tables), 46 teachers (T) and 26 school leaders (SL) or members of the management team. The actual data of participants in the different cycles of the study since the 2018–2019 academic year are shown in Table 1.
For the creation of the focus groups, the entire population that had previously participated in the questionnaire was invited. Based on criteria of representativeness, three discussion groups were formed, consisting of key informants who volunteered. The criterion of representativeness among the students was the selection of students of different educational levels and gender, seeking parity among the members. In relation to the teachers and school leaders, specialists of different ages and work experience were selected, always trying to respect gender balance. The number of participants for each of the focus groups was between 5 to 10 members in order to establish regular and effective communication [46].
Finally, the third instrument was a research diary (RD) which was filled by one member of the management team who is also a member of the research group, so he was involved both in the improvement process and in the research process based on a social approach to educational research.

3.2. Phases

The first phase of the research began during the 2018–2019 academic year, where the SELFIE (school leaders, teachers and students) questionnaire was applied, leading to the detection of the first needs that allowed the design of improvement actions in relation to the use and exploitation of digital technologies that are made in the educational center [43,44]. The next application phase was scheduled for the 2019 and 2020 academic years, but one course had to be postponed due to the lockdown which ensued from the COVID-19 pandemic. During the 2020–2021 academic year, the questionnaire was applied again and the analysis of the results made it possible to design a second digital improvement action plan [45]. These actions were implemented and the results were evaluated in the 2021–2022 academic year, and said results are the ones presented in this article (Figure 1).

3.3. Instruments to Collect Data

For the collection of quantitative data, the SELFIE questionnaire design based on the DigCompOrg model was selected to provide schools with information on the effective use they make of digital technologies [20]. This tool guarantees the anonymous and secure collection of the opinions of students, teachers and school leaders on digital technologies for education through brief statements scored on a Likert scale from 1 (lowest score) to 5 (highest score). The questionnaire is structured around sixty-two items about different aspects grouped into eight dimensions of analysis: (A) leadership; (B) collaboration and networking; (C) infrastructure and equipment; (D) continuing professional development; (E) pedagogy: supports and resources; (F) pedagogy: implementation in the classroom; (G) assessment practices and (H) student digital competence.
For the collection of qualitative information, the discussion group technique was chosen. To do this, an ad hoc semi-structured script was developed, organized around a battery of open-response questions based on the same dimensions of analysis as SELFIE [44]. It was also complemented with a research diary (RD), an instrument in which the main researcher’s observations were collected in an organized and systematic way in his role as moderator of the focus groups and also coordinator of the digitalization process at the educational center. It consisted of a systematic record of information carried out by a single key informant and using the chronology of the annotations as the main criterion.
These three techniques have made it possible to triangulate the information obtained with the aim of giving the study greater validity, in addition to the value of qualitative information itself as a strategy for approaching the real context of educational research [40].

3.4. Procedure

In the beginning of this research, both students and teachers were informed and advised by an expert who guided them throughout the implementation process and during the days on which the information was collected. The questionnaire was filled out electronically at the center (mainly via mobile phone) or at home. The data obtained was anonymous, aggregated and stored on the server of the European Commission. Once the data was analyzed, we worked with the descriptive statistics offered by SELFIE.
For the collection of qualitative data, different discussion groups were formed, made up of key informants who had previously filled out the questionnaire. The talks took place at the center and were conducted by the lead investigator. Once the informed consent of the participants was obtained, the sessions were recorded and the verbal content of the session was transcribed for subsequent analysis. For the analysis of these data, different tables were made in which the verbatim citations were categorized according to the thematic areas and competency descriptors contemplated in SELFIE. This analysis was completed with the observations collected in the research diary during the three phases of the investigation.

4. Results

We have jointly analyzed the quantitative and qualitative results, since, as explained, the qualitative information was used to triangulate and contrast the quantitative data obtained with the SELFIE questionnaires for school leaders, teachers and students. We have presented the results in relation to every dimension of the SELFIE model considering that not all of them are in the students’ questionnaire.

4.1. Leadership

This dimension does not appear in the student questionnaire. Of the total of five indicators of this dimension, in the 2021–2022 academic year, the item with the highest score was “copyright and licensing rules” (A5), with an average value of 4.1, being the faculty who rated it the highest. On the contrary, the descriptor “time to explore digital teaching” (A4) was the one that obtained the lowest mean score (2.4) with similar values to the previous course (Table 2).
These scores are consistent with what was expressed in the discussion groups, since both the management staff and the teaching staff have generally appreciated the existence of a digital strategy at the school. The majority of the participants also stated that the center usually respects copyright and licensing rules when using digital technologies. However, most teachers pointed out the lack of time to explore how to improve their methodology with digital technologies being one of the main drawbacks: “Since three years ago, we have been following a route in our center that allows us to continue with a digital plan” (SL1). “I would like to have more time to educate myself properly and put what I learn into practice in the classroom” (T2).
Most teachers still point at the lack of time as one of the main barriers that limit the use of technologies to explore. This belief could justify the “scarce presence of teachers in autonomous training related to new technologies” that the educational center has scheduled the last four academic years in the afternoon (RD, 13 October 2021).
Therefore, in this area we can observe that there is a coincidence between the quantitative and qualitative data. In addition, the qualitative techniques highlight that, according to the teachers’ vision, the lack of time is one of the main reasons that hinder the use and exploration of digital technologies in the classroom.

4.2. Collaboration and Networking

The students were not asked any questions in relation to this topic. In the 2021–2022 academic year, out of a total of four indicators, the descriptor that obtained the highest average score is “progress review” (B1) with a 3.5, while “synergies for blended learning” (B4) obtained a 3.1, which shows that little progress has been made in the collaboration with other centers or organizations to support the use of digital technologies over time (Table 3).
In the discussion groups, most of the participants reported that the progress in teaching and learning with digital technologies made in the educational center has been assessed, although they did not specify how. All the members of the management team recognized that they have hardly collaborated with other organizations to support the use of digital technologies: “Since the pandemic we use more digital platforms and tools to work with students” (T3). “We barely have contact with other ‘digital centers’ to see what they are doing within the program (Digital Centers)” (SL2).
It is remarkable that the descriptor that has obtained the worst rating in this area is “synergies for blended learning” since, according to the management team, during the 2020–2021 academic year the center collaborated with other international educational centers through the eTwinning program that is part of Erasmus +, the European Union program for education, training, youth and sport that promotes the collaboration and twinning of schools from different European countries through the use of technology. A possible explanation would be the “lack of publicity of this type of project among the entire educational community” (RD, 10 March 2022).
The contrast between quantitative and qualitative data in this area shows that the point of view of the management team differs from the point of view of teachers, especially in those items related to collaboration in projects with other schools or organizations to support the use of digital technologies. The discussion groups show that the teachers have a more negative view than the school leaders despite the fact that the teachers score this item more in the questionnaire.

4.3. Infrastructure and Equipment

School leaders, teachers and students participated in this area. In the 2021–2022 academic year, out of a total of sixteen indicators, the descriptor with the highest average score was “bring your own device” (C13) with a 4.1. It was closely followed by the descriptor “digital devices for learning” (C8) with an average of 4 points, a very notable increase compared to the 2018–2019 academic year. The descriptor that again obtained the lowest score was “online libraries/repositories” (C16) with an average of 3.1, with management staff being the group that awarded the fewest points (Table 4).
In the discussion groups, all the participants agreed that they brought their own devices (mainly mobile phones) to the center, although the teachers thought that the use of the mobile phone was excessive in the students and, frequently, they used it inappropriately. In addition, the management team stated that there were digital devices available for students if they needed them, although most students said they were unaware of this information. On the other hand, most of the participants were not aware that there were online libraries with learning material in the center: “In this center we have many computers to be able to work with the students (…), but the use they make of their mobile phones, or the amount of time they used them should be limited” (T5). “I did not know I could ask for a computer, and take it with me if I need it” (ST2).
This area was one of the most highly valued, obtaining high average scores in all its items. This perception may have been motivated by the investment in electronic devices and infrastructure that has been made in the center mainly from the 2019–2020 academic year to the current one. The educational center has increased the charging points in several classrooms and has invested in the “purchase of 48 laptops and 10 mobile data SIM cards” which can be borrowed, with the aim of reducing the digital divide among students (RD, 1 November 2021).
In this area we can see that there is a coincidence between the quantitative and qualitative data, obtaining similar data. In addition, qualitative techniques have allowed us to highlight a potential source of conflict regarding students’ use of mobile devices in the classroom. Teachers are calling for internal regulation on the use of mobile phones at school.

4.4. Continuing Professional Development

School leaders and teachers participated in this area. In the 2021–2022 academic year, out of a total of three descriptors, the descriptor with the highest average score was “participation in CPD” (D2) with 3.7, with the teaching staff giving it the highest score. It was followed with a slightly lower value by “CPD needs” (D1) and “sharing experiences” (D3) with 3.4 points with similar values to previous courses (Table 5).
In the discussion groups, most of the participants agreed that they have had many training opportunities, both internally and externally through the Center for Teachers and Resources (CPR) of the Region of Murcia: “There are many courses to train in new technologies (…) many of which are very interesting, although what we lack is time to do all the ones I would like” (T6).
Teachers and school leaders have been generally satisfied with the training offer related to digital technologies available both through the Center for Teachers and Resources and through the autonomous training offered by the center: “autonomous training seminars in the afternoon” and “short-term training for teachers that are scheduled weekly during recess and are taught by the teachers themselves” (RD 9 December 2021).
Therefore, in this area we can see that there is a coincidence between the quantitative and qualitative data, but qualitative techniques have been able to highlight that teachers often do not find the opportunity to put into practice what they have learned through internal and/or external training.

4.5. Pedagogy: Supports and Resources

School leaders and teachers participated in this area. In the 2021–2022 academic year, out of a total of five descriptors, “online educational resources” (E1) together with “communicating with the school community” (E4) were the descriptors that obtained the highest average score with 4.3 and 4.2, respectively. They were followed by “using virtual learning environments” (E3) and “open educational resources” (E5) with 3.8 points, which indicates a fairly positive assessment of the support and resources available at the center. On the other hand, “creating digital resources” (E2) obtained the lowest average score with 3.6. Similar values to last year were observed, but there was a very notable increase with respect to the 2018–2019 academic year (Table 6).
This appreciation was confirmed in the discussion groups, where most of the participants considered that they used online digital educational resources, as well as virtual learning environments and, to a lesser extent, open educational resources: “In all the thematic units, digital activities are proposed to achieve the objectives (…) we also use Google Classroom to communicate and upload the tasks or to offer other type of information” (T2).
The widespread use of digital platforms by teachers (mainly Google Classroom), as well as the use of online educational resources had increased mainly from the 2019–2020 academic year due to the outbreak of the COVID-19 pandemic. However, “the creation of online repositories with open educational resources or own digital resources is yet to be done”, something which corroborates the negative perception that the educational community has in this regard (RD, 12 June 2022).
Also, in this area we found similar results between the quantitative and qualitative data, this area being better valued globally both in the questionnaire and in the discussion groups.

4.6. Pedagogy: Implementation in the Classroom

School leaders, teachers and students participated in this area. In the last course, out of a total of six descriptors, the highest scored item was “engaging students” (F4) with an average of 3.8, with similar scores among the three groups of participants. In general, teachers are acknowledged to carry out digital learning activities that involve students. On the other end, the descriptor with the lowest average score, with 3.2 points, was “cross-curricular projects” (F6) and it was the members of the management team who once again gave it a lower score (Table 7).
These results are consistent with the information obtained from the discussion groups, where the majority of the informants positively valued the digital learning activities that they carried out with the students (3.6). Likewise, the majority of the teachers and students were satisfied with the type of activities they proposed for individual and group work. However, they hardly took into consideration technology to work with projects that combine different subjects: “This course in Language we are doing an ‘interactive panel’ where the whole class is uploading our material (…) and that way, we can also see what our classmates are doing” (ST2).
The descriptor “cross-curricular projects” was the one with the worst average score. However, according to the management team, “at least one interdisciplinary project has been carried out during the Cultural Week that is held every year at the center”. The negative assessment can be explained because there is still little use of digital technologies as a means of involving students (RD, 11 May 2021).
The quantitative and qualitative results are coincident. However, in the items referring to cross-curricular projects, the student’s interviews tended to show a more negative perception compared with the scores they gave in the questionnaire.

4.7. Assessment Practices

School leaders, teachers and students participated in this area, although the latter did not participate in all items. In the 2021–2022 academic year, out of a total of ten descriptors, “using data to improve learning” (G9) was the descriptor that obtained the highest average score with 3.6 followed by “assessing skills” (G1) and “digital assessment” (G7) with 3.5 points. These data indicate that assessment practices have been implemented in the center through digital technologies, mainly with respect to the 2018–2019 academic year, although there is still plenty of room for improvement. The descriptor “feedback to other students” (G6) continued to be the one with the lowest rating, with a 2.9, something that indicates that technology is hardly used to make observations about the work of classmates (Table 8).
In the discussion groups, all the participants agreed that teachers usually use technology to assess students’ skills, although the skills that students develop outside the school are valued little or not at all: “I think that the number of us who use technologies to carry out evaluations of our students is increasing (…)” (T1). “Many times, the teacher sends us homework through Classroom, but he does not explain how to do it” (ST4).
This area was one of the lowest rated by the educational community. Despite the fact that the majority of the teaching staff affirmed that they use digital technologies to evaluate the skills of their students, among the institutional documents to which the researcher has had access (Didactic Programs, department minutes, evaluation minutes), they have found little evidence that this practice is fully incorporated into the teaching methodology (RD, 10 June 2021).
In this area, both quantitative and qualitative data coincide: the use of digital technologies to improve student assessment practices is deficient.

4.8. Student Digital Competence

School leaders, teachers and students participated in this area. In the last course, out of a total of thirteen descriptors, the descriptor “learning to communicate” (H8) was the one which obtained the highest score, with an average of 3.9, which indicates that the center provides many communication opportunities using digital technologies. They also obtained a high score for “safe behavior” (H1) and “responsible behavior” (H3) with 3.8 and 3.7 points respectively, with values similar to the previous year. At the other end of the spectrum, we find “solving technical problems” (H13) and “learning coding or programming” (H11) with a 3 and a 2.8 respectively, which shows that little progress has been made regarding these skills compared to the previous course (Table 9).
In the discussion groups, most of the participants agreed that technologies are used in the center to communicate, as well as learning to act safely and responsibly on the internet, although this is not always achieved. On the other hand, the majority of the students believed that they barely learn to solve the technical problems that arise when using the technologies, nor to program and/or code: “I have become accustomed to using a virtual platform to set up my courses at the beginning of the course (…) I give them the keys and show them (the students) how it works” (T7). “When I have a problem with the computer or with some homework, we go down to the Aula Plumier (computer room) during recess so that a teacher can help us” (ST1).
In this area, “solving technical problems” and “learning coding or programming” were the descriptors that obtained a lower mean score. However, since the 2019–2020 academic year, the management team has set up what has been called “ICT tutorials”. During four recesses a week, students can go if they need to a Plumier classroom (with computers) where an expert teacher helps them solve technical problems that may arise in their daily practice. Therefore, it would be necessary to investigate the reason for the low rating of this descriptor among the students. On the other hand, the management team acknowledged that they still have to design “some educational project for the center in which learning related to computational thinking (coding and programming) is enhanced” (RD, 11 May 2021).
In this area, although the three participating groups give a similar score in the questionnaire, the students’ interviews tend to show a more negative point of view regarding the work that is carried out in the school to improve their own digital competence.
The following table (Table 10) reflects the average score of all the participants by areas during the three academic years in which the study is applied. We can see that the mean values are similar with one or two tenths of a difference. It is noteworthy that area E, “pedagogy: support and resources” continues to be the most favorably valued, with an average score of 3.9, while areas B, “collaboration and networks” and G, “assessment practices” continue to be the areas that obtain fewer points with respect to the rest of the areas, rising only one tenth on average in the 2021–2022 academic year. However, we want to highlight a decrease in the average score in area D, “continuing professional development” with two tenths less in the 2021–2022 academic year compared to the previous academic year, despite the positive assessment that most of the subjects interviewed had in this area during the last academic year.
If we compare the averages in each of the areas, a notable increase is observed in the score obtained in the last analysis with respect to that obtained in the 2018–2019 questionnaire. However, there are no significant changes between the academic years 2020–2021 and 2021–2022, obtaining the same total average score of 3.5 in both phases and with a maximum difference of one or two tenths between areas during the 2020–2021 and 2021–2022 academic years (Figure 2). Therefore, the data presented indicates that work must continue in these areas to detect possible inconsistencies, as well as the causes that explain a stagnation or setback in the improvement process.

5. Discussion and Conclusions

The recent global public health crisis has accelerated the integration of digital technologies in education and training systems. In this regard, the European Union recognizes the potential of digital technologies for inclusive and high-quality education, and this is reflected in its new Digital Education Action Plan 2021–2027 [3]. But for the development of effective digital skills, the education system needs the support of tools and processes to assess, plan and develop its needs [47]. In this context, the present work reflects the results of the evaluation carried out at a Spanish compulsory secondary school in relation to the development of its digital competence according to the areas contemplated in the European DigCompOrg model. This model helps to evaluate the actions undertaken to improve the digital capacity of an educational institution [19,48], enabling, at the same time, the design of instruments that allow such evaluation. In this sense, the choice of the SELFIE tool has favored the development of a practice of joint reflection in the educational center and informed decision-making about its digital strategy. Different impact studies carried out in Italy [33] and in Spain [34] collected evidence similar to that presented here on the adoption of the SELFIE tool to assess the digital capacity of schools.
When considering the quantitative results of the last analysis of the SELFIE report corresponding to the 2021–2022 academic year in the center under study and the total average score obtained from the opinions of the entire school community (school leaders, teachers and students), the area with the highest score is E, “pedagogy: supports and resources” with 3.9 points, followed by the areas of A, “leadership” and C, “infrastructure and equipment” with an average of 3.6. This indicates that the center has invested a great deal of effort to improve competence areas included in the European model, such as “infrastructure” or “leadership and governance practices”, normally relegated to the background when considering the development of digital competence [49,50]. Therefore, the incorporation of technology in schools requires a redefinition of the organizational culture of schools [51]. In this way, greater planning, management and leadership are observed over time as key dimensions for the achievement of competencies in an organizational system.
Thus, it can be assessed that the use of SELFIE—based on the DigCompOrg model— has been an excellent evaluation instrument, helping to focus on the organization as a whole, rather than on the individual digital competence of educational agents [31,32,33,34,35,36,37]. This approach is in line with the theoretical current of learning organizations that learn [52,53], and provides added value to the institution and educational leadership as key elements of improvement and innovation in schools.
On the opposite side, among the worst considered areas are B, “collaboration and networking” and G, “assessment practices”, with an average of 3.3. The data shows that these areas tend to obtain lower scores than the rest, according to other studies [35]. Although the score for these areas has experienced a small improvement in the 2020–2021 academic year, the results indicate that we must keep on implementing the practices that have an impact on increasing the use of digital technologies in collaboration with other institutions. Also, it is advisable to encourage assessment activities supported by digital technologies to improve the perception of both effectiveness for student learning and adequate performance assessment [54].
It is worth highlighting that if we compare the average scores obtained over time, it is observed that there is a notable improvement in the perception that the educational community has regarding the use it makes of digital technologies in its center; however, very similar results (only two tenths of difference between them) are obtained during the last two phases of the study, which suggests that the digital improvement process could have stalled. Even slightly lower average values are obtained in the 2020–2021 academic year compared to the previous academic year in the areas A, “leadership” (−0.1), D, “continuing professional development” (−0.2) and E, “pedagogy: support and resources” (−0.1).
In this regard, the information obtained through the discussion groups, the researcher’s diary and the observation, shows a growing discontent on the part of the teaching staff with respect to the little time they have available to explore with the new technologies in the classroom despite the fact that social demand is greater. This perception has a negative impact on both the school’s digital strategy and continuous professional development. Likewise, the majority of the students thought that, although they consider working with digital resources motivating [55], they lack content, tasks and various resources. That is why certain actions must be reviewed and reoriented to allow progress in the digitization process that is being carried out in the educational center (Table 11).
Regarding the qualitative results, which show variations in the perception of the school leaders and teachers on the one hand, and the students, on the other, about the different levels of use of digital technology; similar conclusions are found in other studies [33]. Among the main needs that can be highlighted, we find the lack of time that teachers have to try and explore teaching methods using digital methodologies, as pointed out in other works [35,56]. Also, the majority of those surveyed demand greater participation in interdisciplinary projects since, as previous research suggests [57,58], the methodology that contributes the most to the development of digital competence is based on multidisciplinary problem solving and project work. However, in the context of the teaching practice these types of learning activities are still underused [58].
The analysis shows other deficient aspects such as the absence of online libraries or repositories or the design of activities that teach students to program, code or solve technical problems. In this sense, other studies also indicate that students feel more confident about their digital competence in communication and collaboration activities than in areas of digital content creation and problem solving using digital technologies [59,60]. All these aspects are consistent with the scores obtained in the questionnaire and have been included in the improvement proposal (Table 11).
On the other hand, the majority of the teaching staff stated that they are accustomed to using technologies in the evaluation of students’ abilities; however, this statement is inconsistent with the average score given by the students, as well as with the data extracted from the documents to which the principal investigator had access. In this same sense, the students pointed out that their teachers do not acknowledge the digital skills that they develop outside the educational center. Similarly, there is no consensus between teachers and students regarding the most appropriate time of use of the mobile phone, whether a safe and responsible behavior on the internet exists [61,62] or whether an adaptation of the teaching methodology is made to suit the individual needs of the students using digital technologies.
Among the main limitations of the study, we can point out that when presenting a single case study, the results do not allow to establish generalizations. However, this type of study in the educational field encourages practical strategies for the improvement of its reality in the particular context where the research is carried out [63,64]. There may also be limitations in the use of self-reflection data, since the opinions do not necessarily always reflect the reality of the center, while on the other hand, the value of the perception of the main agents involved in the evolution of their institution should be taken into account. In addition, these biases associated with self-perception do not invalidate the usefulness of this type of evaluation because they promote a deliberative process in which the educational community reflects on its knowledge and use of digital technologies and, by doing this, can identify practices that must be improved [58]. Similarly, the results would have been more plausible if statistics went beyond the average scores as only the analysis offered by the SELFIE platform had been used. Therefore, this aspect needs to be tackled in future lines of research.
In conclusion, the results presented in this report follow the path of the current educational policies which put the focus on the establishment of periodic evaluation mechanisms as a fundamental means for a digital transformation [65]. In our case, the research carried out has allowed the educational center to engage in a longitudinal process of reflection and self-diagnosis of its digital capacity as an educational organization, which has served as the basis for the design of a Digitization Plan adapted to its needs. Finally, we can highlight that the number of studies such as this offer an overview of the entire educational system, which should guide educational policymakers in decision making [66] that help develop capabilities of schools within the formal education system in a more efficient way.

Author Contributions

Conceptualization, Á.D.F.-M. and P.P.-E.; methodology, Á.D.F.-M. and P.P.-E.; formal analysis, Á.D.F.-M.; research, Á.D.F.-M.; writing—original draft preparation, Á.D.F.-M.; writing—review and editing, P.P.-E.; monitoring, P.P.-E.; project administration, Á.D.F.-M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the ethics of educational research and approved by the Institutional School Board of the secondary school involved in the research.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study, including the informed parental consent for students and the consent of involved informers (school leaders and teachers).

Data Availability Statement

Not applicable.

Acknowledgments

We would like to thank all the school community (school leaders -members of the management team-, teachers, students and parents) for their active involvement in this research.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Rodríguez-Alayo, A.O.; Cabell-Rosales, N.V. Importancia de la competencia digital docente en el confinamiento social. Polo Conoc. 2021, 6, 1091–1109. [Google Scholar] [CrossRef]
  2. Teräs, M.; Suoranta, J.; Teräs, H.; Curcher, M. Post-Covid-19 education and education technology ‘solutionism’: A seller’s market. Postdigit. Sci. Educ. 2020, 2, 863–878. [Google Scholar] [CrossRef]
  3. European Commission. Digital Education Action Plan (2021–2027). 2020. Available online: https://education.ec.europa.eu/focus-topics/digital-education/action-plan (accessed on 3 October 2022).
  4. Castaño-Muñoz, J.; Weikert García, L. La Capacidad Digital de los Centros Educativos de España. Muestra Representativa a Través de la Herramienta SELFIE. CINE-2011 2. 1º, 2º y 3º ESO; Oficina de Publicaciones de la Unión Europea: Luxemburg, 2021. [Google Scholar] [CrossRef]
  5. Rawashdeh, M.; Almasarweh, M.S.; Alhyasat, E.B.; Rawashdeh, O.M. The relationship between the quality knowledge management and organizational performance via the mediating role of organizational learning. Int. J. Qual. Res. 2021, 15, 373–386. [Google Scholar] [CrossRef]
  6. Wang, Z.; Zong, K.; Jin, K.H. The Multinational New Ventures on Corporate Performance Under the Work Environment and Innovation Behavior. Front. Psychol. 2022, 13, 762331. [Google Scholar] [CrossRef] [PubMed]
  7. Tight, M. Twenty-first century skills: Meaning, usage and value. Eur. J. High. Educ. 2020, 11, 160–174. [Google Scholar] [CrossRef]
  8. United Nations Educational Scientific and Cultural Organization (UNESCO). Education: From Disruption to Recovery. 2020. Available online: https://en.unesco.org/covid19/educationresponse/ (accessed on 3 October 2022).
  9. European Council. Recommendation of the European Parliament and of the Council of 18 December 2006 on Key Competences for Lifelong Learning; Publications Office of the European Union: Luxembourg, 2006; Available online: https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:L:2006:394:0010:0018:en:PDF (accessed on 3 October 2022).
  10. Valverde-Crespo, D.; Pro-Bueno, A.J.; González-Sánchez, J. La competencia informacional-digital en la enseñanza y aprendizaje de las ciencias en la educación secundaria obligatoria actual: Una revisión teórica. Rev. Eureka Enseñanza Divulg. Cienc. 2018, 15, 2105. [Google Scholar] [CrossRef]
  11. Ferrari, A. DIGCOMP: A Framework for Developing and Understanding Digital Competence in Europe; Publications Office of the European Union: Luxembourg, 2013. [Google Scholar] [CrossRef]
  12. Vuorikari, R.; Punie, Y.; Gomez, S.C.; Van Den Brande, G. DigComp 2.0: The Digital Competence Framework for Citizens. Update Phase 1: The Conceptual Reference Model; Publications Office of the European Union: Luxembourg, 2016. [Google Scholar] [CrossRef]
  13. Redecker, C.; Punie, Y. Digital Competence Framework for Educators (DigCompEdu); Publications Office of the European Union: Luxembourg, 2017; Available online: https://joint-research-centre.ec.europa.eu/digcompedu_en (accessed on 3 October 2022).
  14. Bacigalupo, M. Competence frameworks as orienteering tools. RiiTE Rev. Interuniv. Investig. Tecnol. Educ. 2022, 12, 20–33. [Google Scholar] [CrossRef]
  15. Kampylis, P.; Punie, Y.; Devine, J. Promoting Effective Digital-Age Learning. A European Framework for Digitally-Competent Educational Organisations; Publications Office of the European Union: Luxembourg, 2015; Available online: https://doi.org/10.2791/54070 (accessed on 3 October 2022).
  16. Balaban, I.; Redjep, N.B.; Calopa, M.K. The Analysis of Digital Maturity of Schools in Croatia. Int. J. Emerg. Technol. Learn. 2018, 6, 4–15. [Google Scholar] [CrossRef] [Green Version]
  17. Chopra, N. E-governance Framework to Measure Digital Competence of HEIs in India. Eur. Sci. J. 2019, 15, 181–193. [Google Scholar] [CrossRef] [Green Version]
  18. Fernández Miravete, Á.D.; Prendes Espinosa, M.P. Marco Europeo para Organizaciones Educativas Digitalmente Competentes: Revisión sistemática 2015–2020. Rev. Fuentes 2022, 24, 65–76. [Google Scholar] [CrossRef]
  19. Giunti, C.; Naldini, M.; Orlandini, L. Professional development to support teaching innovation. The experiences of the schools leading the Avanguardie Educative Movement. Form@re Re-Open J. Form. Rete 2018, 18, 103–115. [Google Scholar] [CrossRef]
  20. European Commission. SELFIE, Self-Reflection on Effective Learning by Fostering the Use of Innovative Educational Technologies. 2018. Available online: https://education.ec.europa.eu/es/selfie (accessed on 3 October 2022).
  21. European Commission. SELFIEforTEACHERS, Self-Reflection on Effective Learning by Fostering the Use of Innovative Educational Technologies for Teachers. 2021. Available online: https://education.ec.europa.eu/selfie-for-teachers (accessed on 3 October 2022).
  22. Bourgeois, A.; Birch, P.; Davydovskaia, O. Digital Education at School in Europe. Eurydice Report; Publications Office of the European Union: Luxembourg, 2019. [Google Scholar] [CrossRef]
  23. Gobierno de España. Ley Orgánica 3/2020, de 29 de Diciembre, por la que se Modifica la Ley Orgánica 2/2006, de 3 de Mayo, de Educación; Boletín Oficial del Estado: Madrid, España, 2020; Available online: https://boe.es/boe/dias/2020/12/30/pdfs/BOE-A-2020-17264.pdf (accessed on 3 October 2022).
  24. Agut, M.; del Pilar, M. Análisis de la LOMLOE (Ley Orgánica 3/2020, de 29 de diciembre, por la que se modifica la ley orgánica 2/2006, de 3 de mayo, de educación) y su repercusión en los profesionales de la educación no formal: Equidad, inclusión, servicio a la comunidad (APS), educación para la sostenibilidad y la ciudadanía mundial. Quad. Anim. Educ. Soc. 2021, 33, 1–20. Available online: https://hdl.handle.net/10550/80813 (accessed on 3 October 2022).
  25. Cabero-Almenara, J.; Fernández Romero, C.; Palacios Rodríguez, A.D.P. La competencia digital educativa en Andalucía (España). El programa# PRODIG. Temas Comun. 2020, 41, 59–71. Available online: https://revistasenlinea.saber.ucab.edu.ve/index.php/temas/article/view/4730 (accessed on 3 October 2022).
  26. Cabero-Almenara, J.; Barragán-Sánchez, R.; Palacios-Rodríguez, A.D.P. DigCompOrg: Marco de referencia para la transformación digital de los centros educativos andaluces. Eco Rev. Digit. Educ. Form. Profr. 2021, 18, 1–21. Available online: https://hdl.handle.net/11441/107955 (accessed on 3 October 2022).
  27. Casillas-Martín, S.; Cabezas-González, M.; García-Valcárcel, A. Análisis psicométrico de una prueba para evaluar la competencia digital de estudiantes de Educación Obligatoria. RELIEVE Rev. Electron. Investig. Eval. Educ. 2020, 26, 1–22. [Google Scholar] [CrossRef]
  28. Consejería de Educación y Universidades de la Región de Murcia. Resolución de 21 de Marzo de 2017, de la Dirección General de Innovación Educativa y Atención a la Diversidad para el Desarrollo del Programa: Centros Digitales; Boletín Oficial de la Región de Murcia: Región de Murcia, España, 2017; Available online: https://programaseducativos.es/programa/centros-digitales/ (accessed on 3 October 2022).
  29. Consejería de Educación y Cultura de la Región de Murcia. Resolución de 16 de Noviembre de 2021de la Consejería de Educación y Cultura por la que se Dictan Instrucciones sobre el Plan Prodigi-e para la Transformación Digital Educativa de la Región de Murcia y su Implantación en el Curso Escolar 2021–2022; Boletín Oficial de la Región de Murcia: Región de Murcia, España, 2021; Available online: https://servicios.educarm.es/templates/portal/ficheros/websDinamicas/45/Res%20instrucciones%20y%20su%20implantacion%202021-22%20CI%20259846.pdf (accessed on 3 October 2022).
  30. Instituto Nacional de Tecnologías Educativas y de Formación del Profesorado (INTEF). El Plan Digital de Centro. Un Marco para la Integración de las Tecnologías. 2020. Available online: https://intef.es/wp-content/uploads/2020/07/2020_0707_Plan-Digital-deCentro_-INTEF.pdf (accessed on 3 October 2022).
  31. Beardsley, M.; Albó, L.; Aragón, P.; Hernández-Leo, D. Emergency education effects on teacher abilities and motivation to use digital technologies. Br. J. Educ. Technol. 2021, 52, 1455–1477. [Google Scholar] [CrossRef]
  32. Begicevic Redjep, N.; Balaban, I.; Zugec, B. Assessing digital maturity of schools: Framework and instrument. Technol. Pedagog. Educ. 2021, 30, 643–658. [Google Scholar] [CrossRef]
  33. Bocconi, S.; Panesi, S.; Kampylis, P. Fostering the digital competence of schools: Piloting SELFIE in the Italian education context. IEEE-Rev. Iberoam. Tecnol. Aprendiz. 2020, 15, 417–425. [Google Scholar] [CrossRef]
  34. Castaño-Muñoz, J.; Pokropek, A.; Weikert García, L. For to all those who have, will more be given? Evidence from the adoption of the SELFIE tool for the digital capacity of schools in Spain. Br. J. Educ. Technol. 2022, 53, 1937–1955. [Google Scholar] [CrossRef]
  35. Castaño-Muñoz, J.; Weikert García, L.; Herrero Rámila, C. Analysing the Digital Capacity of Spanish Schools Using SELFIE; Publications Office of the European Union: Luxembourg, 2021. [Google Scholar] [CrossRef]
  36. Mišianiková, A.; Hubeňáková, V.; Kireš, M.; Babinčáková, M.; Šveda, D.; Šafárik, P.J. Assessment of Digitalization in Primary and Secondary Schools by SELFIE Survey as a part of School Leaders Training. In Proceedings of the 2021 19th International Conference on Emerging eLearning Technologies and Applications (ICETA), Košice, Slovakia, 11–12 November 2021. [Google Scholar] [CrossRef]
  37. Panesi, S.; Bocconi, S.; Ferlino, L. Promoting Students’ Well-Being and Inclusion in Schools Through Digital Technologies: Perceptions of Students, Teachers, and School Leaders in Italy Expressed Through SELFIE Piloting Activities. Front. Psychol. 2020, 11, 1563. [Google Scholar] [CrossRef]
  38. Arnal, J.; Rincón, D.; Latorre, A. Bases Metodológicas de la Investigación Educativa; Ediciones Experiencia S.L.: Barcelona, España, 2003. [Google Scholar]
  39. Escudero, T. La investigación evaluativa en el Siglo XXI: Un instrumento para el desarrollo educativo y social cada vez más relevante. RELIEVE Rev. Electron. Investig. Eval. Educ. 2016, 22, 1–21. [Google Scholar] [CrossRef] [Green Version]
  40. Creswell, J.W. Research Design. Qualitative, Quantitative and Mixed Methods Approaches; Sage publications: London, UK, 2009; p. 31. [Google Scholar]
  41. Molenda, M. In search of the elusive ADDIE model. Perform. Improv. 2003, 42, 34–37. [Google Scholar] [CrossRef]
  42. Morales-González, B.; Edel-Navarro, R.; Aguirre-Aguilar, G. Modelo ADDIE (Análisis, Diseño, Desarrollo, Implementación y Evaluación): Su Aplicación en Ambientes Educativos. In Los Modelos Tecno-Educativos, Revolucionando el Aprendizaje del Siglo XXI, 1st ed.; Esquivel Gámez, I., Ed.; Veracruz: México, Mexico, 2014; pp. 33–46. Available online: https://www.uv.mx/personal/iesquivel/files/2015/03/los_modelos_tecno_educativos__revolucionando_el_aprendizaje_del_siglo_xxi-4.pdf#page=33 (accessed on 3 October 2022).
  43. Fernández-Miravete, Á.D.; Prendes-Espinosa, M.P. Evaluación de la competencia digital de una organización educativa de enseñanza secundaria a partir del modelo DigCompOrg. Rev. Complutense de Educ. 2021, 32, 651–661. [Google Scholar] [CrossRef]
  44. Fernández-Miravete, Á.D.; Prendes-Espinosa, M.P. Análisis del proceso de digitalización de un centro de Enseñanza Secundaria desde el modelo DigCompOrg. RELATEC- Rev. Latinoam. Tecnol. Educ. 2021, 20, 9–25. [Google Scholar] [CrossRef]
  45. Fernández-Miravete, Á.D.; Prendes-Espinosa, M.P. Evaluación del proceso de digitalización de un centro de Enseñanza Secundaria con la herramienta SELFIE. Contextos Educ. 2022, 30, 99–116. [Google Scholar] [CrossRef]
  46. Riesco González, M. La Investigación Cualitativa. In Fundamentos Básicos de Metodología de Investigación Educativa, 1st ed.; Quintanal, J., García, B., Eds.; Editorial CCS: Madrid, España, 2012; pp. 93–134. [Google Scholar]
  47. Hippe, R.; Brolpito, A.; Broek, S. SELFIE for Work-Based Learning; Publications Office of the European Union: Luxembourg, 2006. [Google Scholar] [CrossRef]
  48. Brolpito, A.; Lightfoot, M.; Radišic, J.; Šcepanovic, D. Digital and Online Learning in Vocational Education and Training in Serbia: A Case Study; European Training Foundation (ETF): Turin, Italy, 2016; Available online: https://www.etf.europa.eu/sites/default/files/m/DC024C02AA9B9384C12580280043A0B6_DOL%20in%20VET%20in%20Serbia.pdf (accessed on 3 October 2022).
  49. González, A.; Urdaneta, K.; Muñoz, D. Liderazgo organizacional y responsabilidad socioambiental, una mirada desde la complejidad y postmodernidad. Rev. Venez. Gerencia 2017, 22, 11–23. Available online: https://www.redalyc.org/pdf/290/29051457002.pdf (accessed on 3 October 2022). [CrossRef]
  50. Maureira Cabrera, Ó.J. Prácticas del liderazgo educativo: Una mirada evolutiva e ilustrativa a partir de sus principales marcos, dimensiones e indicadores más representativos. Rev. Educ. 2018, 42, 1–19. [Google Scholar] [CrossRef] [Green Version]
  51. Sosa-Díaz, M.J.; Sierra-Daza, M.C.; Arriazu-Muñoz, R.; Llamas-Salguero, F.; Durán-Rodríguez, N. “EdTech Integration Framework in Schools”: Systematic Review of the Literature. Front. Educ. 2022, 7, 895042. [Google Scholar] [CrossRef]
  52. López, A.J.G.; Lanzat, A.M.A.; González, M.L.C. Análisis de la capacidad de innovación escolar desde la perspectiva del profesorado de educación secundaria. La escuela como organización que aprende. Educar 2018, 54, 449–468. [Google Scholar] [CrossRef]
  53. Quispe, M.A.F.; García, R.S.B.; Borjas, L.G.R. Organización educativa que aprende: Transformación y gestión del conocimiento. Rev. Educ. 2018, 14, 13–23. Available online: http://fh.mdp.edu.ar/revistas/index.php/r_educ/article/view/2687/2918 (accessed on 3 October 2022).
  54. Capperucci, D.; Scierri, I.D.M.; Salvadori, I.; Batini, F.; Toti, G.; Barbisoni, G.; Pera, E. Remote Teaching during COVID-19 Emergency: Teaching and Assessment Strategies and the Role of Previous Training. Educ. Sci. 2022, 12, 646. [Google Scholar] [CrossRef]
  55. Acero, J.M.A.; Coca, M.M.; Coca, D.M. Motivación de alumnos de Educación Secundaria y Bachillerato hacia el uso de recursos digitales durante la crisis del Covid-19. Rev. Estilos Aprendiz. 2020, 13, 68–81. [Google Scholar] [CrossRef]
  56. Mercader, C. Las resistencias del profesorado universitario a la utilización de las tecnologías digitales. Aula Abierta 2019, 48, 167–174. [Google Scholar] [CrossRef]
  57. Tierney, R.J.; Bond, E.; Bresler, J. Examining Literate Lives as Students Engage with Multiple Literacies. Theory Pract. 2006, 45, 359–367. [Google Scholar] [CrossRef]
  58. Pruulmann-Vengerfeldt, P.; Kalmus, V.; Runnel, P. Creating Content or Creating Hype: Practices of Online Content Creation and Consumption in Estonia. Cyberpsychol. J. Psychosoc. Res. Cybersp. 2008, 2. Available online: https://cyberpsychology.eu/article/view/4209 (accessed on 3 October 2022).
  59. Costa, P.; Castaño-Muñoz, J.; Kampylis, P. Capturing schools’ digital capacity: Psychometric analyses of the SELFIE self-reflection tool. Comput. Educ. 2021, 162, 104080. [Google Scholar] [CrossRef]
  60. Fraillon, J.; Ainley, J.; Schulz, W.; Friedman, T.; Duckworth, D. Preparing for Life in a Digital World: IEA International Computer and Information Literacy Study 2018 International Report; International Association for the Evaluation of Educational Achievement (IEA): Amsterdam, The Netherlands, 2020. [Google Scholar] [CrossRef]
  61. García-Ruiz, R.; Escoda, A.P. La Competencia Digital Docente como Clave para Fortalecer el Uso Responsable de Internet. Camp. Virtu. 2021, 10, 59–71. Available online: http://www.uajournals.com/ojs/index.php/campusvirtuales/article/view/781 (accessed on 3 October 2022).
  62. Tejada Garitano, E.; Castaño Garrido, C.; Romero Andonegui, A. Los hábitos de uso en las redes sociales de los preadolescentes. RIED Rev. Iberoam. Educ. Distancia 2019, 22, 119–133. [Google Scholar] [CrossRef]
  63. González, W.O.L. El estudio de casos: Una vertiente para la investigación educativa. Educere 2013, 17, 139–144. Available online: https://www.redalyc.org/pdf/356/35630150004.pdf (accessed on 3 October 2022).
  64. Salinas, J. La investigación ante los desafíos de los escenarios de aprendizaje futuros. RED Rev. Educ. Distancia 2012, 32, 1–23. Available online: https://revistas.um.es/red/article/view/233091 (accessed on 3 October 2022). [CrossRef]
  65. García-Aretio, L. Necesidad de una educación digital en un mundo digital. RIED Rev. Iberoam. Educ. Distancia 2019, 22, 9–22. [Google Scholar] [CrossRef]
  66. Kampylis, P.; Hodson, D.; Petkova, S.; Hippe, R.; Cachia, R.; Sala, A.; Weikert García, L.; Castaño-Muñoz, J.; Punie, Y. SELFIE Forum–Teaching and Learning in the Digital Age; Publications Office of the European Union: Luxembourg, 2019; Available online: https://publications.jrc.ec.europa.eu/repository/handle/JRC117482 (accessed on 3 October 2022).
Figure 1. Phases of evaluation and cycles of improvement.
Figure 1. Phases of evaluation and cycles of improvement.
Societies 12 00193 g001
Figure 2. Comparative average score of the SELFIE areas during the three academic years.
Figure 2. Comparative average score of the SELFIE areas during the three academic years.
Societies 12 00193 g002
Table 1. Participants in the academic years 2018–2019, 2020–2021 and 2021–2022.
Table 1. Participants in the academic years 2018–2019, 2020–2021 and 2021–2022.
Education
Agent
Invited SampleParticipant Sample
SELFIEFocus Group
2018–192020–212021–222018–192020–212021–222018–192020–212021–22
School Leaders (SL)302727252526549
Teachers (T)754948614346768
Students (ST)450542490440393374986
TOTAL555618565526461446211823
Table 2. Descriptive statistics on Leadership.
Table 2. Descriptive statistics on Leadership.
A. Leadership2018–20192020–20212021–2022
SLTSLTSLT
“Copyright and licensing rules” (A4)33.444.33.94.3
“Time to explore digital teaching” (A5)No dataNo data2.12.62.22.6
Note: Items are measured on a 5-point Likert scale, from 1 = strongly disagree to 5 = strongly agree. There are items that were not included in the 2018–2019 SELFIE questionnaire, so no data is available.
Table 3. Descriptive statistics on Collaboration and Networking.
Table 3. Descriptive statistics on Collaboration and Networking.
B. Collaboration and Networking2018–20192020–20212021–2022
SLTSLTSLT
“Progress review” (B1)2.833.13.73.43.7
“Synergies for blended learning” (B4)22.22.83.133.3
Table 4. Descriptive statistics on Infrastructure and Equipment.
Table 4. Descriptive statistics on Infrastructure and Equipment.
C. Infrastructure
and
Equipment
2018–20192020–20212021–2022
SLTSTSLTSTSLTST
“Bring your own device” (C13)No dataNo dataNo data4.23.93.64.34.23.9
“Digital devices for learning” (C8)2.82.93.3443.444.23.8
“Online libraries/repositories” (C16)No dataNo dataNo data2.633.12.533.7
Table 5. Descriptive statistics on continuing professional development.
Table 5. Descriptive statistics on continuing professional development.
D. Continuing Professional
Development
2018–20192020–20212021–2022
SLTSLTSLT
“Participation in CPD” (D2)3.73.243.83.63.8
“CPD needs” (D1)3.22.93.53.63.23.5
“Sharing experiences” (D3)3.22.83.43.733.7
Table 6. Descriptive statistics on pedagogy: supports and resources.
Table 6. Descriptive statistics on pedagogy: supports and resources.
E. Pedagogy: Supports and Resources2018–20192020–20212021–2022
SLTSLTSLT
“Online educational resources” (E1)3.84.14.14.644.5
“Communicating with the school community” (E4)3.64.14.54.33.94.4
“Using virtual learning environments” (E3)3.23.23.83.63.73.9
“Open educational resources” (E5)No dataNo data3.843.63.9
“Creating digital resources” (E2)3.13.33.63.93.34
Table 7. Descriptive statistics on Pedagogy: Implementation in the classroom.
Table 7. Descriptive statistics on Pedagogy: Implementation in the classroom.
F. Pedagogy: Implementation in the Classroom2018–20192020–20212021–2022
SLTSTSLTSTSLTST
“Engaging students” (F4)3.23.63.43.943.43.83.93.6
“Cross-curricular projects” (F6)232.63.22.93.2333.23.3
Table 8. Descriptive statistics on Assessment Practices.
Table 8. Descriptive statistics on Assessment Practices.
G. Assessment
Practices
2018–20192020–20212021–2022
SLTSTSLTSTSLTST
“Using data to improve learning” (G9)No dataNo dataDoes
not
apply
3.23.7Does not
apply
3.43.8Does not
apply
“Assessing skills” (G1)2.52.9Does
not
apply
3.53.5Does not
apply
3.33.7Does not
apply
“Digital assessment” (G7)2.82.9Does
not
apply
3.63.5Does not
apply
3.43.5Does not
apply
“Feedback to other students” (G6)23232.62.82.9232.92.92.8
Table 9. Descriptive statistics on Student Digital Competence.
Table 9. Descriptive statistics on Student Digital Competence.
H. Student Digital Competence2018–20192020–20212021–2022
SLTSTSLTSTSLTST
“Learning to communicate” (H8)3.43.33.24.243.244.23.5
“Safe behavior” (H1)333.73.83.73.53.83.83.8
“Responsible behavior” (H3)33.13.23.63.73.63.73.63.8
“Solving technical problems” (H13)No
data
No dataNo data33.32.72.83.13.2
“Learning coding or programming” (H11)No
data
No dataNo data2.72.72.52.62.82.9
Table 10. Average score of the answers of all the participants for each one of the areas of SELFIE.
Table 10. Average score of the answers of all the participants for each one of the areas of SELFIE.
SELFIE Areas2018–20192020–20212021–2022
A.
Leadership
33.73.6
B.
Collaboration and Networking
*3.23.3
C.
Infrastructure and Equipment
3.23.53.6
D.
Continuing Professional Development
3.13.73.5
E.
Pedagogy: Support and Resources
*43.9
F.
Pedagogy: Implementation in the classroom
*3.33.5
G.
Assessment Practices
2.63.23.3
H.
Student Digital Competence
3.13.43.5
Note: The areas “collaboration and networking” are introduced in the 2020–2021 questionnaire. The areas “pedagogy: support and resources” and “pedagogy: support and resources” are introduced in the 2020–2021 questionnaire. In the 2018–20219 questionnaire they are included in the area called “teaching and learning”.
Table 11. 2021–2022 digital improvement goals for each SELFIE area.
Table 11. 2021–2022 digital improvement goals for each SELFIE area.
Area
SELFIE
Descriptor
SELFIE
AverageObjective
A.
Leadership
A.4. Time to explore digital teaching2.4Design a schedule to explore how to improve the teaching with digital technologies.
B.
Collaboration and Networking
B.4. Synergies for Blended Learning3.1Establish collaborative networks with other schools or organizations to support the use of digital technologies.
C.
Infrastructure and Equipment
C.16. Online libraries/repositories3.1Create online libraries or repositories with teaching and learning materials.
D.
Continuing Professional Development
D.3. Sharing experiences3.4Support teachers to share experiences within the school community about teaching with digital technologies.
E.
Pedagogy: Supports and Resources
E.5. Open educational resources”3.8Promote the use of open educational resources in teachers and students.
F.
Pedagogy: Implementation in the classroom
F.6. Cross-curricular projects3.2Engage students in using digital technologies for cross-curricular projects.
G.
Assessment Practices
G.6. Feedback to other students2.9Use digital technologies to enable students to provide feedback on other students’ work.
H.
Student Digital Competence
H.11. Learning coding or programming2.8Promote the learning of coding or programming in students.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Fernández-Miravete, Á.D.; Prendes-Espinosa, P. Digitalization of Educational Organizations: Evaluation and Improvement Based on DigCompOrg Model. Societies 2022, 12, 193. https://doi.org/10.3390/soc12060193

AMA Style

Fernández-Miravete ÁD, Prendes-Espinosa P. Digitalization of Educational Organizations: Evaluation and Improvement Based on DigCompOrg Model. Societies. 2022; 12(6):193. https://doi.org/10.3390/soc12060193

Chicago/Turabian Style

Fernández-Miravete, Ángel David, and Paz Prendes-Espinosa. 2022. "Digitalization of Educational Organizations: Evaluation and Improvement Based on DigCompOrg Model" Societies 12, no. 6: 193. https://doi.org/10.3390/soc12060193

APA Style

Fernández-Miravete, Á. D., & Prendes-Espinosa, P. (2022). Digitalization of Educational Organizations: Evaluation and Improvement Based on DigCompOrg Model. Societies, 12(6), 193. https://doi.org/10.3390/soc12060193

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop