What Factors Determine the Value of an Online Teacher Education Experience from a Teacher’s Perspective?

E-learning is currently at the center of interest in the educational community, in a situation of global pandemics that forces us to reduce attendance in all aspects of our lives. Online courses have had a bad reputation due to the distance between the tutor and the apprentice and the high dropout rates. The main purpose is to analyze the evaluations made by teachers on different features of a course, by examining variables like content, technology, activities, final work, and format. The analysis covers the 50 online courses that form part of the continuous training offer of the Regional Ministry of Education of the Community of Madrid in courses 17–18 and 18–19 in which 7501 teachers have participated. The opinion of each participant is collected from a questionnaire at the end of the training activity. This has been validated by a group of experts from the Regional Centre for Innovation and Training, belonging to the Community of Madrid, using the Delphi method. To develop the methodology, a linear multivariate model was calculated on the independent variables and the dependent variable Net Promoter Score (NPS). Most of the findings are related to two central variables in the educational approaches that occur in digital scenarios. What the group of teachers’ values most is the content of the proposal, while the factor that they consider least important is the technology that supports the development of the course. The rest of the variables analyzed have little impact on the recommendation of the courses. Nevertheless, conclusions suggest that combining factors such as content, technology, and pedagogy are essential in experiences like these.


Introduction
Continuous or ongoing teacher training is a response to the need posed both by changes and advances in society and by challenges in understanding educational processes and curricula that question teachers' practices in dealing with new educational realities and their stakeholders. In this sense, it is necessary that the practice be reflected upon and fed back into a commitment that favors a better intervention in the educational environment, with new proposals and new pedagogical methods, based on the development of new competencies and skills pertinent to its action [1]. This ongoing training includes all the formative actions developed by the teacher in order to advance and seek, on a permanent basis, to get closer to pedagogical and disciplinary knowledge. This constant updating obeys both their personal needs and the processes and interests of educational institutions and external of competence. It also indicates that they can be developed individually or collectively. García [10] points out that one of the factors that can influence the effectiveness of teacher training processes is the possibility of communicating with each other. In this way, they can reflect on their experiences and changes and be able to negotiate meanings with other peers facing similar contexts and situations. All this seems to be related to a positive effect on their practice and the effectiveness of the course. In online training, this interactivity is encouraged through debates, collaborative work, asynchronous communication, forums, wiki, and others. As for the tasks, different typologies are available such as questionnaires, tests, projects, among others.
A second feature of analysis is the content of the course says that teachers tend to better evaluate the training processes whose contents they emphasize in their experience. In the same way, they value the possibility of improving skills that interest and motivate him and in which the development of a certain skill that he can experience is declared (empirical and inductive learning) [10]. When referring to the materials used to develop these contents, it is important that all of them have been prepared for the objective of the course. These can bring teachers with increasing complexity closer to the concepts and knowledge required [9]. The author draws attention to the use of scientific documents or articles that often, because they are at a very high level, are not understood at an initial stage of the process and must be gradually integrated.
In reference to the means and resources used for online training, these must be consistent with the proposed objectives. It is noteworthy the fact that they must arouse the interest and motivation of the student and achieve their involvement in learning. At the same time, the need to reach all the students through them is pointed out, taking into account, for example, their learning styles. Here, resources that target each of the sensorial channels that can include simulated experiences, animations, audiovisuals, graphics, demonstrations, forums, blogs, wikis, and a large number of other resources that make ICTs possible are positively valued [11].
Another aspect to consider is the format in which online training is offered, which will depend largely on the technological software used and its design and architecture. This is an aspect that can greatly influence the motivation of the learner [9]. Thus, in studies carried out on the format, it is possible to identify common elements that obey to a pedagogical bet of them: Title, introductory video, presentation of the objectives, presentation of the teaching team, weekly dedication, duration of the course, functioning of the system, modules with their activities, and ways of relating and evaluation [7].
It is necessary to emphasize the importance of generating online learning communities that are capable of generating networked learning. To do this, there must be interaction between the participants of e-learning experiences using tools and spaces that encourage the exchange of impressions, resources, and knowledge. To produce a connective learning, it is necessary to design activities that favor the interaction between peers through workflows that empower the learners in the creation and participation. We also offer a wide diversity of content in different media and formats (videos, blogs, wikis, interactive activities, etc.) [12].
In the current situation of the Covid-19 pandemic, in which attendance has been limited due to its characteristics, distance education becomes an important opportunity for the continuity of teacher training. Therefore, it is necessary to advance studies that can shed light from the same teachers on their possibilities, advantages, and disadvantages.
In the case of Spain, teacher training is offered in different formats: Face-to-face, mixed, or distance. However, it is basically face-to-face. Therefore, it is considered a strategic priority to develop online courses that facilitate access to and monitoring of training programs for active teachers, providing both geographical and time flexibility. For this purpose, the Ministry of Education of the Community of Madrid offers annually various online distance training courses through its digital platform. The Community of Madrid has 6.5 million inhabitants and almost 51,000 teachers of non-university education. Between 2017 and 2018, around 50 tutored courses were offered in the online modality, plus classroom courses and mass online courses (MOOC), modalities that we will not address in this article. Each course follows a common structure designed by the Regional Centre for Innovation and Training (CRIF) and is assisted by tutors who have also been trained at the same Centre. All activities are carried out online and communication is also through the Internet. At the end of the course, a questionnaire is offered to evaluate the different aspects of the course and the reasons for accessing the training process. The online courses are offered from the digital teaching platform of the Regional Ministry of Education. The participants have access to different course modules and communication tools in a unique environment. In addition, the courses are optionally accompanied by external tools to the platform, such as social networks or forums.
It is precisely the objective of this study to analyze the evaluations made by teachers on the different aspects of a course, as are the contents that make up these courses, the technology through which these online experiences have been carried out, the activities that have had to be solved to achieve the resolution of the course, the final work that allows to demonstrate that the skills have been acquired and successfully overcome the training development, and finally, the format that configures the structuring of the different elements that make up the sequence. Also, this evaluation takes as a reference a predictive factor of satisfaction that is the NPS (Net Promoter Score). This analysis offers elements that can be useful in the implementation of relevant training actions with a better impact on educational quality.

Participants
The participants in this research are teachers who registered and developed the online training courses offered by the Ministry of Education of the Community of Madrid in the academic years 2017-2018 and 2018-2019. Therefore, this is a non-probabilistic sample. The teachers sign up voluntarily for the training courses offered by the Community of Madrid. If there are more registrations than places available, scales based on merit and seniority are applied to award them.
These courses are sequential and are developed throughout the corresponding academic year, through the Moodle platform of the Community of Madrid. These training programs have a modular character, having to overcome a block to be able to access the next one. The dynamics follows the basic principles of an e-learning and collaborative learning proposal [13]. The student advances through the course autonomously, under the guidance of a tutor. The tutor's functions consist of acting as a guide during the entire teaching and learning process, resolving individual or collective doubts, and correcting tasks by providing feedback to each of them. In addition, it makes common spaces for collaboration more dynamic. The experiences are characterized by being ubiquitous and connected [14], allowing the learner to access the experience from any place and time, being in constant contact with the course learning community. Specifically, the virtual classroom has general and specific forums where questions are resolved by the teacher or any student. Online activities or file attachments are also available, and in addition, some courses use external applications and services such as Padlet, Twitter, among others, in order to enrich the digital competence of the teachers.
The questionnaire they solve is at the end of the training experience, contained within the same platform where the course takes place. The total number of teachers who participated was 7810, however, as a criterion for the inclusion of the research, only those teachers who were able to fill in the questionnaire, for a total of 7501 participants in this study. They were distributed as shown in Table 1. For the subsequent analyses, as well as for the following tables and figures, only the course number will be used. This will facilitate reading. The high response rate is due to the fact that the questionnaire is located within the same space in which the training itinerary is developed, and furthermore, it is an essential requirement in order to successfully complete the course. The following, table describes the number of students in each of the courses analyzed in this paper.

Research Instrument
Data collection was carried out through an online questionnaire located in the last module of each course. This survey was designed by a discussion group made up of professionals from the Regional Centre for Innovation and Training, which met during several sessions. The first session defined the type of survey (areas of interest, number of items, writing style, rating scale, and presentation format). The survey was passed on to a pilot group of people who had taken one of the online courses. Then, a session was held in which the items were written and placed inside the questionnaire's skeleton. These items contemplated different types of questions, such as open and closed questions as well as multiple choice questions. The evaluation questions were constructed using a 6-point Likert scale. Using the Delphi method, an expert consultation was carried out to validate the final product. This step culminated in the final construction of the tool, allowing a high degree of validity for the purpose of the instrument. Once validated, it was digitized using the Google Drive forms tool. The final product would end up collecting information about the teachers who participate in the training experiences, the perception they have about these courses, their origin, the information needed to solve it, and one last point that questions whether they would recommend this type of training to other fellow teachers. The reliability of the instrument was checked using Cronbach's Alpha coefficient. The score was 0.961, which allowed us to establish the internal consistency of the instrument.

Data Analysis
This research works with data on the evaluation of different aspects of the course, as well as the degree of recommendation of the same. Therefore, these data were used as variables, resulting in three types: Dependent, independent, and categorical: • Dependent variables: NPS and would you recommend this course to another teacher. The result of this calculation can range from −100% to +100%. The central value, therefore, is zero. An NPS greater than zero is favorable and a NPS less than zero is unfavorable.
The variable Would you recommend this course to another teacher? measured on a scale of 0 to 10. In order to prepare the independent variables (measured on a 6-degree Likert scale), the items in the questionnaire have been framed within each of the variables, according to their subject matter, and then an average score has been calculated for subsequent analysis.
To develop the methodology, a linear multivariate model was calculated on the independent variables and the dependent variable NPS. Each one of them was compared with the course variable, in order to establish an analysis of variance between them.
Initially, the data offered by CRIF are checked for normality for the variables content, technology, activities, final work, and format, applying the Kolmogorov-Smirnov test, since we have a sample more than n > 50. It can be observed in Table 2 that p-value for all variables analyzed in this study is 0.000. Therefore, we can say that with a 95% confidence for the variables, there is no evidence to prove that it follows a normal distribution. Next, the normality statistic is presented, which will indicate if the sample will be treated in a parametric way or not.

Results
The following are the results of the statistical analysis of the data, which begin with a Spearman correlation because the sample does not follow a normal distribution.  Our next step is to look at the average reputation of each course (NPS), such as its recommendation, as described in Tables 4 and 5. This allows us to place ourselves from a general perspective on each of the variables, before analyzing in detail the variance of each of the independent variables in relation to the course.   Then, since the sample cannot be said to follow a normal distribution, non-parametric measurements are applied. Specifically, the Kruskal-Wallis test, to confirm the existence of significant differences between the independent variables and the dependent variable NPS for each of the courses taken by the teachers.

Evaluation of ICTs
When the sample is paired, significant elements emerge. This allows you to check and mark differences between courses. In them, it can be seen that the course that better valued the use of ICT was No 1. Course number 5 had a worse valuation of the same section ( Figure 1). The average evaluation of the use of ICT is 4.53. Then, since the sample cannot be said to follow a normal distribution, non-parametric measurements are applied. Specifically, the Kruskal-Wallis test, to confirm the existence of significant differences between the independent variables and the dependent variable NPS for each of the courses taken by the teachers.

Evaluation of ICTs
When the sample is paired, significant elements emerge. This allows you to check and mark differences between courses. In them, it can be seen that the course that better valued the use of ICT was No 1. Course number 5 had a worse valuation of the same section ( Figure 1). The average evaluation of the use of ICT is 4.53.

Evaluation of the Contents
Analyzing the variable Contents Evaluation, it is possible to verify differences between the courses. In them, it can be seen that the one that had better evaluations of the Content was No. 32. Course number 5 had a worse evaluation of the same section ( Figure 2). In the variable in question, the average rating was 5.16.

Evaluation of the Contents
Analyzing the variable Contents Evaluation, it is possible to verify differences between the courses. In them, it can be seen that the one that had better evaluations of the Content was No. 32. Course number 5 had a worse evaluation of the same section ( Figure 2). In the variable in question, the average rating was 5.16.

Evaluation of Activities
By focusing on the Activity Evaluation, we can see discrepancies across the different courses. We can see that the course No. 33 was the one with the best evaluation of the activities. The course No. 7 was the one with the worst evaluation in the same section ( Figure 3). In this case, the average value of the activities was 5.10.

Evaluation of Activities
By focusing on the Activity Evaluation, we can see discrepancies across the different courses. We can see that the course No. 33 was the one with the best evaluation of the activities. The course No. 7 was the one with the worst evaluation in the same section ( Figure 3). In this case, the average value of the activities was 5.10.

Evaluation of the Final Work
If we focus on the evaluation of the Final Work of the different courses, significant differences between them can be observed. This allows us to appreciate that the course in which the Final Work was best evaluated was No 14. Course number 5 had a worse evaluation of the same section ( Figure   Figure 3. Kruskal-Wallis test for Activity Evaluation variable.

Evaluation of the Final Work
If we focus on the evaluation of the Final Work of the different courses, significant differences between them can be observed. This allows us to appreciate that the course in which the Final Work was best evaluated was No 14. Course number 5 had a worse evaluation of the same section ( Figure 4). This time, the average value of the Final Project was 5.35.

Evaluation of the Final Work
If we focus on the evaluation of the Final Work of the different courses, significant differences between them can be observed. This allows us to appreciate that the course in which the Final Work was best evaluated was No 14. Course number 5 had a worse evaluation of the same section ( Figure  4). This time, the average value of the Final Project was 5.35.

Evaluation of the Format
In the verification of the sample by pairs referring to the Format Evaluation, significant elements are identified. This allows us to check and mark differences between courses. In them, it is possible to appreciate that the course in which the Format was better valued was No 14. Course number 5 had a worse valuation concerning the same section ( Figure 5). Here, the average rating of the Format was 4.85.

Evaluation of the Format
In the verification of the sample by pairs referring to the Format Evaluation, significant elements are identified. This allows us to check and mark differences between courses. In them, it is possible to appreciate that the course in which the Format was better valued was No 14. Course number 5 had a worse valuation concerning the same section ( Figure 5). Here, the average rating of the Format was 4.85.

NPS
Finally, when valuing the reputation of the course, as it happened with the previous variables, we can also confirm significant differences between them. The course that obtained the best NPS was

NPS
Finally, when valuing the reputation of the course, as it happened with the previous variables, we can also confirm significant differences between them. The course that obtained the best NPS was No 34 and the worst valuation of the same section was No. 5 ( Figure 6). The average NPS score was 38.75, a value that is considered within a positive reputation range.

Discussion
At least for the next decade, training in ICT skills will be a burning challenge in all educational systems around the world [15]. For an educational system to be able to meet the demands of the knowledge society, it must have teachers who design, experiment, and evaluate learning experiences enriched with ICT [16].
Furthermore, there is no doubt that teacher training must become one of the challenges to be faced in order to ensure that citizens, especially children, adolescents, and young people who are still in the teaching-learning process, receive correct media literacy from [17]. Having a good system for valuing this formation is a fundamental requirement for a good evaluation [18].
The main conclusion of this study reveals that, for a good recommendation of an online teacher training course, the most important factor is the content that constitutes it, as Table 3 shows. There are other important factors in the evaluation of a course, such as the use of ICT or methodology, but the one that has more consideration is the content itself, giving a positive correlation of 0.646. This infers that the better the evaluation of the course content, the more recommended it is among teachers. At the same time, and reinforcing this conclusion, Table 5 indicates that there is a difference of 2.71 points on average in the evaluation of content between the best rated course (Cedar) in this section, and the worst rated (Technology, Programming, and Robotics). Similarly, Rivero and Mur [19] discover that methodology, content, and a good pedagogical foundation forms the basis for innovative approaches in the future.
It is possible that questions such as those discovered by Roig-Vila, Mengual-Andrés, and Quinto-Medrano [20] may influence the primary school teachers' disciplinary and pedagogical knowledge rather than their technological knowledge. From the opposite perspective, there are tendencies linked to training actions that address the area of content creation of digital teaching competence. In addition to teaching teachers how to use authoring tools, they should pay special attention to technical aspects such as accessibility and reusability of content, as well as go in-depth into the creation of adaptive resources and providing feedback [21].
Although the lack of good technological training constitutes one of the key elements for the difficulties of implementing new technologies [22], in this case, as observed in Table 3, the teachers do not consider the ICT variable as an influential factor when recommending a training experience. All this is evident with a low positive correlation (0.403), assuming the lowest value of all the factors

Discussion
At least for the next decade, training in ICT skills will be a burning challenge in all educational systems around the world [15]. For an educational system to be able to meet the demands of the knowledge society, it must have teachers who design, experiment, and evaluate learning experiences enriched with ICT [16].
Furthermore, there is no doubt that teacher training must become one of the challenges to be faced in order to ensure that citizens, especially children, adolescents, and young people who are still in the teaching-learning process, receive correct media literacy from [17]. Having a good system for valuing this formation is a fundamental requirement for a good evaluation [18].
The main conclusion of this study reveals that, for a good recommendation of an online teacher training course, the most important factor is the content that constitutes it, as Table 3 shows. There are other important factors in the evaluation of a course, such as the use of ICT or methodology, but the one that has more consideration is the content itself, giving a positive correlation of 0.646. This infers that the better the evaluation of the course content, the more recommended it is among teachers. At the same time, and reinforcing this conclusion, Table 5 indicates that there is a difference of 2.71 points on average in the evaluation of content between the best rated course (Cedar) in this section, and the worst rated (Technology, Programming, and Robotics). Similarly, Rivero and Mur [19] discover that methodology, content, and a good pedagogical foundation forms the basis for innovative approaches in the future.
It is possible that questions such as those discovered by Roig-Vila, Mengual-Andrés, and Quinto-Medrano [20] may influence the primary school teachers' disciplinary and pedagogical knowledge rather than their technological knowledge. From the opposite perspective, there are tendencies linked to training actions that address the area of content creation of digital teaching competence. In addition to teaching teachers how to use authoring tools, they should pay special attention to technical aspects such as accessibility and reusability of content, as well as go in-depth into the creation of adaptive resources and providing feedback [21].
Although the lack of good technological training constitutes one of the key elements for the difficulties of implementing new technologies [22], in this case, as observed in Table 3, the teachers do not consider the ICT variable as an influential factor when recommending a training experience. All this is evident with a low positive correlation (0.403), assuming the lowest value of all the factors that influence teacher satisfaction with the training proposal. This conclusion is reinforced by observing Figure 1, from which it can be extracted that the average evaluation of the use of ICT within all the training courses is the one that has obtained the lowest score, exactly 4.53. This definitely shows that ICT is the factor of those taken into account that has the least influence on teacher satisfaction with the course. This is a premise that coincides with the Cabero and Marín [23] proposal, in which the training and improvement of teachers in ICT must be done from different approaches to how it has been done, which has focused too much on instrumental and technological aspects.
Furthermore, these formative actions for the virtual training of teachers should not focus so much on the instrumental component, that is, the functioning of the platform or LMS, but rather on didactic aspects, referring to the design of materials for the network, or the strategies and e-activities to be applied on that network [24]. Technology needs to be infused as a systemic and systematic process throughout the teacher training program [25]. On the contrary, placing ourselves in the position of the trends that give great value to technical aspects in teacher training processes is one of the fundamental premises for the incorporation of ICTs into the educational world is the technical knowledge to handle and apply digital tools. Without these skills, it will be difficult to implement experiences mediated by the technology in education [26].
At this same level, Erdogan and Sahin [27] found that, by obtaining poor results in technological knowledge, lower results are also obtained at the intersections of these with the rest of the basic knowledge (pedagogical and content knowledge). This implies low results in the technological and pedagogical knowledge of content, necessary for a good integration of ICT in the teaching work according to the TPACK model [28]. However, it should be noted that in order to effectively use ICTs in teaching, teachers must be trained in technological, content, and pedagogical skills. Therefore, education, teacher development, and professional development programs should provide learning opportunities for teachers to develop the three areas addressed in the TPACK model [29]. The continuous training of teachers has to be redirected towards other paths, not teaching a teacher how to handle a tool like Twitter but presenting them. Thanks to that, they will be able to get in touch with other teachers and share experiences and learning. In other words, the challenge lies in fostering relationships between education professionals and the media, not only as application technologies, but as cultural forms. We must change the culture of continuous training by generating a network in which professionals with more knowledge and experience with the media take on a mentoring role towards the newest ones, until they acquire sufficient mastery and security in their use [30].
Of the courses taken, the scores vary in terms of the evaluation of the sections measured in the questionnaire. Although it is true that a pattern is observed here (Figures 2, 4 and 6), it can be concluded that in general terms, course No 33 is the best valued in most of the factors, and even when it is not the best valued, it tends to be classified in high positions. In a similar way, but on the opposite side, the same happens with course No 5. This is usually the worst valued in most of the variables, and in those that are not, it is usually among the worst ranked.
With regard to the reputation of the course (NPS), the variables taken (contents, ICT, activities, final work, and format) have little influence on it, as can be seen in Table 3. Therefore, it is convenient to pose as prospective and future lines of development, to continue studying which variables can influence (evaluation systems, feedback tools, course methodology, gender of the participants, age of the teachers, etc.) for this purpose. Along these lines, we can progress in accordance with Felini [31] who offers a series of parameters that can guide the way forward and highlights, among other issues, the need to incorporate co-teaching processes in which more than one teacher participates, the configuration of effective working groups, and the originality of the training activity on media literacy. Furthermore, these training processes must necessarily be linked to good teaching practices that serve as a reference model [30].
What is clear is that a competence training proposal cannot be addressed through an approach based on the mere transmission of knowledge [32], it requires applied practical experience to develop the professional competence of teachers in the educational use of ICT [33], and the use of didactic resources and continuous and effective feedback [34].
Teachers involved in ICT professional development definitely tend to use computers more often, place more emphasis on teaching ICT skills, and have a stronger sense of ICT self-efficacy than their skeptical colleagues [35].
Regarding the rest of the variables, the activities of the training proposals analyzed do not seem to determine too much the decisions that teachers make in relation to their preference for continuous training. If we look at the final work of the courses, we discover that, as in the case of the activities, it is a factor that does not exert a high influence on the teachers' decision making about the training opportunities they intend to access. Finally, regarding the format of the courses we can point out that it follows the pattern of the two previous factors, it cannot be considered a variable that influences the teachers' preferences in a decisive way when facing the selection of a training opportunity.

Conclusions
To close, the objective of the study is to analyze the assessments made by teachers on different aspects of a course, identifying the factors (content, technology, activities, final work, and format) that predict the selection of a training experience by this profile. In this sense, it has been discovered that the content is what attracts most attention, while the technology used for its development turns out to be the least attractive for the teaching profile. The NPS of the courses analyzed reinforces these conclusions, since the courses that have obtained the best valuation are those in which the contents stand out, while the less valued are those directly linked to technology. Therefore, we can conclude that a positive recommendation of a training opportunity by the teacher will be given in those courses that emphasize the design of their contents aligning them with the needs of the target population. On the other hand, we can emphasize that the factor that will have the least influence on this decision is the technology that is proposed as a means to carry out the different practices and dynamics that make up the training experience.
It can be clearly seen in the course No 33 (Digital Collaborative Art) that has a very marked focus on content arrangement, while course No 5 focuses mainly on technological instrumental content (Technology, programming, and robotics). Table 5 represents the average course recommendation to support this idea. In it, it can be seen that, as in the case of Course Evaluation, numbers 33 and 34 are the most recommended, while numbers 5 and 7 are the least recommended. This reasserts the importance of the content, since the central proposal of the training experiences of numbers 33 and 34 has a component directly linked to this variable, while numbers 5 and 7 are more focused on technological aspects.
For teachers to design new learning environments and incorporate technologies, it is essential to experience a variety of modalities and initiatives in their own training, even more so in the pandemic situation in which we find ourselves The skills and competencies that teachers will have to develop in order to be able to teach how to learn in the new society are not acquired in training contexts that do not promote these skills and competencies. New training contexts are demanded in a digital environment that is rapidly changing. In short, new training scenarios must be considered in order to respond to new forms of learning [36], and to be successful in these scenarios, teachers need to have a good techno-pedagogical training that enables them to have control over the use of technological resources in their teaching practice, and in parallel, access to technical training on the use of different digital tools that give them security in the implementation of the sessions; all this at the service of an innovative pedagogical experience [37,38]. This is a necessity that has been magnified in the current situation with the Covid-19 pandemic, which has forced teachers to move their learning proposal to a digital environment in order to ensure continuity in the learning of their students.

Limitations and Further Developments of the Research
Finally, in relation to the limitations of the study, we can highlight the absence of control over variables such as the specific area of work of teachers, which could directly influence the preferences on the different aspects that make up the training courses. In addition, no information has been collected on sociodemographic variables that segment the sample by gender and age. These are two aspects that could emerge as elements that directly influence the opinions that teachers have about the training opportunities they face in their continuous professional development.
As future lines of research, it is proposed to develop this construct from the national proposal for teacher training. The sample would be extended to the other communities that make up the Spanish geography, seeking to consolidate the findings of this work from a broader perspective. In addition, a hypothesis will be raised that facilitates delving into the perceptions of the group of teachers, addressing the research from different methods. It will allow to identify the factors that most influence the decision making of the teachers about their ongoing training, providing concrete recommendations that allow the improvement of a training experience of this nature. At the same time, it is necessary to check whether the effects of the Covid-19 pandemic have modified the preferences of teachers in relation to the factors of the training they attend.

Conflicts of Interest:
The authors declare no conflict of interest.