Next Article in Journal
Between Addiction and Immersion: A Correlational Study of Digital and Academic Behaviour Among Engineering Students
Previous Article in Journal
Holistic Approach in Higher Education in Latin America to Adapt to New Social and Labor Needs: Challenges for Quality Assurance
 
 
Systematic Review
Peer-Review Record

Educational Technology in Teacher Training: A Systematic Review of Competencies, Skills, Models, and Methods

Educ. Sci. 2025, 15(8), 1036; https://doi.org/10.3390/educsci15081036
by Henry David Osorio Vanegas 1, Yasbley de María Segovia Cifuentes 1,* and Angel Sobrino Morrás 2
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Educ. Sci. 2025, 15(8), 1036; https://doi.org/10.3390/educsci15081036
Submission received: 11 May 2025 / Revised: 13 July 2025 / Accepted: 17 July 2025 / Published: 13 August 2025
(This article belongs to the Section Teacher Education)

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

The manuscript presents a timely and comprehensive systematic review of teacher training in educational technology, focusing on competencies, skills, models, and methods. It is well structured and follows PRISMA guidelines, and it includes useful bibliometric and qualitative analyses. However, there are several areas that need clarification or improvement to enhance the overall clarity, theoretical grounding, and rigor of the review.

Introduction:

The introduction section (lines 26–51) is underdeveloped. It briefly mentions the digital era and COVID-19 but does not fully establish the research gap, the importance of this review in relation to existing literature.

As the authors supported, the are several frameworks that support teachers’ digital competences. Therefore, I do recommend in the introduction to discuss prior reviews or gaps in existing research, clearly articulating why a systematic review on this topic is needed now.

Lines 35-37: I agree that COVID-19 forced teachers to adopt to a new teaching reality. I believe though that the last years AI in education is also a significant change for teachers too that potentially force teachers to cultivate additional competences.

The UNESCO ICT Competency Framework is presented as a point of reference (lines 58–65), but the manuscript does not clarify why this framework is prioritized over others (e.g., DigCompEdu, ISTE). The analysis of the paper does not build on the UNESCO’s framework. I see that they use it as an example, but I do no think that the graph is  necessary from the time they do not study it deeply.

  1. The research questions (Questions A and B with subquestions A.1, A.2, B.1, B.2) are presented inside the eligibility criteria section (Section 2.1, lines 120–137), which is unconventional and potentially confusing. I recommend to move the questions in the end of the introduction and include them in a new section highlighting the research gap, the research aims and the research questions. Also, the subquestions overlap with the main questions A & B. I would recommend a single clear and concise question per domain.
  2. Considering the information sources, I do agree with the justification for selecting Scopus, however using only Scopus restricts the scope and breadth of the systematic review, potentially omitting key studies indexed in other reputable databases like ERIC, Web of Science. I recommend using additional sources. Also, the title of the section is misleading as basically there is only one source.

The manuscript claims to systematically review the competencies and skills in educational technology required by in-service teachers. However, the search strategy, as currently presented, does not include any keywords explicitly related to technology, such as “educational technology,” “ICT,” “digital skills,” “technology integration,” or similar terms.

This omission is concerning for several reasons:

Inconsistency between the review’s aim and the search terms used:

    • The core concepts of the review—technology-related skills and competencies—are not represented in the search string provided (lines 147–149). The listed terms focus solely on teacher training and professional development without anchoring the search to the technological dimension.

Lack of transparency in the exclusion criteria:

    • The exclusion criteria (lines 164–170) do not filter out studies that are unrelated to educational technology. Without a clearly defined technological focus in either the inclusion or exclusion criteria, it becomes difficult to validate the claim that all 55 final studies meaningfully address digital competencies or technology integration.

Implications for replicability and review validity:

    • This limits reproducibility, a key requirement in systematic reviews. Readers cannot evaluate whether the final set of studies truly represents the landscape of research on digital competencies in teacher education.

I recommend that authors should revise the search string to explicitly include keywords related to educational technology. They should also clarify in the selection criteria how studies were evaluated for relevance to technology-related competencies.

Furthermore, the manuscript does not clearly describe how the quality of the included studies was assessed. In systematic reviews, assessing the methodological quality or risk of bias of each included study is critical for ensuring the validity and reliability of the synthesis. While the authors mention using an Excel matrix for data extraction (line 172) and briefly refer to a “minimal risk of bias” (line 179), there is no transparent or systematic description of a quality appraisal process.

As emphasized by Gough et al. (2017), there is considerable variation in how study quality is assessed in systematic reviews, and this variation must be explicitly addressed.

4.The fonts used in the figures and graphs (e.g., Figures 3, 4, and 5) do not match the rest of the manuscript

5.In Table 3, the authors list various models and theoretical frameworks guiding teacher training in educational technology. However, HLM (Hierarchical Linear Modeling) is not a conceptual model relevant to teacher training or educational technology; rather, it is a statistical technique used for analyzing nested data structures (e.g., students within schools).

6.The concept of training methods requires clearer definition and consistent classification. While the manuscript provides examples such as PBL and PLCs, it remains unclear what criteria were used to categorize an approach or activity as a “training method.”

In this case, Wu et al. (2021) likely used HLM to analyze relationships within multilevel data, not to conceptualize or guide teacher training practices. Its inclusion in this table may confuse readers about its purpose and relevance. While Table 3 provides a comprehensive list of models referenced in the reviewed studies, it currently lacks contextual information explaining why each model was included and how it relates to the development of competencies and skills in educational technology.

To improve clarity and scholarly rigor, I recommend adding an additional column that offers a brief justification for the inclusion of each model.

7.The Discussion section is largely descriptive, reiterating the results without offering deeper analysis or critical interpretation. While the authors successfully summarize which competencies and methods were most frequently identified, they do not sufficiently explore why certain competencies are more prominent than others, or what this reveals about the current state of educational technology integration in teacher training.

For example, an important and overlooked point is that competencies related to Technological Content Knowledge (TCK) — which reflect more sophisticated pedagogical design using technology — are not prominently featured in the reviewed studies. Instead, the identified competencies and skills largely reflect a basic or instrumental use of technology, such as operating platforms, managing digital content, or using messaging tools.

The review covers studies from various regions (Europe, Asia, North America), but the discussion does not reflect on regional or cultural variations.

8.The Recommendations section is not well connected to the main body of the review. While it acknowledges general challenges and proposes broad directions for future research, it lacks specific, actionable next steps that are directly informed by the review’s findings.

Given that this is a systematic review, the recommendations should be grounded in the patterns, gaps, and limitations identified in the included studies. Many of the current suggestions (e.g., explore barriers, adapt training programs) are too general and speculative — they could apply to any context or topic.

Author Response

Thanks for your comments. Please take a look at the file I've attached.

Author Response File: Author Response.pdf

Reviewer 2 Report

Comments and Suggestions for Authors

This manuscript offers a valuable and well-organised synthesis of research into competencies, skills, models, and methods related to in-service teacher training in educational technology. It is commendable in both scope and intention with the inclusion of 55 empirical studies published across 11 years, adherence to PRISMA guidelines, and comprehensive thematic categorisation of competencies and training approaches all point to a systematic and transparent research process.

In my opinion, the article contributes meaningfully to discourse in educational technology literature in four main ways:

First, mapping competencies across seven dimensions with fine-grained skill descriptors.

Second, by highlighting dominant theoretical models (particularly TPACK and its variations) and widely used training methods such as Professional Learning Communities (PLCs) and Problem-Based Learning (PBL).

Third, by addressing barriers and enablers such as infrastructure, institutional support, and mentorship - often under explored in reviews of this nature.

Finally, through the provision of actionable recommendations for future research and policy.

However, to further strengthen the manuscript, the following areas should be addressed:

Expression and Academic Tone:
While the structure is generally clear, the manuscript would benefit from substantive language editing. At times, the phrasing is awkward, overly conversational, or imprecise. These instances detract from the professional tone and clarity expected in a peer-reviewed systematic review.

Repetition and Redundancy:
There is significant overlap between the results and discussion sections. For example, entire blocks of text are restated with minimal critical extension. Instead of reiterating the frequency of competencies and associated authors, the discussion should interrogate the significance of patterns, tensions, and absences in the literature.

Scope and Limitations:
The review only includes articles from Scopus, which could bias findings due to the exclusion of relevant literature from other databases such as Web of Science or ERIC. While Scopus is justified as high quality, a clearer acknowledgment of this methodological limitation and its impact on findings is necessary.

Critical Depth:
The review is descriptive in parts where more analytical depth would be appropriate. For example, the discussion of digital ethics and AI integration remains surface-level despite being identified as emerging areas. A more critical exploration of underrepresented or contested areas - such as resistance to technology or variation across geographic contexts - would enhance the scholarly contribution.

Presentation of Tables:
Tables are rich with data but can be overwhelming due to their length and formatting. Consider segmenting them or synthesising into more concise summaries for key findings.

Comments on the Quality of English Language

The manuscript is written in generally understandable English; however, the quality of language use varies throughout and would benefit from careful copyediting to improve clarity, fluency, and academic tone. Several sentences are overly conversational, imprecise, or awkwardly constructed, which detracts from the professionalism and readability of the paper.

Common issues include:

Redundant phrasing (e.g., "the skills that were found... that go with them"),

Repetitive structures, particularly in the results and discussion sections,

Informal or vague expressions (e.g., "a good starting point", “more difficult problems”, “might work better”),

Inconsistent use of verb tense and passive/active voice,

Occasional grammatical errors or structural ambiguities.

Improving these aspects would significantly enhance the clarity and scholarly quality of the manuscript.

Author Response

Thanks for your comments. Please take a look at the file I've attached.

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

Comments and Suggestions for Authors

Dear authors,

The manuscript has improved significantly. Please consider the following comments:

1. The discussion section is now much clearer. However, the first part still closely mirrors the results section, as it primarily restates the competencies and skills identified. In the discussion, the main findings should be interpreted and contextualized in light of the research objectives and existing literature. For example, which competency or skill was most frequently reported across the studies? Can this be attributed to broader educational trends or systemic priorities? Strengthening the discussion with such interpretations would enhance the manuscript’s analytical depth.

It is evident from your methods that Communities of Practice emerged in the majority of studies, and you have critically discussed this well. Similarly, the emphasis on the TPACK model was clearly highlighted and appropriately elaborated upon.

2. You mention both COVID-19 and Artificial Intelligence (AI) in the introduction, which is very relevant and timely. However, a reference is needed to support the point about AI. There are many recent studies you could draw on to strengthen this statement.

Thank you

Comments on the Quality of English Language

The manuscript presents important insights, but the quality of English needs improvement to ensure clarity and readability. 

 

Thank you

Author Response

Thanks for your comments. Please review the file I've attached.

Author Response File: Author Response.pdf

Back to TopTop