Next Article in Journal
What Do We Know about the Use of the Walk-along Method to Identify the Perceived Neighborhood Environment Correlates of Walking Activity in Healthy Older Adults: Methodological Considerations Related to Data Collection—A Systematic Review
Previous Article in Journal
Numerical Simulation of CO2-ECBM Based on Multi-Physical Field Coupling Model
Previous Article in Special Issue
Design and Validation of a Novel Tool to Assess Citizens’ Netiquette and Information and Data Literacy Using Interactive Simulations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Teacher Education Interventions on Teacher TPACK: A Meta-Analysis Study

1
School of Mathematics and Statistics, Guangxi Normal University, Guilin 541006, China
2
School of Mathematical Sciences, Beijing Normal University, Beijing 100875, China
3
New Century School, Dongguan 523700, China
*
Authors to whom correspondence should be addressed.
Sustainability 2022, 14(18), 11791; https://doi.org/10.3390/su141811791
Submission received: 22 August 2022 / Revised: 11 September 2022 / Accepted: 14 September 2022 / Published: 19 September 2022
(This article belongs to the Special Issue Digital Teaching Competences for Sustainable Development)

Abstract

:
Teacher education is an important strategy for developing teachers’ technological pedagogical content knowledge (TPACK). Many schools in the world have incorporated the training into teacher education plans. However, there has been controversy in academic circles concerning the effects of teacher education intervention in promoting the development of teacher TPACK. Therefore, this study used a meta-analysis approach to review the published literature on teacher education programs to determine the impact on TPACK. The results showed that teacher education intervention positively affected TPACK (d = 0.839, p < 0.0001). Besides cultural background, experimental participants, types, sample types, intervention durations, differences in measurement methods, intervention types, and learning environments are the reasons for the differences in the effects of the interventions. The research design using random experiments had a significant positive effect on the size, which was significantly higher than that of the quasi-experiment. The longer the duration of teaching intervention, the stronger the improvement effect of teachers’ TPACK. There are significant differences in improving TPACK between teaching interventions, and the effect is more obvious. Teacher education intervention has a greater and slightly smaller impact on theoretical and practical knowledge. However, cultural background, experimental participant, sample type, and learning environment have no significant effect on teacher education intervention.

1. Introduction

The Technological Pedagogical Content Knowledge (TPACK) framework provides both empirical and theoretical guidance for technology integration in the classroom. The TPACK framework is an important framework for current teacher education. Since the formal introduction of TPACK theory, many studies have recognized the broad appeal and potential of the framework. It is a theoretical basis for developing teachers’ understanding of using technology to support student learning constructively and has become one of the frontiers in educational technology [1]. Teachers’ TPACK knowledge is not fixed and can be cultivated by designing a specific system of education programs [2]. Since the concept, many teachers’ education programs have been developed internationally to cultivate TPACK. Many professional development courses have also been reorganized to promote development [3]. However, the effect of education programs and their substantive impact have always been debated. Some studies believe the program intervention significantly promotes teachers’ TPACK [4,5]. Furthermore, the intervention had no significant effect on TPACKs development [6,7]. Are teacher education program interventions effective for TPACK development? Is the moderating effect of experiment type, sample type, and intervention duration significant on the effect of teacher education intervention? This study used the meta-analysis method to quantitatively integrate relevant experimental research conclusions, analyze the influence of different moderator variables on teachers’ TPACK improvement effects, and provide guidance for implementing the program intervention to answer the above question.

2. Literature Review

2.1. TPACK Theory

Since the concept of technological pedagogical content knowledge, studies have put forward a variety of theoretical frameworks to analyze the connotation from different theoretical orientations. This study grasps the connotation by reviewing the development of the concept.
The conceptualization is mainly derived from Shulman’s theoretical framework of technological pedagogical content knowledge (TPACK) [8]. In the early 2000s, adding technical knowledge to Schulman’s pedagogical knowledge base was also suggested, and in 2001, Pierson began to use the concept [9]. Niess defines TPCK as a complex of disciplinary, technical, and teaching and learning knowledge development. It studies how technology integration projects affect pre-service teachers’ use of technology in teaching practice [10]. According to the Education Association, TPCK is composed of all consonants, which are difficult to pronounce and remember. Therefore, the vowel letter A is added to the abbreviation of TPCK, and the subject teaching knowledge of integrated technology is officially changed to TPACK [11]. In a five-year graduate professional development program, Koehler and Mishra developed the TPACK theoretical framework by engaging students in designing online courses, educational videos, or redesigning existing websites [12]. The TPACK framework proposed by Koehler and Mishra has become a generally accepted framework. It presents the overall framework to understand the complexity of TPACK, but it is difficult to clearly define the internal concepts and the relationship. Angeli and Valanides questioned and explicitly criticized the TPACK superposition view from the perspective of epistemology. The growth of a certain knowledge base (technical knowledge, pedagogical knowledge, or content knowledge) will spontaneously lead to the growth of TPACK [2]. Cox and Graham emphasized the relationship between TPACK and PCK. Furthermore, they questioned the TPACK framework, pointing out the dynamic nature from the perspective of rapid technological changes [13]. Doering et al., explained the dynamic nature of the bidirectional relationship between knowledge and practice [14].
In summary, the following three views of TPACK have developed over time: T as an augmentation of PCK (PCK) [8,15], as a unique and distinct body of knowledge TPCK [9,10], as interactions among the three types of knowledge and their crossover in specific contexts [12]. The most widely recognized TPACK framework consists of eight parts, as seen in Figure 1 [12].

2.2. Teacher Education Interventions

In the context of TPACK, teacher education is an intervention in developing teacher TPACK [16]. Studies have confirmed that teachers’ TPACK is affected by individual and teaching factors [17]. By changing teaching strategies and continuously strengthening effective teacher education interventions, it is possible to promote teacher TPACK development. Since the introduction, there have been many ways to help teachers develop TPACK [18], and this includes focusing on learning technology [19] or trying to inject it into other educational courses, such as educational psychology or teaching methods [20].
According to different intervention methods, research on developing teachers’ TPACK strategies can be divided into the following three categories: method, tool, and technical intervention [21]. Methods interventions are the most widely used strategies, specifically learning by design [22], scaffolding teaching [23], collaborative learning [24], problem-based teaching [25], case study [26], and game learning [27]. Studies have also attempted to build models such as TPACK-COIR [28], TPACK-COPR [24], and TPACK-IDDIRR [29] during design learning.
Tool intervention tools can be divided into multimedia, including graphics, audio, video, and 2D/3D animation, micro-lectures [30], presentation tools such as Spreadsheets [31], and Web2.0 tools such as WebQuest [32]. Technical intervention can be divided into the following three categories: AI (computer intelligent tutoring system) [24], data collection and analysis software [33], and interactive whiteboard [34]. Many teaching strategies and information technology tools are used in teacher education, and there are diverse combinations.
In order to explore the effect of teacher education on teachers’ TPACK intervention, the researchers conducted experimental demonstrations, but the conclusions did not reach a consensus. Most studies agree that teacher education intervention has a significant effect on improving teachers’ TPACK. For example, Ching Sing Chai studied the development of TPACK in 78 pre-service teachers under blended learning conditions and found that the integration of information and communication technology in education Both student-centered approaches are new strategies for the development of TPACK for science teachers [35]. Similarly, a study by Irina Lyublinskaya and Nelly Tournaki found that preservice special education teachers had significantly improved TPACK in math and science courses [36], and Karl Wollmann et al. also came to the same conclusion [37]. However, some studies have confirmed that teacher education interventions have no significant effect on improving teachers’ TPACK. For example, Fatih Saltan found that online cases significantly improved participants’ technical knowledge and technical content knowledge, but only focused on technical knowledge, content knowledge, and technical knowledge. Teaching knowledge is not enough to develop teachers’ TPACK [38], and Seong-Won Kim et al. also came to a similar conclusion [39]. These mixed findings suggest that the impact of teacher education interventions on teacher TPACK is unclear; therefore, a systematic approach is needed to review the effects of teacher education interventions. Meta-analysis is a quantitative research method widely used worldwide. It can avoid the biases and deficiencies of traditional research methods to a certain extent and obtain more general and regular research conclusions. Therefore, this study adopts the method of meta-analysis to deeply explore the impact of teacher education on TPACK and provide a reference for improving education.

3. Methods and Materials

3.1. Literature Search and Screening

3.1.1. Literature Search

The strategy of combining literature retrieval and snowballing was adopted in the first round. This study searched several international databases to ensure the representativeness and comprehensiveness of the literature. The types include journal papers, conference papers, and dissertations. In order to ensure the comprehensiveness of the literature search, the author conducted a comprehensive search on the English and Chinese databases. Literatures were retrieved from databases such as Web of Science, Google Scholar, ProQuest, and Scopus. The keywords and qualifiers to be searched are set according to the search criteria. The specific screening process and criteria are as follows:
(1)
The author’s preset search criteria are as follows: the research content is teacher TPACK, so “technological pedagogical content knowledge OR TPACK” is selected as the search term. According to the connotation and structure of TPACK, the core of the TPACK framework is technical knowledge, so we can further filter “technological knowledge OR TK OR technological content knowledge OR TCK OR technological pedagogical knowledge OR TPK” as the search term;
(2)
The literature is experimental research, which can provide quantitative data results, so select “experiment” is used as the search term.
(3)
The literature is from the field of education, so “education” is selected as the qualifier;
(4)
Since TPACK was officially proposed in June 2006, so the time span is determined to be from June 2006 to July 2022;
(5)
The age range of the tested teachers, countries, and regions is not limited;
(6)
The sources of literature are journals and dissertations, so the type of literature is limited to journals and dissertations.
This study also conducted a secondary search through citation backtracking to ensure comprehensive coverage. The above search strategy was used to obtain 2490 pieces of literature, and a total of 1105 were obtained after excluding the duplicate.

3.1.2. Literature Selection Criteria

The inclusion criteria for the literature are as follows:
(1)
Research object: The impact of teacher education on TPACK;
(2)
Study results: The results of selected studies should show changes in TPACK. This meta-analysis takes teacher education as the independent variable and TPACK as the dependent variable;
(3)
Research type: Types of quasi-experimental or randomized experimental studies. During the experiment, students should not be informed of the purpose of the research, and the experimental intervention duration should be more than 1 week. The research findings may be skewed when the study duration is too brief;
(4)
Study content: To avoid a disproportionate impact on the overall results, the content of selected studies was reviewed to exclude the same study published in different formats;
(5)
Research results: The literature should present clear and complete statistical results. Studies should use standardized tests to measure TPACK and contain sufficient statistical information, including mean, standard deviation, sample size, or t-value, and F-value, to ensure that effect sizes can be calculated.
The exclusion criteria for the literature are:
(1)
Research object: Excludes research on the effect of teacher TPACK intervention on teachers whose research topic is not a teacher education program;
(2)
Study results: Studies with findings that did not show changes in TPACK were excluded;
(3)
Study type: Referring to the criteria of Cheung and Slavin, studies with large differences (effect size > 0.5) were excluded, and random experiments with no pre-test and intervention duration less than one week were excluded [40];
(4)
Research content: Exclude the same research published in different formats;
(5)
Research results: Literatures without sufficient statistical information were excluded.

3.1.3. Literature Screening Process

The first step was retrieval, and a total of 2490 articles were retrieved, followed by three rounds of screening. The second step is the primary screening; the titles are screened, and the documents are imported into EndnoteX9 for screening. A total of 1105 documents were collected after removing duplicates and research published in multiple formats.
The third step is confirmation; after screening the abstracts and excluding irrelevant or non-experimental research on the research topic, a total of 199 papers were obtained. The fourth step is inclusion; the content is reviewed, and 59 papers are finally obtained. Separately, 59 effect sizes were retrieved when many independent samples were included in a single piece of literature (Figure 2).

3.2. Document Coding

The purpose of coding is to facilitate moderator analysis and group comparison based on the organization of the literature. The code for this study is as follows:
(1)
Literature information, including author names, publication years, and journal sources (graduation thesis, international conferences, journal papers);
(2)
Cultural background, including eastern and western cultures, coded according to Hofstede’s cross-border cultural survey data [41]. Hofstede’s cross-cultural analysis model includes the following six dimensions: Power Distance, Uncertainty Avoidance, Individualism/Collectivism, Masculinity/Femininity, Long/Short-Term Orientation, and Indulgence/Restraint. Among them, power distance refers to the unequal distribution of power within an organization. According to Hofstede’s cross-border cultural survey data [42], countries with higher power distances such as mainland China, Singapore, Malaysia, Arab countries (such as Kuwait), Indonesia, Turkey, South Korea, and Spain are coded as east. Meanwhile, countries with lower power distances, such as the United States, Germany, and Australia, are coded as the west. Furthermore, when English literature belongs to an eastern country, it is also compiled as East;
(3)
Experimental participants refer to the types of teachers tested, including pre-service and in-service;
(4)
The types of experiments can be divided into random experiments, quasi-experiments, and real experiments according to the control of educational experiments. The independent variables of educational experiments often have comprehensive characteristics, and there is no real experiment in the true sense. Therefore, educational experiments include random experiments and quasi-experiments;
(5)
The sample type is divided into the following two types according to the size: the large sample is coded as L, while the small is coded as S. Referring to Chueng and Slavin [40], a sample with a size greater than 250 is a large sample, while less than or equal to 250 is small;
(6)
Intervention duration refers to the duration of TPACK training and teaching using the intervention. According to the length of the experimental time, the duration is standardized as 0~3 months, 3~6 months, and more than 6 months;
(7)
Measurement methods using Schmidt [43], Chai [20], and Koehler [44] classification hierarchy as measurement tools are coded as standardized. Other adaptations of Schmidt, Chai, and Koehler or a combination of multiple frameworks are coded as self-edited measurements;
(8)
Types of interventions, according to the names of teaching interventions reported by the authors, are divided into methods including design learning, scaffolding teaching, case learning, problem-based teaching, game learning; technical comprising of interactive whiteboards, and micro-lectures; tool interventions such as AI, multimedia, digital resources, and robotics;
(9)
The types of knowledge are divided into theory and practice. Theoretical courses focus on students’ acquisition of concepts, rules, and principles, while practical courses focus on mastery of operational skills;
(10)
Teaching environment refers to the environment in which the process is completed. In the context of TPACK, teacher education environments include online, offline, and hybrid [45];
(11)
Intervention results, according to the p value provided by the literature, the intervention results of teacher education programs are divided into the following two types: improvement and no significant difference;
The coding method is that the first author independently codes, and then the corresponding author checks and proofreads simultaneously after completion. The agreement between the two independent coders was 97%, except for individual data biases, and the specific codes of the 59 studies are shown in Table 1.

3.3. Literature Quality Assessment

In a meta-analysis, the quality of the included literature affects the final results, and this study refers to Cooper et al.’s literature quality assessment method [92]. It scores the literature quality according to the included literature described in the sample characteristics, experimental design, interventions, measurement tools, and measurement process. Uncertainty receives a score of 0, relative clarity receives a point, and clarity receives a score of 2. A maximum of 10 points can be awarded for a document that is directly related. Two studies were conducted on the evaluation process to ensure the objectivity of the literature quality evaluation results. The score was 0.867 (p < 0.001), indicating that the included literature quality met the standard’s requirements.

3.4. Research Tools

The data analysis tool was the meta-analysis software (Comprehensive Meta-Analysis, CMA). The data used in the meta-analysis includes the number of samples and the mean and standard deviation of the experimental and control groups before and after the test. These raw data are input into the CMA software to generate the effect value of each sample. Since teacher TPACK is a continuous variable, the included literatures are all randomized experiments or quasi-experimental designs to compare differences between or within groups. The sample size of the literature is small, hence this study uses Hedges’ as the effect size.

4. Analysis of Results

4.1. Heterogeneity Test

Due to the heterogeneity between studies, such as cultural background, sample size, intervention measures, and teaching methods, it is necessary to judge the model based on the test results. Heterogeneity is “the differences between all studies that include the same meta-analysis.” The purpose of the test, also called the statistical homogeneity test or the consistency test, is to check the consistency results of each independent study. Combinability and heterogeneity testing include a statistical and a graphical method. Commonly used heterogeneity indicators include Q statistic, H statistic, and I 2 statistic. The Q statistic is affected by the number of included documents. The H and the I 2 statistics are corrected for the degree of freedom (number of documents) of the Q statistic, which will not change with the number of documents included, and the results are more stable and reliable. The heterogeneity test mainly refers to the I 2 value [93].
Heterogeneity was further tested by plotting forest plots of fixed-effects models for the 59 studies. The chi-square value is 1807.85, the degree of freedom is 58, and the p value is less than 0.01, indicating obvious heterogeneity among the 59 pieces research, as illustrated by the forest plot’s results. At the same time, the F value is 92%, verifying the significant heterogeneity among the 59 studies. This is the same result presented by the funnel plot. Hence, further analysis of the sources of heterogeneity is required. Subgroup analysis, meta-regression, or sensitivity analysis were used to explore the source of heterogeneity. After excluding the influence of obvious heterogeneity, a random-effects model was used for meta-analysis.
This study used meta-regression analysis to explore sources of heterogeneity in pooled effects. A meta-regression test was performed with the measurement method as a covariate, p = 0.000 < 0.05, I = 93.8%, and R-squared (%) = 0.47, indicating that the measurement method could explain 47% of the heterogeneity. The test was performed on the variables, and p = 0.0086 < 0.05, I = 95.61%, and R-squared (%) = 0.26, indicating that the intervention measures can explain 26% of the heterogeneity. Equal covariates can explain more sources of heterogeneity.

4.2. Evaluation of Publication Bias

Publication bias is caused by relying on the direction and strength of research findings when selecting papers for publication. The publication has a certain degree of selectivity, and studies with statistically significant positive results are easier or faster to publish. Commonly used detection methods include funnel plots, Egger’s, Begg’s, and loss of safety factor [94]. Therefore, to fully consider the above impact of publication bias results and ensure the reliability of the meta-analysis, a funnel plot was used to evaluate publication bias.
The funnel plot mainly uses visual observation to identify publication bias. It takes the effect size as the abscissa, the ordinate as the standard error, and the two slashes as the 95% confidence interval. Ideally, the interval and the dispersion obtained by small samples are larger; hence it is often at the bottom of the funnel plot, and the dispersion of large samples is smaller and at the top under normal circumstances. The publication bias of the 59 experimental studies included in the meta-analysis was assessed using CMA software, which was graphed as shown in Figure 3. The test results for the funnel plot in Figure 3 are not symmetrical. This study chose Egger’s test slightly more powerful than Begg’s and is more sensitive to small samples. The results showed that t = 1.757 > 1.96, p_1, p_2 < 0.05, further indicating the existence of bias. However, through the analysis of the safety factor, the value of the loss of safety factor is 911, which is much larger than “N × 5 + 10”. Therefore, publication bias exists to a certain extent, but it is still safe.

4.3. Main Effect Test

There is heterogeneity among the initial studies, indicating that in addition to random errors, there are other factors leading to the real difference between the effect sizes of the studies. Therefore, this study uses a random effect model to analyze teacher education (see Table 2). The effect size of 59 pieces of literature as the outcome variable, the combined effect size of a teacher education intervention on teacher TPACK is 0.839. It can be seen that teacher education intervention has a significant positive effect on teacher TPACK.
Among the included literature, 22 reported significant, statistically significant effects, and 13 had an effect size greater than 1. The forest map can be drawn using CMA software. See the forest plot for details, as shown in Figure 4.

4.4. Moderating Effect Test

The above heterogeneity test results show that each literature’s effect sizes has significant statistical significance variability. This is necessary to analyze the source of this variability through the moderation effect test. To understand the impact of a teaching intervention on students’ deep learning, this study uses cultural background, experimental participant, experimental type, sample type, intervention durations, measurement method, intervention type, and teaching environment as moderator variables to test the effect. The results showed that cultural background, experimental participants, experimental types, sample types, intervention durations, measurement methods, intervention types, and learning environments were the reasons for the differences in the effect sizes of each study.

4.4.1. Moderating Effect Test of Different Cultural Backgrounds

Table 3 presents the combined effect size obtained by statistics to analyze the differences in the effect of teacher education intervention in different cultural backgrounds. The cultural background has no significant moderating effect on teacher education intervention (Q = 0.603, p > 0.05). From the perspective of different cultural backgrounds, the combined effect size of Eastern and Western cultures is 0.900 and 0.730, respectively. The p-values of both groups were less than 0.001, reaching a statistically significant level. The combined effect value of Western culture is between 0.5 and 0.8, indicating that under the background of Western culture, teacher education intervention has a moderate promotion effect on teacher TPACK. In contrast, the combined effect value of Eastern culture is greater than 0.8, implying that teachers’ educational intervention significantly affected TPACK. Under different cultural backgrounds, the TPACK level of teachers is different. However, the programs are all based on the TPACK model, and there is a great similarity in training goals and design that can significantly improve the TPACK of teachers in the region. Therefore, the moderating effect of cultural background is not significant. Eastern countries are more inclined to develop TPACK through a separate program. In contrast, most Western countries integrate it into the teacher education curriculum through the entire teacher education process [95]. Therefore, compared to short-term teacher education programs, courses that integrate information technology have a more profound impact on teacher TPACK development.

4.4.2. Moderating Effect Test of Different Experimental Participant

To analyze the differences in the intervention effect of teacher education on different participants, Table 3 presents the combined effect size obtained by statistics. The experimental participant has no significant moderating effect on the effect of teacher education intervention (Q = 3.355, p > 0.05). The combined effect size of pre-service teachers was 0.864, and 1.204 for pre-service and in-service teachers. The p-values of both groups were less than 0.001, reaching a statistically significant level. Former teachers have a significant positive promotion effect, followed by mixing pre-service and in-service teachers. The main reason for this result is that pre-service teachers are in the development stage of the TPACK level, and the effect of the intervention is significant. In contrast, the teaching methods of in-service teachers are relatively fixed, and their TPACK level is stable. Hence, teacher education intervention is not obvious. The number of studies on in-service teachers is small due to the sample size, and the impact needs to be further explored [96].

4.4.3. Moderating Effect Test of Different Experimental Types

Table 3 presents the combined effect size obtained by statistics to analyze the differences in the effects of teacher education intervention under different experimental types. The experimental type significantly moderates teacher education intervention (Q = 5.696, p < 0.05). From the perspective of different experimental types, the combined effect size of the quasi and random experiments is 0.580 and 1.026, respectively. The p-values of both groups were less than 0.001, reaching the level of statistical significance. This shows significant differences in teacher education intervention’s effect under different experiments. The research design using randomized experiments positively impacts the effective value, which is higher than the quasi-experiment. The size of the effect value is highly correlated with the quality of the experimental design.

4.4.4. Moderating Effect Test for Different Sample Types

To analyze the differences in the effect of teacher education intervention under different experimental types, Table 3 presents the combined effect size obtained by statistics. The moderating effect of sample type on the effect of teacher education intervention was insignificant (Q = 1.658, p > 0.05). The small and large samples’ combined effect sizes were 0.109 and 0.246, respectively. The p-values were all less than 0.001, reaching a statistically significant level. The sample type of citations has no significant moderating effect on the effect of teacher education intervention. Therefore, there is no significant difference in moderating effects between small and large samples. The number of studies with large samples is too small, which is limited by the study’s sample size, and the impact needs to be further explored.

4.4.5. Moderating Effect Test of Different Intervention Durations

Table 3 presents the combined effect size obtained by statistics to analyze the differences in the effect of teacher education intervention under different intervention durations. The intervention durations had no significant moderating effect on the teacher education intervention (Q = 1.574, p > 0.05). The combined effect size of 0–3 and 3–6 months was 0.177 and 0.177. The combined effect size of more than 6 months was 0.023, and the p-values of the three groups were all less than 0.05, reaching a statistically significant level. The effect of a teacher education intervention on teacher TPACK is small when the intervention duration is less than 6 months. The combined effect value of the research with an intervention duration of more than 6 months is the highest, above other subgroups. This phenomenon may be because the formation of teacher TPACK is a long-term process, the intervention duration in citations is mostly within 6 months, and the impact is insignificant. Internal TPACK knowledge structures are not yet established when the trial time is too short, making the major impact problematic. The meta-analysis results showed that the longer the intervention duration, the more significant the effect of a teacher education intervention on TPACK [44]. However, a more detailed analysis could not be conducted due to the lack of studies over the last 6 months [97].

4.4.6. Moderating Effect Test of Different Measurement Methods

Table 3 presents the combined effect size obtained by statistics to analyze the differences in the effect of teacher education intervention under different measurement methods. The measurement method significantly moderates the teacher education intervention (Q = 3.863, p < 0.05). From the perspective of the measurement method, the standard test has a greater impact on the effect of the intervention. The significant effect of standardized tests indicates that using tools can more accurately explore the relationship between teacher educational interventions and the effect of teacher TPACK.

4.4.7. Moderating Effect Test of Different Intervention Types

Table 3 exhibits the combined effect size obtained by statistics to analyze the differences in the effect of teacher education intervention under different types of intervention. The type of intervention has a significant moderating effect on teacher education intervention (Q = 8.936, p < 0.05). In terms of intervention measures, method has the best effect, followed by tool, while the important technical intervention has the most effect. Pure technical intervention is not as effective as a systematic method, which also shows that the learning and application of TPACK should emphasize the technology and information environment. of “teaching and learning theories” and methods [98]. Technology interventions may be used with other teaching strategies to facilitate teachers’ TPACK development [99]. In addition, since the number of studies involved is relatively small, and the complex nature of the TPACK development process suggests that it is necessary to use only method intervention for teaching. Therefore, follow-up research needs to focus on technology intervention in teachers’ TPACK.

4.4.8. Moderating Effect Test of Different Knowledge Types

To analyze the differences in the effect of teacher education intervention, Table 3 presents the combined effect size obtained by statistics. From the perspective of the between-group effect, Q = 2.270 and p = 0.321 > 0.05. Therefore, there are no significant differences in the effect of a teacher education intervention on the learning effect of different knowledge types. The combined effect value of theoretical and practical knowledge is 1.013 and 0.586. Comprehensive knowledge was 0.155, and both were significant at 0.001 (p < 0.001). Teacher education intervention has a significant positive effect on teachers’ TPACK. Specifically, this has a greater and smaller impact on the learning effect of theoretical and practical knowledge, respectively. This is because the transformation process from theory to practice is complex, and teacher education programs are more direct and do not provide enough guidance for the practice. Even though teachers use technology in training and show evidence of TPACK, their pedagogical methods are not entirely consistent with those emphasized in the professional development process [100].

4.4.9. Moderating Effect Test of Different Teaching Environments

Table 3 presents the combined effect size obtained by statistics to analyze the differences in the effect of teacher education intervention under different types of intervention. With the development of technology, the cooperative learning environment is not limited to offline classroom learning. Furthermore, it is also possible to carry out cooperative learning in online and blended environments. The teaching environment significantly modifies teacher education intervention (Q = 0.272, p > 0.05) in different teaching places. Teacher education intervention can positively affect TPACK, which is consistent with existing research conclusions [101]. The effect of the intervention in the blended learning environment has the highest effect size. Therefore, blended and online teaching environments have a greater impact on teachers’ TPACK [102]. This is because the development of TPACK requires a technology-rich environment. The presentation of TPACK theoretical knowledge or the application of TPACK in teaching practice relies on the support of technology. Therefore, online teaching is more effective than offline. Blended teaching integrates online teaching due to its positive effect [33].

5. Discussion and Inspiration

5.1. Teacher Education Intervention Has a Significant Impact on Teacher TPACK

Using a meta-analysis approach, this study conducted a statistical analysis of 59 randomized experimental or quasi-experimental studies on the effects of teacher education interventions on TPACK. The common effect value of teacher education intervention on the overall impact was 0.839, consistent with the conclusion of Lyublinskaya et al. [83]. The collected research can support the conclusion that teacher education intervention can significantly improve TPACK. However, the design and implementation should fully consider the intervention measures, knowledge types, and teaching environment. This result can be summarized in the literature included in the analysis as follows: First, teacher education programs combine new technology knowledge with content and teaching, from only contacting existing theoretical knowledge and technology. Resources are turned to practice and application to improve teachers’ TPACK level [103]. Second, with the development of information technology, teacher education has integrated online and offline teaching environments [104].

5.2. The Effect of Moderator Variables on Teacher Education Intervention

This study also analyzed the reasons for the differences in the results of each study through the moderation effect test. The included moderator variables can be divided into the following two categories: One is variables related to the research design, including cultural background, experimental type, sample type, intervention duration, and measurement method. The others are variables related to the teaching intervention, including experimental participants, intervention types, knowledge types, and teaching methods. The results of the study are as follows:
The moderating effect of cultural background on teacher education intervention is insignificant, and the effect is better under the background of oriental culture. Even though most of the existing literature discusses the relationship between cultural backgrounds, there are significant differences in teachers’ TPACK ability [105]. However, cultural factors’ influence on teacher education intervention’s effect is less discussed. The most likely explanation is that there are also significant cultural and educational differences between Eastern and Western countries, preventing all studies from being included in these two categories of the moderation analysis. The relationship between cultural background and teacher education intervention’s effect needs to be clarified further.
The type of experiment has a significant moderating effect on teacher education intervention, and the design using randomized experiments has a significant positive impact. The effect size of teacher education intervention depends on the outcome measurement method used and the rigor of the experimental design. It is highly correlated with the quality of the experimental design [106].
The moderating effect of sample type on the teacher education intervention is insignificant, and the value of large and small samples is not high. This can be understood from the small effect of the two class variables and the p-value indicating the share of this moderator variable. The result is statistically unreasonable because effect sizes are susceptible to outliers with small sample sizes, and results from larger studies are given more weight. This bias in the heterogeneity analysis may be because too little literature was included. This led to the interference of outliers in teacher education interventions and the inability to draw reliable conclusions [107]. Since the number of studies varies, the difference in effect size does not simply mean that large samples are better than small. There are various reasons for this difference, such as variation in sample sizes, experimental participants, and interventions.
The intervention duration has no significant moderating effect on the teacher education intervention. The longer the intervention duration, the more significant the effect of a teacher education intervention on TPACK. This finding suggests that efficiency may be a useful variable to consider when conducting research in the future. Teacher TPACK training has a certain “life cycle” because effective improvement needs to go through a series of development stages, including cognition, acceptance, adaptation, exploration, and improvement [108]. Therefore, the intervention duration is one factor determining the effectiveness of teachers’ educational interventions. The internal TPACK structure of teachers has not been formed, and it is difficult to significantly impact their TPACK levels.
The measurement method significantly moderates teacher education intervention, and the standard test has a greater influence. Studies using standardized tests showed that teacher education interventions had a greater effect on TPACK than self-administered tests. Consistent with existing research conclusions [109], the number of studies involved in the self-made test is relatively large. Therefore, the effect size is likely to be affected by other factors. The effect size of teacher education intervention depends on the rigor of the experimental design, which is highly correlated with the measurement of TPACK.
The moderator variables related to the research design have been discussed above. Regarding the variables related to the teaching intervention, the results of the study are as follows:
The experimental participants had no significant moderating effect on the teacher education intervention. Therefore, pre-service and in-service teacher groups benefited equally from technology. Teacher education programs had the greatest impact on pre-service teacher TPACK of all study characteristics relevant to the sample. From the existing empirical research, technology is more attractive to pre-service teachers. They are more willing to receive TPACK training and make more effort to learn. For technical knowledge learning, the TPACK level tends to be mature and stable [96]. A separate TPACK education program was used to improve the teaching effect for pre-service teachers.
The type of intervention has a significant moderating effect on the effect of teacher education intervention. As the most direct measure, the technical intervention has no significant effect. Method intervention is based on analyzing the psychological mechanism formed by the teachers’ TPACK. Technology and tools are key factors, but the current integration has not kept pace with the rapid changes in the quality and quantity of information technology [110]. Due to the limitation of teachers’ cognitive level of TPACK, technical intervention still needs the majority of educational studies to improve and develop this important intervention technology in future theoretical research and teaching practice.
In teacher education intervention, the learning effect of theoretical and practical knowledge is more influential and slightly smaller. The teacher education program provides a platform to learn this knowledge. Furthermore, teachers are more sensitive to theoretical knowledge but fail to establish an organic connection. Some can use TPACK in learning, but technical knowledge is rarely combined in real classroom teaching for various reasons. There is still a gap between theoretical and practical teaching [111]. For this reason, different types of knowledge points should be different when designing TPACK teaching. More background knowledge of the organizer can be designed before the theoretical class. The important and difficult points need to be internalized and explained in the classroom. For practical knowledge, all operations can be arranged before the class. The class focuses on learning deeper skills, and design targets can give full play to students’ subjective initiative and creative exploration activities.
From the perspective of the teaching environment, the effect size is greater than 0.8, indicating that the intervention had a significant positive impact on teacher TPACK. A blended teaching environment that combines the advantages of online and offline learning is significantly more effective than purely online and traditional offline. The main reason for this phenomenon is the enhanced situation creation and interaction [33].

6. Conclusions and Implications

6.1. Conclusions

Through the method of meta-analysis, 59 studies were comprehensively sorted out on the effect of teacher education on teachers’ TPACK intervention over the past ten years and objectively analyzed and evaluated the effect of the intervention. This study analyzes and discusses the effect of a teacher education intervention on TPACK and the differences under the influence of different moderator variables. In conclusion, the comprehensive effect value of teacher education intervention on TPACK is 0.839. Therefore, the teacher education intervention has a positive promoting effect. From the perspective of moderating effect, the research design using randomized experiments significantly affects the effect size, which is significantly higher than that of quasi-experiments. The longer the duration of teaching intervention, the stronger the improvement effect of teachers’ TPACK. There are significant differences in improving TPACK between teaching interventions, and the effect is more obvious. Teacher education intervention has a greater impact on the learning effect of theoretical knowledge and a slightly smaller impact on practical knowledge. However, cultural background, experimental participant, sample type, and learning environment have no significant moderating effect. The meta-analysis results affirmed the importance of teacher education intervention in developing TPACK and its positive role. Education has different intervention cycles, intervention strategies, and knowledge types. There are differences in the impact of effects, and no region, institution, or school can require all education programs to adopt a unified TPACK model.

6.2. Limitations and Future Research

Regarding the limitations, the analysis of moderator variables based on a small number of studies should be interpreted cautiously and should not lead to strong inferences. Some moderator variables involve a small number of studies, resulting in one of the subgroups containing little research literature. Second, some moderator variables (e.g., sample type and intervention duration) may be manipulated inappropriately. This is because the dichotomous criteria for determining specific subgroups may be inaccurate, and such manipulation may increase the analysis’s variability, affecting the study’s results. Third, the heterogeneity in determining the impact size of teacher education interventions in the meta-analysis suggests that there may be other methodological and non-methodological moderators. Future research should discover these variables moderating the effects of teacher education interventions. This will improve our understanding of the relationship between teacher education interventions and their pre-and post-variables.
Even though the meta-analysis revealed the effect of a teacher education intervention on the development of TPACK, the specific mechanism of action and practical application are not clearly described. This study surveyed schools, institutions, and related teachers to comprehensively understand the intervention’s impact on TPACK development.

Author Contributions

Conceptualization, Y.N. and T.T.W.; methodology, Y.N.; validation, J.C. and Y.Z.; formal analysis, Y.N. and J.C.; writing—original draft preparation, Y.N.; writing—review and editing, Y.N., T.T.W., and J.C.; visualization, Y.N.; supervision, Y.Z.; funding acquisition, Y.N. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the 2021 Guangxi Postgraduate Education Innovation Plan Project “Research on Mathematical Higher-Order Thinking Evaluation Based on Cognitive Maps” (YCSW2021102) and “reform of degree and postgraduate education in Guangxi (JGY2022053)”.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

This research was funded by the postgraduate education innovation project in Guangxi (YJSCX2021103) and reform of degree and postgraduate education in Guangxi (JGY2022053), and thanks to all who participated in this research.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Mishra, P.; Koehler, M. Technological pedagogical content knowledge: A framework for teacher knowledge. Teach. Coll. Rec. 2006, 108, 1017–1054. [Google Scholar] [CrossRef]
  2. Angeli, C.; Valanides, N. Epistemological and methodological issues for the conceptualization, development, and assessment of ICT-TPCK: Advances in technological pedagogical content knowledge (TPCK). Comput. Educ. 2009, 52, 154–168. [Google Scholar] [CrossRef]
  3. Wijaya, T.T.; Jianlan, T.; Purnama, A. Developing an Interactive Mathematical Learning Media Based on the TPACK Framework Using the Hawgent Dynamic Mathematics Software. In Emerging Technologies in Computing; Springer: Cham, Switzerland, 2020; pp. 318–328. [Google Scholar] [CrossRef]
  4. Han, I.; Eom, M.; Shin, W.S. Multimedia case-based learning to enhance pre-service teachers’ knowledge integration for teaching with technologies. Teach. Teach. Educ. 2013, 34, 122–129. [Google Scholar] [CrossRef]
  5. Cengiz, C. The development of TPACK, Technology Integrated Self-Efficacy and Instructional Technology Outcome Expectations of pre-service physical education teachers. Asia-Pac. J. Teach. Educ. 2015, 43, 411–422. [Google Scholar] [CrossRef]
  6. Koh, J.H.L.; Chai, C.S. Teacher clusters and their perceptions of technological pedagogical content knowledge (TPACK) development through ICT lesson design. Comput. Educ. 2014, 70, 222–232. [Google Scholar] [CrossRef]
  7. Koh, J.H.L. TPACK design scaffolds for supporting teacher pedagogical change. EtrD-Educ. Technol. Res. Dev. 2019, 67, 577–595. [Google Scholar] [CrossRef]
  8. Shulman, L.S. Those who understand: Knowledge growth in teaching. Educ. Res. 1986, 15, 4–14. [Google Scholar] [CrossRef]
  9. Pierson, M.E. Technology integration practice as a function of pedagogical expertise. J. Res. Comput. Educ. 2001, 33, 413–430. [Google Scholar] [CrossRef]
  10. Niess, M.L. Preparing teachers to teach science and mathematics with technology: Developing a technology pedagogical content knowledge. Teach. Teach. Educ. 2005, 21, 509–523. [Google Scholar] [CrossRef]
  11. Thompson, A.D.; Mishra, P. Editors’ remarks: Breaking news: TPCK becomes TPACK! J. Comput. Teach. Educ. 2007, 24, 38–64. [Google Scholar]
  12. Koehler, M.; Mishra, P. What is technological pedagogical content knowledge (TPACK)? Contemp. Issues Technol. Teach. Educ. 2009, 9, 60–70. [Google Scholar] [CrossRef]
  13. Cox, S.; Graham, C.R. Using an elaborated model of the TPACK framework to analyze and depict teacher knowledge. TechTrends 2009, 53, 60–69. [Google Scholar]
  14. Doering, A.; Veletsianos, G.; Scharber, C.; Miller, C. Using the technological, pedagogical, and content knowledge framework to design online learning environments and professional development. J. Educ. Comput. Res. 2009, 41, 319–346. [Google Scholar] [CrossRef]
  15. Shulman, L. Knowledge and teaching: Foundations of the new reform. Harv. Educ. Rev. 1987, 57, 1–23. [Google Scholar] [CrossRef]
  16. Harris, J.; Hofer, M. Instructional planning activity types as vehicles for curriculum-based TPACK development. In Society for Information Technology & Teacher Education International Conference; Association for the Advancement of Computing in Education (AACE): Chesapeake, VA, USA, 2009; pp. 4087–4095. [Google Scholar]
  17. Koh, J.H.L.; Chai, C.S.; Tay, L.Y. TPACK-in-Action: Unpacking the contextual influences of teachers’ construction of technological pedagogical content knowledge (TPACK). Comput. Educ. 2014, 78, 20–29. [Google Scholar] [CrossRef]
  18. Mishra, P.; Koehler, M.J. Introducing technological pedagogical content knowledge. In Proceedings of the Annual Meeting of the American Educational Research Association, New York, NY, USA, 24–28 March 2008; Volume 1, p. 16. [Google Scholar]
  19. Gronseth, S.; Brush, T.; Ottenbreit-Leftwich, A.; Strycker, J.; Abaci, S.; Easterling, W.; Roman, T.; Shin, S.; van Leusen, P. Equipping the next generation of teachers: Technology preparation and practice. J. Digit. Learn. Teach. Educ. 2010, 27, 30–36. [Google Scholar] [CrossRef]
  20. Chai, C.S.; Koh, J.H.L.; Tsai, C.-C. A Review of Technological Pedagogical Content Knowledge. Educ. Technol. Soc. 2013, 16, 31–51. [Google Scholar]
  21. Harris, J.B. In-service teachers’ TPACK development: Trends, models, and trajectories. In Handbook of Technological Pedagogical Content Knowledge (TPACK) for Educators; Routledge: London, UK, 2016; pp. 201–216. [Google Scholar]
  22. Koehler, M.J.; Mishra, P.; Yahya, K. Tracing the development of teacher knowledge in a design seminar: Integrating content, pedagogy and technology. Comput. Educ. 2007, 49, 740–762. [Google Scholar] [CrossRef]
  23. Chai, C.S.; Koh, J.H.L. Changing teachers’ TPACK and design beliefs through the Scaffolded TPACK Lesson Design Model (STLDM). Learn. Res. Pract. 2017, 3, 114–129. [Google Scholar] [CrossRef]
  24. Chen, Y.H.; Jang, S.J.; Chen, P.J. Using wikis and collaborative learning for science teachers’ professional development. J. Comput. Assist. Learn. 2015, 31, 330–344. [Google Scholar] [CrossRef]
  25. Tee, M.Y.; Lee, S.S. From socialisation to internalisation: Cultivating technological pedagogical content knowledge through problem-based learning. Australas. J. Educ. Technol. 2011, 27, 89–104. [Google Scholar] [CrossRef]
  26. Mouza, C.; Karchmer-Klein, R. Promoting and assessing pre-service teachers’ technological pedagogical content knowledge (TPACK) in the context of case development. J. Educ. Comput. Res. 2013, 48, 127–152. [Google Scholar] [CrossRef]
  27. Foster, A. Assessing learning games for school content: The TPACK-PCaRD framework and methodology. In Assessment in Game-Based Learning; Springer: Berlin/Heidelberg, Germany, 2012; pp. 201–215. [Google Scholar] [CrossRef]
  28. Jang, S.-J. Integrating the interactive whiteboard and peer coaching to develop the TPACK of secondary science teachers. Comput. Educ. 2010, 55, 1744–1751. [Google Scholar] [CrossRef]
  29. Lee, C.-J.; Kim, C. An implementation study of a TPACK-based instructional design model in a technology integration course. EtrD-Educ. Technol. Res. Dev. 2014, 62, 437–460. [Google Scholar] [CrossRef]
  30. Zhan, Y.; Quan, J.; Ren, Y. An empirical study on the technological pedagogical content knowledge development of pre-service mathematics teachers in China. Int. J. Soc. Media Interact. Learn. Environ. 2013, 1, 199–212. [Google Scholar] [CrossRef]
  31. Niess, M.L.; van Zee, E.H.; Gillow-Wiles, H. Knowledge growth in teaching mathematics/science with spreadsheets: Moving PCK to TPACK through online professional development. J. Digit. Learn. Teach. Educ. 2010, 27, 42–52. [Google Scholar] [CrossRef]
  32. Graham, C.; Cox, S.; Velasquez, A. Teaching and measuring TPACK development in two preservice teacher preparation programs. In Society for Information Technology & Teacher Education International Conference; Association for the Advancement of Computing in Education (AACE): Chesapeake, VA, USA, 2009; pp. 4081–4086. [Google Scholar]
  33. Qasem, A.A.A.; Viswanathappa, G. Blended learning approach to develop the teachers’ TPACK. Contemp. Educ. Technol. 2016, 7, 264–276. [Google Scholar] [CrossRef]
  34. Jang, S.J.; Tsai, M.F. Exploring the TPACK of Taiwanese elementary mathematics and science teachers with respect to use of interactive whiteboards. Comput. Educ. 2012, 59, 327–338. [Google Scholar] [CrossRef]
  35. Kim, S.W.; Youngjun, L. Development of TPACK-P Education Program for Improving Technological Pedagogical Content Knowledge of Pre-service Teachers. J. Korea Soc. Comput. Inf. 2017, 22, 141–152. [Google Scholar] [CrossRef]
  36. Tournaki, N.; Lyublinskaya, I. Preparing special education teachers for teaching mathematics and science with technology by integrating TPACK framework into the curriculum: A study of teachers’ perceptions. J. Technol. Teach. Educ. 2014, 22, 243–259. [Google Scholar]
  37. Wollmann, K.; Lange-Schubert, K. The Development of Prospective Primary School Science Teachers’ TPaCK Fostered by Innovative Science-Teacher Education. Educ. Sci. 2022, 12, 381. [Google Scholar] [CrossRef]
  38. Saltan, F. Online Case-Based Learning Design for Facilitating Classroom Teachers’ Development of Technological, Pedagogical, and Content Knowledge. Eur. J. Contemp. Educ. 2017, 6, 308–316. [Google Scholar]
  39. Kim, S.-W.; Youngjun, L. Effects of Programming-Based TPACK Education Program on TPACK of Pre-service Teachers. J. Korean Assoc. Comput. Educ. 2020, 23, 79–89. [Google Scholar]
  40. Cheung, A.C.K.; Slavin, R.E. The effectiveness of educational technology applications for enhancing mathematics achievement in K-12 classrooms: A meta-analysis. Educ. Res. Rev. 2013, 9, 88–113. [Google Scholar] [CrossRef]
  41. Hofstede, G. Cultural dimensions in management and planning. Asia Pac. J. Manag. 1984, 1, 81–99. [Google Scholar] [CrossRef]
  42. Hofstede, G.; Minkov, M. Long-versus short-term orientation: New perspectives. Asia Pac. Bus. Rev. 2010, 16, 493–504. [Google Scholar] [CrossRef]
  43. Schmidt, D.A.; Baran, E.; Thompson, A.D.; Mishra, P.; Koehler, M.J.; Shin, T.S. Technological pedagogical content knowledge (TPACK) the development and validation of an assessment instrument for preservice teachers. J. Res. Technol. Educ. 2009, 42, 123–149. [Google Scholar] [CrossRef]
  44. Koehler, M.J.; Mishra, P.; Kereluik, K.; Shin, T.S.; Graham, C.R. The technological pedagogical content knowledge framework. In Handbook of Research on Educational Communications and Technology; Springer: Berlin/Heidelberg, Germany, 2014; pp. 101–111. [Google Scholar] [CrossRef]
  45. McGarr, O.; McDonagh, A. Digital Competence in Teacher Education; University of Limerick: Limerick, Ireland, 2019. [Google Scholar]
  46. Kim, S.W.; Lee, Y. Development and Application of the TPACK-P Education Program for Pre-Service Teachers’ TPACK. Int. J. Eng. Technol. 2018, 7, 636–643. [Google Scholar]
  47. Jung, L.E.; Ji Young, Y. Exploring the effects and experiential significance of technology-based capstone design class on technology pedagogical and content knowledge(TPACK) of pre-service early childhood teachers. J. Learn.-Cent. Curric. Instr. 2020, 20, 525–546. [Google Scholar] [CrossRef]
  48. Cabero, J.; Barroso, J. ICT teacher training: A view of the TPACK model/Formación del profesorado en TIC: Una visión del modelo TPACK. Cult. Educ. 2016, 28, 633–663. [Google Scholar] [CrossRef]
  49. Meng, C.C.; Sam, L.C. Developing Pre-Service Teachers’ Technological Pedagogical Content Knowledge for Teaching Mathematics with the Geometer’s Sketchpad through Lesson Study. J. Educ. Learn. 2013, 2, 1–8. [Google Scholar] [CrossRef]
  50. Chai, C.S.; Koh, J.H.L.; Tsai, C.-C. Facilitating Preservice Teachers’ Development of Technological, Pedagogical, and Content Knowledge (TPACK). Educ. Technol. Soc. 2010, 13, 63–73. [Google Scholar]
  51. Alayyar, G.M.; Fisser, P.; Voogt, J. Developing technological pedagogical content knowledge in pre-service science teachers: Support from blended learning. Australas. J. Educ. Technol. 2012, 28. [Google Scholar] [CrossRef]
  52. Valtonen, T.; Sointu, E.; Kukkonen, J.; Makitalo, K.; Hoang, N.; Hakkinen, P.; Jarvela, S.; Naykki, P.; Virtanen, A.; Pontinen, S.; et al. Examining pre-service teachers’ Technological Pedagogical Content Knowledge as evolving knowledge domains: A longitudinal approach. J. Comput. Assist. Learn. 2019, 35, 491–502. [Google Scholar] [CrossRef]
  53. Chai, C.S.; Koh, J.H.L.; Tsai, C.C.; Tan, L.L.W. Modeling primary school pre-service teachers’ Technological Pedagogical Content Knowledge (TPACK) for meaningful learning with information and communication technology (ICT). Comput. Educ. 2011, 57, 1184–1193. [Google Scholar] [CrossRef]
  54. Hall, J.A.; Lei, J.; Wang, Q. The first principles of instruction: An examination of their impact on preservice teachers’ TPACK. Educ. Technol. Res. Dev. 2020, 68, 3115–3142. [Google Scholar] [CrossRef]
  55. Sug Shin, W.; In Sook, H.; Mi Ri, E. Influence of Technology Integration Course on Preservice Teachers’ Technological Pedagogical and Content Knowledge (TPACK). J. Korean Assoc. Inf. Educ. 2012, 16, 71–80. [Google Scholar]
  56. Choi, K.; Hye, B. Effect of classes considering TPACK developmental stage on self-efficacy and developmental level of prospective teachers. J. Learn.-Cent. Curric. Instr. 2020, 20, 1371–1391. [Google Scholar] [CrossRef]
  57. Choi, Y.; Hong, S.-H. Designing and implementing integrated lessons for pre-service elementary teachers’ technological pedagogical content knowledge development. J. Korean Elem. Sci. Educ. 2019, 38, 287–304. [Google Scholar] [CrossRef]
  58. Adipat, S. Developing Technological Pedagogical Content Knowledge (TPACK) through Technology-Enhanced Content and Language-Integrated Learning (T-CLIL) Instruction. Educ. Inf. Technol. 2021, 26, 6461–6477. [Google Scholar] [CrossRef]
  59. Choi, Y.; Hong, S.H. The Effects of an Integrated Science Content-Based Course on the TPACK of Elementary Pre-Service Teachers according to Content Domain Choice in Learning Activity Type Planning. Biol. Educ. 2021, 49, 362–379. [Google Scholar] [CrossRef]
  60. Chai, C.S.; Rahmawati, Y.; Jong, M.S.-Y. Indonesian Science, Mathematics, and Engineering Preservice Teachers’ Experiences in STEM-TPACK Design-Based Learning. Sustainability 2020, 12, 9050. [Google Scholar] [CrossRef]
  61. Banas, J.R.; York, C.S. Authentic learning exercises as a means to influence preservice teachers’ technology integration self-efficacy and intentions to integrate technology. Australas. J. Educ. Technol. 2014, 30, 728–746. [Google Scholar] [CrossRef]
  62. Cetin-Dindar, A.; Boz, Y.; Sonmez, D.Y.; Celep, N.D. Development of pre-service chemistry teachers’ technological pedagogical content knowledge. Chem. Educ. Res. Pract. 2018, 19, 167–183. [Google Scholar] [CrossRef]
  63. Agyei, D.D.; Voogt, J.M. Pre-service teachers’ TPACK competencies for spreadsheet integration: Insights from a mathematics-specific instructional technology course. Technol. Pedagog. Educ. 2015, 24, 605–625. [Google Scholar] [CrossRef]
  64. Chang, C.Y.; Chien, Y.T.; Chang, Y.H.; Lin, C.Y. MAGDAIRE: A model to foster pre-service teachers’ ability in integrating ICT and teaching in Taiwan. Australas. J. Educ. Technol. 2012, 28, 983–999. [Google Scholar] [CrossRef] [Green Version]
  65. Park, Y.; Jihae, S. An Analysis of Preservice Music Teachers’ Perception on Technological Pedagogical and Content Knowledge (TPACK) Capacity. J. Res. Curric. Instr. 2021, 25, 326–337. [Google Scholar] [CrossRef]
  66. Young-Joo, J.; Hyuk, S.; Yoon, S. A Study on Technology Content Teaching Knowledge (TPACK) and Teacher Efficacy of Preservice Teachers for Online Korean Teacher Training Program. Korean Educ. 2014, 145, 379–404. [Google Scholar]
  67. Choe, H.; Tae Wuk, L. Implementation and Analysis about Technology Knowledge Education Program for Pre-service Teacher based on the TPACK Model. J. Korea Soc. Comput. Inf. 2015, 20, 231–239. [Google Scholar] [CrossRef]
  68. Lachner, A.; Fabian, A.; Franke, U.; Preiss, J.; Jacob, L.; Fuehrer, C.; Kuechler, U.; Paravicini, W.; Randler, C.; Thomas, P. Fostering pre-service teachers’ technological pedagogical content knowledge (TPACK): A quasi-experimental field study. Comput. Educ. 2021, 174, 104304. [Google Scholar] [CrossRef]
  69. Zimmermann, F.; Melle, I.; Huwer, J. Developing Prospective Chemistry Teachers’ TPACK-A Comparison between Students of Two Different Universities and Expertise Levels Regarding Their TPACK Self-Efficacy, Attitude, and Lesson Planning Competence. J. Chem. Educ. 2021, 98, 1863–1874. [Google Scholar] [CrossRef]
  70. Kaplon-Schilis, A. Development and Transfer of Technological Pedagogical Content Knowledge (TPACK) of Special Education Teachers; City University of New York: New York, NY, USA, 2018. [Google Scholar]
  71. Cheng, P.H.; Molina, J.; Lin, M.C.; Liu, H.H.; Chang, C.Y. A New TPACK Training Model for Tackling the Ongoing Challenges of COVID-19. Appl. Syst. Innov. 2022, 5, 32. [Google Scholar] [CrossRef]
  72. Izgi Onbasili, U.; Avsar Tuncay, A.; Sezginsoy Seker, B.; Kiray, S.A. An Examination Of Pre-Service Teachers’ Experiences In Creating A Scientific Digital Story In The Context Of Their Self Confidence In Technological Pedagogical Content Knowledge. J. Balt. Sci. Educ. 2022, 21, 207. [Google Scholar] [CrossRef]
  73. Zhigao, Z.; Liguo, Z. Practical Research of Pre-service Teachers’ TPACK Development Based on Design-based Learning. China Electron. Educ. 2019, 389, 86–94. [Google Scholar]
  74. Liang, Y. An Empirical Study on TPACK Ability Cultivation of Normal Interns Based on Lesson Study. Master’s Thesis, Nanchang University, Nanchang, China, 2021. [Google Scholar]
  75. Xile, C. Curriculum Construction and Implementation Based on TPACK Development for Pre-Service Biology Teachers. Master’s Thesis, East China Normal University, Shanghai, China, 2017. [Google Scholar]
  76. Mingrui, Z. Research on Supporting Strategies of Learner Collaborative Design in Teacher Education MOOC Oriented by TPACK Development. Master’s Thesis, Ludong University, Yantai, China, 2021. [Google Scholar]
  77. Thephavongsa, S. CDIO-based Training Model for Teachers’ TPACK and Its Application. Ph.D. Thesis, Huazhong Normal University, Wuhan, China, 2019. [Google Scholar]
  78. Yonghan, L. Research on TPACK Development of Pre-Service Teachers Driven by Project Learning—Taking Microteaching of Information Technology as an Example. Master’s Thesis, Sichuan Normal University, Chengdu, China, 2021. [Google Scholar]
  79. Cunliang, L.; Minjie, D. The Design Research on University Teachers’ Modern Educational Technology Training Based on TPACK. Mod. Educ. Technol. 2015, 25, 45–51. [Google Scholar]
  80. Chunli, W. Research on Training Information Instructional Design Ability for Pre-service Teachers in the Perspective of TPACK. Master’s Thesis, Henan Normal University, Xinxiang, China, 2012. [Google Scholar]
  81. Ansari, I. The 5Es inquiry-based Lesson plan activities and the Preservice Science Teachers’ Technological Pedagogical Content Knowledge (TPACK) Development. J. Res. Method Educ. 2019, 9, 58–64. [Google Scholar] [CrossRef]
  82. Bhagat, K.K.; Chang, C.-Y.; Huang, R. Integrating GeoGebra with TPACK in improving pre-service mathematics teachers’ professional development. In Proceedings of the 2017 IEEE 17th International Conference on Advanced Learning Technologies (ICALT), Timisoara, Romania, 3–7 July 2017; pp. 313–314. [Google Scholar]
  83. Lyublinskaya, I.; Tournaki, N. A study of special education teachers’ TPACK development in mathematics and science through assessment of lesson plans. J. Technol. Teach. Educ. 2014, 22, 449–470. [Google Scholar]
  84. Moon, J.; Lee, S.; Xu, X. Exploring pre-service teachers’ technology-integration belief and scientific inquiry in a teacher-development course. Int. J. Technol. Des. Educ. 2022, 32, 1777–1798. [Google Scholar] [CrossRef]
  85. Wu, B.; Peng, X.; Hu, Y. How to foster pre-service teachers’ STEM learning design expertise through virtual internship: A design-based research. Educ. Technol. Res. Dev. 2021, 69, 3307–3329. [Google Scholar] [CrossRef]
  86. Meroño, L.; Calderón, A.; Arias-Estero, J.L. Pedagogía digital y aprendizaje cooperativo: Efecto sobre los conocimientos tecnológicos y pedagógicos del contenido y el rendimiento académico en formación inicial docente. Rev. Psicodidáctica 2021, 26, 53–61. [Google Scholar] [CrossRef]
  87. Sointu, E.; Valtonen, T.; Kukkonen, J.; Kärkkäinen, S.; Koskela, T.; Pöntinen, S.; Rosenius, P.; Mäkitalo-Siegl, K. Quasi-experimental study for enhancing pre-service teachers’ TPACK. In Society for Information Technology & Teacher Education International Conference; Association for the Advancement of Computing in Education (AACE): Chesapeake, VA, USA, 2016; pp. 3067–3074. [Google Scholar]
  88. Horzum, M.B. An investigation of the technological pedagogical content knowledge of pre-service teachers. Technol. Pedagog. Educ. 2013, 22, 303–317. [Google Scholar] [CrossRef]
  89. Kartal, T.; Dilek, I. Preservice Science teachers’ TPACK development in a technology-enhanced Science teaching method course. J. Educ. Sci. Environ. Health 2021, 7, 339–353. [Google Scholar] [CrossRef]
  90. Nordin, H.; Davis, N.; Ariffin, T.F.T. A case study of secondary pre-service teachers’ technological pedagogical and content knowledge mastery level. Procedia-Soc. Behav. Sci. 2013, 103, 1–9. [Google Scholar] [CrossRef]
  91. Lehiste, P. The impact of a professional development program on in-service teachers’ TPACK: A study from Estonia. Probl. Educ. 21st Century 2015, 66, 18. [Google Scholar] [CrossRef]
  92. Cooper, H.; Hedges, L.V.; Valentine, J.C. The Handbook of Research Synthesis and Meta-Analysis; Russell Sage Foundation: New York, NY, USA, 2019. [Google Scholar]
  93. Wijaya, T.T.; Cao, Y.; Weinhandl, R.; Tamur, M. A Meta-Analysis of the Effects of E-Books on students’ mathematics achievement. Heliyon 2022, 8, e09432. [Google Scholar] [CrossRef]
  94. Juandi, D.; Kusumah, Y.S.; Tamur, M.; Perbowo, K.S.; Wijaya, T.T. A meta-analysis of Geogebra software decade of assisted mathematics learning: What to learn and where to go? Heliyon 2021, 7, e06953. [Google Scholar] [CrossRef]
  95. Farjon, D.; Smits, A.; Voogt, J. Technology integration of pre-service teachers explained by attitudes and beliefs, competency, access, and experience. Comput. Educ. 2019, 130, 81–93. [Google Scholar] [CrossRef]
  96. Dong, Y.; Chai, C.S.; Sang, G.-Y.; Koh, J.H.L.; Tsai, C.-C. Exploring the Profiles and Interplays of Pre-service and In-service Teachers’ Technological Pedagogical Content Knowledge (TPACK) in China. Educ. Technol. Soc. 2015, 18, 158–169. [Google Scholar]
  97. Doering, A.; Hughes, J.; Huffman, D. Preservice teachers: Are we thinking with technology? J. Res. Technol. Educ. 2003, 35, 342–361. [Google Scholar] [CrossRef]
  98. Koehler, M.J.; Mishra, P. What happens when teachers design educational technology? The development of technological pedagogical content knowledge. J. Educ. Comput. Res. 2005, 32, 131–152. [Google Scholar] [CrossRef]
  99. Koh, J.H.; Divaharan, H. Developing pre-service teachers’ technology integration expertise through the TPACK-developing instructional model. J. Educ. Comput. Res. 2011, 44, 35–58. [Google Scholar] [CrossRef] [Green Version]
  100. Polly, D. Developing teachers’ technological, pedagogical, and content knowledge (TPACK) through mathematics professional development. Int. J. Technol. Math. Educ. 2011, 18, 83–96. [Google Scholar]
  101. Anderson, A.; Barham, N.; Northcote, M. Using the TPACK framework to unite disciplines in online learning. Australas. J. Educ. Technol. 2013, 29, 549–565. [Google Scholar] [CrossRef]
  102. Archambault, L.; Kennedy, K. Teacher preparation for K-12 online and blended learning. In Handbook of Research on K-12 Online Blended Learning; ETC Press: Dartmouth, NS, Canada, 2014; pp. 225–244. [Google Scholar]
  103. Ertmer, P.A.; Stepich, D.A.; Flanagan, S.; Kocaman-Karoglu, A.; Reiner, C.; Reyes, L.; Santone, A.L. Impact of guidance on the problem-solving efforts of instructional design novices. Perform. Improv. Q. 2009, 21, 117–132. [Google Scholar] [CrossRef]
  104. Dooly, M.; Sadler, R. Filling in the gaps: Linking theory and practice through telecollaboration in teacher education. Recall 2013, 25, 4–29. [Google Scholar] [CrossRef]
  105. Wong, E.M.; Li, S.S.; Choi, T.-H.; Lee, T.-N. Insights into innovative classroom practices with ICT: Identifying the impetus for change. J. Educ. Technol. Soc. 2008, 11, 248–265. [Google Scholar]
  106. Tobler, N.S. Meta-analysis of 143 adolescent drug prevention programs: Quantitative outcome results of program participants compared to a control or comparison group. J. Drug Issues 1986, 16, 537–567. [Google Scholar] [CrossRef]
  107. Higgins, J.; Thompson, S.; Deeks, J.; Altman, D. Statistical heterogeneity in systematic reviews of clinical trials: A critical appraisal of guidelines and practice. J. Health Serv. Res. Policy 2002, 7, 51–61. [Google Scholar] [CrossRef]
  108. Niess, M.L.; Ronau, R.N.; Shafer, K.G.; Driskell, S.O.; Harper, S.R.; Johnston, C.; Browning, C. Mathematics teacher TPACK standards and development model. Contemp. Issues Technol. Teach. Educ. 2009, 9, 4–24. [Google Scholar]
  109. Li, Q.; Ma, X. A Meta-analysis of the Effects of Computer Technology on School Students’ Mathematics Learning. Educ. Psychol. Rev. 2010, 22, 215–243. [Google Scholar] [CrossRef]
  110. Scherer, R.; Tondeur, J.; Siddiq, F.; Baran, E. The importance of attitudes toward technology for pre-service teachers’ technological, pedagogical, and content knowledge: Comparing structural equation modeling approaches. Comput. Hum. Behav. 2018, 80, 67–80. [Google Scholar] [CrossRef] [Green Version]
  111. Liu, S.-H.; Tsai, H.-C.; Huang, Y.-T. Collaborative Professional Development of Mentor Teachers and Pre-Service Teachers in Relation to Technology Integration. Educ. Technol. Soc. 2015, 18, 161–172. [Google Scholar]
Figure 1. TPACK framework and its components.
Figure 1. TPACK framework and its components.
Sustainability 14 11791 g001
Figure 2. Literature screening process.
Figure 2. Literature screening process.
Sustainability 14 11791 g002
Figure 3. Funnel plot for publication bias assessment.
Figure 3. Funnel plot for publication bias assessment.
Sustainability 14 11791 g003
Figure 4. Forest map. Note: Mingrui Zhang (2021) [79]. Ghaida M. Alayyar et al. (2012) [52]. Lourdes Meroño et al. (2021) [89]. Chew Cheng Meng et al. (2013) [50]. Kyungsik Choi et al. (2019) [58]. Julio Cabero et al. (2016) [48]. Chun-Yen Chang et al.(2012) [67]. Ching Sing Chai et al. (2017) [23]. Joyce Hwee Ling Koh et al. (2014) [17]. Dr. Imran Ansari (2019) [84]. Ching Sing Chai et al. (2020) [63]. Joyce Hwee Ling Koh et al. (2014) [17]. Franziska Zimmermann et al. (2021) [72]. Insook Han et al. (2013) [4]. Teemu Valtonen et al. (2019) [53]. Tezcan Kartal et al. (2021) [92]. Choi Young-mi et al. (2021) [59]. Ping-Han Cheng et al. (2022) [74]. Xile Chen (2017) [78]. Chunli Wang (2012) [83]. Douglas D. Agyei et al.(2015) [66]. Choi Young-mi et al. (2019) [59]. Irina Lyublinskaya et al. (2014) [86]. Yonghan Li (2021) [81]. Joyce Hwee Ling Koh et al. (2014) [17]. Piret Lehiste (2015) [94]. Meng Yew Tee et al. (2011) [50]. Jennifer R. Banas et al. (2014) [64]. Ching Sing Chai et al. (2010) [51]. Park Ye-Rang et al.(2021) [68]. Hasniza Nordin et al. (2013) [93]. Seong-Won Kim et al. (2016) [39]. Hyun-Jong Choi et al. (2015) [70]. Karl Wollmannet al. (2022) [37]. Kaushal Kumar Bhagat et al. (2017) [85]. Bian Wu et al. (2021) [88]. Umit Izgi-Onbasili et al. (2022) [75]. Sug Shin Won et al. (2012) [57]. Ching Sing Chai et al. (2011) [54]. Cevdet Cengiz (2014) [5]. Andreas Lachner et al. (2021) [71]. Seong-Won Kim et al. (2017) [35]. Joo Young-Joo et al. (2012) [69]. Zhigao Zheng et al.(2019) [76]. Erkko Sointu et al. (2016) [90]. Ayla Cetin-Dindar et al. (2017) [65]. Jewoong Moon et al.(2022) [87]. Cunliang Liang (2015) [82]. Liang Yao (2021) [77]. Surattana Adipat (2021) [61]. Jacob A. Hall et al. (2019) [56]. Fatih Saltan (2017) [38]. Mehmet Barış Horzum (2013) [91]. Aleksandra Kaplon-Schilis (2018) [73]. Joyce Hwee Ling Koh(2018) [49]. Syh-Jong Jang et al. (2012) [60]. Thephavongsa (2019) [80]. Eunjung Lee et al. (2019) [47].
Figure 4. Forest map. Note: Mingrui Zhang (2021) [79]. Ghaida M. Alayyar et al. (2012) [52]. Lourdes Meroño et al. (2021) [89]. Chew Cheng Meng et al. (2013) [50]. Kyungsik Choi et al. (2019) [58]. Julio Cabero et al. (2016) [48]. Chun-Yen Chang et al.(2012) [67]. Ching Sing Chai et al. (2017) [23]. Joyce Hwee Ling Koh et al. (2014) [17]. Dr. Imran Ansari (2019) [84]. Ching Sing Chai et al. (2020) [63]. Joyce Hwee Ling Koh et al. (2014) [17]. Franziska Zimmermann et al. (2021) [72]. Insook Han et al. (2013) [4]. Teemu Valtonen et al. (2019) [53]. Tezcan Kartal et al. (2021) [92]. Choi Young-mi et al. (2021) [59]. Ping-Han Cheng et al. (2022) [74]. Xile Chen (2017) [78]. Chunli Wang (2012) [83]. Douglas D. Agyei et al.(2015) [66]. Choi Young-mi et al. (2019) [59]. Irina Lyublinskaya et al. (2014) [86]. Yonghan Li (2021) [81]. Joyce Hwee Ling Koh et al. (2014) [17]. Piret Lehiste (2015) [94]. Meng Yew Tee et al. (2011) [50]. Jennifer R. Banas et al. (2014) [64]. Ching Sing Chai et al. (2010) [51]. Park Ye-Rang et al.(2021) [68]. Hasniza Nordin et al. (2013) [93]. Seong-Won Kim et al. (2016) [39]. Hyun-Jong Choi et al. (2015) [70]. Karl Wollmannet al. (2022) [37]. Kaushal Kumar Bhagat et al. (2017) [85]. Bian Wu et al. (2021) [88]. Umit Izgi-Onbasili et al. (2022) [75]. Sug Shin Won et al. (2012) [57]. Ching Sing Chai et al. (2011) [54]. Cevdet Cengiz (2014) [5]. Andreas Lachner et al. (2021) [71]. Seong-Won Kim et al. (2017) [35]. Joo Young-Joo et al. (2012) [69]. Zhigao Zheng et al.(2019) [76]. Erkko Sointu et al. (2016) [90]. Ayla Cetin-Dindar et al. (2017) [65]. Jewoong Moon et al.(2022) [87]. Cunliang Liang (2015) [82]. Liang Yao (2021) [77]. Surattana Adipat (2021) [61]. Jacob A. Hall et al. (2019) [56]. Fatih Saltan (2017) [38]. Mehmet Barış Horzum (2013) [91]. Aleksandra Kaplon-Schilis (2018) [73]. Joyce Hwee Ling Koh(2018) [49]. Syh-Jong Jang et al. (2012) [60]. Thephavongsa (2019) [80]. Eunjung Lee et al. (2019) [47].
Sustainability 14 11791 g004
Table 1. Literature coding results.
Table 1. Literature coding results.
Author (Year)Cultural BackGroundExperimental ParticipantExperiment TypeSample TypeIntervention DurationMeasurement MethodIntervention TypeKnowledge TypeTeaching EnvironmentIntervention Outcome
Seong-Won Kim et al. (2018) [46]EPRS3–6 monthsSDMInOffImprove
Eunjung Lee et al. (2019) [47]EPQS3–6 monthsSMMThOffImprove
Karl Wollmannet al. (2022) [37]WPRL3–6 monthsSDMThOffImprove
Julio Cabero et al. (2016) [48]EBQL3–6 monthsSMMThOffImprove
Joyce Hwee Ling Koh (2018) [7]EIQS3–6 monthsSDMThOnImprove
Seong-Won Kim et al. (2017) [35]EPRS3–6 monthsSDMInOffNo significant difference
Cevdet Cengiz (2014) [5]WPQS3–6 monthsSMTThOffImprove
Meng Yew Tee et al. (2011) [49]EIQS3–6 monthsSMMPrOffImprove
Ching Sing Chai et al. (2010) [50]EPRL0–3 monthsSDMThOffImprove
Insook Han et al. (2013) [4]EPQS0–3 monthsSMTInOffImprove
Ghaida M. Alayyar et al. (2012) [51]WPQS3–6 monthsSMTInMixImprove
Seong-Won Kim et al. (2016) [39]EPRS>6 monthsSDMThOffImprove
Teemu Valtonen et al. (2019) [52]WPQS3–6 monthsSMMInOffImprove
Ching Sing Chai et al. (2011) [53]EPRL>6 monthsSDMThOffImprove
Joyce Hwee Ling Koh et al. (2017) [6]EIQS3–6 monthsSMMThOffImprove
Jacob A. Hall et al. (2019) [54]WPRS3–6 monthsSDMThMixImprove
Sug Shin, Won et al. (2012) [55]EPRS0–3 monthsSDMThOffImprove
Kyungsik Choi et al. (2019) [56]EPQS3–6 monthsSMMThOffImprove
Choi Young-mi et al. (2019) [57]EPQS0–3 monthsSMMInOffImprove
Syh-Jong Jang et al. (2012) [34]EIRL0–3 monthsSDT1ThOffImprove
Surattana Adipat (2021) [58]EPRS3–6 monthsSDMThOffImprove
Choi Young-mi et al. (2021) [59]EPRS0–3 monthsSMMThOffImprove
Ching Sing Chai et al. (2020) [60]EPQS3–6 monthsSMMThMixImprove
Jennifer R. Banas et al. (2014) [61]WPQS3–6 monthsSMMPrOffImprove
Ayla Cetin-Dindar et al. (2017) [62]WPQS0–3 monthsSDEThOffImprove
Douglas D. Agyei et al. (2015) [63]WPQS0–3 monthsSMMPrOffImprove
Chun-Yen Chang et al. (2012) [64]EPQS3–6 monthsSMT1InMixImprove
Park Ye-Rang et al. (2021) [65]EPQS3–6 monthsSMMThOffImprove
Joo Young-Joo et al. (2012) [66]EPRS0–3 monthsSDMThOnImprove
Hyun-Jong Choi et al. (2015) [67]EPQS3–6 monthsSMMInMixImprove
Arwa Ahmed Abdo Qasem (2016) [33]EPRS0–3 monthsSDMThMixImprove
Andreas Lachner et al. (2021) [68]WPRS0–3 monthsSDMThMixImprove
Franziska Zimmermann et al. (2021) [69]WPRS3–6 monthsSMT1ThOffImprove
Aleksandra Kaplon-Schilis (2018) [70]WBQS>6 monthsSDMThOffImprove
Ching Sing Chai et al. (2017) [23]EPQL3–6 monthsSMMInMixNo significant difference
Ping-Han Cheng et al. (2022) [71]EPQS0–3 monthsSMTInMixImprove
Umit Izgi-Onbasili et al. (2022) [72]WPRS3–6 monthsSDT1ThOffNo significant difference
Zheng Zhigao et al. (2019) [73]EPQS0–3 monthsSMMInMixNo significant difference
Yao Liang (2021) [74]EPRS3–6 monthsSDT1PrOffImprove
Chen Xile (2017) [75]EPRS0–3 monthsSDMInMixImprove
Zhang Mingrui (2021) [76]EPRS3–6 monthsSMMPrMixImprove
Souphanh Thephavongsa (2019) [77]EPQS>6 monthsSMMInMixNo significant difference
Li Yonghan (2021) [78]EPQS3–6 monthsSMMInOffImprove
Fatih Saltan (2017) [38]WPRS0–3 monthsSDTThMixNo significant difference
Liang Cunliang (2015) [79]EIRS3–6 monthsSDTInOffImprove
Wang Chunli (2012) [80]EPQS0–3 monthsSMMPrOffImprove
Dr. Imran Ansari (2019) [81]EPQS3–6 monthsSMMPrMixImprove
Kaushal Kumar Bhagat et al. (2017) [82]EPQS3–6 monthsSDT1ThMixImprove
Irina Lyublinskaya et al. (2014) [83]WPQS>6 monthsSMTThOffImprove
Jewoong Moon et al. (2022) [84]WPRS3–6 monthsSDMPrOffImprove
Joyce Hwee Ling Koh et al. (2014) [17]EBQL0–3 monthsSMMThOffImprove
Bian Wu et al. (2021) [85]EPQS0–3 monthsSDMPrOnImprove
Lourdes Meroño et al. (2021) [86]WPQL3–6 monthsSMT1PrOffImprove
Erkko Sointu et al. (2016) [87]WPRS3–6 monthsSDT1PrOnImprove
Chew Cheng Meng et al. (2013) [49]EPQS3–6 monthsSMMInOffImprove
Mehmet Barış Horzum (2013) [88]WPRL3–6 monthsSDMInOffImprove
Tezcan Kartal et al. (2021) [89]WPQS0–3 monthsSMMThOffImprove
Hasniza Nordin et al. (2013) [90]WPQS3–6 monthsSMMThOffImprove
Piret Lehiste (2015) [91]WIQS>6 monthsSDMInOffImprove
Note: Cultural background—east (E), west (W). Experimental participant—pre-service teacher (P), in-service teachers (I). Experiment Type—Random experiment (R), quasi-experiment(Q). Sample type—large sample (L), small sample (S). Measurement method—standardized test (SD), self-made test (SM). Intervention type—method intervention (M), technical intervention (T), tool intervention (T1). Knowledge type—theoretical (Th), practical (Pr), integrated (In). Teaching environment—online (On), offline (Off), mixed type (Mix).
Table 2. Main effects test.
Table 2. Main effects test.
ModelEffect SizeEffect Value95% Confidence IntervalTwo-Tailed TestHeterogeneity Test
Lower LimitUpper LimitZ Valuep Valuedf (Q)I-Squared
Random effects590.8390.6321.0457.9710.0005896.05018
Table 3. Moderating effect test.
Table 3. Moderating effect test.
Moderating VariableCategoriesEffect SizeEffect Value95% Confidence IntervalTwo-Tailed TestHeterogeneity Test
Lower LimitUpper LimitZ Valuep Valuedf (Q)I-Squared
Cultural backgroundEast380.9000.6421.1576.8520.000
West210.7300.3891.0724.1910.000
Total between 0.6030.437
experimental participantsPre-service teachers500.8640.6701.0598.7260.000
In-service teachers60.399−0.1560.9551.4080.159
Mixed type31.2040.4441.9643.1060.002
Total between 3.3550.187
Experimental typeQuasi-experiment350.5800.2990.8614.0410.000
Random experiment241.0260.7911.2608.5740.000
Total between 5.6960.017
Sample typeSmall sample500.1090.5751.0017.2470.000
Large sample90.2460.6521.6164.6140.000
Total between 1.6580.198
Intervention durations0–3 months200.1770.3451.0393.9060.000
3–6 months330.1390.6891.2326.9370.000
>6 months60.3210.1001.3602.2710.023
Total between 1.5740.455
Measurement methodSelf-made test320.3670.1390.5953.1600.002
Standardized test271.2501.0371.46311.5210.000
Total between 3.8630.000
Intervention typeMethod Intervention431.0130.7851.2428.6850.000
Tool intervention90.5860.0831.0882.2850.022
Technical intervention70.155−0.4060.7150.5410.588
Total between 8.9360.011
Knowledge typeTheoretical291.0130.0230.3964.5510.000
Practical110.5860.0630.6194.4310.000
Integrated190.1550.0370.5534.8230.000
Total between 2.2700.321
Teaching environmentOnline40.8960.2161.5762.5820.000
Offline380.8080.5831.0327.0540.010
Mixed type170.9120.5691.2545.2150.000
Total between 0.2720.873
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ning, Y.; Zhou, Y.; Wijaya, T.T.; Chen, J. Teacher Education Interventions on Teacher TPACK: A Meta-Analysis Study. Sustainability 2022, 14, 11791. https://doi.org/10.3390/su141811791

AMA Style

Ning Y, Zhou Y, Wijaya TT, Chen J. Teacher Education Interventions on Teacher TPACK: A Meta-Analysis Study. Sustainability. 2022; 14(18):11791. https://doi.org/10.3390/su141811791

Chicago/Turabian Style

Ning, Yimin, Ying Zhou, Tommy Tanu Wijaya, and Jihe Chen. 2022. "Teacher Education Interventions on Teacher TPACK: A Meta-Analysis Study" Sustainability 14, no. 18: 11791. https://doi.org/10.3390/su141811791

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop