# Analysis of Differences in the Levels of TPACK: Unpacking Performance Indicators in the TPACK Levels Rubric

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Assessing TPACK

## 3. Conceptual Framework

#### 3.1. Progressive Levels of TPACK

#### 3.2. Cognitive Demand of Tasks Framework

#### 3.3. Inquiry-Based Learning Framework

#### 3.4. Development of Conceptual Framework

#### 3.5. TPACK Levels Rubric

## 4. Research Methodology

^{2}= 1026.14, df = 6, p < 0.001), confirming that the R-matrix is not an identity matrix. The determinant of the correlation matrix was 0.003 > 0.000001, supporting the assumption that the R-matrix was not singular. The Kaiser–Meyer–Olkin (KMO = 0.880) measure verified the sampling adequacy, and the KMO values for the individual items ranged between 0.849 and 0.919, well above the acceptable limit of 0.5, while the nondiagonal values of the anti-image matrix were relatively small [47]. Both the scree plot (Figure 3) and Kaiser’s criterion suggested a one-factor solution, with an eigenvalue of 3.725, explaining the 93.14% of variance, with item loadings ranging from 0.953 to 0.973.

## 5. Unpacking TPACK Levels Rubric

#### 5.1. Lesson Plan Exemplars

#### 5.1.1. Recognizing Level of TPACK

#### 5.1.2. Accepting Level of TPACK

#### 5.1.3. Adapting Level of TPACK

#### 5.1.4. Exploring Level of TPACK

#### 5.1.5. Advancing Level of TPACK

#### 5.2. Unpacking Teacher-Related Indicators

#### 5.2.1. Overarching Conception

#### 5.2.2. Knowledge of Student Understanding

#### 5.2.3. Knowledge of the Curriculum

#### 5.2.4. Instructional Strategies

#### 5.3. Unpacking Student-Related Indicators

#### 5.3.1. Overarching Conception

#### 5.3.2. Knowledge of Student Understanding

#### 5.3.3. Knowledge of Curriculum

#### 5.3.4. Instructional Strategies

## 6. Discussion

## 7. Conclusions

## Supplementary Materials

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## References

- Darling-Hammond, L.; Oakes, J. Preparing Teachers for Deeper Learning; Harvard Education Press: Cambridge, MA, USA, 2019. [Google Scholar]
- Hyndman, B. Ten reasons why teachers can struggle to use technology in the classroom. Sci. Educ. News
**2018**, 67, 41–42. [Google Scholar] - OECD. TALIS 2018 Results (Volume II): Teachers and School Leaders as Lifelong Learners; TALIS, OECD Publishing: Paris, France, 2020. [Google Scholar] [CrossRef]
- Ottenbreit-Leftwich, A.T.; Brush, T.A.; Strycker, J.; Gronseth, S.; Roman, T.; Abaci, S.; Shin, S.; Plucker, J. Preparation versus practice: How do teacher education programs and practicing teachers align in their use of technology to support teaching and learning? Comput. Educ.
**2012**, 59, 399–411. [Google Scholar] [CrossRef] - Niess, M.L. Preparing teachers to teach science and mathematics with technology: Developing a technology pedagogical content knowledge. Teach. Teach. Educ.
**2005**, 21, 509–523. [Google Scholar] [CrossRef] - Mishra, P.; Koehler, M. Technological pedagogical content knowledge: A framework for teacher knowledge. Teach. Coll. Rec.
**2006**, 108, 1017–1054. [Google Scholar] [CrossRef] - Shulman, L.S. Those who understand: Knowledge growth in teaching. Educ. Res.
**1986**, 15, 4–14. [Google Scholar] [CrossRef] - Niess, M.L. Central component descriptors for levels of technological pedagogical content knowledge. J. Educ. Comput. Res.
**2013**, 48, 173–198. [Google Scholar] [CrossRef] - Lyublinskaya, I. Evolution of a course for special education teachers on integrating technology into math and science. In Handbook of Research on Teacher Education in the Digital Age; Niess, M.L., Gillow-Wiles, H., Eds.; IGI Global: Hershey, PA, USA, 2015; pp. 532–559. [Google Scholar]
- Voogt, J.; McKenney, S. TPACK in teacher education: Are we preparing teachers to use technology for early literacy? Technol. Pedagog. Educ.
**2017**, 26, 69–83. [Google Scholar] [CrossRef] - Niess, M.L.; Gillow-Wiles, H. Online instructional strategies for enhancing teachers’ TPACK: Experiences, discourse, and critical reflection. In Research Anthology on Developing Effective Online Learning Courses; Information Resources Management Association, Ed.; IGI Global: Hershey, PA, USA, 2021; pp. 326–348. [Google Scholar]
- Lyublinskaya, I.; Tournaki, N. A study of special education teachers’ TPACK development in mathematics and science through assessment of lesson plans. J. Technol. Teach. Educ.
**2014**, 22, 449–470. [Google Scholar] - Purwaningshi, E.; Nurhadi, D.; Masjkur, K. TPACK development of prospective physics teachers to ease the achievement of learning objectives: A case study at the State University of Malang, Indonesia. J. Phys. Conf. Ser.
**2019**, 1185, 012042. [Google Scholar] [CrossRef] - Oner, D. A virtual internship for developing technological pedagogical content knowledge. Australas. J. Educ. Technol.
**2020**, 36, 27–42. [Google Scholar] [CrossRef] - Hall, J.A.; Lei, J.; Wang, Q. The first principles of instruction: An examination of their impact on preservice teachers’ TPACK. Educ. Technol. Res. Dev.
**2020**, 68, 3115–3142. [Google Scholar] [CrossRef] - Koehler, M.J.; Shin, T.S.; Mishra, P. How do we measure TPACK? Let me count the ways. In Educational Technology, Teacher Knowledge, and Classroom Impact: A Research Handbook on Frameworks and Approaches; Ronau, R., Rakes, C., Niess, M., Eds.; IGI Global: Hershey, PA, USA, 2012; pp. 16–31. [Google Scholar]
- Abbitt, J.T. Measuring technological pedagogical content knowledge in preservice teacher education: A review of current methods and instruments. J. Res. Technol. Educ.
**2011**, 43, 281–300. [Google Scholar] [CrossRef] - Lyublinskaya, I.; Tournaki, E. The effects of teacher content authoring on TPACK and on student achievement in algebra: Research on instruction with the TI-Nspire handheld. In Educational Technology, Teacher knowledge, and Classroom Impact: A Research Handbook on Frameworks and Approaches; Ronau, R., Rakes, C., Niess, M., Eds.; IGI Global: Hershey, PA, USA, 2012; pp. 295–322. [Google Scholar]
- Niess, M.L.; Sadri, P.; Lee, K. Dynamic spreadsheets as learning technology tools: Developing teachers’ technology pedagogical content knowledge (TPCK). In Proceedings of the American Education Research Association Annual Conference, Chicago, IL, USA, 9–13 April 2007. [Google Scholar]
- Balgalmis, E.; Cakiroglu, E.; Shafer, K. An investigation of a pre-service elementary mathematics teacher’s TPACK within the context of teaching practices. In Proceedings of the Society for Information Technology & Teacher Education International Conference, Jacksonville, FL, USA, 17 March 2014; Searson, M., Ocho, M., Eds.; AACE: Chesapeake, VA, USA, 2014; pp. 2210–2217. [Google Scholar]
- Handal, B.; Campbell, C.; Cavanagh, M.; Petosz, P. Characterising the perceived value of mathematics educational apps in preservice teachers. Math. Educ. Res. J.
**2016**, 26, 199–221. [Google Scholar] [CrossRef] [Green Version] - Shinas, V.H.; Yilmaz-Ozden, S.; Mouza, C.; Karchmer-Klein, R.; Glutting, J.J. Examining domains of technological pedagogical content knowledge using factor analysis. J. Res. Technol. Educ.
**2013**, 45, 339–360. [Google Scholar] [CrossRef] - Mouza, C.; Karchmer-Klein, R. Promoting and assessing pre-service teachers’ technological pedagogical content knowledge (TPACK) in the context of case development. J. Educ. Comput. Res.
**2013**, 48, 127–152. [Google Scholar] [CrossRef] - Archambault, L. Exploring the use of qualitative methods to examine TPACK. In Handbook of Technological Pedagogical Content Knowledge for Educators, 2nd ed.; Herring, M.C., Koehler, M.J., Mishra, P., Eds.; Routledge: New York, NY, USA, 2016; pp. 65–86. [Google Scholar]
- Chai, C.S.; Koh, J.H.L.; Tsai, C.C. A review of the quantitative measures of technological pedagogical content knowledge (TPACK). In Handbook of Technological Pedagogical Content Knowledge for Educators, 2nd ed.; Herring, M.C., Koehler, M.J., Mishra, P., Eds.; Routledge: New York, NY, USA, 2016; pp. 87–106. [Google Scholar]
- Karatas, I.; Tunc, M.P.; Yilmaz, N.; Karaci, G. An investigation of Technological Pedagogical Content Knowledge, self-confidence, and perception of pre-service middle school mathematics teachers towards instructional technologies. Educ. Technol. Soc.
**2017**, 20, 122–132. [Google Scholar] - Saubern, R.; Urbach, D.; Koehler, M.; Phillips, M. Describing increasing proficiency in teachers’ knowledge of the effective use of digital technology. Comput. Educ.
**2020**, 47, 103784. [Google Scholar] [CrossRef] - Schmidt, D.A.; Baran, E.; Thompson, A.D.; Mishra, P.; Koehler, M.J.; Shin, T.S. Technological Pedagogical Content Knowledge (TPACK): The development and validation of an assessment instrument for preservice teachers. J. Res. Technol. Educ.
**2009**, 42, 123–149. [Google Scholar] [CrossRef] - Lyublinskaya, I.; Tournaki, N. Examining the relationship between self and external assessment of TPACK of pre-service special education teachers. In Proceedings of the Society for Information Technology & Teacher Education International Conference, Las Vegas, NV, USA, 2 March 2015; Rutledge, D., Slykhuis, D., Eds.; AACE: Chesapeake, VA, USA, 2015; pp. 2977–2983. [Google Scholar]
- Tomayko, M. Pre-service teachers self-assessing TPACK using a visual quantitative mode. In Proceedings of the Society for Information Technology & Teacher Education International Conference, Washington, DC, USA, 26 March 2018; Langran, E., Borup, J., Eds.; AACE: Chesapeake, VA, USA, 2018; pp. 2123–2127. [Google Scholar]
- Voogt, J.; Fisser, P.; Roblin, N.P.; Tondeur, J.; van Braak, J. Technological Pedagogical Content Knowledge–a review of the literature. J. Comput. Assist. Learn.
**2013**, 29, 109–121. [Google Scholar] [CrossRef] [Green Version] - Akyuz, D. Measuring technological pedagogical content knowledge (TPACK) through performance assessment. Comput. Educ.
**2018**, 125, 212–225. [Google Scholar] [CrossRef] - Harris, J.; Grandgenett, N.; Hofer, M. Testing a TPACK-based technology integration assessment rubric. In Proceedings of the Society for Information Technology & Teacher Education International Conference, San Diego, CA, USA, 29 March 2010; Gibson, D., Dodge, B., Eds.; AACE: Chesapeake, VA, USA, 2010; pp. 3833–3840. [Google Scholar]
- Lachner, A.; Fabian, A.; Franke, U.; Preiß, J.; Jacob, L.; Führer, C.; Küchler, U.; Paravicini, W.; Randler, C.; Thomas, P. Fostering pre-service teachers’ technological pedagogical content knowledge (TPACK): A quasi-experimental field study. Comput. Educ.
**2021**, 174, 104304. [Google Scholar] [CrossRef] - Stein, M.K.; Smith, M.S. Mathematical tasks as a framework for reflection: From research to practice. Math. Teach. Middle Sch.
**1998**, 3, 268–275. [Google Scholar] [CrossRef] - Tekkumru-Kisa, M.; Stein, M.K.; Schunn, C. A framework for analyzing cognitive demand and content-practices integration: Task analysis guide in science. J. Res. Sci. Teach.
**2015**, 52, 659–685. [Google Scholar] [CrossRef] - Levy, P.; Little, S.; McKinney, P.; Nibbs, A.; Wood, J. The Sheffield Companion to Inquiry-Based Learning; Centre for Inquiry-Based Learning in the Arts and Social Sciences, The University of Sheffield: Sheffield, UK, 2010. [Google Scholar]
- Rogers, E.M. Diffusion of Innovations; The Free Press of Simon and Schuster Inc.: New York, NY, USA, 1995. [Google Scholar]
- Niess, M.L.; Ronau, R.N.; Driskell, S.O.; Kosheleva, O.; Pugalee, D.; Weinhold, M.W. Inquiry into Mathematics Teacher Education; Association of Mathematics Teacher Educators: Houghton, MI, USA, 2009. [Google Scholar]
- Grossman, P.L. The Making of a Teacher: Teacher Knowledge and Teacher Education; Teachers College Press: New York, NY, USA, 1990. [Google Scholar]
- Maeng, J.L.; Mulvey, B.K.; Smetana, L.K.; Bell, R.L. Preservice teachers’ TPACK: Using technology to support inquiry instruction. J. Sci. Educ. Technol.
**2013**, 22, 838–857. [Google Scholar] [CrossRef] - Bell, R.L.; Smetana, L.; Binns, I. Simplifying inquiry instruction. Sci. Teach.
**2005**, 72, 30–33. [Google Scholar] - Marzano, R.J. Formative Assessment & Standards-Based Grading; Solution Tree Press: Bloomington, IN, USA, 2008. [Google Scholar]
- Clements, D.H.; Sarama, J.; DiBiase, A.M. (Eds.) Engaging Young Children in Mathematics: Standards for Early Childhood Mathematics Education; Routledge: New York, NY, USA, 2003. [Google Scholar]
- Moskal, B.M. Recommendations for developing classroom performance assessments and scoring rubrics. Pract. Assess. Res. Eval.
**2003**, 8, 14. [Google Scholar] [CrossRef] - Tierney, R.; Simon, M. What’s still wrong with rubrics: Focusing on the consistency of performance criteria across scale levels. Pract. Assess. Res. Eval.
**2004**, 9, 2. [Google Scholar] [CrossRef] - Field, A. Discovering Statistics Using SPSS, 5th ed.; Sage Publications: Thousand Oaks, CA, USA, 2018. [Google Scholar]
- Hayton, J.C.; Allen, D.G.; Scarpello, V. Factor retention decisions in Exploratory Factor Analysis: A tutorial on parallel analysis. Organ. Res. Methods
**2004**, 7, 191–205. [Google Scholar] [CrossRef] - BrainPOP Jr. Available online: https://jr.brainpop.com/ (accessed on 27 September 2021).
- Didax Virtual Manipulatives. Available online: https://www.didax.com/math/virtual-manipulatives.html (accessed on 27 September 2021).
- PhET Interactive Simulations. Available online: https://phet.colorado.edu/en/simulation/make-a-ten (accessed on 27 September 2021).
- Sbar, E. Schemas Are Key to Deep Conceptual Understanding: A Blog from MIND Research Institute. Available online: https://blog.mindresearch.org/blog/schemas-deep-conceptual-understanding (accessed on 27 September 2021).
- Ritchhart, R.; Perkins, D.N. Learning to think: The challenges of teaching thinking. In The Cambridge Handbook of Thinking and Reasoning; Holyoak, K.J., Morrison, R.G., Eds.; Cambridge University Press: Cambridge, UK, USA, 2005; pp. 775–802. [Google Scholar]
- National Council of Teachers of Mathematics. Principles and Standards for School Mathematics; NCTM: Reston, VA, USA, 2000. [Google Scholar]

**Figure 1.**Technological pedagogical content knowledge (TPACK) framework. Reproduced with the permission of the publisher, © 2022 by tpack.org.

**Figure 2.**Model for progressive development of teachers’ knowledge, from PCK to TPACK. Reproduced with permission from Niess et al. [39] (Copyright permission received from the author).

TPACK Level | Cognitive Demand of Tasks with Technology | Inquiry-Based Level of Tasks with Technology |
---|---|---|

Recognizing | Memorization | No inquiry |

Accepting | Procedures without connections | Confirmation inquiry |

Adapting | Procedures with connections | Structured inquiry |

Exploring | Doing mathematics | Guided inquiry |

Advancing | Doing mathematics | Open inquiry |

Factor | Eigenvalues | |
---|---|---|

PCA | PA | |

1 | 3.725 | 1.172 |

2 | 0.125 | 1.046 |

3 | 0.086 | 0.948 |

4 | 0.064 | 0.835 |

TPACK Level | Teacher-Related Performance Indicator |
---|---|

Recognizing | Teacher uses instructional technology for motivation only, rather than subject matter development. New ideas are presented by the teacher mostly without technology. |

Accepting | Teacher uses instructional technology for subject matter development. However, a larger part of technology use is for teacher demonstrations, which include presentations of new knowledge. |

Adapting | Teacher uses instructional technology as a way to enhance student learning. This use of technology supports subject matter development. |

Exploring | Teacher plans for instructional technology to be used mostly by students who explore and experiment with technology for subject matter development. |

Advancing | Teacher develops instructional technology tasks for students that provide them with a deeper conceptual understanding of the subject matter. |

**Table 4.**Teacher-related performance indicators for the knowledge of student understanding component of TPACK.

TPACK Level | Teacher-Related Performance Indicator |
---|---|

Recognizing | Teacher uses instructional technology in a way that does not support student thinking and learning of new content. |

Accepting | Teacher uses instructional technology in a teacher-led/student-followed format, without focusing on the students’ thinking. |

Adapting | Teacher structures students’ use of instructional technology to promote the students’ thinking about mathematics. |

Exploring | Teacher facilitates students’ use of instructional technology to develop thinking leading to a conceptual understanding of mathematics. |

Advancing | Teacher facilitates students’ use of instructional technology to develop higher-order thinking, leading to a deep understanding of mathematics. |

**Table 5.**Teacher-related performance indicators for the knowledge of the curriculum component of TPACK.

TPACK Level | Teacher-Related Performance Indicator |
---|---|

Recognizing | Teacher selects instructional technology that is not aligned with curriculum topics. |

Accepting | Teacher selects instructional technology that is partially aligned with one or more curriculum topics. Technology use is not effective for the curriculum topics. |

Adapting | Teacher selects instructional technology that is aligned with curriculum topics, but only replaces nontechnology-based tasks with technology-based tasks. Technology use is partially effective for the curriculum topics. |

Exploring | Teacher selects instructional technology that is aligned with curriculum topics, and provides an alternative way of topic exploration. Technology use is effective for the curriculum topics. |

Advancing | Teacher selects instructional technology that is aligned with curriculum topics, but also challenges the traditional curriculum by engaging students to learn about different topics with technology. Technology use is highly effective for the curriculum topics. |

**Table 6.**Teacher-related performance indicators for the instructional strategies component of TPACK.

TPACK Level | Teacher-Related Performance Indicator |
---|---|

Recognizing | Teacher focuses on how to use instructional technology, rather than on mathematical ideas. |

Accepting | Teacher structures lessons without student explorations, with instructional technology. The instruction is teacher-led. |

Adapting | Teacher uses a deductive approach to teaching, with instructional technology, to maintain control of the progression of the exploration activities. |

Exploring | Teacher uses deductive and inductive instructional strategies that support the students’ thinking about mathematics. |

Advancing | Teacher mostly uses multiple inductive instructional strategies that support the students’ experimentation with mathematical ideas with instructional technology. |

TPACK Level | Student-Related Performance Indicator |
---|---|

Recognizing | Technology-based activities do not include inquiry tasks. Technology procedures do not provide space for students to use or make connections. |

Accepting | Technology-based activities include confirmation inquiry tasks. Technology procedures do not provide space for students to use or make connections. |

Adapting | Technology-based activities include structured inquiry tasks towards intended ideas. Technology procedures concentrate on mathematical tasks that use or make connections. |

Exploring | Technology-based activities include guided inquiry tasks of high cognitive demand. Technology procedures concentrate on doing mathematics while using or making connections. |

Advancing | Technology-based activities include open inquiry tasks of high cognitive demand. Technology procedures concentrate on tasks that use or develop deep mathematical knowledge representing connections and strategic knowledge. |

**Table 8.**Student-related performance indicators for the knowledge of student understanding component of TPACK.

TPACK Level | Student-Related Performance Indicator |
---|---|

Recognizing | Digital materials only provide space for student practice and drills. |

Accepting | Digital materials for students mirror the structure of the traditional textbook presentation of mathematics. |

Adapting | Digital materials provide an environment for students to engage in active explorations of mathematics with teacher guidance. |

Exploring | Digital materials provide an environment for students to deliberately take mathematically meaningful actions on mathematical objects, but the teacher still guides students to recognize the meaningful consequences of those actions. |

Advancing | Digital materials provide an environment for students to deliberately take mathematically meaningful actions on mathematical objects, and to immediately see the meaningful consequences of those actions. |

TPACK Level | Student-Related Performance Indicator |
---|---|

Recognizing | Students’ tasks with technology do not support making connections between topics in the curriculum. |

Accepting | Students’ tasks with technology do not support making connections between topics in the curriculum. |

Adapting | Students are given curriculum-based tasks with technology to develop a basic understanding of curriculum topics with teacher guidance. |

Exploring | Students are given curriculum-based tasks with technology and are asked to expand mathematical ideas on the basis of technology explorations. |

Advancing | Students’ tasks with technology focus on deepening their understanding of mathematical concepts, and the making of connections between topics, inside and outside of the curriculum. |

**Table 10.**Student-related performance indicators for the instructional strategies component of TPACK.

TPACK Level | Student-Related Performance Indicator |
---|---|

Recognizing | Digital materials are built around drill and practice only. |

Accepting | Digital materials are built around delivery of information as well as drill and practice. |

Adapting | Digital materials are built around mathematical objects but do not promote student reflection. |

Exploring | Digital materials are built around mathematical objects and explicitly promote student reflection, especially the posing of questions for sense making. |

Advancing | Digital materials are built around mathematical objects and explicitly promote student reflection—especially the posing of questions for sense making and reasoning, including explanation and justification. |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Lyublinskaya, I.; Kaplon-Schilis, A.
Analysis of Differences in the Levels of TPACK: Unpacking Performance Indicators in the TPACK Levels Rubric. *Educ. Sci.* **2022**, *12*, 79.
https://doi.org/10.3390/educsci12020079

**AMA Style**

Lyublinskaya I, Kaplon-Schilis A.
Analysis of Differences in the Levels of TPACK: Unpacking Performance Indicators in the TPACK Levels Rubric. *Education Sciences*. 2022; 12(2):79.
https://doi.org/10.3390/educsci12020079

**Chicago/Turabian Style**

Lyublinskaya, Irina, and Aleksandra Kaplon-Schilis.
2022. "Analysis of Differences in the Levels of TPACK: Unpacking Performance Indicators in the TPACK Levels Rubric" *Education Sciences* 12, no. 2: 79.
https://doi.org/10.3390/educsci12020079