Next Article in Journal
Comparative Effects of Esports and Traditional Sports on Motor Skills and Cognitive Performance in Higher Education Students in a Post-Pandemic Context
Previous Article in Journal
Methodological Choices When Assessing Summer Bridge Programs in STEM Majors: A Scoping Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Beyond Beliefs: Understanding Instructor Framing and the Uptake of Educational Technology in Engineering Education

1
Department of Mechanical Engineering, California Polytechnic State University, San Luis Obispo, CA 93407, USA
2
Department of Chemical and Biological Engineering and Department of Education, Tufts University, Medford, MA 02155, USA
3
Department of Education, UC Santa Barbara, University of California, Santa Barbara, CA 93106, USA
*
Author to whom correspondence should be addressed.
Educ. Sci. 2026, 16(2), 221; https://doi.org/10.3390/educsci16020221
Submission received: 2 December 2025 / Revised: 10 January 2026 / Accepted: 16 January 2026 / Published: 2 February 2026

Abstract

Often expressed teacher beliefs are misaligned with observed instructional practices. We propose that instructor beliefs do not solely exist as isolated entities within instructors’ minds, but that instructors’ classroom decisions and actions are thoroughly linked with the environment in which they teach. As part of a larger study, we conducted interviews from three experienced instructors, each in different undergraduate engineering contexts. Utilizing the theory of resources and frames, we explored how they organized instructional resources into frames and how that framing interacted with their use of educational technology. We found significant differences in the framings and associated differences with how they integrated a new technological tool in their classes. We identify implications for teacher professional development, in particular the importance of attending to context and giving teachers agency in professional development and learning community settings.

1. Introduction

This study resides at the intersection of teacher professional development and the productive adaptation of educational technology in university contexts. We focus on the problem that, although numerous research studies have shown benefits of educational technology (Escueta et al., 2017; Luschei, 2014), adoption and effective use of new educational technologies are slow and inconsistent (Chugh et al., 2023; Luschei, 2014). To realize the potential benefits of educational technology, we need to understand how to better support instructors to develop effective ways to use educational technology.
Change strategies often assume that changing instructors’ beliefs will prompt change in instructional practice, the idea being that if an instructor believes a certain way about teaching and learning that belief will translate to their actions (Burns, 1992; Fives & Gill, 2015; Kagan, 1992). Here instructor beliefs are considered stable ideas that do not change when they return to their classrooms after a professional development experience and that if an instructor has beliefs about the importance of integrating a new technology tool into their practice, they will do it. However, evidence shows that an instructor’s expressed beliefs do not necessarily match instructional practices (Fives & Gill, 2015; Haynes-Brown, 2024; Kelly, 2006; Tondeur et al., 2017) and that while complex, their beliefs can also contradict one another (DeGlopper et al., 2023; Hora, 2014). Recent research points to the need to tackle this challenge. Haynes-Brown (2024) suggests some ways to address this problem, including having instructors “first confront the inconsistencies that exist between their espoused beliefs and practices” (p. 14) through methods such as examining actual classroom practice and reflection. Additionally, Foley et al. (2025) point to the need to consider the instructors and their engagement holistically in conjunction with their context.
We propose that instructor beliefs do not solely exist as isolated entities within the instructors’ minds. Rather, a fundamental source for the misalignment between expressed beliefs and observed practices is that instructors’ decisions and actions are thoroughly linked with the environment in which they teach (DeGlopper et al., 2023; Desimone et al., 2002; Haynes-Brown, 2024). This linkage is reciprocal and in conversation. To interrogate the instructor–environment interaction, we utilize the construct of resources, elements of classroom context that instructors identify to support their learning goals (Hammer et al., 2005; Stroupe, 2016). These resources assemble into frames, an instructor’s overall sense of “what is going on here.” How an instructor frames the instruction emerges from the resources they activate (Elby & Hammer, 2010). Unlike immutable beliefs, which are posited to be consistent across environments, this study is based on the premise that instructors develop frames that vary in response to their teaching context. For example, the same instructor might teach differently in a small class versus a large class, even if their purported beliefs are stable. Therefore, to change how instructors integrate new educational technology tools and associated practices into their teaching, we must look not at beliefs, but at the resources that instructors activate, the framings they develop, and how these relate to their uptake of the new technology tool.
In this study, we examine how instructors’ framing and their uptake of a new technology tool are related. We focus on three experienced engineering instructors, each at different universities, to first understand and describe their unique context-related framing and then examine how these framings influence their uptake of a new educational technology tool.

1.1. Background Literature

As educational technology becomes more prevalent in classroom settings, there has been corresponding attention on developing approaches to support instructors’ use and adoption of these new technology tools. Research on teacher professional development has shown that single, isolated professional development workshops are not sufficient to change teaching practices (Birman et al., 2000). Rather, approaches that provide ongoing support, develop community among instructors, and attend to the daily challenges, which also account for an instructor’s teaching context appear to be more successful (Foley et al., 2025; Zech et al., 2000). Professional development programs often oversimplify individual experiences, ideas, and perspectives and approach teacher professional development with a “one-size-fits-all” strategy. The focus is often on the technology tool itself with only a single approach to integrating it. However, without sufficient attention to instructors’ fundamental ideas about teaching, we miss an important aspect of implementation and adoption. Importantly, each instructor brings their own history, experiences, and ideas about teaching that they must operationalize within the opportunities and constraints of their specific contexts. Therefore, each instructor involved in a professional development program has a unique experience with the learning opportunity and implements the new technology tool in different ways.
Factors that have been reported to influence instructors’ uptake of new technology tools often treat instructor dispositions as immutable. For example, researchers have reported that instructors are more likely to implement changes in their use of technology when they exhibit a willingness to commit time to makes changes, an attitude open to change and risk taking, and constructivist and student-centered views of teaching (Li et al., 2019; Tondeur et al., 2017; Vannatta & Fordham, 2004). Additionally, instructors’ self-efficacy about technology, confidence, and agency have been shown to influence instructors’ uptake of new technology tools. While we believe this body of research is important and has provided useful knowledge to support use of educational technology, we propose that the relation between context and uptake remains undertheorized and understudied. For example, if instructors are involved in the development of a technology innovation, they become more confident in their use and more likely to take risks to use it (Li et al., 2019). The teaching context interacts with instructors’ ideas about the value of technology in ways that lead to how they use educational technology (Bowman et al., 2022; Cheng et al., 2020; Desimone et al., 2002; Tondeur et al., 2017). These findings point to the difficulties of preparing instructors for the diversity of environments and teaching conditions and point to the need for further understanding of how to most effectively support instructors.
Researchers have suggested that a key to changing classroom practice is to change what instructors believe about teaching and student learning. Programs looking to change how instructors interact with new educational technology tools have examined instructor beliefs as a way to understand how different instructors implement instructional changes. These beliefs are influenced by a variety of internal and external factors, including instructors’ personal beliefs, their experiences with mentor teachers, and their own educational experiences (Basckin et al., 2021). However, there are shortcomings in equating changes in instructors’ beliefs with changes in practice. For example, instructors’ actual practice is often misaligned with their espoused beliefs (Haynes-Brown, 2024). Beliefs do not account for the contingencies of their teaching situation and the adaptations needed in a particular context.
The points summarized here suggest a need to look beyond individual instructor beliefs. Rather, instructors work within specific learning contexts and there are reciprocal interactions between the experiences they bring and their instructional practices within their current environment. In a longitudinal study, Desimone et al. (2002) found that teacher professional development “is more effective in changing teachers’ classroom practice when it has collective participation of teachers from the same school, department, or grade; and active learning opportunities, such as reviewing student work or obtaining feedback on teaching; and coherence, for example, linking to activities or building on teachers’ previous knowledge” (p. 102). This finding suggests that consideration of the environment instructors work within (e.g., same school, the specific assignments they give students) is useful to effectively promote changes in their practice. Additionally, one of the conclusions of Mouza et al. (2022) was “that teachers rarely implemented pedagogical strategies shared during the PD [professional development] in uniform ways. Rather, they frequently adapted them to fit their particular context, their own knowledge and comfort level, and their student needs” (p. 14). This finding supports the idea that there is more than just an instructor’s beliefs and experience that mediate their practice. In this study, we address the instructor–context dynamic by using a theoretical framework of resources and frames; this framework shifts focus from beliefs contained within the instructors’ minds to how their ideas and experiences interact with their context. We outline this framework in the following section and use it to situate this study.

1.2. Theoretical Framework

We take a sociocultural, or situative, orientation to instructors’ professional development around educational technology (Borko et al., 2000; Greeno, 1998; Nolen et al., 2009). Instructor’s identities and goals develop within social contexts and the two are inextricably linked. Our ideas build on the work of motivational filters (Nolen et al., 2009) and situative learning in teacher development more broadly (Borko et al., 2000). As such, we apply a theory of learning and epistemological development—resources and frames (Elby & Hammer, 2010; Hammer et al., 2005)—to interpret how instructors approach teaching more broadly and their interactions with educational technology specifically. Russ & Luna (2013) have applied this framework to the learning of a high school teacher implementing educational technology.
The resources and frames approach offers an alternative perspective to beliefs. From a cognitivist orientation, instructors’ beliefs are treated as an individual characteristic that “causes” behavior. As discussed above, this view has limitations, including the assumption that beliefs are independent of context and that an individual’s ideas directly influence actions in a causal fashion. In contrast, a sociocultural perspective interrogates how instructional decisions are “co-constructed” between an instructor and their social context through negotiation within particular systems of meaning. That is, rather than a belief an instructor brings to the classroom, we see framing as a way to understand how decisions are made at the boundary between the individual and the social context.
We utilize resources and framing as an analytical lens to explore variations in how instructors approach their teaching (Elby & Hammer, 2010; Hammer et al., 2005; Levin et al., 2009). Framing refers to the expectations individuals bring to a social context, or their perception of “what is happening here?” (Goffman, 1986; Levin et al., 2009). This framing shapes how instructors interpret in-the-moment teaching decisions, design course activities, and decide what to modify. Resources, on the other hand, are fine-grained elements of the classroom environment that instructors identify to support their learning goals (Rodriguez, 2024). Embedded in the activation of resources are elements of knowledge, often subconscious, that can be acquired through professional development (e.g., teaching workshops) or through lived experience. As Stroupe (2016) puts it, “resources are physical and intellectual commodities that instructors recognize, generate, and use to solve problems of practice.” (p. 53). Thus, resources can include tools, practices, goals, and epistemological stances (DeGlopper et al., 2023; Hammer et al., 2005). The activation of resources varies depending on the situation, and these resources are interconnected, with the activation of one potentially enabling or inhibiting others. When multiple resources are activated simultaneously, they can coalesce to form a cohesive frame.
Traditionally, resources and framing have been applied to examine learners’ cognitive processes when solving disciplinary problems and to investigate students’ personal epistemologies as they engage in such work (Elby & Hammer, 2010; Hammer et al., 2005; Rodriguez, 2024). However, some have extended this perspective to analyze instructional practice (DeGlopper et al., 2023; Russ & Luna, 2013).

1.3. Research Questions

We address the following research questions:
  • Within different contexts, what resources do three instructors identify to support their teaching goals? How do instructors appear to assemble these resources into frames?
  • How do the resources and frames interact with instructors’ use of the new technology tool?

2. Materials and Methods

2.1. Research Design

The work presented here is part of a broader initiative aimed at understanding the variation in the adoption, use, and impact of an educational technology tool, the Concept Warehouse (CW), across a range of institutions with distinct missions, instructional constraints, and student populations (Koretsky et al., 2014; Koretsky et al., 2024). The core group promoting the initiative is a collaborative effort involving nine institutions that differ along several dimensions: public vs. private, small vs. large, minority-serving vs. majority-serving, four-year vs. two-year, and teaching-focused vs. research-focused. Prior work has focused on student reasoning (Koretsky et al., 2016a, Koretsky et al., 2016b), development of a variety of instructional tools (Bowen et al., 2014; Cook et al., 2022; Gavitte et al., 2025, Nigon et al., 2024), propagation to other contexts (Friedrichsen et al., 2017) and student feedback (Auby & Koretsky, 2025).
For the study discussed in this report, we utilized a multiple case study methodology, which is designed to develop theory through the detailed, contextual analysis of complex phenomena (George & Bennet, 2005; Yin, 2013, 2018). Specifically, in prior analyses, we noticed that instructors appeared to frame their instruction in different ways, and that framing could influence how they approached their instruction, including adaptation of the CW. Our data set and analysis is qualitative in nature to allow for deep analysis of individual experiences, in line with case study research methods. Based on that observation, we selected three instructors from different institution types (2-year, 4-year teaching, 4-year research intensive) to explore how they organized instructional resources into frames and how that framing interacted with the use of the technology tool. We limited analysis to focus on three instructors to allow for an in-depth analysis that could lead to a “thick” description of each case (Geertz, 2008).

2.2. Concept Warehouse (CW)

The CW is a free web-based instructional tool designed to support concept-based active learning and to lower the activation barrier for instructors to implement these pedagogical practices (Koretsky et al., 2014). The tool itself provides three distinct but complementary functions: (a) a repository of educational materials, (b) an audience response system, and (c) learning analytics that provide data to instructors and researchers. The tool currently houses over 3,500 concept questions, 12 research-based Concept Inventories, 11 virtual laboratories, 6 inquiry-based learning activities, 5 adaptive learning modules, and a set of reflection activities. The CW has been used by over 1700 Chemical Engineering and Mechanical Engineering faculty (Friedrichsen et al., 2017).

2.3. Participants and Setting

We selected three cases from among a larger set of interviews of 24 instructors from nine institutions. We use gender neutral names to provide anonymity. We chose these three instructors based on apparent differences in (1) how they framed their instruction, (2) their integration of the CW in classroom activities, and (3) their different institution types. All three were experienced instructors who appeared to have well-developed frames. Table 1 shows their salient characteristics. Each of the instructors taught multiple different courses within their department; the target course for this study is listed in Table 1. Although each of these courses was different, they are all introductory engineering science courses focused on foundational principles and among the first engineering courses taken by undergraduate students. In addition, Cameron contributed content to the CW.

2.4. Data Sources

Each instructor was interviewed twice. The first interview provided the primary source for analysis of instructor resources and framing. The second interview was used to understand how the instructors integrated the CW. We also examined CW use data to confirm the ways that instructors described their use of the tool in class. Additionally, we verified that each instructor’s framing in the second interview aligned with the findings from analysis of the first interview.
In the first interview, we gathered data on instructors’ professional histories, instructional context, beliefs about students, learning, and instruction, goals for student learning, instructional practices, and experiences with the use of the technology tool. In the second interview, instructors logged into the CW and demonstrated how they had used the tool in their classroom. This involved showing a recently used question along with its associated student response data, while also explaining their rationale for using the tool and how they interpreted the student responses. Interview protocols are included in Appendix A.

2.5. Data Analysis

Interview responses were segmented into quotations, which were demarked by a single idea that was being described. A new idea led to a new quotation, often segmented by an interviewer remark or question. The authors worked together to code the transcript, identifying resources through emergent codes. We took an expansive perspective considering a resource as any element of the classroom environment that an instructor identified in support of their learning goals. One author began by coding Cameron’s interview, after which the authors convened to review and refine the codes, introducing new ones as needed, until consensus was achieved. The other authors then independently coded Bailey’s interview, applying this refined set of codes, but also adding new codes as they emerged. This initial coding was followed by another group review where the author team met and reviewed all resource codes, negotiating differences until consensus was reached. One author then coded Alex’s first interview followed by review by the author team. After coding for the individual resources, we then identified the co-occurrences of the resources for each instructor. Co-occurrences were defined as the different resource appearing in the same quotation.
After identifying the resources each instructor described in their teaching practice, we worked to develop a holistic narrative of their framing. In particular, we considered the overarching threads in each instructor’s frame that we interpreted as guiding the instructor’s moment to moment and planning decisions. We compared which resources were present and absent for each instructor; a summary of these is included in the Appendix B. We also used features of the qualitative analysis software (Atlas TI 9) to examine patterns in the data, for example, the co-occurrence of resources within a quotation. Finally, to illustrate how resources connected to these instructor frames, we identified key quotes that represented the resources described by each instructor and distilled those examples into the description in the Findings section.
Our primary aim was to link instructor resources to the ways they framed their instruction, and from this analysis, explore how their use of educational technology aligns with those instructional frames. To do this, we analyzed the second interview after the resources had been defined from the first interview and the instruction had been described. In the second interview, we focused primarily on examples that demonstrated use of the educational technology tool. One author did the primary coding with frequent collaboration from the other authors.

2.6. Positionality and Trustworthiness

The research team attended to established criteria of qualitative trustworthiness across the study’s design, data collection, and analysis, including credibility, dependability, confirmability, and transferability (Forero et al., 2018; Stahl & King, 2020; Yin, 2018). The team’s diverse composition brought multiple and sometimes contrasting perspectives to the study’s design, data collection, coding, and analysis. The team was intentionally multidisciplinary, bringing complementary forms of expertise (Bransford et al., 2010). The team included a scholar entirely new to the data corpus (first author), a developer of the focal tool (second author), a researcher who had participated in previous interview analyses (third author), and a researcher who conducted the interviews (fourth author). As a collective, we were committed to understanding how individual intentions and social contexts interact in complex and generative ways, and to foregrounding the value of the varied experiences and perspectives that instructors bring to educational practice. Attending to our positionality, all analytic and interpretive decisions were made collaboratively through iterative discussion, with attention to surfacing and negotiating differing perspectives until shared interpretations were reached. Throughout analytic discussions, we intentionally created space for differing perspectives and attended to the motives and positionalities of each researcher as we worked to reconcile interpretive differences. This process supported consistency across analytic work. Rather than seeking a single “typical” or “ideal” framing, we examined longitudinal data from each instructor working across varied institutional contexts.

3. Results

3.1. Summary of Instructor Resources

This section addresses Research Question 1. Figure 1 shows the co-occurrent resources identified for each instructor (rectangles) and the edges (lines) depict their co-occurrences, with thicker lines representing more common co-occurrences. A complete list of all resources is included in Appendix B. For each case, we present our holistic description of the instructor’s framing followed by a detailed description of the evidence we used to interpret how the resources assembled to support that framing.

3.2. Case 1: Alex

3.2.1. Framing Summary

Alex’s instructional framing centers on didactically dispensing knowledge to students. Their instructional decision-making utilizes the course’s textbook and focuses on the chapters therein for content organization and emphasizes constraints from the teaching environment, such as low TA support and a large class size. Alex’s frame, therefore, is largely bounded and siloed by these chapters and organization. This is illustrated in Figure 1 by the co-occurrences between the resources “Learn concepts” and “Traditional Lecture” as well as the absence of many of the student-centered focused resources the other instructors described.

3.2.2. Assembling of Resources into an Instructional Frame

In the first interview, Alex described their view that teaching is a fixed skill that they worked to develop early in their career, but has since mastered:
The classes that I’ve been teaching, it’s sort of very standardized, and I’ve been doing that for a few years so, I haven’t been interacting that much [with other faculty and staff] in terms of what can be changed, or how can we improve, that kind of thing. But early on, I would go take the … there is a Center for Teaching and Learning, I would attend their workshops and what can be incorporated, there was project-based learning workshop that I attended … So I tried different things, it seems it [has] gotten more and more tough as the class sizes kept increasing. And accordingly the TA help and other things didn’t increase as much so it’s, to deviate from what the standard is, becomes harder, that’s what I’ve found. (Coded as Teaching mastery not ongoing practice and Managing instructor’s work)
  • Here, Alex described how early in their career, they worked to develop their teaching practice, but, in their framing, the course content has calcified because they have mastered it. They also frame course constraints, such as large class sizes and insufficient TA support, as reasons to bypass course innovation.
When asked to describe a typical class period for their fluids course, Alex said:
I’ll have lectures where we describe some theory … I put them [the slides] on Canvas without any solution so they can write on. And then I’ll solve sample questions explaining the process on how to solve them, and then towards the end we’ll have similar short answer type questions as well as more comprehensive questions that I’ll solve. (Coded as Material tools and Traditional lecture)
  • Here, Alex illustrates their ideas that students learn through didactic processes by being shown how to solve problems that build on theory. They go on to describe how they organize the course content according to chapters in the canonical textbook as well as their ideas of students’ motivation and engagement:
Usually that’s how I end up teaching most of the chapters, I’ll introduce the theory, although most of the students are not too excited about learning the theory with the derivation and so I kind of keep it short and then directly go more problem based and help them learn through visual graphics or through basic sample examples. So right now we are in more application like pipe flows and so I also make use of videos, those classic videos that have been developed … And they seem to like that, although I’ve received comments in the past that these videos are very old. (Coded as Learn concepts, Material tools, Responding to student feedback, and Traditional lecture).
  • Alex values showing students how to do things during lectures and believes that direct instruction has a greater impact than other methods, such as reading the textbook. They described this by saying: “They retain things only after doing or discussing these concepts in maybe 10 minutes of a lecture or so. If I never discussed I’d let them just have reading based, then they will not even retain any of that.” (Coded as Traditional lecture and Learn concepts). Here, Alex described how they cover topics with traditional lecture that includes introducing the theory, working basic examples, and then going through some applications. They described how they sometimes tailor their teaching to what students like (such as less emphasis on theory), but that this feedback is from only a few students and is not comprehensive data about student learning.
Alex also described their approach to assessing homework, which is not graded:
Currently I don’t do assigned homework which is graded … I do assign problems that I’ll say, these are the problems that you should look at, but we are not going to collect it and grade them, and that has become a norm because of the class sizes. I don’t have sufficient TA help for that … And then I will post solutions for those discussed, and then if there’s no grade associated with it, they tend to not be more motivated by that. But I tell them that your midterms will be based on these problems … really for this course, problem solving is critical. Rather than just reading the text or anything … there isn’t a weekly feedback to them in terms of how they are progressing, I just let them be aware of what’s coming. (Coded as Responding to instructional constraints and Students are motivated by points)
  • In these examples, Alex sees themselves as the conveyer of knowledge to the students and sees students as unmotivated and unable to learn by themself. They recognize both that practicing problem solving is critical for the course and that students are not really motivated without points assigned. They acknowledge that this conflict does not support student motivation. However, Alex still believes this is the best approach, and cites reasons such as the large class sizes and lack of TA support. Thus, in Alex’s framing, the constraints of the course workload weigh more heavily than what might be best for the students. They have not found alternative ways to address these tensions nor have they more closely examined the needs of the students.

3.3. Case 2: Bailey

3.3.1. Framing Summary

Bailey’s instructional framing centers on developing students’ engineering problem-solving skills. They differentiate college engineering problems where the solution path is not obvious at first glance from the more straight-forward “high school problems” that students have experienced. Their focus is to prepare students for a common final exam and for their subsequent courses by teaching students to recognize certain types of problems and learn the procedures for solving those problems. This is illustrated in Figure 1 in the connections between resources.

3.3.2. Assembling of Resources into an Instructional Frame

Much of Bailey’s framing is influenced by the pressure they feel to cover certain topics and prepare students for both the next course and the common final exam.
Theoretically, I could just go in there and teach whatever, but … All of these classes … are classes that students go on … to classes for which dynamics is a prerequisite … It would be a great disservice to the students not to cover everything on the syllabus. (Coded as Prioritize content coverage, and Responding to instructional constraints)
  • Within Bailey’s framing, content coverage is an important resource. They feel responsibility to the students to cover certain topics and make sure they understand those topics well. Another significant aspect of Bailey’s framing relates to organizing delivery order of class topics to account for their complexity. For example, Bailey states:
I start off [my statics class] with equilibrium and as soon as we’ve finished equilibrium … I do friction … because I think it’s one of the hardest topics in statics. If you do it the week before the final, the students have barely had a chance to do the homework and they have very little understanding. But if I do it in the middle of the quarter … While I’m doing [the other topics] I can give them quizzes, like my five-minute quizzes … on friction to keep them thinking on friction and developing that a little more when they’ve had more time to think about it [coded as More problems are better, More time on topic = more learning, Short quizzes, and Timing of complex material]
  • Here, Bailey described their thinking about the order of topics in the course. This supports their framing about the importance of covering certain topics so that students are prepared for future courses, in this case dynamics. In addition, Bailey frames instruction around teaching problem solving and teaching students to think more like engineers:
I think a large part of statics and dynamics is an ability to solve problems. An ability to look at problems the way engineers look at problems, rather than the way high schoolers look at problems … One thing that’s really hard for people, especially coming into statics, is … they’ve done high school math and they’ve done some physics in college, and basically they know how to look at a problem and go, “Okay, well if I do this, this, and this I get my answer.” So, they’ve got the whole thing laid out in their mind beforehand how they’re going to solve this problem … it’s about teaching them like, “Okay, you have to start by reading the problem and understanding the problem,” which they’ve never had issues with before … Then to think about the problem maybe from a couple of different angles. (coded as Guided problem solving)
  • Here, Bailey described that their framing values teaching students to recognize and utilize problem solving abilities for more complex problems than they have solved previously. What they refer to as “an ability to look at problems the way engineers look at problems,” does not completely reflect engineering practice in that they are not asking students to tackle ambiguous, open-ended problems (Jonassen et al., 2006; Vincenti, 1993), but rather how to approach a problem when the solution path is not immediately obvious.
Bailey’s framing influences the use of class time and instructional practices in several ways. For example, they said:
A typical class, I would probably talk as briefly as possible about the concepts. Then I would give the students a question. I would sort of make sure they set up the question and make sure they understand the question. … Then I would have the students work in groups to answer the question. … I’d give them a few minutes and then I’d be like, “Does everybody have a free body diagram? … If you’ve got diagrams like this, you’re on the right track. Okay, keep going.” Then I give them a little more time. … I feel like I don’t have enough class time … to go through as many examples as I’d like. I don’t feel I can just say, “Okay, here do this question,” and half an hour later restart the problem. I try to guide them as they’re going through the questions. [coded as Guided problem solving, Learning from peers, More problems are better, Responsive teaching, and Scaffold problem solving]
  • Here, Bailey described the guided problem-solving methods they use. This approach reflects their framing around the importance of learning to solve problems with non-obvious solutions and demonstrates how their framing influences their instruction. As described, Bailey controls the flow and timing of the class period and gives students small pieces of the problem to work through, emphasizing developing procedural knowledge.
Many of Bailey’s decisions are related to their framing of how students are motivated. In the interviews, Bailey described several instructional decisions based on students’ motivation:
The points thing just drives me nuts. [The students are] like, “How much is this worth?” In some ways it’s useful because you can get students to do stuff by saying, “Here’s this quiz. It’s worth two points.” I’m like, “Two points is nothing. It’s two points.” But they do it because it’s worth two points. [coded as Students are motivated by points]
  • Here, Bailey described their experiences with students being motivated by points. Their response to this framing was to give short quizzes that involved points to get students to do the problems. This also reinforces the resources around More problems being better.

3.4. Case 3: Cameron

3.4.1. Framing Summary

Cameron’s instructional framing centers on creating an active learning environment that foregrounds students’ interactions with material tools and with one another. Cameron also values providing students with opportunities for reflection. They elicit students’ thinking about the content and invite them to reflect on their own learning in ways that develop agency and prepare them for careers in engineering. This is illustrated in Figure 1 by the high connectiveness of “Learning from peers” (6 co-occurrences) as well as the integration of several resources not identified in the other instructors’ resources, including “Student reasoning”, “Negotiate confusion”, “Connect to real life”, and “Responsive teaching.”

3.4.2. Assembling of Resources into an Instructional Frame

Cameron values preparing their students to be reflective, confident learners who can tackle challenges after leaving the class:
Most important to me is what they take away in their approach to learning and their approach to problem-solving. I hope that they are developing as a student and developing a better self-regulation. I really want them to learn how to learn. … We can talk all we want about content and technical stuff in engineering, but really, what we’re doing is we’re putting students through the ringer, really intense material for four years and training them how to learn complex material efficiently. And that’s what makes a successful engineer. Because you’re not going to go graduate and get a job doing what you learn, solving textbook problems. Right? … And so that’s really what I’m trying to engender in the students and get them to really take responsibility for the learning, not have me dictate so much what they get out of the class and have them articulate what they’re getting out of the class. [Coded as: Learning how to learn, Prepare students for professional practice, Problem solving over content coverage, Reflective thinking of students, and Connect to real life]
  • This quote illustrates Cameron’s thinking that their highest priority for students is broad—to learn how to learn. Rather than prioritizing specific technical content, as Bailey does, Cameron believes that learning how to learn, self-confidence, and taking responsibility for learning are critical characteristics for future engineers.
Consistent with this broad framing, Cameron described several strategies they use to put students in positions of confusion and the pedagogies they use to help students navigate this confusion:
[During class] I’ll have a series of questions that increase in difficulty and/or nuance, and we’ll kind of go through those until we get to that point, like, “Okay, here’s what we need to talk about, because this is where like more than half the class is not getting there.” … They will first answer on their own, then I’ll have them report their answers, select their answer on Concept Warehouse, and then, without even discussing it as a class, I have them discuss as a group at their table and talk about it and try and convince their neighbors that they’re correct. And then they answer again after that discussion. And then based on that answer, I decide whether or not we’re going to talk more about it as a class. Or I’ll just ask one or two students to share their explanations. … It all depends on how confused people are. [Coded as Learning from peers, Material tools, Negotiate confusion, and Responsive teaching]
Cameron described how they put students in positions to make sense of their own thinking and how they adjust their teaching to guide students based on the support that students need. Cameron uses learning from peers in ways that students must rely on their peers and work together in meaningful ways.
Across several timescales, their teaching is responsive. While the quotation above illustrates how they respond during an instructional activity, Cameron describes a responsive approach to course design as well. For example, they said, “I’m a constant tinkerer. Every course, I’m always tinkering around the edges. … My approach is basically like every year, I’ll pick one or two courses that I want to do pretty significant overhaul.” (Coded as Responding to instructional constraints.) In contrast to Alex, Cameron frames their own teaching as constantly working to improve and learn, just as they expect from their students.

3.5. Integration of Educational Technology

This section addresses Research Question 2, describing how each instructor’s frames and resources interacted with their use of the new technology tool. In the second interview, instructors described how they used the CW, using a shared screen to walk through a specific example within the instructor interface. We analyzed how their CW use aligned with each instructor’s framing.
In summary, Alex’s framing focused on the instructor being the deliverer of information in a lecture and the students being responsible for following the instructor’s lead. In line with this frame, Alex used the tool to assign students problems that were auto-graded. If the students did not know the correct answer, they were told to discuss with their peers outside of class or ask questions in office hours. Alex did not require them to provide an explanation for their reasoning, instead emphasizing correctness. Bailey’s framing valued preparing students for future classes by teaching them the relevant concepts and procedures they would need. They viewed students as motivated by points and used that as leverage with short quizzes. They used the concept questions in the CW as a type of reward to think about problems differently than regular lecture, separating the ideas in the concept questions from other lectures that they considered more central to the course. Finally, Cameron’s framing valued students learning how to learn and taking ownership for what they learned, with a particular emphasis on the importance of sense making. In line with this frame, when incorporating the new technology tool, Cameron used a format that presented students with a question designed to elicit confusion and then a student led discussion to resolve that confusion with guidance from their peers and the instructor.
Figure 1 illustrates these differences. Alex had no co-occurrences between the resource “Material tools” and any other resources, indicating limited integration with the CW with their other instructional resources. Bailey had five co-occurrences of “Material tools” with other resources, especially with “Responding to instructional constraints” and “Learn concepts.” Cameron had ten co-occurrences of “Material tools” with other resources, most highly with “Learning from peers”, “Negotiate confusion”, and “Responsive teaching.”
We next provide evidence for each of these ways the uptake of the technology tool aligned with the three instructors’ framings.

3.5.1. Case 1: Alex

  • Alex integrated the CW educational technology tool in ways that required minimal changes to their class. Alex continued to rely primarily on “traditional lecture” and “mini-projects”, having the students work on the concept questions in groups outside of class, and prioritizing “learn concepts” and basing those concepts on which topics are covered in certain chapters of the textbook. These practices align with the resource “teaching mastery not ongoing practice”, in that they adapted the use of the new technology tool in ways that corresponded with their current teaching practice, rather than make changes to their teaching practice to utilize the new technology tool. For example, they wrote questions into the system that closely resembled questions they already asked the students, stating:
I had created these five or six different folders [in the CW] with my own questions … on first two chapters fluid properties and hydrostatics … I can click on one, these are the standard multiple choice, and I would ask them to work on this. They were allowed to discuss it with their peers because in my mini projects, I formed groups of up to three students. They would work in a group and submit one solution for the mini project. But these were all individual, the Concept Warehouse. But then I still ask them to discuss with your peers, or they could also ask me questions if they had … I wouldn’t pull up Concept Warehouse, but I would have a question similar on my slide and then discuss.
  • This approach did not require Alex to change the content of the course; rather, they used the CW as a tool to give students similar questions to what they had previously asked. In this same vein, they did not utilize the concept questions to encourage collaboration and spark student thinking in the ways that Cameron did. Similarly to how they described their teaching style in the first interview, Alex did not discuss the concept questions in class but did encourage students to come to them with questions.
I use six of them [the questions], one for each sort of topic or two chapters in the text. I would select like 15, I had my own problems, I selected some of the already published problems in the warehouse, and I would assign them. … It was not a test, but I’d give them sufficient time, four or five days, and then they can ask me questions on those.
  • In this example, Alex, as the instructor, is still the authority and arbiter of knowledge. As another example, they described their approach to seeing the students’ responses and how they addressed them:
Once this was due and submitted, I would go through each question and I would sort of solve the problem myself and make a video of it not during the lecture, but separate video of how I would approach them. And I would post that video. In that sense, it was good because it didn’t really take too much of my class time. And also it gave them [students] feedback as to how to approach these questions.
  • In line with their framing, Alex was very focused on whether questions were right or wrong, for example, they described using auto-grading for concept questions to give students points:
Immediately, they said, “Are you actually grading us based on the response” And I said, “No.” If the answer is wrong and even if they have a good explanation, we could grade that, but that’s just too much grading to go through. And so I said, “No, the grading is automatic based on whether your answer is correct or not, but I would encourage you to give an explanation.
  • Here, Alex describes how the resources “managing instructor’s work” and “students are motivated by points” plays into their use of the tool, in particular how the auto-grading function allows them to save time grading. Because Alex could not use auto-grading for the explanation, they did not require students to provide an explanation justifying the choice.

3.5.2. Case 2: Bailey

Like Alex, Bailey implemented the concept questions in ways that aligned with their framing. For example, in their second interview, they stated:
I was using the Concept Warehouse on Friday. I called it Concept Fun Friday. … I think the students just think they’re a kind of a fun thing … they get to think about stuff then. I think a lot of times lectures can get boring and they just sit there and write notes. Like I say, I have them do the examples, but I still talk about theory to some extent.
  • Here, they described how, to implement the CW, they taught the class as normal and gave the concept questions at the end of the week, as a treat for students who made it through the “boring” lectures. This aligns with how they position students for learning. For example, in the first interview they described their ideas about students being “motivated by points” and her role to manipulate these points as a tool to motivate them, such as with her methods of using “short quizzes.” Here, they are describing using the concept questions and the fun environment they bring to motivate them and see learning as fun.
Bailey’s framing is also reflected in how they modified the CW to fit into her teaching. They stated:
The way I use it, I didn’t really use Concept Warehouse, I think, the way you intended it to be used. I just pulled the questions off and put them up on the projector. … I would like, “Okay, silence for the next minute or two, and I want you to think about what the answer is.” So they all got to sit there and read the question and in their mind have an answer, but they weren’t required. I’d be like, “Okay, do you have an answer? Which answer is it? … Is it A, B, C, or D?” … And then I would just have them discuss, and the learning assistants would go around and help discuss with each of the groups. We’d stop by, we’d be like, “Okay, so what are we thinking here?” And we’d actually have them verbalize some of it. And if they were way off track, we’d be like, “Well, have you thought about this?” Or give them some little hint or maybe point out the flaw in their argument.
  • Here, Bailey describes how they gave the students time to consider the questions and discuss with their peers and the learning assistants. This practice provided students opportunity to “learn from their peers” and near peers and have time to think about the concepts. Both of these are in line with their use of resources “learning from peers” and “more time on topic = more learning.” However, in this process, Bailey maintained most of the control of the pace and content of the discussions. The agency for learning and for the pace of learning stayed with the instructor, consistent with the “scaffolding problem solving” resource.
Another component of how their framing affected their use of the CW was in how they integrated it into their course schedule. For example, in showing an example question during the interview, they said:
This is the first question we do. Generally, we’ve just talked about pulleys at this point, or we’re just about to talk about pulleys. So this is nice because students don’t really need an equation for this. They don’t really need to understand about the method to solve for velocities of things with pulleys. They should be able to look at that and go, as block A moves to the left, if it moves half a meter to the left, B will move a meter up. And then there’s the same relationship between velocities. … It gives them an idea of thinking about something before they’ve actually learned about it. It’s a simple pulley situation. … This is simple enough that they could be like, I can figure that out without actually being told much because it’s a simple situation.
  • In this example, Bailey uses concept questions that they feel are “simple enough” that students can reason through the answer without the numerical procedures. The CW tools are not displacing Bailey’s ideas about how to teach problem solving, but are pushing the ideas in ways that provide students new tools to think about problems. The CW fits within Bailey’s framing, but also stretches student thinking. Bailey has a frame that students must learn to recognize the types of problems and be able to identify the method needed for a problem to apply it. However, in this example, they state the students have not learned the material yet, i.e., Bailey has not told them about it yet, rather than seeing this example as part of the learning process. Another component of their framing reflected here is the importance of the “timing of complex material.” Bailey views the order of topics and the pacing of the content as important parts of their role as the instructor.
Bailey felt significant pressure on the instructor’s time and workload of the course. Bailey has the resources “more problems is better” and “more time on topic = more learning.” Therefore, they feel compelled to include many things in the course, making it challenging to implement a new technology tool because it becomes an add-on to the pre-existing activities and content. In the second interview, Bailey was asked if they collected and used data through the CW. They replied:
I wasn’t. It was both the time issue and the student response to that, which I just felt was negative. I would’ve liked to have more information. But at the same time, I just didn’t have time. When you only have 50 minutes three days a week to get through the syllabus in dynamics, it’s just a constant push. … I’ve been teaching that class for so long now that I put so much stuff in there that it’s a real push to get it done.
  • This quote aligns with their framing that as the instructor they must maintain a rigorous pace to the course for both themself and the students and that the time they have for coursework is a driving factor of their instructional decisions. Other components of their framing demonstrated here are using “conventional HW” and covering the content to “prepare students for following classes.” They view these as essential components of the course and prioritize time spent on these.

3.5.3. Case 3: Cameron

Like the other instructors, Cameron implemented the concept questions in ways that aligned with their framing. They integrated the CW in ways that prompted students’ confusion and then pushed students to confront that confusion in ways that were productive for their developing understanding. For example, in the second interview, they described their process for how they introduced the topic of moments showing a specific example from the CW:
So they have a reading assignment before each class meeting. … And this is a question that kind of focuses in, are they getting some of the basic concepts from the reading? And they use this to clarify misconceptions, and do some peer to peer instruction. … I’ll put this question up with explanations, … give them five, eight minutes to answer individually, and then once everyone’s answered, I will look at the results. … I will probably show that histogram to the students. … And I’ll say, “Go into your breakout rooms, and convince your classmates that you’re right.” … They go for five minutes, or whatever, and hash it out. And then I call them back, and ask them to answer again … then they answer again, and then I shut off the poll. Depending on timing, I may or may not prompt explanations for the second time. … If almost everyone’s converged on one answer, and hopefully it’s the correct answer, I’ll just ask the class. I’ll tell people that that was correct. [If not], then I’ll ask if someone wants to explain the reasoning … and as they’re explaining the reasoning, I will scribe.
  • Here, Cameron uses the CW questions think deeply about assigned reading, requiring both individual explanations and group discussion. Their use aligns with their framing utilizing “responsive teaching” as well as “learning from peers”, “connecting to prior knowledge”, and “learning concepts.” They use a data-driven approach and are transparent with the students about the data they are seeing. Additionally, they shift agency and responsibility for learning to the students, requiring students to rely on the reading and each other, rather than positioning themself as the dispenser of knowledge. They act as the “scribe” rather than the authority conveying the information.
An additional component of Cameron’s framing that came up frequently in the interviews was “negotating confusion” as Cameron described using the CW as a strategy incorporated into their teaching:
This is one of those, where there’s just this really common point of confusion for students as we’re getting into 3D moments, and it’s a really simple geometry, and it’s proven to be a pretty good question to just elicit that confusion, and talk about it.
  • Here, Cameron explained how they used the CW questions in line with their framing, i.e., as a tool to surface a confusion that students had about the concepts and then talk about it.

4. Discussion

Building on those who have taken this approach to instructional decision making in biology (Russ & Luna, 2013) and chemistry (DeGlopper et al., 2023), we utilize a theoretical orientation of resources and frames to address the adaptation of an educational technology tool in engineering. As compared to instructor beliefs, which are considered fixed and stable, this orientation accounts more explicitly for the complex interactions between an instructor’s ideas and history and the context in which their moment-to-moment decisions and instructional planning occur. Importantly, by foregrounding the interactions between instructional choices and context, we avoid a deficit view where we judge an instructor’s approach as good or bad but rather try to understand their decision making within their teaching context and then move that forward.
We have presented evidence from three instructors each with their own framing and explicated how that framing coincided with the ways they adapted use of a new technology tool. These instructors’ practices fit within observed teaching styles that have been reported as student-centered or instructor-centered (DeGlopper et al., 2023; Ferrare, 2019; Hora, 2014; Popova et al., 2021). For example, Cameron might be classified as student-centered with a framing based on supporting students to learn how to learn. Building on this framing, Cameron structured classes around prompting confusion about topics and supporting students to work with their peers to navigate that confusion. Correspondingly, Cameron used the CW to pursue increasingly complex concepts, promoting collaboration and sensemaking among students. In contrast, Alex might be classified as instructor-centered, framing teaching as a set skill that involves transmitting concepts and problem-solving procedures to students where they are the arbiter of knowledge. Alex primarily used traditional lecture in large classes with little TA support and used the CW to assign problems that were auto-graded and prompted students to ask questions outside of class to their peers or during office hours, rather than using class time.
However, the label of student- and instructor-centered beliefs is problematic in that instructors can simultaneously exhibit both as Bailey exhibited and as confirmed by others in the literature (Ferrare, 2019; Hora, 2014). A resources and frames orientation allows researchers and professional developers to untangle these aspects of practice. A shift in theoretical lens from beliefs to resources and frames is consistent with the trend in professional development models that have been increasingly accounting for the influence of contextual factors on instructional approaches and decision making (Popova et al., 2021). Instructors value pedagogical collaboration and the development of communities as key aspects in their motivation and development (Smith & Wyness, 2024). Rather than considering the question of how beliefs change over time, we suggest a more fine-grained approach where we examine and prompt instructional growth by considering what salient resources from the instructional context are activated and how they are assembled into frames.
Alex, Bailey, and Cameron each bring their own past experiences and ideas about learning to their instructional practice. They are also, undoubtedly, influenced by the contexts in which they teach, e.g., their class sizes, student populations, department culture, and work expectations outside of the classroom. We selected these three participants precisely because the combined attributes led to a wide set of frames. Our goal is not to establish causal relations between the specific individual characteristics of each instructor and their framing. Rather, by examining instructors with very different framings, we sought to unpack how an instructor’s framing, generally conceived, can influence their approach and specific choices in integrating educational technology in their teaching. Such an approach illustrates the critical link between resources, frames, and the uptake of a new educational technology tool.
This study has implications for professional development of faculty, especially in regard to the adaptation of educational technology. Our study reveals how instructors tend to focus on the features that fit into their current resources and framings. The experienced instructors in this study adapted the use of the new technology tool in ways that aligned with their current teaching practice, rather than make changes to their teaching practice to utilize the new technology tool. Therefore, instructors, each with their own framing, will foreground aspects from a professional development experience that align with their framings. The application of these professional development experiences will in turn affect student experiences and learning, which is an area of further study. For example, it would be useful to study the extent to which the foci of an instructor’s framing correspond to student outcomes (e.g., How do Bailey’s students perform in future classes? How do Cameron’s students negotiate confusion and connect to real-world cases?)
We propose that professional development focused on the use of new technology tools must first attend to the salient resources instructors identify and how the technology fits within that corresponding framing. What are the instructors’ goals and constraints? How will that affect their use of the technology? There are, of course, contradictions and tensions here. For example, if research has shown that a particular method is effective for learning, it is logical to assume that instructors should use that method. However, if we tell instructors that without attending to the practicalities of their current context, they might return to the classrooms and use what fits within their framing, not in the way the use was intended (Li et al., 2019; Tondeur et al., 2017). If a group of instructors are working together to implement a new technology tool, they come with unique perspectives to add. If we attend to these perspectives by supporting instructors to understand their framings, goals, and practices, we can enrich how the technology is used. Additionally, in this study, Cameron added some content to the tool and felt ownership for that content. As shown in other work (e.g., Guzey et al., 2016), having the instructor integral to the development of the tool can play an important role in their confidence and application of the tool. Learning communities should support instructor agency and a role in the development of the educational technology itself.
We suggest that to make change in the use of educational technology, professional development programs must couple learning the technology with broader conversations about instructor framing and context. Fostering a culture of improvement and continual development, rather than a dichotomous view of experts “training” instructors, has the potential to integrate instructors’ framing with the use of the technology in ways that will be most effective for their implementation. Essential components of this structure include giving instructors material agency during the development of the tools and in learning how to use them, and having open conversations about the tools and their use. One way to conceptualize this structure is that the instructor is having a conversation with the tool, both what it can do and the obstacles for use, and that conversation interacts with their framing to produce the ways they use it and challenge ideas about how to teach. When we interact with new technology tools, that process should be generative, rather than limiting. Additionally, we need to help instructors connect to as many resources as possible, in meaningful ways that help them make connections between these resources.
This study has several limitations. To generate rich, longitudinal accounts of tool use and instructional decision-making, we focused on three instructors across different institutional contexts. The limited sample size, descriptive design, and reliance primarily on self-report data, hallmarks of case study research, require caution in interpretation. We selected the three instructors because they were experienced instructors with a variety of teaching approaches teaching in distinctly different contexts. The intent here is not to develop a comprehensive list of possible resources and frames. Rather, we sought to unpack the different resources and frames of these instructors and illustrate how such a theoretical lens could provide insight into their adaption of a novel educational technology tool. Although the interview protocol incorporated elements of stimulated recall related to tool use, we did not directly observe classroom enactments of the tool in practice. At the same time, we took several steps to strengthen the analytic value of the study. We sought variation in instructors’ backgrounds and institutional contexts, worked as a multidisciplinary research team, and drew on detailed tool usage data to corroborate and extend our interpretations of instructors’ accounts. Together, these features support our analysis of how and why instructors take up a tool such as the one studied here. Because the tool was intentionally designed for flexible use, the patterns described here may not fully capture the adoption and adaptation of more constrained technologies. However, our emphasis on the resources the instructors identify and the frames they assembled from those resources is likely to be relevant to the uptake of other educational innovations more broadly.

5. Conclusions

In this multiple case study, we examined the framing of three experienced instructors at three different types of institutions and examined how their framing was related to their uptake of a new educational technology tool. Instructional framing is a set conception about teaching and learning but is distinct from the construct of instructor beliefs by foregrounding how instructors interact with context. Specifically, our approach shifts from a psychological framework that interprets instructors’ uses of technology based on their cognitive mental models (i.e., what’s “inside their head”) to emphasizing how instructors act, position themselves, and make technology choices within the systems and cultures that they teach. We argue that this shift is needed to understand reported discrepancies between instructors’ expressed beliefs and their observed practice. Understanding an instructor’s framing, and the resources from which that framing is constituted, provides understanding of how cultural elements (e.g., institutional pressures, disciplinary norms, and student expectations) influence practice and can, therefore, provide direct levers to enact strategies to shift practice.
We found notable differences in the framings of the three instructors and associated differences with how they integrated the technological tool in their classes. Overall, we found that the instructors adapted the use of the tool in ways that aligned with their framing, rather than make changes to their teaching practice to utilize the tool. Our findings suggest that to make change in how technological tools are used in teaching, professional development programs must attend to instructors’ framings. Professional development programs should provide agency for instructors to examine their own framings and make sense of how they will use the tools in their context. Integration of new technology tools should be a co-creation process involving the instructors and the tools. Co-creation increases the potential for instructors to align their framing with effective ways to integrate new technology tools that are research based and effective while prompting them to better understand how to support student learning in their unique contexts.

Author Contributions

Conceptualization, A.E., M.K., B.S. and J.G.; methodology, A.E., M.K., B.S. and J.G.; formal analysis, A.E., M.K., B.S. and J.G.; investigation, A.E., M.K., B.S. and J.G.; resources, M.K. and B.S.; data curation, M.K. and J.G.; writing—original draft preparation, A.E., M.K., B.S. and J.G.; writing—review and editing, A.E., M.K., B.S. and J.G.; visualization, A.E. and M.K.; project administration, M.K. and B.S.; funding acquisition, M.K. and B.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the U.S. National Science Foundation, grant numbers 2135190.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of Tufts University (protocol code IRB STUDY00001651, with approval granted on 21 June 2018).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The participants of this study did not give written consent for their data to be shared publicly.

Acknowledgments

We would like to thank the instructors for their participation in this study. We would like to thank Sandra Huffman for support on the graphic.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
CWConcept Warehouse

Appendix A

Faculty Interview Questions
Level 1
  • [Institutional Context/Instructor Histories/Beliefs] I’d like to know more about your position at XXX. Specifically:
    • What is your official title?
    • How did you come to this position?—looking for a little professional history—did they come from industry, right after grad school, or from other institution…
    • What proportion of your job is teaching? What are the expectations other than teaching for your position?
    • What classes did you teach this academic year, 201x–201x?
    • How much autonomy do you have over what and how you teach?
    • How many years have you been teaching university classes?
2.
[Learning Context (Including practices)] Tell me about the class you are using the CW in?
Probes:
    • How many students are enrolled?
    • What is the format (e.g., lecture, recitation, lab, studio) What does a typical class look like for this course?
    • Do you have TAs or other instructional support? Probe: How many, what do they do?
    • Does this course have sections taught by other instructors?
    • How do you coordinate between sections?
    • To what extent do you take into account courses that come after this one in the curriculum?
    • How often and to what extent do you change aspects of the course (each time you teach, when you notice something problematic…). Ask for example?
3.
[CW Context (including Features)] How are you or do you think you might use the CW in this class? And why?
  • What ways might it help students learn?
  • What challenges do you anticipate?
4.
[Learning Context (Including practices)] I’d like to hear about your current assessment practices while teaching.
  • Describe the data/information you collect about student learning. Follow up for exams: what types of questions do you use (mc, problems, open-ended …) and why.
  • How do you use the information you collect?
  • Are there opportunities for students to use the information to reflect on their learning? Describe
5.
[Instructor Histories/Beliefs] Think of a successful student in this course. What do they take away that’s most important to you?
  • Overall, what do you consider as the most effective teaching strategies towards developing these things?
  • To what extent do you employ these teaching strategies?
  • What are the most challenging aspects for students to learn? How do you approach those?
  • Based on what you know now about CW, how well does it align with your teaching practices?
6.
[Institutional Context/Instructor Histories/Beliefs] Do you interact regularly with any others concerning issues of teaching and learning?
  • Are these people in your discipline/department/program?
  • What encourages or discourages these interactions?
  • Have these interactions had impact on how you think about or approach teaching and learning? Please explain how/why.
7.
[Instructor Histories/Beliefs] Have you had any professional development experiences that have been influential in the way you teach? [If, yes] Describe.
8.
Do you have any other comments that might be useful to us for this project?
9.
[Follow up email question to instructors who decline Level 2]. Can you let us know why you declined to participate in Level 2 of the Concept Warehouse project?
Level 2
  • [Institutional Context/Instructor Histories/Beliefs] I’d like to know more about your position at XXX. Specifically:
    • What is your official title?
    • How did you come to this position?—looking for a little professional history—did they come from industry, right after grad school, or from other institution…
    • What proportion of your job is teaching? What are the expectations other than teaching for your position?
    • What classes did you teach this academic year, 20xx–20xx?
    • How much autonomy do you have over what and how you teach?
    • How many years have you been teaching university classes?
2.
[Learning Context (Including practices)] Tell me about the class you are using the CW in?
Probes:
    • How many students are enrolled?
    • What is the format (e.g., lecture, recitation, lab, studio) What does a typical class look like for this course?
    • Do you have TAs or other instructional support? Probe: How many, what do they do?
    • Does this course have sections taught by other instructors?
    • How do you coordinate between sections?
    • To what extent do you take into account courses that come after this one in the curriculum?
    • How often and to what extent do you change aspects of the course (each time you teach, when you notice something problematic…). Ask for example?
3.
Is this your first time using CW for in-class work?
4.
How did you learn how to use the technology to deliver in class (or in some way other than the CI alone)? How did that go? Did you find it harder or easier than other technologies you have learned or adopted?
5.
What kinds of ways did you use the CW this term? Did you need to change your approach to teaching to use it? In what ways?
6.
(How) did you assign credit for student work in the CW?
7.
[Learning Context (Including practices)] I’d like to hear about your current assessment practices while teaching.
  • Describe the data/information you collect about student learning. Follow up for exams: what types of questions do you use (mc, problems, open-ended …) and why.
  • How do you use the information you collect?
  • Are there opportunities for students to use the information to reflect on their learning? Describe
8.
[we captured zoom video of the responses to the following questions] Log onto CW and walk us through a question they have used in class, how they used it, why they used it, how did it go? What was the student response? Anything surprise you? [If we have observed them, direct these questions about the class they were observed]
  • How did your experience with using CW go?
  • How much time does it take you to prepare and use the CW? Going forward, do you think preparation time will influence your choice to use the CW.
  • How did you use the students’ responses to questions? [refer to observation if we have observed them]
  • Did you use CW in any other ways this term?
9.
[Instructor Goals/Conceptions] Think of a successful student in this course. What do they take away that’s most important to you?
  • Overall, what do you consider as the most effective teaching strategies towards developing these things?
  • To what extent do you employ these teaching strategies?
  • What are the most challenging aspects for students to learn? How do you approach those?
  • Based on what you know now about CW, how well does it align with your teaching practices?
10.
In what ways do you think the CW might support student learning?
11.
In what ways might the CW provide resources for you as an instructor?
12.
[Institutional Context/Instructor Histories/Beliefs] Do you interact regularly with any others concerning issues of teaching and learning?
  • Are these people in your discipline/department/program?
  • What encourages or discourages these interactions?
  • Have these interactions had impact on how you think about or approach teaching and learning? Please explain how/why.
13.
[Instructor Histories/Beliefs] Have you had any professional development experiences that have been influential in the way you teach? [If, yes] Describe.
14.
What suggestions do you have for making these experiences better for instructors or students?
15.
Anything else you want to tell me or the developers?

Appendix B

Resources identified in the three cases
Complete list of resources identified for each instructor. Grey indicates resources that were mentioned and black indicates resources that were discussed in depth.
Type of ResourceResourceAlexBaileyCameron
Instructional tools“mini projects”
Conventional HW
Exams as growth opportunity
Material tools
Short quizzes
Traditional lecture
Instructional prioritiesExtend examples to new problems
Guided problem solving
Having students take roles
Negotiate confusion
Scaffold problem solving
Managing instructor’s work
More problems are better
More time on topic = more learning
Responding to instructional constraints
Timing of complex material
Technical learning objectivesLearn concepts
Prioritize content coverage
Problem solving over content coverage
Socio learning objectivesEngineering identity
Learning how to learn
Reflective thinking of students
Growth in confidence
Participation NOT correctness
Process goals
Student reasoning
Student CollaborationLearning from peers
Common language
Student contributes to whole class discussion
ResponsiveResponding to student feedback
Responsive teaching
Assessing student state of mind
Connections outside the classConnect to prior knowledge
Connect to prior knowledge NOT
Prepare students for PE
Prepare students for professional practice
Prepare students for the following classes
Prepare students to pass the final
Connect to real life
Instructor self-reflective practicesAdapting practices from colleagues
Changing practice over the term
Mastery of teachingTeaching mastery not ongoing practice
Student motivationStudent engagement strategies
Students are motivated by points
Motivating students

References

  1. Auby, H., & Koretsky, M. D. (2025, June 22–25). Student Perceptions of Learning and Engagement Using an Educational Technology Tool. 2025 ASEE Annual Conference & Exposition, Montreal, QC, Canada. [Google Scholar]
  2. Basckin, C., Strnadová, I., & Cumming, T. M. (2021). Teacher beliefs about evidence-based practice: A systematic review. International Journal of Educational Research, 106, 101727. [Google Scholar] [CrossRef]
  3. Birman, B. F., Desimone, L., Porter, A. C., & Garet, M. S. (2000). Designing professional development that works. Educational Leadership, 57(8), 28–33. [Google Scholar]
  4. Borko, H., Peressini, D., Romagnano, L., Knuth, E., Willis-Yorker, C., Wooley, C., Hovermill, J., & Masarik, K. (2000). Teacher education does matter: A situative view of learning to teach secondary mathematics. Educational Psychologist, 35(3), 193–206. [Google Scholar] [CrossRef]
  5. Bowen, A. S., Reid, D. R., & Koretsky, M. (2014, June 15–18). Development of interactive virtual laboratories to help students learn difficult concepts in thermodynamics. 2014 ASEE Annual Conference & Exposition, Indianapolis, Indiana. [Google Scholar]
  6. Bowman, M. A., Vongkulluksn, V. W., Jiang, Z., & Xie, K. (2022). Teachers’ exposure to professional development and the quality of their instructional technology use: The mediating role of teachers’ value and ability beliefs. Journal of Research on Technology in Education, 54(2), 188–204. [Google Scholar] [CrossRef]
  7. Bransford, J., Mosborg, S., Copland, M. A., Honig, M. A., Nelson, H. G., Gawel, D., Phillips, R. S., & Vye, N. (2010). Adaptive people and adaptive systems: Issues of learning and design. In Second international handbook of educational change (pp. 825–856). Springer Netherlands. [Google Scholar] [CrossRef]
  8. Burns, A. (1992). Teacher beliefs and their influence on classroom practice. Prospect, 7(3), 56–66. [Google Scholar]
  9. Cheng, S. L., Lu, L., Xie, K., & Vongkulluksn, V. W. (2020). Understanding teacher technology integration from expectancy-value perspectives. Teaching and Teacher Education, 91, 103062. [Google Scholar] [CrossRef]
  10. Chugh, R., Turnbull, D., Cowling, M. A., Vanderburg, R., & Vanderburg, M. A. (2023). Implementing educational technology in higher education institutions: A review of technologies, stakeholder perceptions, frameworks and metrics. Education and Information Technologies, 28(12), 16403–16429. [Google Scholar] [CrossRef]
  11. Cook, J., Ekstedt, T., Self, B., & Koretsky, M. (2022). Bridging the gap: Computer simulations and video recordings for remote inquiry-based laboratory activities in mechanics. Advances in Engineering Education, 10(2). [Google Scholar] [CrossRef]
  12. DeGlopper, K. S., Russ, R. S., Sutar, P. K., & Stowe, R. L. (2023). Beliefs versus resources: A tale of two models of epistemology. Chemistry Education Research and Practice, 24(2), 768–784. [Google Scholar] [CrossRef]
  13. Desimone, L. M., Porter, A. C., Garet, M. S., Yoon, S., & Birman, B. F. (2002). Effects of professional development on teachers’ instruction: Results from a three-year longitudinal study. Educational Evaluation and Policy Analysis, 24(2), 81–112. [Google Scholar] [CrossRef]
  14. Elby, A., & Hammer, D. (2010). Epistemological resources and framing: A cognitive framework for helping teachers interpret and respond to their students’ epistemologies. In Personal epistemology in the classroom: Theory, research, and implications for practice (pp. 409–434). Cambridge University Press. [Google Scholar]
  15. Escueta, M., Quan, V., Nickow, A. J., & Oreopoulos, P. (2017). Education technology: An evidence-based review. National Bureau of Economic Research. [Google Scholar] [CrossRef]
  16. Ferrare, J. J. (2019). A multi-institutional analysis of instructional beliefs and practices in gateway courses to the sciences. CBE Life Sciences Education, 18(2), ar26. [Google Scholar] [CrossRef]
  17. Fives, H., & Gill, M. G. (2015). International handbook of research on teachers’ beliefs. Routledge. [Google Scholar]
  18. Foley, K., O’Sullivan, D., & Cahill, K. (2025). Factors affecting primary teachers’ ability to engage in transformative professional learning. Professional Development in Education. [Google Scholar] [CrossRef]
  19. Forero, R., Nahidi, S., De Costa, J., Mohsin, M., Fitzgerald, G., Gibson, N., McCarthy, S., & Aboagye-Sarfo, P. (2018). Application of four-dimension criteria to assess rigour of qualitative research in emergency medicine. BMC Health Services Research, 18(1), 120. [Google Scholar] [CrossRef] [PubMed]
  20. Friedrichsen, D. M., Smith, C., & Koretsky, M. D. (2017). Propagation from the start: The spread of a concept-based instructional tool. Educational Technology Research and Development, 65, 177–202. [Google Scholar] [CrossRef]
  21. Gavitte, S. B., Koretsky, M. D., & Nason, J. A. (2025). Connecting affordances of physical and virtual laboratory modes to engineering epistemic practices. Journal of Computing in Higher Education, 37(1), 442–476. [Google Scholar] [CrossRef]
  22. Geertz, C. (2008). Thick description: Toward an interpretive theory of culture. In The cultural geography reader (pp. 41–51). Routledge. [Google Scholar]
  23. George, A. L., & Bennet, A. (2005). Case studies and theory development in social sciences. MIT Press. [Google Scholar]
  24. Goffman, E. (1986). Frame analysis: An essay on the organization of experience. Northeastern University Press. [Google Scholar]
  25. Greeno, J. G. (1998). The situativity of knowing, learning, and research. American Psychologist, 53(1), 5. [Google Scholar] [CrossRef]
  26. Guzey, S. S., Moore, T. J., & Harwell, M. (2016). Building up stem: An analysis of teacher-developed engineering design-based stem integration curricular materials. Journal of Pre-College Engineering Education Research, 6(1), 2. [Google Scholar] [CrossRef]
  27. Hammer, D., Elby, A., Scherr, R. E., & Redish, E. F. (2005). Resources, framing, and transfer. In J. P. Mestre (Ed.), Transfer of learning from a modern multidisciplinary perspective (pp. 89–119). Emerald Publishing Limited. [Google Scholar]
  28. Haynes-Brown, T. K. (2024). Beyond changing teachers’ beliefs: Extending the impact of professional development to result in effective use of information and communication technology in teaching. Professional Development in Education, 51(1), 23–38. [Google Scholar] [CrossRef]
  29. Hora, M. T. (2014). Exploring faculty beliefs about student learning and their role in instructional decision-making. The Review of Higher Education, 38(1), 37–70. [Google Scholar] [CrossRef]
  30. Jonassen, D., Strobel, J., & Lee, C. B. (2006). Everyday problem solving in engineering: Lessons for engineering educators. Journal of Engineering Education, 95(2), 139–151. [Google Scholar] [CrossRef]
  31. Kagan, D. M. (1992). Implications of research on teacher beliefs. Educational Psychologist, 27(1), 6590. [Google Scholar] [CrossRef]
  32. Kelly, P. (2006). What is teacher learning? A socio-cultural perspective. Oxford Review of Education, 32(4), 505–519. [Google Scholar] [CrossRef]
  33. Koretsky, M. D., Brooks, B. J., & Higgins, A. Z. (2016a). Written justifications to multiple-choice concept questions during active learning in class. International Journal of Science Education, 38(11), 1747–1765. [Google Scholar] [CrossRef]
  34. Koretsky, M. D., Brooks, B. J., White, R. M., & Bowen, A. S. (2016b). Querying the questions: Student responses and reasoning in an active learning class. Journal of Engineering Education, 105(2), 219–244. [Google Scholar] [CrossRef]
  35. Koretsky, M. D., Falconer, J. L., Brooks, B. J., Silverstein, D. L., Smith, C., & Miletic, M. (2014). The AIChE concept warehouse: A web-based tool to promote concept-based instruction. Advances in Engineering Education, 4(1), n1. [Google Scholar]
  36. Koretsky, M. D., Nolen, S. B., Galisky, J., Auby, H., & Grundy, L. G. (2024). Progression from the Mean: Cultivating instructors’ unique trajectories of practice using educational technology. Journal of Engineering Education, 113(1), 164–194. [Google Scholar] [CrossRef]
  37. Levin, D. M., Hammer, D., & Coffey, J. E. (2009). Novice teachers’ attention to student thinking. Journal of Teacher Education, 60(2), 142–154. [Google Scholar] [CrossRef]
  38. Li, Y., Garza, V., Keicher, A., & Popov, V. (2019). Predicting high school teacher use of technology: Pedagogical beliefs, technological beliefs and attitudes, and teacher training. Technology, Knowledge and Learning, 24(3), 501–518. [Google Scholar] [CrossRef]
  39. Luschei, T. F. (2014). Assessing the costs and benefits of educational technology. In Handbook of research on educational communications and technology (4th ed., pp. 239–248). Springer New York. [Google Scholar] [CrossRef]
  40. Mouza, C., Codding, D., & Pollock, L. (2022). Investigating the impact of research-based professional development on teacher learning and classroom practice: Findings from computer science education. Computers and Education, 186, 104530. [Google Scholar] [CrossRef]
  41. Nigon, N., Tucker, J. D., Ekstedt, T. W., Jeong, B. C., Simionescu, D. C., & Koretsky, M. D. (2024). Adaptivity or agency? Educational technology design for conceptual learning of materials science. Computer Applications in Engineering Education, 32(6), e22790. [Google Scholar] [CrossRef]
  42. Nolen, S. B., Ward, C. J., Horn, I. S., Childers, S., Campbell, S. S., & Mahna, K. (2009). Motivation development in novice teachers: The development of utility filters. In M. Wosnitza, S. A. Karabenick, A. Efklides, & P. Nenniger (Eds.), Contemporary motivation research: From global to local perspectives (pp. 265–278). Hogrefe & Huber. [Google Scholar]
  43. Popova, M., Kraft, A., Harshman, J., & Stains, M. (2021). Changes in teaching beliefs of early-career chemistry faculty: A longitudinal investigation. Chemistry Education Research and Practice, 22(2), 431–442. [Google Scholar] [CrossRef]
  44. Rodriguez, J. M. G. (2024). Using analytic autoethnography to characterize the variation in the application of the resources framework: What is a resource? Journal of Chemical Education, 101(9), 3676–3690. [Google Scholar] [CrossRef]
  45. Russ, R. S., & Luna, M. J. (2013). Inferring teacher epistemological framing from local patterns in teacher noticing. Journal of Research in Science Teaching, 50(3), 284–314. [Google Scholar] [CrossRef]
  46. Smith, B., & Wyness, L. (2024). What makes professional teacher development in universities effective? Lessons from an international systematised review. Professional Development in Education, 1–23. [Google Scholar] [CrossRef]
  47. Stahl, N. A., & King, J. R. (2020). Expanding approaches for research: Understanding and using trustworthiness in qualitative research. Journal of Developmental Education, 44(1), 26–28. [Google Scholar]
  48. Stroupe, D. (2016). Beginning teachers’ use of resources to enact and learn from ambitious instruction. Cognition and Instruction, 34(1), 51–77. [Google Scholar] [CrossRef]
  49. Tondeur, J., van Braak, J., Ertmer, P. A., & Ottenbreit-Leftwich, A. (2017). Understanding the relationship between teachers’ pedagogical beliefs and technology use in education: A systematic review of qualitative evidence. Educational Technology Research and Development, 65(3), 555–575. [Google Scholar] [CrossRef]
  50. Vannatta, R. A., & Fordham, N. (2004). Teacher dispositions as predictors of classroom technology use. Journal of Research on Technology in Education, 36(3), 253–271. [Google Scholar] [CrossRef]
  51. Vincenti, W. G. (1993). What engineers know and how they know it: Analytical studies from aeronautical history. John Hopkins University Press. [Google Scholar]
  52. Yin, R. K. (2013). Validity and generalization in future case study evaluations. Evaluation, 19(3), 321–332. [Google Scholar] [CrossRef]
  53. Yin, R. K. (2018). Case study research and applications: Design and methods (6th ed.). SAGE Publications Inc. [Google Scholar]
  54. Zech, L. K., Gause-Vega, C. L., Bray, M. H., Secules, T., & Goldman, S. R. (2000). Content-based collaborative inquiry: A professional development model for sustaining educational reform. Educational Psycologist, 35(3), 207–217. [Google Scholar] [CrossRef]
Figure 1. Co-Occurrences of resources for each instructor.
Figure 1. Co-Occurrences of resources for each instructor.
Education 16 00221 g001
Table 1. Instructor characterization.
Table 1. Instructor characterization.
AlexBaileyCameron
RankFull ProfessorLecturerFull Professor
Years teaching15 years15 years19 years
Institution typeLarge, Research intensiveLarge, Teaching focusedSmall, Community College
Target classFluid MechanicsDynamicsStatics
Typical class size1603520
Context notesMinimal teaching assistant supportHigh teaching load
Common final exam
Small program
Teaches most of the engineering courses
Interview 1
(Resources and Frames)
26 February 202010 July 201912 August 2020
Interview 2
(Connection to educational technology tool)
16 March 20213 March 20222 January 2021
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Emberley, A.; Koretsky, M.; Self, B.; Galisky, J. Beyond Beliefs: Understanding Instructor Framing and the Uptake of Educational Technology in Engineering Education. Educ. Sci. 2026, 16, 221. https://doi.org/10.3390/educsci16020221

AMA Style

Emberley A, Koretsky M, Self B, Galisky J. Beyond Beliefs: Understanding Instructor Framing and the Uptake of Educational Technology in Engineering Education. Education Sciences. 2026; 16(2):221. https://doi.org/10.3390/educsci16020221

Chicago/Turabian Style

Emberley, Amanda, Milo Koretsky, Brian Self, and John Galisky. 2026. "Beyond Beliefs: Understanding Instructor Framing and the Uptake of Educational Technology in Engineering Education" Education Sciences 16, no. 2: 221. https://doi.org/10.3390/educsci16020221

APA Style

Emberley, A., Koretsky, M., Self, B., & Galisky, J. (2026). Beyond Beliefs: Understanding Instructor Framing and the Uptake of Educational Technology in Engineering Education. Education Sciences, 16(2), 221. https://doi.org/10.3390/educsci16020221

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop