Beyond Beliefs: Understanding Instructor Framing and the Uptake of Educational Technology in Engineering Education
Abstract
1. Introduction
1.1. Background Literature
1.2. Theoretical Framework
1.3. Research Questions
- Within different contexts, what resources do three instructors identify to support their teaching goals? How do instructors appear to assemble these resources into frames?
- How do the resources and frames interact with instructors’ use of the new technology tool?
2. Materials and Methods
2.1. Research Design
2.2. Concept Warehouse (CW)
2.3. Participants and Setting
2.4. Data Sources
2.5. Data Analysis
2.6. Positionality and Trustworthiness
3. Results
3.1. Summary of Instructor Resources
3.2. Case 1: Alex
3.2.1. Framing Summary
3.2.2. Assembling of Resources into an Instructional Frame
The classes that I’ve been teaching, it’s sort of very standardized, and I’ve been doing that for a few years so, I haven’t been interacting that much [with other faculty and staff] in terms of what can be changed, or how can we improve, that kind of thing. But early on, I would go take the … there is a Center for Teaching and Learning, I would attend their workshops and what can be incorporated, there was project-based learning workshop that I attended … So I tried different things, it seems it [has] gotten more and more tough as the class sizes kept increasing. And accordingly the TA help and other things didn’t increase as much so it’s, to deviate from what the standard is, becomes harder, that’s what I’ve found. (Coded as Teaching mastery not ongoing practice and Managing instructor’s work)
- Here, Alex described how early in their career, they worked to develop their teaching practice, but, in their framing, the course content has calcified because they have mastered it. They also frame course constraints, such as large class sizes and insufficient TA support, as reasons to bypass course innovation.
I’ll have lectures where we describe some theory … I put them [the slides] on Canvas without any solution so they can write on. And then I’ll solve sample questions explaining the process on how to solve them, and then towards the end we’ll have similar short answer type questions as well as more comprehensive questions that I’ll solve. (Coded as Material tools and Traditional lecture)
- Here, Alex illustrates their ideas that students learn through didactic processes by being shown how to solve problems that build on theory. They go on to describe how they organize the course content according to chapters in the canonical textbook as well as their ideas of students’ motivation and engagement:
Usually that’s how I end up teaching most of the chapters, I’ll introduce the theory, although most of the students are not too excited about learning the theory with the derivation and so I kind of keep it short and then directly go more problem based and help them learn through visual graphics or through basic sample examples. So right now we are in more application like pipe flows and so I also make use of videos, those classic videos that have been developed … And they seem to like that, although I’ve received comments in the past that these videos are very old. (Coded as Learn concepts, Material tools, Responding to student feedback, and Traditional lecture).
- Alex values showing students how to do things during lectures and believes that direct instruction has a greater impact than other methods, such as reading the textbook. They described this by saying: “They retain things only after doing or discussing these concepts in maybe 10 minutes of a lecture or so. If I never discussed I’d let them just have reading based, then they will not even retain any of that.” (Coded as Traditional lecture and Learn concepts). Here, Alex described how they cover topics with traditional lecture that includes introducing the theory, working basic examples, and then going through some applications. They described how they sometimes tailor their teaching to what students like (such as less emphasis on theory), but that this feedback is from only a few students and is not comprehensive data about student learning.
Currently I don’t do assigned homework which is graded … I do assign problems that I’ll say, these are the problems that you should look at, but we are not going to collect it and grade them, and that has become a norm because of the class sizes. I don’t have sufficient TA help for that … And then I will post solutions for those discussed, and then if there’s no grade associated with it, they tend to not be more motivated by that. But I tell them that your midterms will be based on these problems … really for this course, problem solving is critical. Rather than just reading the text or anything … there isn’t a weekly feedback to them in terms of how they are progressing, I just let them be aware of what’s coming. (Coded as Responding to instructional constraints and Students are motivated by points)
- In these examples, Alex sees themselves as the conveyer of knowledge to the students and sees students as unmotivated and unable to learn by themself. They recognize both that practicing problem solving is critical for the course and that students are not really motivated without points assigned. They acknowledge that this conflict does not support student motivation. However, Alex still believes this is the best approach, and cites reasons such as the large class sizes and lack of TA support. Thus, in Alex’s framing, the constraints of the course workload weigh more heavily than what might be best for the students. They have not found alternative ways to address these tensions nor have they more closely examined the needs of the students.
3.3. Case 2: Bailey
3.3.1. Framing Summary
3.3.2. Assembling of Resources into an Instructional Frame
Theoretically, I could just go in there and teach whatever, but … All of these classes … are classes that students go on … to classes for which dynamics is a prerequisite … It would be a great disservice to the students not to cover everything on the syllabus. (Coded as Prioritize content coverage, and Responding to instructional constraints)
- Within Bailey’s framing, content coverage is an important resource. They feel responsibility to the students to cover certain topics and make sure they understand those topics well. Another significant aspect of Bailey’s framing relates to organizing delivery order of class topics to account for their complexity. For example, Bailey states:
I start off [my statics class] with equilibrium and as soon as we’ve finished equilibrium … I do friction … because I think it’s one of the hardest topics in statics. If you do it the week before the final, the students have barely had a chance to do the homework and they have very little understanding. But if I do it in the middle of the quarter … While I’m doing [the other topics] I can give them quizzes, like my five-minute quizzes … on friction to keep them thinking on friction and developing that a little more when they’ve had more time to think about it [coded as More problems are better, More time on topic = more learning, Short quizzes, and Timing of complex material]
- Here, Bailey described their thinking about the order of topics in the course. This supports their framing about the importance of covering certain topics so that students are prepared for future courses, in this case dynamics. In addition, Bailey frames instruction around teaching problem solving and teaching students to think more like engineers:
I think a large part of statics and dynamics is an ability to solve problems. An ability to look at problems the way engineers look at problems, rather than the way high schoolers look at problems … One thing that’s really hard for people, especially coming into statics, is … they’ve done high school math and they’ve done some physics in college, and basically they know how to look at a problem and go, “Okay, well if I do this, this, and this I get my answer.” So, they’ve got the whole thing laid out in their mind beforehand how they’re going to solve this problem … it’s about teaching them like, “Okay, you have to start by reading the problem and understanding the problem,” which they’ve never had issues with before … Then to think about the problem maybe from a couple of different angles. (coded as Guided problem solving)
- Here, Bailey described that their framing values teaching students to recognize and utilize problem solving abilities for more complex problems than they have solved previously. What they refer to as “an ability to look at problems the way engineers look at problems,” does not completely reflect engineering practice in that they are not asking students to tackle ambiguous, open-ended problems (Jonassen et al., 2006; Vincenti, 1993), but rather how to approach a problem when the solution path is not immediately obvious.
A typical class, I would probably talk as briefly as possible about the concepts. Then I would give the students a question. I would sort of make sure they set up the question and make sure they understand the question. … Then I would have the students work in groups to answer the question. … I’d give them a few minutes and then I’d be like, “Does everybody have a free body diagram? … If you’ve got diagrams like this, you’re on the right track. Okay, keep going.” Then I give them a little more time. … I feel like I don’t have enough class time … to go through as many examples as I’d like. I don’t feel I can just say, “Okay, here do this question,” and half an hour later restart the problem. I try to guide them as they’re going through the questions. [coded as Guided problem solving, Learning from peers, More problems are better, Responsive teaching, and Scaffold problem solving]
- Here, Bailey described the guided problem-solving methods they use. This approach reflects their framing around the importance of learning to solve problems with non-obvious solutions and demonstrates how their framing influences their instruction. As described, Bailey controls the flow and timing of the class period and gives students small pieces of the problem to work through, emphasizing developing procedural knowledge.
The points thing just drives me nuts. [The students are] like, “How much is this worth?” In some ways it’s useful because you can get students to do stuff by saying, “Here’s this quiz. It’s worth two points.” I’m like, “Two points is nothing. It’s two points.” But they do it because it’s worth two points. [coded as Students are motivated by points]
- Here, Bailey described their experiences with students being motivated by points. Their response to this framing was to give short quizzes that involved points to get students to do the problems. This also reinforces the resources around More problems being better.
3.4. Case 3: Cameron
3.4.1. Framing Summary
Cameron’s instructional framing centers on creating an active learning environment that foregrounds students’ interactions with material tools and with one another. Cameron also values providing students with opportunities for reflection. They elicit students’ thinking about the content and invite them to reflect on their own learning in ways that develop agency and prepare them for careers in engineering. This is illustrated in Figure 1 by the high connectiveness of “Learning from peers” (6 co-occurrences) as well as the integration of several resources not identified in the other instructors’ resources, including “Student reasoning”, “Negotiate confusion”, “Connect to real life”, and “Responsive teaching.”
3.4.2. Assembling of Resources into an Instructional Frame
Most important to me is what they take away in their approach to learning and their approach to problem-solving. I hope that they are developing as a student and developing a better self-regulation. I really want them to learn how to learn. … We can talk all we want about content and technical stuff in engineering, but really, what we’re doing is we’re putting students through the ringer, really intense material for four years and training them how to learn complex material efficiently. And that’s what makes a successful engineer. Because you’re not going to go graduate and get a job doing what you learn, solving textbook problems. Right? … And so that’s really what I’m trying to engender in the students and get them to really take responsibility for the learning, not have me dictate so much what they get out of the class and have them articulate what they’re getting out of the class. [Coded as: Learning how to learn, Prepare students for professional practice, Problem solving over content coverage, Reflective thinking of students, and Connect to real life]
- This quote illustrates Cameron’s thinking that their highest priority for students is broad—to learn how to learn. Rather than prioritizing specific technical content, as Bailey does, Cameron believes that learning how to learn, self-confidence, and taking responsibility for learning are critical characteristics for future engineers.
[During class] I’ll have a series of questions that increase in difficulty and/or nuance, and we’ll kind of go through those until we get to that point, like, “Okay, here’s what we need to talk about, because this is where like more than half the class is not getting there.” … They will first answer on their own, then I’ll have them report their answers, select their answer on Concept Warehouse, and then, without even discussing it as a class, I have them discuss as a group at their table and talk about it and try and convince their neighbors that they’re correct. And then they answer again after that discussion. And then based on that answer, I decide whether or not we’re going to talk more about it as a class. Or I’ll just ask one or two students to share their explanations. … It all depends on how confused people are. [Coded as Learning from peers, Material tools, Negotiate confusion, and Responsive teaching]
3.5. Integration of Educational Technology
3.5.1. Case 1: Alex
- Alex integrated the CW educational technology tool in ways that required minimal changes to their class. Alex continued to rely primarily on “traditional lecture” and “mini-projects”, having the students work on the concept questions in groups outside of class, and prioritizing “learn concepts” and basing those concepts on which topics are covered in certain chapters of the textbook. These practices align with the resource “teaching mastery not ongoing practice”, in that they adapted the use of the new technology tool in ways that corresponded with their current teaching practice, rather than make changes to their teaching practice to utilize the new technology tool. For example, they wrote questions into the system that closely resembled questions they already asked the students, stating:
I had created these five or six different folders [in the CW] with my own questions … on first two chapters fluid properties and hydrostatics … I can click on one, these are the standard multiple choice, and I would ask them to work on this. They were allowed to discuss it with their peers because in my mini projects, I formed groups of up to three students. They would work in a group and submit one solution for the mini project. But these were all individual, the Concept Warehouse. But then I still ask them to discuss with your peers, or they could also ask me questions if they had … I wouldn’t pull up Concept Warehouse, but I would have a question similar on my slide and then discuss.
- This approach did not require Alex to change the content of the course; rather, they used the CW as a tool to give students similar questions to what they had previously asked. In this same vein, they did not utilize the concept questions to encourage collaboration and spark student thinking in the ways that Cameron did. Similarly to how they described their teaching style in the first interview, Alex did not discuss the concept questions in class but did encourage students to come to them with questions.
I use six of them [the questions], one for each sort of topic or two chapters in the text. I would select like 15, I had my own problems, I selected some of the already published problems in the warehouse, and I would assign them. … It was not a test, but I’d give them sufficient time, four or five days, and then they can ask me questions on those.
- In this example, Alex, as the instructor, is still the authority and arbiter of knowledge. As another example, they described their approach to seeing the students’ responses and how they addressed them:
Once this was due and submitted, I would go through each question and I would sort of solve the problem myself and make a video of it not during the lecture, but separate video of how I would approach them. And I would post that video. In that sense, it was good because it didn’t really take too much of my class time. And also it gave them [students] feedback as to how to approach these questions.
- In line with their framing, Alex was very focused on whether questions were right or wrong, for example, they described using auto-grading for concept questions to give students points:
Immediately, they said, “Are you actually grading us based on the response” And I said, “No.” If the answer is wrong and even if they have a good explanation, we could grade that, but that’s just too much grading to go through. And so I said, “No, the grading is automatic based on whether your answer is correct or not, but I would encourage you to give an explanation.
- Here, Alex describes how the resources “managing instructor’s work” and “students are motivated by points” plays into their use of the tool, in particular how the auto-grading function allows them to save time grading. Because Alex could not use auto-grading for the explanation, they did not require students to provide an explanation justifying the choice.
3.5.2. Case 2: Bailey
I was using the Concept Warehouse on Friday. I called it Concept Fun Friday. … I think the students just think they’re a kind of a fun thing … they get to think about stuff then. I think a lot of times lectures can get boring and they just sit there and write notes. Like I say, I have them do the examples, but I still talk about theory to some extent.
- Here, they described how, to implement the CW, they taught the class as normal and gave the concept questions at the end of the week, as a treat for students who made it through the “boring” lectures. This aligns with how they position students for learning. For example, in the first interview they described their ideas about students being “motivated by points” and her role to manipulate these points as a tool to motivate them, such as with her methods of using “short quizzes.” Here, they are describing using the concept questions and the fun environment they bring to motivate them and see learning as fun.
The way I use it, I didn’t really use Concept Warehouse, I think, the way you intended it to be used. I just pulled the questions off and put them up on the projector. … I would like, “Okay, silence for the next minute or two, and I want you to think about what the answer is.” So they all got to sit there and read the question and in their mind have an answer, but they weren’t required. I’d be like, “Okay, do you have an answer? Which answer is it? … Is it A, B, C, or D?” … And then I would just have them discuss, and the learning assistants would go around and help discuss with each of the groups. We’d stop by, we’d be like, “Okay, so what are we thinking here?” And we’d actually have them verbalize some of it. And if they were way off track, we’d be like, “Well, have you thought about this?” Or give them some little hint or maybe point out the flaw in their argument.
- Here, Bailey describes how they gave the students time to consider the questions and discuss with their peers and the learning assistants. This practice provided students opportunity to “learn from their peers” and near peers and have time to think about the concepts. Both of these are in line with their use of resources “learning from peers” and “more time on topic = more learning.” However, in this process, Bailey maintained most of the control of the pace and content of the discussions. The agency for learning and for the pace of learning stayed with the instructor, consistent with the “scaffolding problem solving” resource.
This is the first question we do. Generally, we’ve just talked about pulleys at this point, or we’re just about to talk about pulleys. So this is nice because students don’t really need an equation for this. They don’t really need to understand about the method to solve for velocities of things with pulleys. They should be able to look at that and go, as block A moves to the left, if it moves half a meter to the left, B will move a meter up. And then there’s the same relationship between velocities. … It gives them an idea of thinking about something before they’ve actually learned about it. It’s a simple pulley situation. … This is simple enough that they could be like, I can figure that out without actually being told much because it’s a simple situation.
- In this example, Bailey uses concept questions that they feel are “simple enough” that students can reason through the answer without the numerical procedures. The CW tools are not displacing Bailey’s ideas about how to teach problem solving, but are pushing the ideas in ways that provide students new tools to think about problems. The CW fits within Bailey’s framing, but also stretches student thinking. Bailey has a frame that students must learn to recognize the types of problems and be able to identify the method needed for a problem to apply it. However, in this example, they state the students have not learned the material yet, i.e., Bailey has not told them about it yet, rather than seeing this example as part of the learning process. Another component of their framing reflected here is the importance of the “timing of complex material.” Bailey views the order of topics and the pacing of the content as important parts of their role as the instructor.
I wasn’t. It was both the time issue and the student response to that, which I just felt was negative. I would’ve liked to have more information. But at the same time, I just didn’t have time. When you only have 50 minutes three days a week to get through the syllabus in dynamics, it’s just a constant push. … I’ve been teaching that class for so long now that I put so much stuff in there that it’s a real push to get it done.
- This quote aligns with their framing that as the instructor they must maintain a rigorous pace to the course for both themself and the students and that the time they have for coursework is a driving factor of their instructional decisions. Other components of their framing demonstrated here are using “conventional HW” and covering the content to “prepare students for following classes.” They view these as essential components of the course and prioritize time spent on these.
3.5.3. Case 3: Cameron
So they have a reading assignment before each class meeting. … And this is a question that kind of focuses in, are they getting some of the basic concepts from the reading? And they use this to clarify misconceptions, and do some peer to peer instruction. … I’ll put this question up with explanations, … give them five, eight minutes to answer individually, and then once everyone’s answered, I will look at the results. … I will probably show that histogram to the students. … And I’ll say, “Go into your breakout rooms, and convince your classmates that you’re right.” … They go for five minutes, or whatever, and hash it out. And then I call them back, and ask them to answer again … then they answer again, and then I shut off the poll. Depending on timing, I may or may not prompt explanations for the second time. … If almost everyone’s converged on one answer, and hopefully it’s the correct answer, I’ll just ask the class. I’ll tell people that that was correct. [If not], then I’ll ask if someone wants to explain the reasoning … and as they’re explaining the reasoning, I will scribe.
- Here, Cameron uses the CW questions think deeply about assigned reading, requiring both individual explanations and group discussion. Their use aligns with their framing utilizing “responsive teaching” as well as “learning from peers”, “connecting to prior knowledge”, and “learning concepts.” They use a data-driven approach and are transparent with the students about the data they are seeing. Additionally, they shift agency and responsibility for learning to the students, requiring students to rely on the reading and each other, rather than positioning themself as the dispenser of knowledge. They act as the “scribe” rather than the authority conveying the information.
This is one of those, where there’s just this really common point of confusion for students as we’re getting into 3D moments, and it’s a really simple geometry, and it’s proven to be a pretty good question to just elicit that confusion, and talk about it.
- Here, Cameron explained how they used the CW questions in line with their framing, i.e., as a tool to surface a confusion that students had about the concepts and then talk about it.
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| CW | Concept Warehouse |
Appendix A
- [Institutional Context/Instructor Histories/Beliefs] I’d like to know more about your position at XXX. Specifically:
- What is your official title?
- How did you come to this position?—looking for a little professional history—did they come from industry, right after grad school, or from other institution…
- What proportion of your job is teaching? What are the expectations other than teaching for your position?
- What classes did you teach this academic year, 201x–201x?
- How much autonomy do you have over what and how you teach?
- How many years have you been teaching university classes?
- 2.
- [Learning Context (Including practices)] Tell me about the class you are using the CW in?
- How many students are enrolled?
- What is the format (e.g., lecture, recitation, lab, studio) What does a typical class look like for this course?
- Do you have TAs or other instructional support? Probe: How many, what do they do?
- Does this course have sections taught by other instructors?
- How do you coordinate between sections?
- To what extent do you take into account courses that come after this one in the curriculum?
- How often and to what extent do you change aspects of the course (each time you teach, when you notice something problematic…). Ask for example?
- 3.
- [CW Context (including Features)] How are you or do you think you might use the CW in this class? And why?
- What ways might it help students learn?
- What challenges do you anticipate?
- 4.
- [Learning Context (Including practices)] I’d like to hear about your current assessment practices while teaching.
- Describe the data/information you collect about student learning. Follow up for exams: what types of questions do you use (mc, problems, open-ended …) and why.
- How do you use the information you collect?
- Are there opportunities for students to use the information to reflect on their learning? Describe
- 5.
- [Instructor Histories/Beliefs] Think of a successful student in this course. What do they take away that’s most important to you?
- Overall, what do you consider as the most effective teaching strategies towards developing these things?
- To what extent do you employ these teaching strategies?
- What are the most challenging aspects for students to learn? How do you approach those?
- Based on what you know now about CW, how well does it align with your teaching practices?
- 6.
- [Institutional Context/Instructor Histories/Beliefs] Do you interact regularly with any others concerning issues of teaching and learning?
- Are these people in your discipline/department/program?
- What encourages or discourages these interactions?
- Have these interactions had impact on how you think about or approach teaching and learning? Please explain how/why.
- 7.
- [Instructor Histories/Beliefs] Have you had any professional development experiences that have been influential in the way you teach? [If, yes] Describe.
- 8.
- Do you have any other comments that might be useful to us for this project?
- 9.
- [Follow up email question to instructors who decline Level 2]. Can you let us know why you declined to participate in Level 2 of the Concept Warehouse project?
- [Institutional Context/Instructor Histories/Beliefs] I’d like to know more about your position at XXX. Specifically:
- What is your official title?
- How did you come to this position?—looking for a little professional history—did they come from industry, right after grad school, or from other institution…
- What proportion of your job is teaching? What are the expectations other than teaching for your position?
- What classes did you teach this academic year, 20xx–20xx?
- How much autonomy do you have over what and how you teach?
- How many years have you been teaching university classes?
- 2.
- [Learning Context (Including practices)] Tell me about the class you are using the CW in?
- How many students are enrolled?
- What is the format (e.g., lecture, recitation, lab, studio) What does a typical class look like for this course?
- Do you have TAs or other instructional support? Probe: How many, what do they do?
- Does this course have sections taught by other instructors?
- How do you coordinate between sections?
- To what extent do you take into account courses that come after this one in the curriculum?
- How often and to what extent do you change aspects of the course (each time you teach, when you notice something problematic…). Ask for example?
- 3.
- Is this your first time using CW for in-class work?
- 4.
- How did you learn how to use the technology to deliver in class (or in some way other than the CI alone)? How did that go? Did you find it harder or easier than other technologies you have learned or adopted?
- 5.
- What kinds of ways did you use the CW this term? Did you need to change your approach to teaching to use it? In what ways?
- 6.
- (How) did you assign credit for student work in the CW?
- 7.
- [Learning Context (Including practices)] I’d like to hear about your current assessment practices while teaching.
- Describe the data/information you collect about student learning. Follow up for exams: what types of questions do you use (mc, problems, open-ended …) and why.
- How do you use the information you collect?
- Are there opportunities for students to use the information to reflect on their learning? Describe
- 8.
- [we captured zoom video of the responses to the following questions] Log onto CW and walk us through a question they have used in class, how they used it, why they used it, how did it go? What was the student response? Anything surprise you? [If we have observed them, direct these questions about the class they were observed]
- How did your experience with using CW go?
- How much time does it take you to prepare and use the CW? Going forward, do you think preparation time will influence your choice to use the CW.
- How did you use the students’ responses to questions? [refer to observation if we have observed them]
- Did you use CW in any other ways this term?
- 9.
- [Instructor Goals/Conceptions] Think of a successful student in this course. What do they take away that’s most important to you?
- Overall, what do you consider as the most effective teaching strategies towards developing these things?
- To what extent do you employ these teaching strategies?
- What are the most challenging aspects for students to learn? How do you approach those?
- Based on what you know now about CW, how well does it align with your teaching practices?
- 10.
- In what ways do you think the CW might support student learning?
- 11.
- In what ways might the CW provide resources for you as an instructor?
- 12.
- [Institutional Context/Instructor Histories/Beliefs] Do you interact regularly with any others concerning issues of teaching and learning?
- Are these people in your discipline/department/program?
- What encourages or discourages these interactions?
- Have these interactions had impact on how you think about or approach teaching and learning? Please explain how/why.
- 13.
- [Instructor Histories/Beliefs] Have you had any professional development experiences that have been influential in the way you teach? [If, yes] Describe.
- 14.
- What suggestions do you have for making these experiences better for instructors or students?
- 15.
- Anything else you want to tell me or the developers?
Appendix B
| Type of Resource | Resource | Alex | Bailey | Cameron |
| Instructional tools | “mini projects” | |||
| Conventional HW | ||||
| Exams as growth opportunity | ||||
| Material tools | ||||
| Short quizzes | ||||
| Traditional lecture | ||||
| Instructional priorities | Extend examples to new problems | |||
| Guided problem solving | ||||
| Having students take roles | ||||
| Negotiate confusion | ||||
| Scaffold problem solving | ||||
| Managing instructor’s work | ||||
| More problems are better | ||||
| More time on topic = more learning | ||||
| Responding to instructional constraints | ||||
| Timing of complex material | ||||
| Technical learning objectives | Learn concepts | |||
| Prioritize content coverage | ||||
| Problem solving over content coverage | ||||
| Socio learning objectives | Engineering identity | |||
| Learning how to learn | ||||
| Reflective thinking of students | ||||
| Growth in confidence | ||||
| Participation NOT correctness | ||||
| Process goals | ||||
| Student reasoning | ||||
| Student Collaboration | Learning from peers | |||
| Common language | ||||
| Student contributes to whole class discussion | ||||
| Responsive | Responding to student feedback | |||
| Responsive teaching | ||||
| Assessing student state of mind | ||||
| Connections outside the class | Connect to prior knowledge | |||
| Connect to prior knowledge NOT | ||||
| Prepare students for PE | ||||
| Prepare students for professional practice | ||||
| Prepare students for the following classes | ||||
| Prepare students to pass the final | ||||
| Connect to real life | ||||
| Instructor self-reflective practices | Adapting practices from colleagues | |||
| Changing practice over the term | ||||
| Mastery of teaching | Teaching mastery not ongoing practice | |||
| Student motivation | Student engagement strategies | |||
| Students are motivated by points | ||||
| Motivating students |
References
- Auby, H., & Koretsky, M. D. (2025, June 22–25). Student Perceptions of Learning and Engagement Using an Educational Technology Tool. 2025 ASEE Annual Conference & Exposition, Montreal, QC, Canada. [Google Scholar]
- Basckin, C., Strnadová, I., & Cumming, T. M. (2021). Teacher beliefs about evidence-based practice: A systematic review. International Journal of Educational Research, 106, 101727. [Google Scholar] [CrossRef]
- Birman, B. F., Desimone, L., Porter, A. C., & Garet, M. S. (2000). Designing professional development that works. Educational Leadership, 57(8), 28–33. [Google Scholar]
- Borko, H., Peressini, D., Romagnano, L., Knuth, E., Willis-Yorker, C., Wooley, C., Hovermill, J., & Masarik, K. (2000). Teacher education does matter: A situative view of learning to teach secondary mathematics. Educational Psychologist, 35(3), 193–206. [Google Scholar] [CrossRef]
- Bowen, A. S., Reid, D. R., & Koretsky, M. (2014, June 15–18). Development of interactive virtual laboratories to help students learn difficult concepts in thermodynamics. 2014 ASEE Annual Conference & Exposition, Indianapolis, Indiana. [Google Scholar]
- Bowman, M. A., Vongkulluksn, V. W., Jiang, Z., & Xie, K. (2022). Teachers’ exposure to professional development and the quality of their instructional technology use: The mediating role of teachers’ value and ability beliefs. Journal of Research on Technology in Education, 54(2), 188–204. [Google Scholar] [CrossRef]
- Bransford, J., Mosborg, S., Copland, M. A., Honig, M. A., Nelson, H. G., Gawel, D., Phillips, R. S., & Vye, N. (2010). Adaptive people and adaptive systems: Issues of learning and design. In Second international handbook of educational change (pp. 825–856). Springer Netherlands. [Google Scholar] [CrossRef]
- Burns, A. (1992). Teacher beliefs and their influence on classroom practice. Prospect, 7(3), 56–66. [Google Scholar]
- Cheng, S. L., Lu, L., Xie, K., & Vongkulluksn, V. W. (2020). Understanding teacher technology integration from expectancy-value perspectives. Teaching and Teacher Education, 91, 103062. [Google Scholar] [CrossRef]
- Chugh, R., Turnbull, D., Cowling, M. A., Vanderburg, R., & Vanderburg, M. A. (2023). Implementing educational technology in higher education institutions: A review of technologies, stakeholder perceptions, frameworks and metrics. Education and Information Technologies, 28(12), 16403–16429. [Google Scholar] [CrossRef]
- Cook, J., Ekstedt, T., Self, B., & Koretsky, M. (2022). Bridging the gap: Computer simulations and video recordings for remote inquiry-based laboratory activities in mechanics. Advances in Engineering Education, 10(2). [Google Scholar] [CrossRef]
- DeGlopper, K. S., Russ, R. S., Sutar, P. K., & Stowe, R. L. (2023). Beliefs versus resources: A tale of two models of epistemology. Chemistry Education Research and Practice, 24(2), 768–784. [Google Scholar] [CrossRef]
- Desimone, L. M., Porter, A. C., Garet, M. S., Yoon, S., & Birman, B. F. (2002). Effects of professional development on teachers’ instruction: Results from a three-year longitudinal study. Educational Evaluation and Policy Analysis, 24(2), 81–112. [Google Scholar] [CrossRef]
- Elby, A., & Hammer, D. (2010). Epistemological resources and framing: A cognitive framework for helping teachers interpret and respond to their students’ epistemologies. In Personal epistemology in the classroom: Theory, research, and implications for practice (pp. 409–434). Cambridge University Press. [Google Scholar]
- Escueta, M., Quan, V., Nickow, A. J., & Oreopoulos, P. (2017). Education technology: An evidence-based review. National Bureau of Economic Research. [Google Scholar] [CrossRef]
- Ferrare, J. J. (2019). A multi-institutional analysis of instructional beliefs and practices in gateway courses to the sciences. CBE Life Sciences Education, 18(2), ar26. [Google Scholar] [CrossRef]
- Fives, H., & Gill, M. G. (2015). International handbook of research on teachers’ beliefs. Routledge. [Google Scholar]
- Foley, K., O’Sullivan, D., & Cahill, K. (2025). Factors affecting primary teachers’ ability to engage in transformative professional learning. Professional Development in Education. [Google Scholar] [CrossRef]
- Forero, R., Nahidi, S., De Costa, J., Mohsin, M., Fitzgerald, G., Gibson, N., McCarthy, S., & Aboagye-Sarfo, P. (2018). Application of four-dimension criteria to assess rigour of qualitative research in emergency medicine. BMC Health Services Research, 18(1), 120. [Google Scholar] [CrossRef] [PubMed]
- Friedrichsen, D. M., Smith, C., & Koretsky, M. D. (2017). Propagation from the start: The spread of a concept-based instructional tool. Educational Technology Research and Development, 65, 177–202. [Google Scholar] [CrossRef]
- Gavitte, S. B., Koretsky, M. D., & Nason, J. A. (2025). Connecting affordances of physical and virtual laboratory modes to engineering epistemic practices. Journal of Computing in Higher Education, 37(1), 442–476. [Google Scholar] [CrossRef]
- Geertz, C. (2008). Thick description: Toward an interpretive theory of culture. In The cultural geography reader (pp. 41–51). Routledge. [Google Scholar]
- George, A. L., & Bennet, A. (2005). Case studies and theory development in social sciences. MIT Press. [Google Scholar]
- Goffman, E. (1986). Frame analysis: An essay on the organization of experience. Northeastern University Press. [Google Scholar]
- Greeno, J. G. (1998). The situativity of knowing, learning, and research. American Psychologist, 53(1), 5. [Google Scholar] [CrossRef]
- Guzey, S. S., Moore, T. J., & Harwell, M. (2016). Building up stem: An analysis of teacher-developed engineering design-based stem integration curricular materials. Journal of Pre-College Engineering Education Research, 6(1), 2. [Google Scholar] [CrossRef]
- Hammer, D., Elby, A., Scherr, R. E., & Redish, E. F. (2005). Resources, framing, and transfer. In J. P. Mestre (Ed.), Transfer of learning from a modern multidisciplinary perspective (pp. 89–119). Emerald Publishing Limited. [Google Scholar]
- Haynes-Brown, T. K. (2024). Beyond changing teachers’ beliefs: Extending the impact of professional development to result in effective use of information and communication technology in teaching. Professional Development in Education, 51(1), 23–38. [Google Scholar] [CrossRef]
- Hora, M. T. (2014). Exploring faculty beliefs about student learning and their role in instructional decision-making. The Review of Higher Education, 38(1), 37–70. [Google Scholar] [CrossRef]
- Jonassen, D., Strobel, J., & Lee, C. B. (2006). Everyday problem solving in engineering: Lessons for engineering educators. Journal of Engineering Education, 95(2), 139–151. [Google Scholar] [CrossRef]
- Kagan, D. M. (1992). Implications of research on teacher beliefs. Educational Psychologist, 27(1), 6590. [Google Scholar] [CrossRef]
- Kelly, P. (2006). What is teacher learning? A socio-cultural perspective. Oxford Review of Education, 32(4), 505–519. [Google Scholar] [CrossRef]
- Koretsky, M. D., Brooks, B. J., & Higgins, A. Z. (2016a). Written justifications to multiple-choice concept questions during active learning in class. International Journal of Science Education, 38(11), 1747–1765. [Google Scholar] [CrossRef]
- Koretsky, M. D., Brooks, B. J., White, R. M., & Bowen, A. S. (2016b). Querying the questions: Student responses and reasoning in an active learning class. Journal of Engineering Education, 105(2), 219–244. [Google Scholar] [CrossRef]
- Koretsky, M. D., Falconer, J. L., Brooks, B. J., Silverstein, D. L., Smith, C., & Miletic, M. (2014). The AIChE concept warehouse: A web-based tool to promote concept-based instruction. Advances in Engineering Education, 4(1), n1. [Google Scholar]
- Koretsky, M. D., Nolen, S. B., Galisky, J., Auby, H., & Grundy, L. G. (2024). Progression from the Mean: Cultivating instructors’ unique trajectories of practice using educational technology. Journal of Engineering Education, 113(1), 164–194. [Google Scholar] [CrossRef]
- Levin, D. M., Hammer, D., & Coffey, J. E. (2009). Novice teachers’ attention to student thinking. Journal of Teacher Education, 60(2), 142–154. [Google Scholar] [CrossRef]
- Li, Y., Garza, V., Keicher, A., & Popov, V. (2019). Predicting high school teacher use of technology: Pedagogical beliefs, technological beliefs and attitudes, and teacher training. Technology, Knowledge and Learning, 24(3), 501–518. [Google Scholar] [CrossRef]
- Luschei, T. F. (2014). Assessing the costs and benefits of educational technology. In Handbook of research on educational communications and technology (4th ed., pp. 239–248). Springer New York. [Google Scholar] [CrossRef]
- Mouza, C., Codding, D., & Pollock, L. (2022). Investigating the impact of research-based professional development on teacher learning and classroom practice: Findings from computer science education. Computers and Education, 186, 104530. [Google Scholar] [CrossRef]
- Nigon, N., Tucker, J. D., Ekstedt, T. W., Jeong, B. C., Simionescu, D. C., & Koretsky, M. D. (2024). Adaptivity or agency? Educational technology design for conceptual learning of materials science. Computer Applications in Engineering Education, 32(6), e22790. [Google Scholar] [CrossRef]
- Nolen, S. B., Ward, C. J., Horn, I. S., Childers, S., Campbell, S. S., & Mahna, K. (2009). Motivation development in novice teachers: The development of utility filters. In M. Wosnitza, S. A. Karabenick, A. Efklides, & P. Nenniger (Eds.), Contemporary motivation research: From global to local perspectives (pp. 265–278). Hogrefe & Huber. [Google Scholar]
- Popova, M., Kraft, A., Harshman, J., & Stains, M. (2021). Changes in teaching beliefs of early-career chemistry faculty: A longitudinal investigation. Chemistry Education Research and Practice, 22(2), 431–442. [Google Scholar] [CrossRef]
- Rodriguez, J. M. G. (2024). Using analytic autoethnography to characterize the variation in the application of the resources framework: What is a resource? Journal of Chemical Education, 101(9), 3676–3690. [Google Scholar] [CrossRef]
- Russ, R. S., & Luna, M. J. (2013). Inferring teacher epistemological framing from local patterns in teacher noticing. Journal of Research in Science Teaching, 50(3), 284–314. [Google Scholar] [CrossRef]
- Smith, B., & Wyness, L. (2024). What makes professional teacher development in universities effective? Lessons from an international systematised review. Professional Development in Education, 1–23. [Google Scholar] [CrossRef]
- Stahl, N. A., & King, J. R. (2020). Expanding approaches for research: Understanding and using trustworthiness in qualitative research. Journal of Developmental Education, 44(1), 26–28. [Google Scholar]
- Stroupe, D. (2016). Beginning teachers’ use of resources to enact and learn from ambitious instruction. Cognition and Instruction, 34(1), 51–77. [Google Scholar] [CrossRef]
- Tondeur, J., van Braak, J., Ertmer, P. A., & Ottenbreit-Leftwich, A. (2017). Understanding the relationship between teachers’ pedagogical beliefs and technology use in education: A systematic review of qualitative evidence. Educational Technology Research and Development, 65(3), 555–575. [Google Scholar] [CrossRef]
- Vannatta, R. A., & Fordham, N. (2004). Teacher dispositions as predictors of classroom technology use. Journal of Research on Technology in Education, 36(3), 253–271. [Google Scholar] [CrossRef]
- Vincenti, W. G. (1993). What engineers know and how they know it: Analytical studies from aeronautical history. John Hopkins University Press. [Google Scholar]
- Yin, R. K. (2013). Validity and generalization in future case study evaluations. Evaluation, 19(3), 321–332. [Google Scholar] [CrossRef]
- Yin, R. K. (2018). Case study research and applications: Design and methods (6th ed.). SAGE Publications Inc. [Google Scholar]
- Zech, L. K., Gause-Vega, C. L., Bray, M. H., Secules, T., & Goldman, S. R. (2000). Content-based collaborative inquiry: A professional development model for sustaining educational reform. Educational Psycologist, 35(3), 207–217. [Google Scholar] [CrossRef]

| Alex | Bailey | Cameron | |
|---|---|---|---|
| Rank | Full Professor | Lecturer | Full Professor |
| Years teaching | 15 years | 15 years | 19 years |
| Institution type | Large, Research intensive | Large, Teaching focused | Small, Community College |
| Target class | Fluid Mechanics | Dynamics | Statics |
| Typical class size | 160 | 35 | 20 |
| Context notes | Minimal teaching assistant support | High teaching load Common final exam | Small program Teaches most of the engineering courses |
| Interview 1 (Resources and Frames) | 26 February 2020 | 10 July 2019 | 12 August 2020 |
| Interview 2 (Connection to educational technology tool) | 16 March 2021 | 3 March 2022 | 2 January 2021 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Emberley, A.; Koretsky, M.; Self, B.; Galisky, J. Beyond Beliefs: Understanding Instructor Framing and the Uptake of Educational Technology in Engineering Education. Educ. Sci. 2026, 16, 221. https://doi.org/10.3390/educsci16020221
Emberley A, Koretsky M, Self B, Galisky J. Beyond Beliefs: Understanding Instructor Framing and the Uptake of Educational Technology in Engineering Education. Education Sciences. 2026; 16(2):221. https://doi.org/10.3390/educsci16020221
Chicago/Turabian StyleEmberley, Amanda, Milo Koretsky, Brian Self, and John Galisky. 2026. "Beyond Beliefs: Understanding Instructor Framing and the Uptake of Educational Technology in Engineering Education" Education Sciences 16, no. 2: 221. https://doi.org/10.3390/educsci16020221
APA StyleEmberley, A., Koretsky, M., Self, B., & Galisky, J. (2026). Beyond Beliefs: Understanding Instructor Framing and the Uptake of Educational Technology in Engineering Education. Education Sciences, 16(2), 221. https://doi.org/10.3390/educsci16020221

