STEM Faculty Instructional Data-Use Practices: Informing Teaching Practice and Students’ Reflection on Students’ Learning
Abstract
:1. Introduction
1.1. Calls for Increasing Instructional Data-Use Practices in Postsecondary Education
1.2. What We Know About Postsecondary Faculty’s Instructional Data-Use Practices
1.3. Instructional Technology That Influences Instructional Data-Use Practices
1.4. Instructional Data-Use Practices That Involve Students Reflecting on Their Learning
1.5. Paper Focus
2. Methodology
2.1. Conceptual Frameworks
2.2. Research Questions
- What are the instructional data-use practices of a sample of STEM faculty from one U.S. research university and why?
- What affordances and constraints, including instructional technologies, do these faculty claim regarding their instructional data-use practices?
- To what extent do faculty engage students in reflecting on their own learning data?
2.3. Study Context
Participant Sample and Data Collection
- I know how to gather, analyze, and respond to data that informs my teaching.
- I regularly gather, analyze, and respond to data that informs my teaching.
- I am committed to gathering, analyzing, and responding to data that informs my teaching.
2.4. Data Analysis
- I’d like to hear more about your assessment practices while teaching.
- To what extent do you collect data/information about student learning?
- Are your teaching practices informed by data/information about student learning?
- Are there means in the classes/courses that you teach for students to reflect on their own learning data? (If yes), Can you detail these processes?
2.5. Limitations
3. Results
3.1. Types of Instructional Data Faculty Collected
3.1.1. Summative Data-Use Practices
Let’s start by saying roughly 70% of my student grades come from exams. There’s two midterms, and a final and those are the best way that I know that student is presenting me the information that they personally know and they’re not working with others.
We do very traditional assessments in a sense because, in the fall term, we have fourteen hundred students. We have ten weekly quizzes, and those are individual. We have ten small group activities, one per week. We have two midterm exams and a final consisting of a section of multiple-choice, which is out of practicality, and about forty percent of that exam is open-ended, so it’s free-response for students.
3.1.2. Perception of Changes to Summative Data Practices
This year was a pretty dramatic change to over fifty percent of the grade, and the assessment was not exam-based. So the students were writing papers, which they got formative feedback on, and they were developing presentations that they gave in class and also published on the website where they also had some feedback and revision steps there. Teaching assistants were assigned to some of those activities, so they hopefully got some fairly frequent feedback. Most people were working in groups rather than individually on some of those assignments, sharing their results, presenting them, and all of that was pretty high stakes because the total for those activities like I said, was over half the grade. So we deemphasized formal exams, there were midterms and final exams, but they were lower stakes. That came out of both the desire to get students more actively engaged in the material. On the assessment end, I think we’ve recognized, I’ve seen over many years, that exams are great for many students, but I think they don’t measure all student activity and success and learning.
3.1.3. Formative Data-Use Practices
In terms of formative assessment, I have used things like exit cards, where at the end of a class, I just have students [write down a question or comment]. There’s no grade at all attached to this. It’s just for me to get some sense of what did you [student] think was the most significant thing you learned today, what was the muddiest point. These kinds of quick questions that people jot down on a card and can even be anonymous, and then a quick look through all of that gives me a sense of, ‘oh wow,’ I really missed the boat here. I need to re-address that topic again.
...whether they want more coverage on a specific subject. In terms of reflecting on their own performance, I certainly think that when you make an assignment like full credit for participation or sort of the check if they are there working and engaged, it also sends a message to them about how they engage in the material. Both of those were sort of meant to reward them for being there and engaging, but not making it so high stakes, so it wasn’t supposed to be a stress out sort of thing.
The Tuesday problem or something like this, where I take a break in a two-hour lecture, a ten-minute break, and I put up a problem, and I say you guys are welcome to solve it or not, but when we get back from the ten minutes I’ll solve it, and then we’ll talk about it, and you put it up, and you walk around, and you see if people are trying and you kind of help them, or you give them pointers on what direction to go. So there is a way to create way more informal engagement by doing things that way because there’s very little stress because it doesn’t count for any points, really.
Most of my assessment, and I think this is true for most people, comes from informal interactions. Of course, it depends on the course, but oftentimes I informally really try to just talk to students as much as I can and see how things are going. I often say, ‘Hey, what do you like about the course? What don’t you like?’—again, about as informal as could be, but I sometimes find those are most valuable.
Then students in office hours, if they seem willing, I’ll often ask how do you feel about this content area, or even more specific things like I tried to tell you this, did you notice that in class, or do I need to do that differently.
I meet with my students a lot, so I hear what they’re saying, and I use that to inform where they’re at and where I think they need to be.
Qualitatively, I’m talking to my students constantly. I’ve developed a learning assistant program, so I have ten learning assistants, and they’re constantly giving me feedback about what’s working, what’s not working, helping me try to guide the students. And I have seven T.A.’s [teaching assistants] at any given moment, and we also have meetings every week.
3.2. Affordances That Influenced Instructional Data-Use Practices
3.2.1. Faculty and Organizational Student Assessment Norms
3.2.2. Instructional Technologies
Whenever we have a clicker response that’s less than 85%, we’ll spend time talking about why the right answer is right, why the wrong answer is wrong, and I’m always soliciting their voices for that. I’ve moved away from me explaining to getting them to explain and then affirming.
I do formative assessment in my classes through clickers, but I also have daily what I call submission sheets, so the groups work together to answer a couple of concept type questions, and they turn those into me, and I read those each day.
Students are supposed to take a pre-class quiz, but the catch is that in order to be able to access it, they have to have first viewed the video. So ideally, it sort of forces them to watch the video and then take the quiz. The nice thing was that I and the colleague I taught with, we would have a discussion before class every time of, ‘Hey, what questions on the quiz were they really getting? Which questions didn’t they get?’… we were flexible enough that we could go into class that day and say, ‘Hey, you know we realized we should spend a little more time on this.’ That was a huge change that we hadn’t [done previously]—we might have gotten a feel for it kind of walking around talking to students, but there we had very nice concrete data to inform what we would do and enough flexibility built in that we could say, ‘Hey, today we’re going to spend some more time going through Topic A quickly because most of you seem to be fine with that and spend more time on Topic B.
Adaptive learning has certainly been a refinement that I made because I went from an adaptive learning model where changes were being made to the curriculum based on student understanding, perhaps term by term, and now I’ve shortened that gap where feedback is immediate, evaluation is immediate, and then changes could be made for the very next assignment, which would be the next meeting. So I think that has been a real eye-opener in refining the response time, in that a change to the curriculum is not occurring the next term it’s occurring within the term, and, as a matter of fact, up to within two days.
I think the predictive analytics things are useful if it’s things like underrepresented minorities, first-generation college students, information like that, like more of the demographics of who my students are to figure out if there are pockets of the population that aren’t doing really well in the class.
3.3. Impediments to Instructional Data-Use Practices
3.3.1. Perceived Lack of Time to Engage in Instructional Data-Use Practices
Time constraints definitely hinder it [data-use practices], and they hinder actually doing anything innovative. That’s actually a huge problem.
That’s definitely something I’ve wanted to do more of, just the issue of where do you fit that into the curriculum, but I think that’s important, and I wish I was doing more.
A lot of times, I’ll do a short writing assignment, especially if I think they’re struggling with a concept, I’ll have them write about it. But there’s hundreds of them, so it’s difficult to get a lot out of that, although the students get a lot out of it.
Because of the class sizes and some people are teaching, some of the instructors are teaching four courses per quarter. I have another portion of my job is managing the math learning center, so I usually only teach three, but when they have that size classes, a large portion of the exam needs to be multiple choice.
3.3.2. Constraints Due to the Standardization of Course Content
I’m slowly trying to have conversations with the powers that be in the department to be adjustable [with doing exams] so maybe we’ll do group exams, or maybe we’ll try some other things other than just those very traditional midterm and final exam structures.
3.3.3. Perceived Lack of Confidence and Competence in Instructional Data-Use Practices
Yeah, they’re [instructional data-use practices] terrible. I know enough to reject a lot of common practices, but not enough to replace them with better alternatives. So I am really struggling with that right now. It’s not formulated at all.
I would say that’s probably the weakest part of my teaching practice. I’m not really formal about incorporating results of assessment into teaching, which sounds pretty bad. Yeah. Formative assessment, I read the literature, I drink the kool-aid, but that is the thing I drop the most in terms of my teaching practice. What I do is so informal. I don’t know if I can even describe it.
You know, gender-wise, honestly, I’ve come into this profession, and I’ve been an outsider. I’m not going to take something I do that’s different than what other faculty members do and advertise it. I may be very successful at it, but if I advertise it, there will be repercussions.
3.4. Engaging Students in Reflecting on Their Own Learning Data
Oh, no. Not aside for their own grades. They see averages and things like that. I guess that’s really professor-dependent, but for me, whenever I go over the exam, I always put out the bell curve and say this is the average, this is the standard deviation, this is the range of grades.
That’s the one where we add the question on their electronic evaluation of teaching, so in the electronic evaluation of teaching for the students, there’s a series of standard questions, I think there’s ten, how was the course basically, what was the instructor’s contribution to the course and there’s a few others, and then I add, and I advocate for all other faculty to do this as well, you add at the end of this course [a specific question related to the content].
[in the student’s voice] “I understood this material, I feel comfortable with this material,” and then they [students] produce a little bit of evidence and they will say things like “I am completely lost on buffer systems, I have no idea what is happening in a buffer system. I don’t even know what a buffer system is.” I think that’s part of the empowerment [of students]. I think that’s part of their confidence in that this seems to be very meta. So, students are plugged into their empowerment and their own understanding. They’re not looking at it as how I did on an exam. They’re looking at it as, I think I get this, I’m supposed to be learning these key concepts.
In some courses, I’ve gone as far as actually having students keep a journal of what they struggled with that they actually turn in with the homework. So there’s actually some “credit” awarded for going through that exercise. But I think the bigger value of that is getting the students themselves to reflect on their own learning.
When I’m teaching in the classroom, I also have them do some real-time writing. I think writing is a really good way to start to help them see what they don’t understand. So I have them do an individual note card where they write down an answer to a prompt, and then I have someone else, not me, read them, because there are seven hundred of them, and give me some summaries, and then I go back over that with them in the class as sort of a way to see if their thinking is right or what is a good response to these things versus what’s not a good response to these things. So those are kind of the way I think that they get to reflect on what they’re learning.
I often will do the muddiest point type thing, which has them reflect not so much on performance but on their level of understanding.
Sometimes they give them to me. They are in groups, they have assigned friends in my class, so they do share among their group members also.
It [student reflection on their own learning data] gets facilitated in smaller groups, in like a recitation situation. So normally my lecture would have a hundred people and then one day a week there’s four different classes of twenty-five. When [students] get their written homework back with some sort of marks on it, they’re encouraged to look over that and discuss the solutions that have been provided by me. They’re asked to compare and contrast between what their answer looks like, what the solution organization looks like. It’s the logical thought process of putting things together that I want them to focus on. So it’s sort of done in small groups, face-to-face discussions.
4. Discussion
4.1. Instructional Data-Use Practices and Motivations
4.2. Impediments to (Meaningful) Faculty Instructional Data-Use Practices
4.3. Supports of Faculty Data-Use Practices
4.4. Engaging Students in Reflecting on Their Learning
4.5. Further Recommendations
- Faculty leaders and professional development experts must foster ongoing and targeted professional development activities that support faculty toward improving their instructional data-use practices based on best practice research. Professional development activities can elicit perceptions and experiences that will help faculty see which instructional data-use practices afford them the greatest potential for analyzing instructional data that improve their teaching practices and student learning.
- Faculty leaders and faculty must commit to innovating and developing their use of instructional data-use practices by recognizing that research-based instructional data-use practices can inform their teaching and improve student learning. Faculty are encouraged to explore the potential for collecting more formative data and finding efficient ways to gather and use it in timely and relevant ways to inform their teaching and students’ learning.
- Faculty must recognize the shared responsibility for providing students the opportunity to reflect on and improve their learning. Fostering students’ ability to reflect on their learning has implications for students and faculty through increased learning and achievement for students and data that inform adjustments to teaching that can enhance student learning.
- Faculty are encouraged to take advantage of instructional technologies available to them to enhance the gathering, analysis, and response to instructional data. Increasing faculty competence in collecting and responding to data that involves instructional technologies is critical. Faculty leaders and professional development experts must guard against mandating technology that is not perceived as relevant and usable.
- Faculty leaders and faculty are encouraged to reevaluate curricular content development and processes that may stifle faculty instructional data-use practices and explore changes to policies and norms that promote more research-based instructional data-use practices.
4.6. Future Research
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
- Specifically: What is your official title?
- What classes did you teach this academic year, 2016–2017?
- Have your teaching responsibilities changed since last interviewed for this project?
- How much autonomy do you have over what and how you teach?
- (If yes), please provide detail regarding those interactions: including
- who? 1. Are these people in your discipline/department/program?
- how often?
- regarding what specifically?
- What encourages or discourages these interactions?
- Has (Name of Initiative) influenced these interactions in any way?
- What has been your affiliation with the (Name of Initiative) project? What activities have you attended?
- Have you noted any impact of (Name of Initiative) on you?
- Have you noted any impact of (Name of Initiative) on others?
- Have any university or departmental initiatives or teaching professional development opportunities impacted this evolution?
- Has (Name of Initiative) influenced your evolution in any way?
- To what extent do you collect data/information about student learning?
- Are you teaching practices informed by data/information about student learning?
- Are there means in the classes/courses that you teach for students to reflect on their own learning data? (If yes), Can you detail these processes?
- Overall, what do you consider as the most effective teaching strategies towards developing these things?
- To what extent do you employ these teaching strategies?
- What do you think about this goal and strategy? Do you have any evidence that widespread improvement to teaching practices and learning outcomes in undergraduate STEM education have happened in the last couple of years at (Name of University)?
- Can you attribute any changes to the (Name of Initiative) project?
- Have you noted any affordances and barriers towards widespread improvement to teaching practices and learning outcomes in undergraduate STEM education, that can inform efforts like (Name of Initiative)?
- What do you think about this goal and strategy?
- Do you have any evidence that promote active learning and cooperative learning has increased in large, introductory, gateway courses in the last couple of years at (name of university)?
- (If so) Can you attribute any changes to the (Name of Initiative) project?
- Have you noted any affordances and barriers towards active learning and cooperative learning has increased in large, introductory, gateway courses, that can inform efforts like (Name of Initiative)?
Appendix B
Code Names | Code Descriptions |
Narrative Data Practices | Informal methods of collecting student learning data. Different formats: (a) examination of curricular artifacts, (b) notes to self about curricular artifacts, (c) qualitative data such as exit slips or muddy point exercises. |
Numeric Data Practices | Numeric data practices include formative data, summative data, and student evaluations. |
| This code indicates strategies that were formative in nature and usually done throughout the course. Data are collected on the spot using technology such as clickers and online pre-quiz activities. Data are analyzed in real time and changes or adjustments are made to teaching practices and decisions quickly. |
| Commonly used forms of assessment such as weekly homework assignments and quizzes, midterms, final exams, and essays. Analyzed at the end of a section or term to guide decisions about teaching and course design for the next term. |
| Data collected from students at the end of the term regarding feedback related to instructor’s teaching practices. Generally done through an institution-wide process. |
Verbal Data Practices | Verbal data collected by the instructor either through talking with students or talking with teaching assistants or other instructors. |
Sample of Codes and Definitions |
References
- Bouwma-Gearhart, J.; Collins, J. What We Know about Data-Driven Decision Making in Higher Education: Informing Educational Policy and Practice. In Proceedings of the International Academic Conferences, Florence, Italy, 16–19 September 2015; pp. 89–131. [Google Scholar]
- Ewell, P.T.; Kuh, G.D. The state of learning outcomes assessment in the United States. High. Educ. Manag. Policy 2010, 22, 1–20. [Google Scholar]
- Jenkins, D.; Kerrigan, M.R. Evidence-Based Decision Making in Community Colleges: Findings from a Survey of Faculty and Administrator Data Use at Achieving the Dream Colleges; Columbia University: New York, NY, USA, 2008. [Google Scholar]
- Bouwma-Gearhart, J. Bridging the Disconnect between How We Do and Teach Science: Cultivating a Scientific Mindset to Teach in an Era of Data-Driven Education; IAP—Information Age Publishing Inc.: Charlotte, NC, USA, 2021. [Google Scholar]
- Hora, M.; Bouwma-Gearhart, J.; Park, H. Data driven decision-making in the era of accountability: Fostering faculty data cultures for learning. Rev. High. Educ. 2017, 40, 391–426. [Google Scholar] [CrossRef]
- McClenney, K.M.; McClenney, B.N.; Peterson, G.F. A culture of evidence: What is it? Do we have one? Plan. High. Educ. 2007, 35, 26–33. [Google Scholar]
- Bouwma-Gearhart, J.; Ivanovitch, J.; Aster, E.; Bouwma, A. Exploring postsecondary biology educators’ planning for teaching to advance meaningful education improvement initiatives. CBE Life Sci. Educ. 2018, 17, ar37. [Google Scholar] [CrossRef]
- Bouwma-Gearhart, J.; Hora, M. Supporting faculty in the era of accountability: How postsecondary leaders can facilitate the meaningful use of instructional data for continuous improvement. J. High. Educ. Manag. 2016, 31, 44–56. [Google Scholar]
- Meyer, J.W.; Rowan, B. Institutionalized organizations: Formal structure as myth and ceremony. Am. J. Sociol. 1977, 83, 340–363. [Google Scholar] [CrossRef] [Green Version]
- Spillane, J.P. Data in practice: Conceptualizing the data-based decision-making phenomena. Am. J. Educ. 2012, 118, 113–141. [Google Scholar] [CrossRef] [Green Version]
- Ahren, C.; Ryan, H.G.; Massa-McKinley, R. Assessment matters: The why and how of cracking open and using assessment results. Campus 2008, 13, 29–32. [Google Scholar] [CrossRef]
- Coburn, C.E.; Turner, E.O. The practice of data use: An introduction. Am. J. Educ. 2012, 118, 99–111. [Google Scholar] [CrossRef] [Green Version]
- Datnow, A.; Hubbard, L. Teacher capacity for and beliefs about data-driven decision making: A literature review of international research. J. Educ. Chang. 2016, 17, 7–28. [Google Scholar] [CrossRef]
- Halverson, R.; Prichett, R.; Grigg, J.; Thomas, C. The New Instructional Leadership: Creating Data-Driven Instructional Systems in Schools. WCER Working Paper No. 2005-9. Wis. Cent. Educ. Res. 2005, 25, 447–481. [Google Scholar]
- Mandinach, E.B. A perfect time for data use: Using data-driven decision making to inform practice. Educ. Psychol. 2012, 47, 71–85. [Google Scholar] [CrossRef]
- Blaich, C.F.; Wise, K.S. Moving from assessment to institutional improvement. New Dir. Institutional Res. 2010, 2010, 67–78. [Google Scholar] [CrossRef]
- Andrews, T.C.; Lemons, P.P. It’s personal: Biology instructors prioritize personal evidence over empirical evidence in teaching decisions. CBE Life Sci. Educ. 2015, 14, ar7. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Anderson, G.; Gregory, M.; Sun, J.C.; Alfonso, M. Effectiveness of statewide articulation agreements on the probability of transfer: A preliminary policy analysis. Rev. High. Educ. 2006, 29, 261–291. [Google Scholar] [CrossRef]
- Evans, C. Making sense of assessment feedback in higher education. Rev. Educ. Res. 2013, 83, 70–120. [Google Scholar] [CrossRef] [Green Version]
- Nicol, D.J.; Macfarlane-Dick, D. Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Stud. High. Educ. 2006, 31, 199–218. [Google Scholar] [CrossRef]
- Taras, M. Using assessment for learning and learning from assessment. Assess. Eval. High. Educ. 2002, 27, 501–510. [Google Scholar] [CrossRef]
- Kuh, G.D.; Ikenberry, S.O.; Jankowski, N.A.; Cain, T.R.; Ewell, P.T.; Hutchings, P.; Kinzie, J. Using Evidence of Student Learning to Improve Higher Education; Jossey-Bass: San Francisco, CA, USA, 2015; ISBN 13: 1118903391. [Google Scholar]
- Hora, M.; Holden, J. Exploring the role of instructional technology in course planning and classroom teaching: Implications for pedagogical reform. J. Comput. High. Educ. 2013, 25, 68–92. [Google Scholar] [CrossRef]
- Lester, J.; Klein, C.; Rangwala, H.; Johri, A. Learning analytics in higher education. ASHE High. Educ. Rep. 2017, 43, 1–145. [Google Scholar] [CrossRef]
- Austin, A.E. Promoting evidence-based change in undergraduate science education, Paper commissioned by National Academies Research Council Board on Science Education. In Fourth Committee Meeting on Status, Contributions, and Future Directions of Discipline-Based Education Research; Michigan State University: Michigan, MI, USA, 2011. [Google Scholar]
- Fairweather, J. Linking evidence and promising practices in science, technology, engineering, and mathematics (STEM) undergraduate education. In Board of Science Education, National Research Council; The National Academies: Washington, DC, USA, 2008. [Google Scholar]
- Klein, C.; Lester, J.; Rangwala, H.; Johri, A. Technological barriers and incentives to learning analytics adoption in higher education: Insights from users. J. Comput. High. Educ. 2019, 31, 604–625. [Google Scholar] [CrossRef]
- Poulos, A.; Mahony, M.J. Effectiveness of feedback: The students’ perspective. Assess. Eval. High. Educ. 2008, 33, 143–154. [Google Scholar] [CrossRef]
- Sadler, D.R. Beyond feedback: Developing student capability in complex appraisal. Assess. Eval. High. Educ. 2010, 35, 535–550. [Google Scholar] [CrossRef] [Green Version]
- Ryan, M.; Ryan, M. Chapter 2—A model for reflection in the pedagogic field of higher education. In Teaching Reflective Learning in Higher Education A Systematic Approach Using Pedagogic Patterns; Ryan, M., Ed.; Springer International Publishing AG: Cham, Switzerland, 2015; pp. 15–30. [Google Scholar]
- Halverson, R.R. Systems of practice: How leaders use artifacts to create professional community in schools. Educ. Policy Anal. Arch. 2003, 11, 37. [Google Scholar] [CrossRef] [Green Version]
- Gibson, J. The theory of affordances. In Perceiving, acting, and knowing; Shaw, R.E., Bransford, J., Eds.; Lawrence Erlbaum Associates: Hillsdale, NJ, USA, 1977. [Google Scholar]
- Norman, D.A. Affordance, conventions, and design. Interactions 1999, 6, 5. [Google Scholar] [CrossRef]
- Bouwma-Gearhart, J.; Lenz, A.; Ivanovitch, J. The interplay of postsecondary science educators’ problems of practice and competencies: Informing better intervention designs. J. Biol. Educ. 2018, 52, 1–13. [Google Scholar] [CrossRef]
- Hora, M. Organizational factors and instructional decision-making: A cognitive perspective. Rev. High. Educ. 2012, 35, 207–235. [Google Scholar] [CrossRef]
- Carnegie Classification the Carnegie Classifications of Institutions of Higher Education. Available online: https://carnegieclassifications.iu.edu/ (accessed on 17 April 2020).
- Auerbach, C.F.; Silverstein, L.B. Qualitative Data: An introduction to Coding and Analysis; New York University Press: New York, NY, USA, 2003. [Google Scholar]
- Montgomery, P.; Bailey, P.H. Field notes and theoretical memos in grounded theory. West. J. Nurs. Res. 2007, 29, 65–79. [Google Scholar] [CrossRef]
- Creswell, J.W. Research Design Qualitative, Quantitative, And Mixed Methods Approaches, 4th ed.; SAGE: Los Angeles, CA, USA, 2014. [Google Scholar]
- Guinier, L. The Tyranny of the Meritocracy: Democratizing Higher Education in America; Beacon Press: Boston, MA, USA, 2015. [Google Scholar]
- McLaren, P. Critical pedagogy: A look at the major concepts. In The Critical Pedagogy Reader; Darder, A., Torres, R.D., Baltodano, M., Eds.; Routledge: New York, NY, USA, 2017; pp. 56–78. [Google Scholar]
- Kuh, G.D. What we’re learning about student engagement from NSSE: Benchmarks for effective educational practices. Change 2003, 35, 24–32. [Google Scholar] [CrossRef]
- Kahu, E.R.; Nelson, K. Student engagement in the educational interface: Understanding the mechanisms of student success. High. Educ. Res. Dev. 2018, 37, 58–71. [Google Scholar] [CrossRef]
- Shapiro, C.A.; Sax, L.J. Major selection and persistence for women in STEM. New Dir. Institutional Res. 2011, 2011, 5–18. [Google Scholar] [CrossRef] [Green Version]
- Bouwma-Gearhart, J. Teaching Professional Development of Science and Engineering Professors at a Research-Extensive University: Motivations, Meaningfulness, Obstacles, and Effects; University of Wisconsin-Madison: Madison, WI, USA, 2008. [Google Scholar]
- Bouwma-Gearhart, J. Research university STEM faculty members’ motivation to engage in teaching professional development: Building the choir through an appeal to extrinsic motivation and ego. J. Sci. Educ. Technol. 2012, 21, 558–570. [Google Scholar] [CrossRef]
- Fisher, K.Q.; Sitomer, A.; Bouwma-Gearhart, J.; Koretsky, M. Using social network analysis to develop relational expertise for an instructional change initiative. Int. J. STEM Ed. 2019, 6, 1–12. [Google Scholar]
- Pajares, M.F. Teachers’ beliefs and educational research: Cleaning up a messy construct. Rev. Educ. Res. 1992, 62, 307–332. [Google Scholar] [CrossRef]
- Bouwma-Gearhart, J.; Adumat, S. Fostering successful interdisciplinary postsecondary faculty collaborations. Int. J. Univ. Teach. Fac. Dev. 2011, 2, 207. [Google Scholar]
- Reinholz, D.L.; Matz, R.L.; Cole, R.; Apkarian, N. STEM is not a monolith: A preliminary analysis of variations in STEM disciplinary cultures and implications for change. CBE Life Sci. Educ. 2019, 18, mr4. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bouwma-Gearhart, J.; Sitomer, A.; Fisher, K.; Smith, C.; Koretsky, M. Studying organizational change: Rigorous attention to complex systems via a multi-theoretical research model. In Proceedings of the 2016 ASEE Annual Conference & Exposition, New Orleans, LA, USA, 26 June 2016. [Google Scholar]
Physics | Chemistry | Biology | Mathematics | Engineering | Total | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
n | % | n | % | n | % | n | % | n | % | n | % |
12 | 9 | 18 | 14 | 13 | 10 | 27 | 21 | 57 | 45 | 127 | 30 |
Survey Item | M1 | SD |
---|---|---|
I know how to gather, analyze, and respond to data that informs my teaching. | 2.75 | 0.94 |
I regularly gather, analyze, and respond to data that informs my teaching. | 2.50 | 1.04 |
I am committed to gathering, analyzing, and responding to data that informs my teaching. | 2.74 | 1.02 |
Discipline | Participant | Professional Position |
---|---|---|
Physics | Robin | Fixed-Term Faculty |
Jamie | Tenure-Track Faculty | |
Chemistry | Jordan | Fixed-Term Faculty |
Alex | Fixed-Term Faculty | |
Sidney | Fixed-Term Faculty | |
Casey | Fixed-Term Faculty | |
Tracy | Tenure-Track Faculty | |
Biology | Jodi | Fixed-Term Faculty |
Peyton | Fixed-Term Faculty | |
Mathematics | Leslie | Tenure-Track Faculty |
Jackson | Fixed-Term Faculty | |
Madison | Fixed-Term Faculty | |
Kelly | Fixed-Term Faculty | |
Drew | Tenure-Track Faculty | |
Shannon | Tenure-Track Faculty | |
Engineering | Lee | Tenure-Track Faculty |
Bailey | Tenure-Track Faculty | |
Logan | Tenure-Track Faculty | |
Lynn | Tenure-Track Faculty |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lenhart, C.; Bouwma-Gearhart, J. STEM Faculty Instructional Data-Use Practices: Informing Teaching Practice and Students’ Reflection on Students’ Learning. Educ. Sci. 2021, 11, 291. https://doi.org/10.3390/educsci11060291
Lenhart C, Bouwma-Gearhart J. STEM Faculty Instructional Data-Use Practices: Informing Teaching Practice and Students’ Reflection on Students’ Learning. Education Sciences. 2021; 11(6):291. https://doi.org/10.3390/educsci11060291
Chicago/Turabian StyleLenhart, Cindy, and Jana Bouwma-Gearhart. 2021. "STEM Faculty Instructional Data-Use Practices: Informing Teaching Practice and Students’ Reflection on Students’ Learning" Education Sciences 11, no. 6: 291. https://doi.org/10.3390/educsci11060291
APA StyleLenhart, C., & Bouwma-Gearhart, J. (2021). STEM Faculty Instructional Data-Use Practices: Informing Teaching Practice and Students’ Reflection on Students’ Learning. Education Sciences, 11(6), 291. https://doi.org/10.3390/educsci11060291