Next Article in Journal
Self-Assessment in Reading Competence: Memory, Attention, and Inference
Previous Article in Journal
Is Consuming Avocados Equally Sustainable Worldwide? An Activity to Promote Eco-Social Education from Science Education
Previous Article in Special Issue
Understanding School Leadership’s Influence on Teacher Retention in High-Poverty Settings: An Exploratory Study in the U.S.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Use of Research in Schools: Principals’ Capacity and Contributions

by
Elizabeth N. Farley-Ripple
School of Education, University of Delaware, Newark, DE 19716, USA
Educ. Sci. 2024, 14(6), 561; https://doi.org/10.3390/educsci14060561
Submission received: 18 February 2024 / Revised: 8 May 2024 / Accepted: 10 May 2024 / Published: 23 May 2024
(This article belongs to the Special Issue Educational Leadership in School Improvement)

Abstract

:
Policy expectations for the role of research evidence in educational decision-making have grown exponentially in the U.S. and globally, yet there has been limited attention to school capacity to engage in evidence-informed improvement. In this paper, I address this gap by first conceptualizing principal leadership for evidence use and, second, use this conceptual lens to examine large-scale survey data about school evidence use practices and capacity. Drawing on data from a national survey of more than 4000 educators in 134 schools in the US, I explore school practices and capacity to use research and surface opportunities and needs for principal leadership in evidence-informed improvement. Findings suggest that there is an opportunity to improve the role of research in school improvement decision-making, and that principals may contribute to school capacity in specific ways that relate to developing human capital, influencing culture, leveraging resources, and shaping decision-making. Data reveal moderate evidence of research use in agenda setting and of organizational routines that support research use, but lack of uptake of those routines for research use as well as limited investment in resources (e.g., time). Further, decision-making was distributed across a wide range of improvement initiatives, with evidence of a lack of clarity about goals. Although principals report confidence and experience with using research, overall school staff also reported limited experience with prior research, including coursework or participation in research, and low confidence in critically consuming research. Implications point to the need to strengthen principals’ own evidence use capacity as well as focus on dimensions school capacity as part of evidence use initiatives. Recommendations suggest strategies for developing principals’ knowledge and skills around leadership for evidence-informed improvement.

1. Introduction

Expectations regarding the role of research evidence in educational improvement efforts have grown exponentially across the globe. In the U.S., beginning with No Child Left Behind (NCLB) and continuing through the Every Student Succeeds Act, federal funding for education requires the use of evidence in improvement efforts. More recently, in the wake of the COVID-19, the imperative to implement effective strategies for mitigating unfinished learning and learning loss has heightened those expectations, with Elementary and Secondary School Emergency Relief (ESSER) funding requiring evidence-based interventions [1,2].
Underlying these expectations are a number of assumptions about how evidence use—including the use of education research specifically—should play out in school contexts [3,4]. For example, there is an underlying assumption that mandated or incentivized research use will lead to changes in decision-making processes and school improvement efforts. Similarly, there are assumptions about evidence use itself, including the emphasis placed on research in decision making, and the relevance and availability of research to address the challenges schools face—particularly the unprecedented challenges presented by the pandemic.
Perhaps most importantly, expectations assume that schools and educators have the capacity to find, access, interpret, and take up findings from research in their policies and practices. Empirical studies, discussed later in this paper, consistently reveal that evidence use is a multifaceted, complex practice [5], influenced by various organizational conditions. Additionally, many aspects of capacity are important for supporting evidence-informed improvement, including individuals’ knowledge, skills, and beliefs, as are schools’ norms, culture, structure, and leadership. Despite these findings, the United States has seen relatively scant investment in building such capacity at the school level. Efforts in this have predominantly focused on improving accessibility to research through initiatives like clearinghouses [6] and collaborative research endeavors like research–practice partnerships (RPPs) [7]. However, there has been a notable lack of attention paid to understanding and building organizational capacity for evidence use within schools, in contrast to, for example, the U.K, where there have been longstanding efforts to promote “research engaged schools” (e.g., [8,9,10]), albeit with mixed success [11].
In addressing this gap, particularly in the U.S. context, it is important to recognize the pivotal role of principals in shaping school capacity for evidence-informed improvement. Principals serve as critical levers in integrating research into school practice, acting as gatekeepers and brokers that influence what research is incorporated into decision-making processes [12,13,14]. They are also instrumental in mediating external policy demands within their organizational context, supporting sensemaking, and facilitating the implementation of evidence-based practices among staff (e.g., [15,16,17,18,19,20,21]). Moreover, these leaders are a critical lever in organizational change, ultimately influencing a wide range of outcomes, ranging from student achievement to the creation of more affirming and equitable learning environments [22,23]. They also exert indirect influence over a number of critical aspects of schooling, including those in-school conditions that enable instructional improvement [22,24,25]. Accordingly, leadership itself is a central component of, as well as a key mechanism for building, school capacity [26,27,28].
Presently, there is little empirical evidence about school capacity in terms of evidence-informed improvement and, more specifically, the role of school leaders—principals in particular—in building capacity for this work. Amid calls for the greater use of research evidence, a lack of evidence limits our understanding of how to support schools and their leaders in engaging with research evidence and serves as a barrier to achieving goals relating to evidence use across the US education system. This paper seeks to address this gap by first conceptualizing principal leadership in evidence use and, second, using this conceptual lens to examine large-scale survey data about school evidence-use practices and capacity. I am guided by the following overarching research question: In what ways can principals contribute to building schools’ capacity for evidence-informed improvement in the US?

2. Conceptualizing the Principal’s Role in Evidence-Informed Improvement

2.1. Evidence, Research, and Improvement

The unstated but understood logic of evidence-use policies and initiatives is that the use of evidence is a lever for systemic and school improvement [4,29,30,31]. In spite of this widely held belief regarding evidence-informed improvement, there is little shared understanding of what that means in practice. In fact, the role of evidence is described in many ways in the literature and in practice, varying in terms of both what is meant by evidence and what is meant by its use. Federal policy and discourse tend to privilege research and evaluation (for example, in the use of “scientific research” under NCLB), as well as systematically collecting administrative data about teaching and learning (e.g., student demographic and assessment data). Even within the broader umbrella of scientific research, there are many different understandings about what “counts” as evidence that should inform action [32,33,34,35]. Still, studies of evidence use consistently find that other sources of evidence are frequently used to inform policy and practice, including anecdotal information, experiential knowledge, expert advice, community input, among others (e.g., [4,36,37,38]). Furthermore, there are differences in the meaning of various phrases that invoke evidence, with evidence-informed or research-informed being distinct from evidence- or research-based [39]. Additionally, the use of evidence is multidimensional. Two distinct ways of framing this issue include the implementation of evidence-based practices and the use of evidence to inform educational decisions [40]. Use is also widely understood as instrumental e.g., within decisions, such as the adoption of curriculum based on research evidence), conceptual (e.g. engaging with evidence to develop new understandings or ideas about improvement work, such as through frameworks), and tactical (e.g., the use evidence to justify or support a particular course of action).
It is important to acknowledge that all of these conceptualizations of evidence and use may be part of school improvement, which includes all efforts designed to generate growth in student learning and other educational outcomes. They offer valid and valuable ways of considering the role of evidence in education, although it is beyond the scope of this paper to elaborate on the underlying assumptions and merits of each. Rather, for the purposes of this paper, I refer primarily to the use of research, both in order to focus on a particular form of evidence and to be inclusive of the various ways research may influence schools’ work. It is worth noting that this broader conceptualization is also a limitation, as there may be important nuances in schools’ evidence-use and improvement practices that are not able to be explored as a result.

2.2. Capacity for Research Use

Newmann, King, and Youngs describe school capacity as “the collective power of the full staff to improve student achievement schoolwide [28] (p. 261)”, and the subsequent literature has identified many dimensions of that collective power. King and Bouchard [41], building off prior syntheses, define key dimensions of capacity as inclusive of teachers’ knowledge, dispositions, and skills; professional community; technical resources; program coherence; and principal leadership. Thoonen and colleagues [42] productively frame the literature on school capacity as consisting of two views: inside and outside. The inside view of capacity, much like that of King and Bouchard [41], draws on the literature on organizational change and learning, emphasizing structural, cultural, and political conditions which support or constrain capacity for improvement. Summarizing prior studies, they point to practices such as participatory decision making, teacher collaboration, an open and trustful climate, cultures which value shared responsibilities and values, and transformational leadership as the characteristics of capacity for improvement. Thoonen and colleagues add an external view of capacity to this understanding, attending to the implementation of new policies and practices, including the scale-up of reforms. From this external lens, the capacity to reflect conditions such as strong principal leadership, educator beliefs and buy-in, resources and time, and district support are necessary for successful implementation.
Studies of research use in education suggest that both internal and external views of capacity are relevant to whether and how research informs improvement efforts. For example, prior research establishes the importance of and variability in individuals’ confidence and ability to effectively analyze and apply evidence, their beliefs about the role and value of evidence to their work, and perceptions of feasibility and available resources (e.g., [43,44,45,46,47,48,49,50]). Similarly, research suggests that decision makers’ individual experiences and beliefs impact the way in which evidence is processed (e.g., [51,52,53,54]). Studies also show that educators’ and leaders’ own professional networks, including both their relationships with colleagues and connections to external resources, matter in terms of the kinds of evidence they search for and use [37,50,55].
Beyond individuals, studies consistently find that the organizational context enables or constrains research use. Schools and districts, as well as other levels of the education policy system, vary in terms of the human and financial resources available to support evidence use; the allocation of time for evidence use; whether their culture features trust, collaboration, and norms for evidence use; leadership; and structures and processes that facilitate the communication of and about evidence [3,16,43,44,49,56,57,58].

2.3. Principal Contributions to Capacity for Research Use

Within school capacity, leadership is conceptualized as having a critical role. For example, building on prior conceptualizations, Dimmock [26] assessed leadership as having the “aim of building capacity by optimizing available resources towards the achievement of shared goals (p. 7)”. The prior literature clearly established that leadership figures prominently as a critical element of school capacity in general, as does the capacity for research use [8,59]. In fact, Brooks and colleagues argue that “the effective use of data/evidence and research to inform school improvement changes is now considered to be a foundational aspect of instructional principals’ educational ‘toolkit’ [60] (p. 168)”. Biddle and Saha [61] further explain that principals’ influence on research use can be both indirect, by building school capacity, and direct, through their own use of research.
In terms of principals’ own capacity and practice, several studies offer insight. For example, Penuel and colleagues [62] find frequent reports of using research instrumentally and conceptually, of high value being assigned to research, of moderate effort being dedicated to finding and accessing educational research, and the use of professional networks to access research. These findings are similar to Biddle and Saha’s earlier study of U.S. and Australian principals, which found that most school leaders value educational research, are regularly exposed to and access information about educational research, and engage their schools in using that information [6]. Principals’ own capacity can also influence schools’ evidence use, through modeling, for example, as prior studies suggest principals “play a key role in nurturing or disrupting how research is used in schools depending on how they conceive of what constitutes valid evidence [44], cited [60]”.
Indirect influences on capacity can be understood in several ways. First, leaders can develop human capital for URE through staffing decisions and professional development [57,63,64,65]. Similarly, in schools characterized as deep users of research, school and district leaders hire staff with specific evidence-use capabilities, build capacity through selecting staff to serve on teams that engage with evidence in improvement decisions, and engage consultants or district supports to assist teachers with evidence use [66]. Leaders can also influence the culture of evidence use through the establishment of organizational routines and tools centered on evidence use [43,59,67,68]; the allocation of resources to enable evidence use [57,65]; cultivating a culture of collaboration, trust, and inquiry [65,67,69]; strengthening teams and interpersonal processes [67]; and fostering internal social capital and networks [57,65]. Relatedly, they can leverage external resources and expertise by building connections to and external networks for accessing research [61,63,64,66], buffering external influences that distract from evidence-based practice [65,67], and advocating for the use of evidence-based practices [65]. Last, leaders can shape decision-making processes. This includes diagnostic and tactical leadership to better understand school processes and context [67], forming goals linked to organizational needs [65], developing shared leadership [65,67], and establishing an evidence-based agenda [57].

3. The Present Study

Drawing on the literature, I conceptualize the relationship between school capacity for research use and principals’ contributions to school capacity, as illustrated in Figure 1. With this conceptualization in mind, the purpose of this paper is to answer the question: In what ways can principals contribute to building schools’ capacity for evidence-informed improvement? This is accomplished by examining schools’ capacity for and use of research use in the U.S. context (i.e., the right side of Figure 1) and using principals’ contributions (i.e., the left side of Figure 1) as a lens for identifying specific needs and opportunities for principals to build school capacity for the use of research in educational improvement. Specifically, I use large-scale survey data from a national sample of U.S. schools to describe (1) the extent to which research evidence informs school decisions, (2) schools’ capacity for research use, (3) principals’ capacity (e.g., knowledge and skills) for research use, and (4) the ways in which principals may contribute to improving schools’ capacity for research use.
Analyses feature cross-sectional survey data from a national study of schools’ use of research conducted by the Center for Research Use in Education between 2018 and 2020. The Survey of Evidence in Education-Schools (SEE-S) [70] is organized around a conceptual framework [71] that identifies multiple dimensions of schools’ research use practice and the factors that shape those processes. The larger project uniquely explores the role of different forms of evidence and information in school decision making at scale, enabling a landscape view of evidence-use practices in the context of U.S. schools.

3.1. Sample

The SEE-Swas administered online to schools’ instructional staff, including teachers, coaches, other specialists, administrators, and paraprofessionals, and to a member of the district office—which I refer to collectively as practitioners or educators throughout this report—during the 2018–2019 and 2019–2020 school years. A total of 134 traditional public schools from 21 districts, as well as 20 schools from 10 charters (5 rural, 13 suburban, and 13 urban), were successfully recruited into the sample for the SEE-S field trial administration. These schools and districts represent 18 different states, with the proportions of elementary, middle, and high schools mirroring national proportions. The overall individual-level response rate for the SEE-S was 53%. Response rates by school ranged from 0% to 100%, with the average school response rate being 56%. The final sample comprised 4415 school-based practitioners, including 25 district staff. About 4% (n = 181) of the sample identified as a school administrator (principal or assistant principal; heretofore, principals), 64% (n = 2818) identified as a classroom teacher, 10% (n = 420) identified as a special education teacher, 7% (n = 298) as an arts or elective teachers, and 6% identified as an interventionist or coach (n = 255), with the remaining respondents occupying other instructional support positions in schools. In this paper, I attend specifically to principals as school leaders, though school leadership can be construed more broadly to include other formal leaders (e.g., leadership teams, coaches), informal leaders, and all other respondents as school staff.

3.2. Data and Measures

The SEE-S survey was designed to capture multiple dimensions of school-based decision making and factors related to research use, as informed by the prior literature, and organized using the center’s conceptual framework [71].Our multi-year survey design process was informed by standards for educational and psychological assessment [72,73] and consisted of open-ended interviews with educators (n = 18), researchers (n = 28), and educational intermediaries (n = 32); a blueprint for items; cognitive interviews with 35 educators; and two rounds of piloting across a total of 64 schools. To examine the properties of survey scales, we first conducted exploratory factor analysis with an oblimin rotation, with a weight of zero given to our assumption that the sub-factors were correlated. The number of factors extracted for each set of items was determined based on the examination of scree plots and the interpretation of subsets of items relative to our measurement blueprint and conceptual framework; then, we verified our findings via confirmatory factor analysis (CFA).
There are five sections in the survey that mirror the Center’s conceptual framework: the depth of research use in decision making, perspectives about research and practice, networks through which research travels, capacity, and brokerage. For the purposes of this paper, I focus on items that are consistent with the dimensions of capacity associated with research-informed improvement and which the prior literature suggests are connected to principals’ roles. Appendix A includes more complete item descriptions and psychometric details, as appropriate. For further details on the survey design, please see author and colleagues [74].
Evidence use. To capture current practices related to evidence use, I rely on items related to school decision making and the sources of evidence reported to be influential in those decisions. With reference to respondent-reported organizational decisions, educators were asked about fourteen types of evidence and the extent of their influence on the decision. For items relating to research evidence, we used an open-ended item asking for a piece of research that was used (e.g., name of a product, link, reference) to validate responses. Responses were validated using a rubric designed to (a) confirm the existence of research cited by the respondent, (b) classify the citation as direct versus indirect (e.g., providing information about the research source versus the name or entity who provided the research), and (c) classify the type of citation (e.g., a book, an author’s name, a research article, etc.—see author and colleagues [74] for more information).
Individual capacity. Several items and scales within the SEE-S are used to measure individual and school capacity for evidence use. Staff knowledge and experience are captured using items asking about specific training and experience related to engaging with evidence. This information is evaluated on a scale constructed from seven items related to self-reported capacity to critically consume research (α = 0.97). Beliefs about research are captured from educators’ perspectives on the value of research in addressing problems of practice. Five items indicate the extent of agreement with statements about the value of research. These have high internal consistency (α =.88) and are used to generate a scale based on item means. Additionally, capacity is conceptualized as access to external resources through professional networks. I capture personal networks for accessing research through open-ended items asking respondents to identify up to 10 individuals, 10 organizations, and 10 media sources they rely on to connect with educational research which are then categorized as direct ties to research, ties through intermediary organizations, or local ties (e.g., within school or district).
Organizational capacity. To indicate cultures of research use, I utilize SEE-S items related to school processes, such as guidelines for using research, and incentives for using research. Individual items provide insight into the prevalence of these elements across schools and also into their internal coherence as a scale (processes and incentives, α = 0.89). Other dimensions of culture related to research use, as noted in the prior literature, relate to the nature of relationships within schools. Items related to knowledge brokerage, including the sharing of research and other types of knowledge, are used as indicators of these aspects of culture. In particular, I focus on items asking about the frequency with which educators share research evidence as well as on local research (district-, school- or student-generated research). These data are combined into a scale (brokerage, α = 0.85) constructed of the item mean. Other dimensions of organizational capacity pertain to resources for supporting access and engagement with research. The survey includes two sets of items that capture resources that support research use. The first is a two-part item about local structures; respondents were asked about the availability of these structures. Then, a follow-up item asked how frequently those structures supported educators in connecting research and practice. Structures include general school structures, such as PLCs, and research-specific structures, such as subscriptions to research-based periodicals.
A final dimension of organizational capacity is the decision process. Two open-ended items captured the nature of organizational decisions (i.e., decisions about policy and practice made at the school or district level that affect a significant number of teachers and/or students) made in the current or prior school year. They were then asked what decision was made and why (i.e., what challenge or problem did it address?). Thirty percent (n = 1343) of respondents answered the questions about an organizational decision. Sixty percent (n = 2660) of respondents were routed to the items focused on personal-practice decisions, which are not reported here. Notably, all roles were represented in both paths, with administrators more likely to report on organizational than individual decisions (68.5 versus 28.2%). Overall, 1343 educators reported on an organizational decision. These same individuals were asked who participates in decision making, with specific reference to earlier responses about organizational decisions. Twenty categories of participants are listed, with the option to indicate involvement in gathering evidence, evaluating evidence, or making decisions, or no involvement. Because schools vary widely in their broader district and community context, I focus here on immediate members of the school community: principals, classroom teachers, special educators, coaches, support staff, parents, and students.

3.3. Analysis and Interpretation of Findings

As the purpose of this paper is to use national data to explore principals’ opportunities for and contributions to evidence-informed improvement, my analytical approach is descriptive, presenting summaries of three different item types. For closed-ended survey items and scales, I present frequencies, means, standard deviations, and other distributional statistics for responses to individual and organizational items in order to establish a portrait of capacity for research use. When describing principals’ capacity, I disaggregate their responses from other respondents and, where appropriate, test the statistical significance of these differences. When describing networks for accessing research, I utilize an ego network data approach to analyze open-ended data about the resources to which educators turn. Responses are coded into categories (e.g., web-based resources, professional associations, etc.), which are subsequently coded as local, external, or directly related to research. Respondents’ set of resources are then characterized in terms of the size and composition of their network. Finally, when describing organizational decisions, I present summaries of codes about problems and decisions. These are generated from an emergent framework developed by the research team. Specifically, the content and types of responses were categorized through an iterative discussion in which different codes were created, tested with sample responses, and modified. Through this process, multiple categories of problems and decisions were identified. Inter-rater reliability achieved 80% agreement among the research team.
Findings are interpreted through the lens of the four mechanisms through which principals may contribute to the capacity for evidence-informed improvement: developing human capital, influencing culture, leveraging resources, and shaping decision processes. Specifically, each set of findings was considered in terms of the needs or opportunities they suggested for principal leadership in these areas.

3.4. Limitations

There were several limitations to this study. First, the survey was designed for and administered to a national sample of U.S. schools. The U.S. context is distinct in its organization and governance, and conclusions should be understood with that in mind. Second, the survey’s research relied on respondent perception and recall, offering important but partial insight into school capacity. As acknowledged widely in the research-use literature (e.g., [75]), multiple methods are needed to fully understand a phenomenon. Third, the survey does not directly observe leadership for research use, but rather examines current school capacity and practice as a means of revealing leadership needs and opportunities. Additional research directly observing practice would make an important contribution to this growing area of work. Last, as described earlier, “research” and “use” are complex concepts and there are many different understandings of their meaning in research, policy, and practice. Survey responses therefore may reflect different ideas about research use. Relatedly, research is but one type of evidence schools are expected to use, and any findings are unable to speak to other evidence-use practices.

4. Results and Discussion

In this section, I present the results of the analyses conducted and interpret those findings through the lens of principal contributions, described in the conceptual framework. First, I present an overview of the use of research evidence in school decision making as the context for studying principals’ leadership in schools’ use of research for improvement. Second, I present results regarding the various dimensions of school capacity that are within the sphere of principals’ work and influence, as described earlier: staff knowledge, skills, beliefs, and networks; cultures, structures, and resources that support engagement with and use of research; and decision-making processes. These findings are offered as a means by which to identify opportunities for strengthening the role of research in school improvement, and by proxy, opportunities for principals’ leadership in those areas. Last, I present evidence of principals’ capacity and role in schools’ use of research, reflecting leadership as a form of school capacity.

4.1. What Influence Does Research Evidence Have on School Decisions?

Of the 1343 respondents that reported on an organizational decision, 64% reported that external research had some or heavy influence on the decision, 34% of which were confirmed through our validation process, reducing the proportion of decisions that were influenced by research to 22%. Furthermore, 31 of 134 schools (23%) reported no organizational decisions in which research was influential. However, respondents also indicated that they use research outside of specific decisions. For example, 76% of respondents reported that their school/district used frameworks from research to organize improvement efforts, and 71% reported that research has provided a common language and set of ideas. Educators in our sample also reported that research is used to influence others to agree with a point of view (65%) or to mobilize support for important issues (62.7%).
These results suggest that research evidence does, in fact, influence decision making and improvement in schools—though perhaps not across all schools and not to the extent that one might hope, given the consequentiality of schools’ improvement actions. That said, it is also important to acknowledge that it is not reasonable to assume that all decisions should be based on research, nor that all decisions could be based on research. Such assumptions presuppose the relevance, availability, and quality of research evidence for all decisions and the alignment between research findings and school goals and needs—both of which are contested ideas. Nonetheless, there appears to be substantial opportunity to improve the role of research evidence in school improvement initiatives, which in turn warrants paying attention to the role of principal. Moving forward in this section, I present findings around individual and school capacity which signal opportunities for principal leadership in strengthening that role.

4.2. How Can We Characterize School Capacity for Research Use and What Opportunities Arise for Principal Leadership in Research Use?

In this section, I present findings related to staff capacity, including knowledge and skills, beliefs about research use, and networks. This followed by a discussion of the organizational dimensions of capacity, including culture, structures, and resources that support engagement with research, and lastly, decision processes, which include the improvement agenda and participatory decision making.

4.2.1. Staff Capacity

Regarding items related to knowledge and skills, survey responses suggest that educators have limited experience and confidence using research, which in turn reflects opportunities for further growth. In terms of training or experiences (Figure 2), educators report few experiences related to using research, with nearly a third having had none of the experiences listed and only another third having had more than one experience. Most often, educators were involved in collecting or analyzing data for a research project or engaging with research within a professional learning community (PLC); yet, still only a third reported those experiences. Perhaps not surprisingly, then, respondents report relatively low levels of confidence in terms of their ability to critically consume research. When considering the overall scale, the mean score was 2.17 (SD = 0.81), indicating that practitioners, on average, are only somewhat confident. Approximately a third of respondents indicated they were more than somewhat confident in each activity, and about 16% of respondents had the lowest possible value on the scale, indicating that they were not at all confident in critiquing the aspects of research described in our survey items.
Educators were also asked about their beliefs about the value of research in addressing meaningful problems of practice in schools. Overall, educators are mixed in their beliefs, with a mean scale score of 2.68 (SD = 0.61), which is in between agreement and disagreement on our response scale of 1 to 4. However, a few items suggested reasons for optimism. For example, most (83%) agreed or strongly agreed that research is actionable, though only half (55%) agreed that research considered the resources available to implement research findings.
Furthermore, in terms of professional networks for connecting with and accessing research, respondents across schools report relying largely on colleagues within their school or district, with on average 53% of their networks comprising local resources (SD = 0.38), compared to the 36% of their networks comprising intermediary sources, such as professional associations, media sources, curriculum, professional development providers, and others (SD = 0.30), and the 9% comprising researchers, research organizations, or research outlets (SD = 0.17).
Results suggest that most educators have little experience with or exposure to educational research and, correspondingly, report low confidence when critically consuming research, mixed beliefs about its value, and few means of connecting with research or researchers. This is consistent with prior findings which suggest educators may lack both confidence in their research-use abilities (and the capacity to critically interpret research). This is perhaps not surprising, given the knowledge, skills, and experiences educational professionals at all levels of the system are expected to master. However, it is ana rea where school leaders have an opportunity to intervene. First, school leaders are often (though not always) involved in staffing their schools, which means that hiring decisions could include criteria related to experience or training related to evidence use. School leaders are also often key decision makers in decisions about annual professional learning opportunities—both school-wide and in terms of attending conferences or other events. Further, they are often gatekeepers for direct school participation in educational research, a role which enables them to create (or limit) engagement with the research community directly. Such roles fit within the roles and responsibilities that characterize leadership for learning and effective school leadership, such as strategic hiring and the provision of opportunities for high-quality professional learning.

4.2.2. Organizational Capacity

Moving away from individuals’ capacity, I now turn to focus on the characteristics of schools. School cultures for research use, as reflected in responses to items about processes and incentives, suggest moderately supportive cultures, with mean responses on an agreement scale from 1 to 4 of 2.51 (SD = 0.613). However, the large standard deviation indicates substantial variation in this dimension. Item-based responses (Figure 3) reveal that about two-thirds of respondents agree that research use is a priority, but around half or less of respondents agree that there are expectations and incentives for using research. On the other hand, sharing research evidence appears to be a common practice, with 70% of respondents indicating they shared this resource at least once in the past year (Figure 4).
The items capturing available school structures reveal variability, not only in the presence of structures but in terms of their utility in supporting engagement with research (Figure 5). Notably, professional learning communities (PLCs), instructional coaches, and leadership teams are common structures across nearly all schools in our sample, but are also the most frequently leveraged structures in research evidence. Specialized structures that are specific to research use—such as specific professional learning, subscriptions, and district research offices—are less commonly available. However, when they exist, they appear to be helpful for promoting engagement with research. While most respondents have at least one specialized support structure available within their schools (most often professional development on research use), access is not evenly distributed among educators. In fact, 15% of educators reported that none of these structures were available.
The findings regarding culture and resources reveal some evidence that the use of research is a priority, and that time is available for its use. However, limited expectations and incentives, coupled with the mixed beliefs about the value of research described earlier, may suggest a need for leaders to promote evidence use more explicitly as part of “how we do things”. This means moving from rhetoric to action, including investing resources such as time, space, and finances in efforts to support the use of research evidence.
The results show that common school structures (i.e., PLCs, instructional support teams, and instructional coaches) play an important supporting role in research use, at least for some schools. However, their widespread presence in schools—and the space and time they afford—is a potential opportunity for strengthening research use. Therefore, these common structures may hold untapped potential to support engagement with research in ways that impact student learning.
Second, specialized research-use supports, such as research offices and subscriptions to research journals, are not often available to educators, possibly contributing to a gap between stated priorities and actual practice. Specialized support can be an important contributor to organizational capacity, and school leaders seeking to clarify expectations and support access to and engagement with research can consider such investments. However, the limited availability of these supports may not reflect a lack of priorities, but rather a lack of resources (e.g., costs associated with accessing research databases can be prohibitive) and capacity (e.g., small school districts may not have an internal research office). These findings further underscore the importance of primarily building school capacity through the use of common structures to support research use.
A final dimension of organizational capacity relates to an improvement agenda. The survey’s items solicit information about not only the types of organizational decisions that schools make, but who participates in them. The educators reporting on problems and decisions covered extraordinarily diverse issues. Concerns included, among others, creating safe schools, improving academic performance, designing standards-based report cards, and implementing Response to Intervention (RTI). The most frequently reported organizational problems were related to academic performance (39%, n = 553), followed by non-academic issues (20%, n = 282) and systemic issues (16%, n = 224). The least commonly reported organizational problems were those related to community-centered issues (1%, n = 17). The most frequently reported organizational decisions were decisions to adopt something new (44%, n = 575), structural changes (17%, n = 216), and professional development (12%, n = 152). Not only did concerns vary in terms of content focus, but they varied within content categories. For example, some respondents focused on technology, but could be referring to training teachers to use devices, to the adoption of digital course materials, or to concerns about access to technology. Furthermore, educators within schools reported multiple organizational decisions; in 25% of schools, more than six major organizational decisions were named, with some reporting as many as 15 different decisions.
In the data, it was sometimes difficult to grasp how the solution reported addressed the problem described. Broadly defined problems were linked to broadly defined and difficult-to-interpret theories of change. Drawing on the examples shown in Table 1, respondents describe the adoption of a new reading curriculum. Here, the problem is framed in two ways. One respondent states that the problem lies in students reading below the appropriate grade level, whereas the other states that the problem lies in the inadequacy of the reading program, being replaced in the subsequent decision. The challenge here is in the clarity of the problem frame. The first frame is vague and offers no diagnosis for low performance, which likely makes it difficult to identify an evidence-based approach to improvement. The second frame points to the effectiveness of the reading program, which, again does not offer a specific diagnosis about key instructional issues that might explain lack of effectiveness. This framing led decision makers to a wholesale shift in reading curriculum, which might or might not the address underlying causes of low performance. Several other examples suggest that this is not a unique challenge. For instance, one respondent described the challenges their school faced as “failure rate, student engagement in learning/ownership of learning”, and the decision made to address the problem was implementing a 1:1 technology initiative. Another example included addressing low reading scores by giving free books to students who completed reading logs.
Although the survey does not ask for fully articulated theories of change and is limited in the scope and description of problems and decisions, the nature of these reports suggest a number of challenges for research-informed improvement. First, improvement agendas—captured in the diverse array of problems and decisions reported by educators, even within schools—are incredibly diverse and often broad. This means that resources—including time, staff, funding—must be distributed across many different issues and needs and that there is a need for a wide range of research evidence to be available and accessible in order to inform improvement efforts. Second, broadly defined (or ill-defined) problems may lead school decision makers into a time-consuming, frustrating search for potentially relevant research, in which they will likely encounter diverse and competing alternative solutions. This might diminish educators’ already mixed beliefs about the value of research. Second, selecting one of the many research-based solutions to a broadly defined problem is a high-risk venture when student learning is at stake. The selection of programs and practices ill-suited to the problem is unlikely to generate the desired change, resulting in the loss of critically important time, resources, and learning, but again is likely to negatively impact the perceived value of research.
School leaders have a role in addressing these challenges through facilitating collaborative sensemaking as well as diagnostic and tactical leadership. Through these agenda-setting roles, principals and other school leaders can focus the improvement agenda and direct valuable resources towards finding and using research evidence in more productive and impactful ways.
In terms of who participates in research-informed decision making, most educators report some degree of shared decision making. Figure 6 represents the proportion of organizational decision responses in which respondents indicated which stakeholders were involved, and in what ways they were involved. Principals were involved in making most organizational decisions (73.1%), and were the stakeholder most often involved in evaluating (40.4%) and collecting evidence (34.7%) as well (this item refers to evidence broadly rather than research evidence specifically). Notably, teachers were most likely to collect evidence that informed decisions (41.7%) and nearly as likely as principals to participate in its evaluation (38.1%), though much less likely to be involved in making the decision (42.0%). Coaches were also regularly involved in these aspects of decisions, as were special education teachers to a lesser extent. Other stakeholders, including support staff, students, and parents, were least likely to be engaged in the decision-making process.
These data suggest that, although principals are primary decision makers, there are many ways in which other school staff contribute to those decisions and to the use of evidence in those decisions. Principals that promote shared responsibility for collecting and evaluating evidence can leverage staff knowledge, expertise, and professional networks for research use, as described earlier. This also creates opportunities for different kinds of evidence to be used in planning and collaborative interpretation. Furthermore, participatory decision making that engages school staff in evidence use has implications for individual capacity. Specifically, the nature and quality of evidence use in decision making reflects the individual knowledge, skills, and beliefs of those participating; in schools with limited educator capacity in these areas, participation in these roles might not yield high-quality research evidence or use. On the other hand, participation can build capacity by supporting the development of research-related knowledge, skills, and beliefs if engagement and use is effectively modeled. Principals clearly have a role in determining the extent of shared decision making in schools, and this role intersects with their role in building educator capacity, as described earlier.

4.3. How Can We Characterize Principals’ Capacity for Research Use?

To this point, I have noted several dimensions of individual and school capacity which principals may have influence on and outlined how these actors are positioned to support the use of research in school improvement. However, several of these imply a level of capacity among principals, which prior research has suggested matters for evidence use. For example, school leaders who themselves do not value educational research may be less likely to invest resources in organizational structures or leverage routines to support its use. Similarly, school leaders who lack knowledge or skills related to using research may find it challenging to model research use in shared decision-making contexts. Therefore, it is important to explore data regarding principals’ own capacity.
When disaggregated from other school staff, principals were more likely to have had the most research-use-related experiences listed in the survey. Table 2 demonstrates that these differences are statistically significant, except for participation in professional development around using research and having taken another course beyond introductory statistics or research methods. The magnitude of these differences ranges from 7% (involvement in a research–practice partnership) to 27% (having taken an introductory statistics course). Furthermore, only 6% of principals reported not having any of these experiences, compared with 30% of teachers. Relatedly, a comparison of school leaders’ confidence in critically consuming research reveals significantly greater confidence, with respective mean scores of 2.47 (SD = 0.730) and 2.16 (SD = 0.806, p = 0.000).
Nonetheless, no research-use-related opportunity was universally experienced by principals, with the most common—having taken an introductory statistics course—being experienced by 75% of principals. Most events were experienced by far fewer staff. Similarly, though principals expressed greater confidence in using research, their responses still fell below 3, which is between “somewhat” and “mostly” confident.
Additionally, school leaders’ networks for accessing information did not differ significantly from those of other school staff. However, principals’ networks, on average, had greater direct ties to research and fewer local resources (Table 3). They were also significantly more likely to share research evidence, doing so between 3 to 5 times or 5 or more times per year (Table 4). This finding suggests that principals are key brokers of research evidence within their organizations, can facilitate or constrain the flow of information into schools, and may rely on more direct channels to access that research evidence (though variability in responses suggests this is not a statistically significant different).
A last point about principal capacity is made with regard to their beliefs about the value of research evidence in addressing problems of practice. Mean scale scores for leaders (x = 2.79) were higher than those for school staff (2.67, p = 0.014), but still did not achieve a mean of overall agreement (a score of 3 on our scale). There were few statistical differences between school leaders and other school staff in terms of individual items, with principals most often being less likely to strongly disagree with statements about educational research.
Overall, principals’ responses indicate greater capacity for engaging in research use, as measured by the items selected in the survey. Greater capacity in terms of access through networks and confidence critically consuming research may enable leaders to integrate research evidence more effectively into improvement decisions, as per earlier findings showing that principals were notably more likely to be involved in these decisions. Greater sharing of research and stronger beliefs about the value of research may also enable principals to model research use, serve as a resource for accessing research, and direct resources toward research-use activities.

5. Implications and Conclusions

This study explores school capacity—both individual and organizational dimensions—for research use in school improvement through the lens of the principalship in the U.S. context. Drawing on the prior literature describing principal leadership and the ways in which principals can support evidence use in schools, I present national survey data on school decision making as a means of revealing opportunities for principal leadership in terms of research-informed improvement. The findings reveal several opportunities for building school capacity that are aligned to principals’ roles and areas of influence. While these findings are based on data collected in the U.S. context, the conceptual lens of principals’ contributions to school capacity for evidence use are likely to extend to other contexts where leaders engage in capacity-building roles and adds to this growing discussion in the literature (e.g., [59,60,63]).
One set of opportunities relates to principal roles focused on the knowledge and skills of school staff. The data presented in this study suggest that teachers have limited experiences that relate to using research and have low confidence in critically consuming research, but have moderately positive perspectives about the value of research to practice. Principals’ roles in assessing and addressing professional learning needs, in hiring school staff, and in participating in research create opportunities to influence the knowledge, skills, and beliefs within a school community.
A second set of opportunities relates to creating supportive school conditions for research-informed improvement. The findings demonstrate that research use is a priority, and that sharing research is a relatively frequent occurrence in schools. On the other hand, data show that expectations, incentives, and structures that foster engagement with research are less common. The role of principals in allocating time and funding, and in leveraging structures and routines for engaging with research, can be used to promote greater alignment between expectations and actual improvement work. For example, research by Brown [76] suggests that school and district leaders can adapt routines to support educators in “engag[ing] in a facilitated process of learning, designed to help them make explicit connections between research knowledge and their own assumptions and knowledge” (p. 389), stating that, in their research, this work is transforming teaching and learning. Relatedly, by distributing leadership and promoting shared decision making through routines, including through leadership teams, principals can maximize the knowledge, skills, and professional networks of staff [36,57]. This serves as an opportunity to model expectations for using research [25].
A third set of opportunities relates to the role of principals in setting the improvement agenda for their schools. Reports about organizational decision making suggest that there is a need for principals to engage in diagnosing needs and focusing improvement efforts in ways that may make the process and value of using research clearer, which may result in more appropriate and impactful decisions. Similar points are emphasized in the research of Hoffman and Illie [11], who articulate specific challenges faced in research-informed improvement related to articulating and defining outcomes and identifying appropriate research to inform action, among other challenges, leading the authors to emphasize that such work is not merely a tweaking practice but far more substantial and, therefore, in need of particular forms of leadership.
These findings not only reaffirm the critical importance of leadership, but also offer several lessons for how we can support research-informed improvement and principals’ leadership in achieving evidence-use goals. First, empirical evidence on school capacity for research use suggests that there are not merely opportunities but systemwide needs to substantially invest in capacity building. This confirms the problematic assumption that increased demands and expectations for evidence use will lead to actual evidence use. Rather, in spite of rising expectations and multiple policies, schools still experience challenges using research, and specific investments in individual and organizational capacity for evidence use are warranted.
Second, some of the findings here give reason to be optimistic about research use in schools, including moderately positive beliefs about the value of research, widely held priorities for using research, and the leveraging of common school structures to support research engagement. These are positive developments and represent progress towards evidence-informed practices. However, that these are not (yet) leading to the more widespread use of research in school improvement, signals a “knowing–doing” gap [77]. For example, I find that educators value research but lack confidence and experience using it or lack routines and structures to engage with it. These findings echo calls for greater attention to capacity building, but also suggest some specific areas to target.
Third, as explored here, principals are a key lever in building school capacity, which makes their own capacity and preparation a critical issue in evidence-informed improvement. Principals demonstrate greater capacity for research use than other school staff, as measured by a limited set of indicators, yet those differences are often marginal and capacity varies widely across principals and, subsequently, schools. Therefore, an implication of this study is that there needs to be greater attention to research-use- or evidence-use-specific supports for principals, both before and during their time as school leaders. This may happen in several ways.
One shift might occur in the discourse about effective leadership, frameworks for which often guide the development of standards for preparation, professional learning, and evaluation. Findings suggest that principal leadership for research use is aligned with the roles described in broader leadership frameworks. For example, Murphy and colleagues’ [25] description of “leadership for learning” identifies the activities of principals that focus on vision, the instructional program, the curricular program, assessment, communities of learning, resource acquisition and use, organizational culture, and social advocacy. Another approach to understanding the work of principals comes from the principal effectiveness literature, with effectiveness tied to improving student outcomes (e.g., [22,24,78,79,80]). This body of research suggests that effective principals engage in direct instructional leadership and management, including setting the improvement agenda; organizational leadership and management; and managing the external environment. Leadership support and preparation aligned to these frameworks may therefore already support leadership for research use indirectly. On the other hand, these existing frameworks rarely, if ever, explicitly address leadership for evidence-informed improvement, and it is unclear whether developing more knowledge and skills related to these roles translates into leadership in evidence use.
Therefore, it is likely important to build knowledge and skills more explicitly in leading research use, which evidence suggests has been rare to date (60). This has been taken up to some degree in the U.S. through the Professional Standards for Education Leadership (PSEL) [81], and PSEL standards have been adopted for the certification of program accreditation for principals and other leader (e.g., Council for the Accreditation of Educator Preparation (CAEP)). Two standards connect to leadership in terms of research or evidence use. PSEL standard 6e articulates that leaders should “deliver actionable feedback about instruction and other professional practice through valid, research-anchored systems of supervision and evaluation to support the development of teachers’ and staff members’ knowledge, skills, and practice (p.14)”. PSEL standard 10f states leaders should “assess and develop the capacity of staff to assess the value and applicability of emerging educational trends and the findings of research for the school and its improvement (p. 18)”, while 10d states that leaders should “Engage others in an ongoing process of evidence-based inquiry, learning, strategic goal setting, planning, implementation, and evaluation for continuous school and classroom improvement (p.18)”.
The inclusion of evidence use in these dimensions of leadership is a promising start towards strengthening principals’ preparedness to lead research-informed improvement, as standards are likely to trickle down into programs through alignment with accreditation processes. Furthermore, formal recognition of research-use leadership within standards and frameworks may promote change at scale, as most programs must be responsive to some set of standards, which may help to address the unequal distribution of capacity across schools and, ultimately, contribute to more systemic improvement. However, these represent three of nearly one hundred standard elements in PSEL, and still only some of the roles identified in this paper. Furthermore, the three standard elements represent complex leadership activities that require a range of strategies, knowledge, and skills for principals to effectively enact them. This means that additional guidance and support are likely to be needed.
Because research-use leadership falls largely within already-recognized leadership roles, one viable approach may be as simple as integrating a research or evidence-use lens into existing courses and activities, similar to Levin’s [64] suggestion that all professional learning create the space to engage with evidence. However, preparation programs may also strengthen principals’ capacity by creating more research-related experiences—which were notably lacking among our sample—beyond the current curriculum. This could include research-centered coursework, participating in research, leading research, facilitating engagement with research, or other activities.
At the same time, change in principal preparation is difficult because of licensure, accreditation, and other processes, not to mention the already sizeable number of standards and practices that must be fitted into any single program. Therefore, another viable mechanism for capacity building is professional learning for leaders. Most educators, including principals, must participate in continuous learning, and professional learning providers can offer research-use-related options. I echo calls from others (e.g., [60,61,64]) for attention to be paid issues such strategic engagement with evidence in the context of improvement initiatives, the critical evaluation and synthesis of different sources of evidence, the present status of research knowledge regarding key education issues and practices, and strategies to share research knowledge and use it in schools.
While principal preparation and professional learning may prove important, it is also important to consider how principals’ roles in leading research use may be shaped by their district context. As Brooks and colleagues [60] note, principals’ practices in using evidence to make decisions are tightly coupled with the practices of education districts in which they and their schools are nested (p.170)”. Principals may themselves lack the access, time, and support needed to engage with research, and district leaders can be an important influence in research use]. Prior research has shown that evidence-related support in districts is important to the capacity for improvement [82,83,84,85]. Districts can also support principal learning through many of the strategies identified as related to principal leadership: offering professional learning centered on research use, setting expectations and modeling research use, encouraging connections with researchers, and allocating resources that support their engagement with research. However, district variability in commitments to and supports for evidence use may also contribute to the variability in school capacity found in this paper, making district-led solutions to capacity building only a partial solution to building capacity system-wide.
To conclude, this paper contributes to the literature on both leadership and evidence use in several ways. First, I draw on the literature related to school capacity, school leadership, and evidence use to offer a conceptualization of principal leadership for research use. This approach integrates the lenses of leaders’ use of research as individuals and their roles in building school capacity to highlight more explicitly the different mechanisms by which leaders contribute to school capacity for evidence use. Importantly, this work explores such mechanisms across diverse school contexts and at scale, providing evidence of needs and opportunities across the U.S. educational system. Although additional research on leadership for evidence use more broadly—including the use of data—is warranted, the findings presented here offer directions for current and future leadership preparation and development. They also support initiatives and highlight the need for evidence-use guidance to address the complexity of implementation and the inclusion of significant investments in capacity building. I note, however, that the context of this research is specific to the United States and thus applies primarily to the roles and responsibilities of principals in a particular national context. Although leadership is widely regarded as influential in student outcomes and evidence-use expectations have risen globally, the role of school leaders and the conceptualization of leadership itself varies widely across the globe. As a result, findings and recommendations should be considered in light of this limitation. Echoing calls for cross-cultural and comparative leadership research [86], the study of leadership and of evidence-informed improvement would considerably benefit further consideration into leadership for research use across international contexts.

Funding

This research was funded by the University of Delaware’s Institute for Education Sciences (grant number R305C150017).

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board of The University of Delaware (777166-34, approved 28 July 2023).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The author declares no conflicts of interest.

Appendix A. Survey Items

Knowledge and skills for research use







Staff and leader capacity (scale)
n = 4082
Items = 7
Alpha = 0.97
Range = 1–4
Mean = 2.17
SD = 0.81
Q93: What training and/or experiences have you had related to using research?
Select all that apply.
I have collected and/or analyzed data for a research project.
I have been involved in a formal research–practice partnership.
I have participated in other professional development around critically consuming research.
I have engaged with research through a Professional Learning Community.
I have attended research conferences.
Other (please specify) ________________________________________________
None of the above.
Q94: I have participated in an undergraduate/graduate level course in…
introductory statistics or research methods.
intermediate or advanced statistics or research methods.
another course on understanding and interpreting research (please specify).
none of the above.
Q95: How confident do you feel in determining whether
Not at all (1)Somewhat (2)Mostly (3)Very confident (4)
a research study used appropriate statistical analyses?Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
a research study had an adequate sample size?Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
a program evaluation demonstrated real impacts versus improvement that would have happened even without the program?Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
the surveys and assessments used in a research study were reliable and valid?Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
results from a research study are generalizable to different schools, districts, etc?Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Cultures that support engagement with and uptake of researchProcesses, incentives (PI), n = 4042
Items = 6
Alpha = 0.89
Range 1–4
Mean 2.51
SD 0.61
Q75: Please rate your level of agreement with the following statements.
Strongly disagree (1)Disagree (2)Agree (3)Strongly agree (4)
Our school/district has a documented process (e.g., guidelines) for using research to inform decisions. Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Our school/district provides time to discuss research. Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Our school/district prioritizes research in decision making. Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
There are school/district incentives for me to use research in my practice. Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Using research is part of my evaluation as a practitioner. Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
We use research because a supervisor or administrator requires it. Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Q96: In the last year, have you shared any of the following types of research with others? This could include bringing them to a meeting, sending them in an e-mail, posting to social media (e.g., Twitter, Pinterest, etc.), or having a conversation about any of them.

Articles, reports, books, or summaries based on external research or program evaluation (paper or web-based)

Never
1–2 times
3–5 times
More than 5 times
Beliefs about the value and relevance of research evidence,Problems of practice
Problems of Practice
n = 4112
Items = 5
Alpha = 0.88
Range 1–4
Mean 2.68
SD 0.61
Q72: Please rate your level of agreement with the following statements.
Strongly Disagree (1)Disagree (2)Agree (3)Strongly Agree (4)
Most education research suggests actionable steps to take in practice. Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Researchers have a solid grasp on evolving problems in schools/districts. Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Research addresses the most important issues schools/districts face. Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Research takes into consideration the varying levels of resources available to schools/districts to implement research findings. Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Research is produced quickly enough for me to make use of it. (add a row of response circles)Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Participatory decision making ParticipationQ20: Who were the key people involved in gathering and evaluating evidence to inform the decision? Were they also involved in making the decision?
Not Involved
(0)
Involved in gathering evidence
(1)
Involved in evaluating evidence
(2)
Involved in making decision
(3)
I don’t know/remember
(4)
School board Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Principal Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Assistant principal Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Teachers Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Students Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Instructional coaches Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Special educators Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Para-educators/teaching assistants Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Curriculum/instructional supervisor Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Support services staff Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
District superintendent Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Research/evaluation staff Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Other central office staff Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
PTA/parents Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
State/federal staff Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
External program developer(s) Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
External researcher(s)Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
External consultant(s) Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Someone else (please specify) Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Other community members/community organizations Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001Education 14 00561 i001
Improvement agendaProblems and decisionsPlease think of an organizational decision related to student outcomes that was made by your school or district this year or the previous year. If you cannot think of a school or district decision, please leave the following items blank and continue.
What decision was made (i.e., what was changed or introduced/what actions were taken)?
Why was the decision necessary (i.e., what challenge/problem did it address, what was the goal)?
Resources that support access and engagement with researchStructures, networksQ73: Are the following currently present in your school or district?
Q74: Over the last two years, how often have these supported you in connecting research and practice?
Never (1)1–2 times per year (2)3–6 times per year (3)1–2 times each month (4)Every week (5)
Professional learning communities (PLCs) Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Instructional leadership teams Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Instructional coaches Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Professional development on research use Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Research–practice partnership Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
District research office Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Subscription(s) to research databases (e.g., JSTOR, EBSCO) Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Subscription(s) to research-based periodicals (e.g., journal, magazines, Marshall Memo, etc.) Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Membership in a network/association of researchers and practitioners Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
External consultant(s) that help(s) us use research Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
System or tool to help store and/or share research Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Process for sharing/communicating about research Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002Education 14 00561 i002
Q100: To what extent is sharing research expected of you in your role in your organization?
Not at all (1)
Slightly (2)
Moderately (3)
Very (4)
Q82: The following three questions will ask you to list separately the people, organizations, and media sources you rely on for education research.

Q83: Please list up to 10 people (we will ask about organizations and media next) whom you rely on for education research. Please make sure to list their name and select their title or role.
Q84: Please list up to 10 organizations (we will ask about media next) you rely on for education research. Please make sure to list the name and select their category.

Q85: Please list up to 10 media sources you rely on for education research. Please make sure to list the name and select the category.

References

  1. Kuhfeld, M.; Soland, J.; Lewis, K.; Morton, E. The Pandemic Has Had Devastating Impacts on Learning. What Will It Take to Help Students Catch Up? [Commentary]. Brookings Institution. Available online: https://www.brookings.edu/articles/the-pandemic-has-had-devastating-impacts-on-learning-what-will-it-take-to-help-students-catch-up/ (accessed on 15 January 2024).
  2. U.S. Department of Education Fact Sheet. United States Department of Education. Available online: https://oese.ed.gov/files/2021/03/FINAL_ARP-ESSER-FACT-SHEET.pdf (accessed on 15 January 2024).
  3. Farley-Ripple, E.N. Research Use in School District Central Office Decision Making: A Case Study. Educ. Manag. Admin. Leadersh. 2012, 40, 786–806. [Google Scholar] [CrossRef]
  4. Honig, M.I.; Coburn, C. Evidence-based decision making in school district central offices: Toward a policy and research agenda. Educ. Policy 2008, 22, 578–608. [Google Scholar] [CrossRef]
  5. Maxwell, B.; Sharples, J.; Coldwell, M. Developing a systems-based approach to research use in education. Rev. Educ. 2022, 10, e3368. [Google Scholar] [CrossRef]
  6. Wei, T.; Johnson, E. How States and Districts Support Evidence Use in School Improvement; Study Snapshot, NCEE 2020-004; National Center for Education Evaluation and Regional Assistance: Washington, DC, USA, 2020.
  7. Farrell, C.C.; Penuel, W.R.; Coburn, C.E.; Daniel, J.; Steup, L. Research Practice Partnerships in Education: The State of the Field; William T. Grant Foundation: New York City, NY, USA, 2021. [Google Scholar]
  8. Godfrey, D.; Brown, C. (Eds.) An Ecosystem for Research-Engaged Schools: Reforming Education through Research; Routledge: London, UK, 2019. [Google Scholar]
  9. Handscomb, G.; MacBeath, J.E. The Research Engaged School; Essex County Council: Chelmsford, UK, 2003. [Google Scholar]
  10. NFER. The Research-Engaged School/College Award. Available online: http://www.nfer.ac.uk/schools/research-engaged-award/assessment-criteria.cfm (accessed on 17 January 2024).
  11. Hofmann, R.; Ilie, S. A Theory-Led Evaluation of a Scalable Intervention to Promote Evidence-Based, Research-Informed Practice in Schools to Address Attainment Gaps. Educ. Sci. 2022, 12, 353. [Google Scholar] [CrossRef]
  12. Leithwood, K.; Louis, K.S. Linking Leadership to Student Learning; John Wiley & Sons: Hoboken, NJ, USA, 2011. [Google Scholar]
  13. Neal, J.W.; Neal, Z.P.; Kornbluh, M.; Mills, K.J.; Lawlor, J.A. Brokering the research–practice gap: A typology. Am. J. Community Psychol. 2015, 56, 422–435. [Google Scholar] [CrossRef] [PubMed]
  14. Wahlstrom, K.; Louis, K.S. Adoption revisited: Decision-making and school district policy. In Advances in Research and Theories of School Management and Educational Policy; Bachrach, S., Ogawa, R., Eds.; JAI Publishing House: Cuttack, India, 1993; Volume 1, pp. 61–119. [Google Scholar]
  15. Coburn, C.E. Shaping teacher sensemaking: School leaders and the enactment of reading policy. Educ. Policy 2005, 19, 476–509. [Google Scholar] [CrossRef]
  16. Coburn, C.E.; Touré, J.; Yamashita, M. Evidence, interpretation, and persuasion: Instructional decision making at the district central office. Teach. Coll. Rec. 2009, 111, 1115–1161. [Google Scholar] [CrossRef]
  17. Ganon-Shilon, S.; Chen, S. No school principal is an island: From individual to school sense-making processes in reform implementation. Manag. Educ. 2019, 33, 77–85. [Google Scholar] [CrossRef]
  18. Spillane, J.P.; Diamond, J.B.; Burch, P.; Hallett, T.; Jita, L.; Zoltners, J. Managing in the middle: School leaders and the enactment of accountability policy. Educ. Policy 2002, 16, 731–762. [Google Scholar] [CrossRef]
  19. Spillane, J.P.; Miele, D.B. Evidence in practice: A framing of the terrain. Teach. Coll. Rec. 2007, 109, 46–73. [Google Scholar] [CrossRef]
  20. Shaked, H.; Schechter, C. School principals as mediating agents in education reforms. Sch. Leadersh. Manag. 2017, 37, 19–37. [Google Scholar] [CrossRef]
  21. Stoll, L.; Bolam, R.; Collarbone, P. Leading for Change: Building Capacity for Learning. In Second International Handbook of Educational Leadership and Administration; Leithwood, K., Hallinger, P., Eds.; Kluwer Academic Publishers: New York, NY, USA, 2002; pp. 41–73. [Google Scholar]
  22. Grissom, J.A.; Egalite, A.J.; Lindsay, C.A. How Principals Affect Students and Schools; The Wallace Foundation: New York, NY, USA, 2021; Available online: http://www.wallacefoundation.org/principalsynthesis (accessed on 15 January 2024).
  23. Khalifa, M.A.; Gooden, M.A.; Davis, J.E. Culturally responsive school leadership: A synthesis of the literature. Rev. Educ. Res. 2016, 86, 1272–1311. [Google Scholar] [CrossRef]
  24. Leithwood, K.; Louis, K.S.; Anderson, S.; Wahlstrom, K. How Leadership Influences Student Learning; Review of Research; The Wallace Foundation: New York, NY, USA, 2004. [Google Scholar]
  25. Murphy, J.; Elliott, S.N.; Goldring, E.; Porter, A.C. Leadership for learning: A research-based model and taxonomy of behaviors. Sch. Leadersh. Manag. 2007, 27, 179–201. [Google Scholar] [CrossRef]
  26. Dimmock, C. Leadership, Capacity Building and School Improvement: Concepts, Themes and Impact; Routledge: London, UK, 2011. [Google Scholar]
  27. Lai, E. Enacting principal leadership: Exploiting situated possibilities to build school capacity for change. Res. Papers Educ. 2015, 30, 70–94. [Google Scholar] [CrossRef]
  28. Newmann, F.M.; King, M.B.; Youngs, P. Professional development that addresses school capacity: Lessons from urban elementary schools. Am. J. Educ. 2000, 108, 259–299. [Google Scholar] [CrossRef]
  29. National Research Council. Scientific Research in Education; National Academies Press: Washington, DC, USA, 2002. [Google Scholar]
  30. Cousins, J.B.; Leithwood, K.A. Enhancing knowledge utilization as a strategy for school improvement. Knowledge 1993, 14, 305–333. [Google Scholar] [CrossRef]
  31. Robinson, V. Why Doesn’t Educational Research Solve Educational Problems? Educ. Philos. Theory 1992, 24, 8–28. [Google Scholar] [CrossRef]
  32. Biesta, G. Why “what works” won’t work: Evidence-based practice and the democratic deficit in educational research. Educ. Theory 2007, 57, 1–22. [Google Scholar] [CrossRef]
  33. Fishman, B.J.; Penuel, W.R.; Allen, A.R.; Cheng, B.H.; Sabelli, N.O.R.A. Design-based implementation research: An emerging model for transforming the relationship of research and practice. Natl. Soc. Study Educ. 2013, 112, 136–156. [Google Scholar] [CrossRef]
  34. Joyce, K.E.; Cartwright, N. Bridging the gap between research and practice: Using a design-based approach to support professional development. Stud. Educ. Eval. 2020, 67, 100832. [Google Scholar]
  35. Ming, T.Y.; Goldenberg, C. Unlocking the potential of school-based research. Educ. Leadership 2021, 78, 53–59. [Google Scholar]
  36. Farley-Ripple, E.; Van Horne, S.; Tilley, K.; Shewchuk, S.; May, H.; Micklos, D.A.; Blackman, H. Survey of Evidence in Education for Schools (SEE-S) Descriptive Report; Center for Research Use in Education: Newark, DE, USA, 2022. [Google Scholar]
  37. Finnigan, K.S.; Daly, A.J.; Che, J. Systemwide reform in districts under pressure: The role of social networks in defining, acquiring, using, and diffusing research evidence. J. Educ. Adm. 2013, 51, 476–497. [Google Scholar] [CrossRef]
  38. Rickinson, M.; De Bruin, K.; Walsh, L.; Hall, M. What can evidence-use in practice learn from evidence-use in policy? Educ. Res. 2017, 59, 173–189. [Google Scholar] [CrossRef]
  39. Owen, K.L.; Watkins, R.C.; Hughes, J.C. From evidence-informed to evidence-based: An evidence building framework for education. Rev. Educ. 2022, 10, e3342. [Google Scholar] [CrossRef]
  40. Hood, P. Scientific Research and Evidence-Based Practice; WestEd: San Francisco, CA, USA, 2003. [Google Scholar]
  41. King, M.B.; Bouchard, K. The capacity to build organizational capacity in schools. J. Educ. Adm. 2011, 49, 653–669. [Google Scholar] [CrossRef]
  42. Thoonen, E.E.; Sleegers, P.J.; Oort, F.J.; Peetsma, T.T. Building school-wide capacity for improvement: The role of leadership, school organizational conditions, and teacher factors. Sch. Eff. Sch. Improv. 2012, 23, 441–460. [Google Scholar] [CrossRef]
  43. Cain, T.; Graves, S. Building a research-informed culture. In Becoming a Research-Informed School; Routledge: London, UK, 2018; pp. 99–119. [Google Scholar]
  44. Coburn, C.E.; Talbert, J.E. Conceptions of evidence use in school districts: Mapping the terrain. Am. J. Educ. 2006, 112, 469–495. [Google Scholar] [CrossRef]
  45. Cooper, A.; Klinger, D.A.; McAdie, P. What do teachers need? An exploration of evidence-informed practice for classroom assessment in Ontario. Educ. Res. 2017, 59, 190–208. [Google Scholar] [CrossRef]
  46. Gorard, S. (Ed.) Getting Evidence into Education: Evaluating the Routes to Policy and Practice; Routledge: London, UK, 2020. [Google Scholar]
  47. Hill, P.T.; Briggs, D.C. State school finance policy: A literature review. Educ. Policy 2020, 34, 703–724. [Google Scholar]
  48. Honig, M.I.; Venkateswaran, N.; McNeil, P. Research use as learning: The case of fundamental change in school district central offices. Am. Educ. Res. J. 2017, 54, 938–971. [Google Scholar] [CrossRef]
  49. Supovitz, J.A.; Klein, V. Mapping a Course for Improved Student Learning: How Innovative Schools Systematically Use Student Performance Data to Guide Improvement. Research Report. Consortium for Policy Research in Education. Available online: https://repository.upenn.edu/cpre_researchreports/39 (accessed on 15 January 2024).
  50. Williams, D.; Coles, L. Teachers’ approaches to finding and using research evidence: An information literacy perspective. Educ. Res. 2007, 49, 185–206. [Google Scholar] [CrossRef]
  51. Corcoran, T.; Fuhrman, S.H.; Belcher, C.L. The district role in instructional improvement. Phi Delta Kappan 2001, 83, 78–84. Available online: https://journals.sagepub.com/doi/pdf/10.1177/003172170108300116 (accessed on 15 January 2024). [CrossRef]
  52. Jabbar, H.; La Londe, P.G.; Debray, E.; Scott, J.; Lubienski, C. How Policymakers Define ‘Evidence’: The politics of research use in New Orleans. Policy Futures Educ. 2014, 12, 1013–1027. [Google Scholar] [CrossRef]
  53. Neal, J.W.; Neal, Z.P.; Lawlor, J.A.; Mills, K.J.; McAlindon, K. What makes research useful for public school educators? Admin. Policy Ment. Health Ment. Health Serv. Res. 2018, 45, 432–446. [Google Scholar] [CrossRef] [PubMed]
  54. Penuel, W.R.; Farrell, C.C.; Allen, A.; Toyama, Y.; Coburn, C.E. What research district leaders find useful. Educ. Policy 2018, 32, 540–568. [Google Scholar] [CrossRef]
  55. Massell, D.; Goertz, M.E.; Barnes, C.A. State Education Agencies’ Acquisition and Use of Research Knowledge for School Improvement. Peabody J. Educ. 2012, 87(5), 609–626. [Google Scholar] [CrossRef]
  56. Asen, R.; Gurke, D.; Conners, P.; Solomon, R.; Gumm, E. Research Evidence and School Board Deliberations: Lessons from Three Wisconsin School Districts. Educ. Policy 2013, 27, 33–63. [Google Scholar] [CrossRef]
  57. Brown, C.; Zhang, D. How can school leaders establish evidence-informed schools: An analysis of the effectiveness of potential school policy levers. Educ. Manag. Adm. Leadersh. 2017, 45, 382–401. [Google Scholar] [CrossRef]
  58. Cordingley, P. Research and evidence-informed practice: Focusing on practice and practitioners. Camb. J. Educ. 2008, 38, 37–52. [Google Scholar] [CrossRef]
  59. Gleeson, J.; Rickinson, M.; Walsh, L.; Salisbury, M.; Cirkony, C. Quality leadership, quality research use: The role of school leaders in improving the use of research. Aust. Educ. Lead. 2020, 42, 27–30. [Google Scholar]
  60. Brooks, J.S.; Rickinson, M.; Wilkinson, J. School principals and evidence use: Possibilities and problems for preparation and practice. In Evidence and Public Good in Educational Policy, Research and Practice; Springer: Berlin, Germany, 2017; pp. 159–174. [Google Scholar]
  61. Biddle, B.J.; Saha, L.J. The Untested Accusation: Principals, Research Knowledge, and Policy Making in Schools; Greenwood Publishing Group: Westport, CT, USA, 2002. [Google Scholar]
  62. Penuel, W.R.; Briggs, D.C.; Davidson, K.L.; Herlihy, C.; Sherer, D.; Hill, H.C.; Farrell, C.; Allen, A.R. How school and district leaders access, perceive, and use research. AERA Open 2017, 3, 2332858417705370. [Google Scholar] [CrossRef]
  63. Godfrey, D. Leadership of schools as research-led organisations in the English educational environment: Cultivating a research-engaged school culture. Educ. Manag. Adm. Leadersh. 2016, 44, 301–321. [Google Scholar] [CrossRef]
  64. Levin, B. Leadership for evidence-informed education. Sch. Leadersh. Manag. 2010, 30, 303–315. [Google Scholar] [CrossRef]
  65. Levin, J.A.; Datnow, A. The principal role in data-driven decision making: Using case-study data to develop mul-ti-mediator models of educational reform. Sch. Eff. Sch. Improv. 2012, 23, 179–201. [Google Scholar] [CrossRef]
  66. Farley-Ripple, E.N.; Tilley, K.; Mead, H.; Van Horne, S.; Agboh, D. How Is Evidence Use Enacted in Schools? A Mixed Methods Multiple Case Study of "Deep-Use" Schools; Center for Research Use in Education: Newark, DE, USA, 2022. [Google Scholar]
  67. Cosner, S. Supporting the initiation and early development of evidence-based grade-level collaboration in urban elementary schools: Key roles and strategies of principals and literacy coordinators. Urban Educ. 2011, 46, 786–827. [Google Scholar] [CrossRef]
  68. Coburn, C.E.; Spillane, J.P.; Bohannon, A.X.; Allen, A.R.; Ceperich, R.; Beneke, A.; Wong, L.S. The Role of Organizational Routines in Research Use in Four Large Urban School Districts; Tech. Rep. No. 5; National Center for Research in Policy and Practice: Boulder, CO, USA, 2020. [Google Scholar]
  69. Earl, L.M. Leadership for evidence-informed conversations. In Professional Learning Conversations: Challenges in Using Evidence for Improvement; Springer: Berlin, Germany, 2009; pp. 43–52. [Google Scholar]
  70. May, H.; Farley-Ripple, E.N.; Blackman, H.W.; Tilley, K.; Wang, R.; Maynard, R.A.; Louis, K.S.; Karpyn, A. Survey of Evidence in Education-Schools; Center for Research Use in Education, University of Delaware: Newark, DE, USA, 2018. [Google Scholar]
  71. Farley-Ripple, E.; May, H.; Karpyn, A.; Tilley, K.; McDonough, K. Rethinking Connections between Research and Practice in Education: A Conceptual Framework. Educ. Res. 2018, 47, 235–245. [Google Scholar] [CrossRef]
  72. Standards for Educational and Psychological Testing; American Educational Research Association, American Psychological Association, National Council on Measurement in Education: Washington, DC, USA, 2014.
  73. Gelbach, L.B.; Brinkworth, M.E. Designing a System of Assessments to Support Learning; Council of Chief State School Officers: Washington, DC, USA, 2011. [Google Scholar]
  74. May, H.; Blackman, H.; Van Horne, S.; Tilley, K.; Farley-Ripple, E.N.; Shewchuk, S.; Agboh, D.; Micklos, D.A. Survey of Evidence in Education for Schools (SEE-S) Technical Report. Center for Research Use in Education. October 2022. Available online: https://files.eric.ed.gov/fulltext/ED628009.pdf (accessed on 15 January 2024).
  75. Gitomer, D.H.; Crouse, K. Studying the Use of Research Evidence: A Review of Methods; William T. Grant Foundation: New York City, NY, USA, 2019. [Google Scholar]
  76. Brown, C. Research learning communities: How the RLC approach enables teachers to use research to improve their practice and the benefits for students that occur as a result. Res. All 2017, 1, 387–405. [Google Scholar] [CrossRef]
  77. Sheard, M.K.; Sharples, J. School leaders’ engagement with the concept of evidence-based practice as a management tool for school improvement. Educ. Manag. Adm. Leadersh. 2016, 44, 668–687. [Google Scholar] [CrossRef]
  78. Bryk, A.S.; Sebring, P.B.; Allensworth, E.; Luppescu, S.; Easton, J.Q. Survey Measures, Factors, Composite Variables, and Items Used in Organizing Schools for Improvement: Lessons from Chicago; University of Chicago Press: Chicago, IL, USA, 2009. [Google Scholar]
  79. Grissom, J.A.; Loeb, S. Triangulating principal effectiveness: How perspectives of parents, teachers, and assistant principals identify the central importance of managerial skills. Am. Educ. Res. J. 2011, 48, 1091–1123. [Google Scholar] [CrossRef]
  80. Hallinger, P.; Heck, R.H. Exploring the principal’s contribution to school effectiveness: 1980-1995. Sch. Eff. Sch. Improv. 1998, 9, 157–191. [Google Scholar] [CrossRef]
  81. Professional Standards for Educational Leaders 2015; National Policy Board for Educational Administration: Reston, VA, USA, 2015.
  82. Bickel, W.E.; Cooley, W.W. Decision-oriented educational research in school districts: The role of dissemination processes. Stud. Educ. Eval. 1985, 11, 183–203. [Google Scholar] [CrossRef]
  83. Coburn, C.E. Partnership for district reform: The challenges of evidence use in a major urban district. In Research and Practice in Education: Building Alliances, Bridging the Divide; Rowman & Littlefield: Lanham, Maryland, 2010; pp. 167–182. [Google Scholar]
  84. Honig, M.I. Building policy from practice: District central office administrators’ roles and capacity for implementing collaborative education policy. Educ. Adm. Q. 2003, 39, 292–338. [Google Scholar] [CrossRef]
  85. Hubbard, L. Research to practice: The case of Boston Public Schools, Education Matters and the Boston Plan for Excellence. In Research and Practice in Education: Building Alliances, Bridging the Divide; Rowman & Littlefield: Lanham, Maryland, 2010; pp. 55–72. [Google Scholar]
  86. Dimmock, C.; Walker, A. Developing comparative and international educational leadership and management: A cross-cultural model. Sch. Leadersh. Manag. 2000, 20, 143–160. [Google Scholar] [CrossRef]
Figure 1. Conceptual framework underlying the present study.
Figure 1. Conceptual framework underlying the present study.
Education 14 00561 g001
Figure 2. Percentage of respondents reporting having experiences related to research use (n = 4082).
Figure 2. Percentage of respondents reporting having experiences related to research use (n = 4082).
Education 14 00561 g002
Figure 3. Percentage of respondents reporting agreeing or strongly agreeing about the presence of organizational conditions supporting research use (n = 4109).
Figure 3. Percentage of respondents reporting agreeing or strongly agreeing about the presence of organizational conditions supporting research use (n = 4109).
Education 14 00561 g003
Figure 4. Percentage of respondents for each reported frequency (in the last year) of sharing of educational research among school staff (n = 3954).
Figure 4. Percentage of respondents for each reported frequency (in the last year) of sharing of educational research among school staff (n = 3954).
Education 14 00561 g004
Figure 5. Percentage of respondents in the study that the reported presence and use of school structures to support research use in the last year (n = 4082).
Figure 5. Percentage of respondents in the study that the reported presence and use of school structures to support research use in the last year (n = 4082).
Education 14 00561 g005
Figure 6. Percentage of organizational decisions named by respondents in which stakeholders were reported to participate (n = 1311).
Figure 6. Percentage of organizational decisions named by respondents in which stakeholders were reported to participate (n = 1311).
Education 14 00561 g006
Table 1. Illustrative examples of problem framing.
Table 1. Illustrative examples of problem framing.
Problem ResponseDecision Response
Lack of writing instruction, lack of teacher knowledge of effective writing practices and strategies.Adoption of the Teacher’s College Writing Workshop.
Our district intended to increase writing and language arts scores.
A large number of students are reading below grade-level standards.A new reading program was implemented throughout the district.
The problem was the effectiveness of the reading program in the elementary schools.
Table 2. Differences in the percentage of school staff and school leaders with research-use-related experiences.
Table 2. Differences in the percentage of school staff and school leaders with research-use-related experiences.
School Staff (n = 3913)Principals (n = 169)p
Collected or analyzed data for a research project37%58%0.000
Been involved in a formal research–practice partnership8%15%0.001
Participated in professional development around using research27%30%0.448
Engaged with research through a PLC36%60%0.000
Attended a research conference17%36%0.000
Have taken and introductory statistics course48%75%0.000
Have taken a research methods course16%31%0.000
Another course7%7%0.890
No training or experience30%6%0.000
Table 3. Differences in networks for accessing research among school staff and principals.
Table 3. Differences in networks for accessing research among school staff and principals.
School Staff (n = 133)Principals (n = 2639)p
Proportion of ties that are through intermediaries0.35 (SD = 0.30)0.30 (SD = 0.28)0.256
Proportion of network that is local0.54 (SD = 0.33)0.33 (SD = 0.32)0.304
Proportion of ties that are direct to research0.09 (SD = 0.17)0.17 (SD = 0.14)0.828
Table 4. Differences between school staff and principals in percent reporting sharing of research in the last year.
Table 4. Differences between school staff and principals in percent reporting sharing of research in the last year.
School Staff (n = 3788)Principals (n = 166)
Never34.7%7.2% *
1–2 times43.0%41.0%
3–5 times13.7%34.9% *
More than 5 times8.6%16.9% *
* indicates statistically significant difference at the p < 0.05 level, as determined according to post hoc pairwise comparison of column proportions using the Bonferroni correction.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Farley-Ripple, E.N. The Use of Research in Schools: Principals’ Capacity and Contributions. Educ. Sci. 2024, 14, 561. https://doi.org/10.3390/educsci14060561

AMA Style

Farley-Ripple EN. The Use of Research in Schools: Principals’ Capacity and Contributions. Education Sciences. 2024; 14(6):561. https://doi.org/10.3390/educsci14060561

Chicago/Turabian Style

Farley-Ripple, Elizabeth N. 2024. "The Use of Research in Schools: Principals’ Capacity and Contributions" Education Sciences 14, no. 6: 561. https://doi.org/10.3390/educsci14060561

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop