Next Article in Journal
Newton’s Second Law Teaching Strategies—Identifying Opportunities for Educational Innovation
Next Article in Special Issue
An Exploratory Study of Simulations for Leadership Development in the Principal Pipeline
Previous Article in Journal
ChatGPT or Human Mentors? Student Perceptions of Technology Acceptance and Use and the Future of Mentorship in Higher Education
Previous Article in Special Issue
Understanding the Relationship Between Educational Leadership Preparation Program Features and Graduates’ Career Intentions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Conceptualizing the Education Doctorate (EdD) as a Lever for Improving Education Leaders’ Use of Research Evidence

by
Jill Alexa Perry
1,*,
Elizabeth Farley-Ripple
2,*,
Andrew Leland
3,
Samantha Shewchuk
2 and
William Firestone
4
1
School of Education, University of Pittsburgh, Pittsburgh, PA 15260, USA
2
School of Education, University of Delaware, Newark, NJ 19716, USA
3
City Year Philadelphia, Philadelphia, PA 19123, USA
4
Graduate School of Education, Rutgers University, New Brunswick, NJ 08901, USA
*
Authors to whom correspondence should be addressed.
Educ. Sci. 2025, 15(6), 747; https://doi.org/10.3390/educsci15060747
Submission received: 31 January 2025 / Revised: 27 May 2025 / Accepted: 29 May 2025 / Published: 13 June 2025
(This article belongs to the Special Issue Strengthening Educational Leadership Preparation and Development)

Abstract

:
This paper explores how redesigned Education Doctorate (EdD) programs in educational leadership can serve as a lever for strengthening the use of research evidence (URE) in schools and districts. Drawing on the COM-B framework, we conceptualize a theory of action that links EdD program design to leaders’ capabilities, motivations, and behaviors in applying research to educational improvement. We identify key dimensions of leadership preparation that align with well-documented URE practices. Finally, we discuss how these insights can inform doctoral programs as well as in-service professional learning and suggest the need for additional empirical work on these relationships and a deeper understanding of how local contexts shape the effectiveness of leadership preparation in supporting research-informed decision-making.

1. Introduction

The use of research evidence (URE) in education is widely recognized as a critical lever for improving opportunities and outcomes for our nation’s youth. Educational research, which we define as the product of an activity in which people employ systematic empirical methods to answer a specific question (whether in the form of books and journal articles or local needs assessments and evaluations; Penuel et al., 2016), can provide high-quality information to guide consequential decisions about educational improvement. K-12 education leaders play a critical role in determining what research is seen or used in schools and districts for planning, decision making, and implementing improvement initiatives. Researchers have long observed that local leaders are often the gateway bringing research into local practice (Daft & Becker, 1978; Leithwood & Seashore Louis, 2011; J. W. Neal et al., 2015; Wahlstrom & Louis, 1993). In K-12 systems, education leaders serve as lead teachers, school principals, department heads, and district administrators. These leaders are a critical lever in organizational change and improvement, ultimately influencing a wide range of outcomes, from student achievement (Grissom et al., 2021; Supovitz, 2006) to creating more affirming and equitable learning environments (Grissom et al., 2021; Khalifa et al., 2016). Studies of the effects of principals and other school leaders on student achievement reveal that these effects are second only to teacher effects, explaining about one quarter of all school effects (Leithwood et al., 2004). More specifically, research shows that principals exert indirect influence over a number of critical aspects of schooling, including those in-school conditions that enable instructional improvement (Bossert et al., 1982; Hallinger & Murphy, 1985; Leithwood & Montgomery, 1982).
Studies show that leaders often engage with evidence in their practice (Biddle & Saha, 2006; Penuel et al., 2017), broker research evidence (Daly et al., 2014; Farley-Ripple, 2021), and act in ways that support URE (Brown & Zhang, 2017; Honig et al., 2017). Yet, these same leaders may also have limited knowledge about and confidence in using research (Hill & Briggs, 2020; May et al., 2020), as well as difficulty applying research to their context (Finnigan et al., 2013). This discrepancy may mean that many leaders lack the skills and knowledge to make sense of existing research and best use it to improve schools (L. V. Lysenko et al., 2016; Nelson et al., 2009; Supovitz & Klein, 2003). There has been limited inquiry into how leaders develop the capability and motivation to use and lead the use of research evidence. For instance, at the doctoral level, the documentation of exemplary programs (Cosner, 2019; Honig & Donaldson Walsh, 2019) and examination of particular approaches like action research (Buss & Zambo, 2016; Osterman et al., 2014) provide greater descriptions of how educational leaders are prepared, but there remains limited research conceptualizing leadership preparation as leading to improved URE.
This conceptual paper explores the potential for redesigned Education Doctorate (EdD) programs in educational leadership to strengthen school and district leaders’ capacity for research use, describing the extent and ways in which principles and learning experiences of redesigned EdD programs can support system-wide goals to improve URE among those in school- or district-based educational leadership positions. We argue that EdD programs can be purposefully designed to cultivate leaders’ capabilities and motivation to engage with research. These conditions can support meaningful, sustained URE when aligned with the opportunities available in local contexts. We offer a framework linking EdD preparation to URE—a connection that has been suggested but is underdeveloped. Second, the model provides practical guidance for EdD program designers, faculty, and policymakers seeking to strengthen the research-use capacity of education leaders.

2. Foundations of This Work

In this work, we connect the ideas from a behavioral change approach to URE—the Capabilities–Opportunity–Motivation framework for Behavioral Change (COM-B, Michie et al., 2011)—to the underlying principles of the Carnegie Project for the Education Doctorate (CPED) to identify ways in which EdD programs can contribute to stronger URE for educational improvement through the development of educational leaders.
First, we draw on the COM-B framework (Michie et al., 2011) to organize the complex nature of URE. URE, like educational improvement more generally, is complex. It is often described as comprising multiple activities (e.g., search and incorporation; Honig & Coburn, 2008), requiring significant sensemaking and negotiation (e.g., Spillane & Miele, 2007; Honig & Coburn, 2008; Honig & Venkateswaran, 2012), engaging many actors and interactions across levels of the education system (e.g., Coburn & Talbert, 2006; Honig & Venkateswaran, 2012; Finnigan et al., 2013), and shaped by the local context (e.g., Asen et al., 2013; Huguet et al., 2021; Robinson, 1992). These characteristics suggest that URE is less like an administrative task in which decision-makers might engage (as suggested by evidence use policy) and more of a practice or routine (Rickinson et al., 2020; Spillane & Miele, 2007).
The COM-B framework describes a behavior system in which individual capability, motivation, and opportunity interact to shape behavior (Michie et al., 2011; Yanovitzky & Blitz, 2017). Developed in health psychology, COM-B is a widely recognized framework used across disciplines to understand the conditions that shape behavior. While several theoretical models have been designed to explain behavior change—including the Theory of Planned Behavior (Ajzen, 1991), Social Cognitive Theory (Bandura, 1986), and the Health Belief Model (Rosenstock, 1974)—we chose COM-B because of its parsimonious structure, its emphasis on the interactions between individual and contextual factors, and its demonstrated adaptability across fields, including public health, environmental science, and education. For example, Yanovitzky and Blitz (2017) use the model to examine how public health leaders engage with research in policymaking, identifying capability, opportunity, and motivation as central to whether and how evidence is used. COM-B provides a practical, integrative lens for examining how educational leaders’ behaviors related to research use can be supported through intentional program design. We conceptualize the target behavior as leaders’ URE and opportunity as being provided by the organizational improvement efforts in their local context. We argue here that the specific principles and learning experiences of redesigned EdD programs, described below, can lead to greater URE capabilities and motivations.
Second, the framework is informed by the underlying principles of the EdD in the context of CPED. Beginning in 1920, the original purpose of the EdD was to offer a rigorous course of study that would enhance candidates’ prior knowledge and skills and better prepare them to lead as school practitioners (Cremin, 1978), conceptualized as the professional practice doctorate in education. Although the degree proliferated throughout the 20th century, its academic versus practical nature was frequently debated (D. G. Anderson, 1983; Freeman, 1931; Levine, 2005; Ludlow, 1964; Shulman et al., 2006). A key point in the ongoing debate about the EdD was the appropriate amount and approach to research training for students intending to be leaders, not researchers (Clifford & Guthrie, 1988; Levine, 2005; Osguthorpe & Wong, 1993). Some critics have suggested that the EdD only needs to prepare students to understand research methods (Andrews & Grogan, 2005), but in ways that reduce research requirements from existing PhD programs to form what Shulman et al. (2006) have described as “PhD-lite” (p. 27) EdD programs. Reformers, on the other hand, argue that the EdD should maintain high expectations for students’ understanding and use research methods and evidence (Hochbein & Perry, 2013; Shulman et al., 2006), but also examine practical problems and identify solutions in their local contexts (G. L. Anderson & Herr, 1999; Cochran-Smith & Lytle, 2009). Reformers contend that using research to address local problems is especially crucial for EdD programs focused on educational leadership given students’ well-developed professional identities and practical knowledge (Golde, 2013), as well as their current and future positions as key brokers of evidence for their institutions (Daft & Becker, 1978; Daly et al., 2014; J. W. Neal et al., 2015).
EdD redesign efforts often adopt a professional doctorate model that makes research and practice equally salient in developing students’ knowledge, skills, and dispositions (Hoffman & Perry, 2016). This model serves as the bedrock for CPED, a consortium of over 150 EdD programs across the United States, Canada, and Ireland, as well as other programs that may not be CPED members, but that are similarly interested in redesigning EdD programs to prepare leaders to effect change by meaningfully and explicitly connecting research to practice. The purpose of these redesigned EdD programs is to integrate research, theory, and practice to provide K-12 educational professionals with the knowledge and tools necessary to identify, investigate, and lead change to address real school system problems (Hoffman & Perry, 2016). We consider EdD programs to be ‘redesigned EdD programs’ as they carefully integrate adult learning principles and engage leaders with student-chosen POPs, applied research courses that provide hands-on experiences, and a dissertation in practice that is embedded into the program. Subsequently, we argue here that they are more likely to produce education leaders who engage in URE.
By integrating the COM-B model and the principles underlying CPED programs, we conceptualize a framework that links redesigned EdD program features and learning experiences that promote URE, the resulting capabilities and motivation for URE in graduating education leaders, and graduates’ URE (behavior) in the context of organizational improvement efforts (opportunity). We elaborate on these components and their relationship by drawing on extant literature, beginning with framing URE using COM-B and then mapping the resulting ideas onto CPED.

3. URE and the COM-B Framework

The COM-B model posits that behavior (B), conceptualized as URE, results from the interaction of three essential components: capability (C)—the individual’s psychological and physical capacity to engage in the behavior, including the necessary knowledge and skills; opportunity (O)—external factors that enable or prompt the behavior, including environmental and social influences; and motivation (M)—internal processes that energize and guide behavior, such as beliefs, emotions, goals, and intentions. This framework is beneficial for understanding research use not as a simple or rational decision but as a situated practice shaped by individual characteristics and systemic conditions.
Applied to the context of EdD programs, COM-B helps conceptualize how leadership preparation can build the necessary capabilities (e.g., research literacy, critical thinking), motivation (e.g., valuing evidence-informed decision-making), and opportunities (e.g., collaborative settings, applied inquiry) to support sustained engagement with research in practice.

3.1. Behavior as URE

The process of using research evidence happens in a range of ways. Weiss’s typology is often referenced in explaining these differences (Weiss, 1979; Weiss & Bucuvalas, 1980); however, the types of uses have evolved over time, as has the language used to describe them. We describe here the ways in which URE is operationalized in the study of education. Normative expectations for research use often focus on instrumental use—the direct uptake of research in decision-making. For example, Penuel et al. (2017) describe instrumental uses as including cases where research is used in purchasing an intervention, adopting a curriculum, or designing professional development. Conceptual use refers to the role of research in shaping how individuals think about or understand an issue. Farrell and Coburn (2016) offer examples of how research can be useful in introducing new concepts, seeing an issue in a new light, broadening the set of potential solutions to a problem, or providing a framework to guide action. Persuasive use, which is sometimes referred to as political use, occurs when research is used to generate buy-in or discredit a potential policy or practice. Asen et al. (2013) illustrate this type of use in their description of school board deliberations. Often lumped in under persuasive use is what we call symbolic use. Symbolic use (originally termed tactical; Weiss, 1979) captures attempts to meet normative expectations for engaging in evidence-based decisions, but often in inauthentic or rhetorical ways, for example with “the research says” (Farley-Ripple, 2012). Imposed or sanctioned use refers to external requirements for the use of research or research-based programs, including requirements to choose programs or curricula from a pre-selected set of evidence-based options (Kochanek et al., 2015; Penuel et al., 2017; Yoshizawa, 2020), reflecting increased expectations for evidence use in education policy. Latent or embedded use captures the integration of research into the development of routines, tools, and other resources used by practitioners or policymakers (Cain & Allan, 2017; Coburn et al., 2020).
Different types of URE are enacted in different aspects of leadership. As noted earlier, educational leaders are important contributors to organizational change, as both significant users of research and mediators of evidence use in the schools and districts they lead. Findings from recent literature suggest three ways in which leaders engage in URE: their own use, brokering research, and leading URE (Farley-Ripple, 2024).
Leaders at school and district levels often engage in research use as part of their own decision-making and as part of collective decision-making. In a study of research use in central offices, for example, Honig et al. (2017) explored how leaders use research to shape their own practices. Other studies have identified that leaders’ modeling of research use in their own practice has an influence on larger URE practices in schools (Brown & Zhang, 2017; Earl, 2009). A study of school and district leaders (Penuel et al., 2017) found that leaders frequently reported using research in supporting a wide range of activities, most notably designing professional learning for teachers and staff and directing resources to school programs or practices. Importantly, studies of leaders’ use of research find that leaders engage in a wide range of the uses described above, including instrumental, conceptual, tactical, etc., and that the particular form of URE may depend on the issue at hand, the decision context, the particular users and audience of the research, and the level of the system at which use occurs.
A second critical URE function for leaders is their role as knowledge brokers. As Meyer (2010) explains, knowledge brokers can be broadly understood as those who facilitate the creation, sharing, and use of knowledge and who establish and maintain links between research and practice via the translation of research. In this way, leaders engage in URE by sharing research, facilitating its use, and providing a link to research resources. Through planning efforts, decision making, and overseeing the implementation of improvement initiatives, local leaders are often the gateway to research being brought into local practice (Leithwood & Seashore Louis, 2011; J. W. Neal et al., 2015). Studies of URE in education highlight the ways in which leaders facilitate the discussion of research (e.g., Cosner, 2011) and leverage their own knowledge and networks to bring research into the conversation (e.g., Daly et al., 2014; Honig et al., 2014). Recent findings from the Center for Research Use in Education (CRUE), an IES-funded knowledge utilization center, point to principals and district central offices as two of the most frequently cited sources for research-based information among school staff (Shewchuk & Farley-Ripple, 2021).
A third way in which education leaders engage in URE is through their leadership. Research suggests that leaders develop human capital for URE through staffing decisions and professional development (Brown & Zhang, 2017; B. Levin, 2010; J. A. Levin & Datnow, 2012). Honig et al. (2017) also point to the influence of superintendent support on the URE capabilities of staff. Similarly, qualitative data from CRUE (Farley-Ripple et al., 2022a, 2023) reveal that school and district leaders hire staff with specific evidence use capabilities, build capacity through selecting staff to serve on teams that engage with evidence in improvement decisions, and engage consultants or district supports to assist teachers with evidence use. Leaders also integrate research and local knowledge to set the vision and direction for school improvement. This includes diagnostic and tactical leadership to better understand school processes and context (Cosner, 2011), forming goals linked to organizational needs (J. A. Levin & Datnow, 2012), and establishing an evidence-based agenda (Brown & Zhang, 2017). Relatedly, leaders strengthen the culture of evidence use through the establishment of organizational routines and tools centered on evidence use (Cosner, 2011; B. Levin, 2010; J. A. Levin & Datnow, 2012); the allocation of resources to enable evidence use (Brown & Zhang, 2017; J. A. Levin & Datnow, 2012); developing shared leadership (Cosner, 2011; J. A. Levin & Datnow, 2012); cultivating a culture of collaboration, trust, and inquiry (Cosner, 2011; Earl, 2009; J. A. Levin & Datnow, 2012); strengthening teams and interpersonal processes (Cosner, 2011); and fostering internal social capital and networks (Brown & Zhang, 2017; J. A. Levin & Datnow, 2012).
Both the types of URE and the leaders’ role in URE are integral components of what we conceptualize as behavior. Leaders may engage in multiple types of URE as part of educational decision-making and improvement efforts. Thus, improving URE in educational policy and practice may entail strengthening leaders’ engagement in any or all types of URE.

3.2. Opportunity for URE

Returning to the COM-B framework, opportunity is defined as “the factors that lie outside the individual that make the behaviour possible or prompt it” (Michie et al., 2011, p. 4). That is, URE does not occur in a vacuum, but rather opportunity is created or suppressed by the affordances in the work context in which the research user operates. In some instances, it may be a predictable consequence of a planned change, while in others it may be unanticipated and time-bound (M. D. Cohen et al., 1972). Several environmental factors shape leaders’ opportunities for URE, including characteristics of the organization and characteristics of the decision.
Studies consistently find that organizational context enables or constrains research use. Schools and districts, as well as other levels of the education policy system, vary in terms of human and financial resources to support evidence use; allocation of time for evidence use; a culture that features trust, collaboration, and norms for evidence use; leadership; and structures and processes that facilitate communication of and about evidence (Asen et al., 2013; Brown & Zhang, 2017; Cordingley, 2008; Coburn & Talbert, 2006; Coburn et al., 2009; Penuel et al., 2017; Supovitz & Klein, 2003). Studies also note that the nature and extent of search depends on a range of individual and organizational factors, such as the internal resources and expertise available, relationships with external organizations and networks, and the extent to which search is a formally designated activity (Honig & Coburn, 2008; Massell et al., 2012; Penuel et al., 2017). Together, these factors relate to the organization’s capacity to incorporate external information—often described as absorptive capacity (W. M. Cohen & Levinthal, 1990; Farrell & Coburn, 2017)—and the institutional logics that shape how work is conducted (Thornton et al., 2015; Horn et al., 2015).
Additionally, the decision context matters. There may be decisions for which the use of research is not necessary, or for which evidence is overwhelming and conflicting, or for which many different research bases are relevant (Farley-Ripple et al., 2020). Prior research suggests that URE is shaped by research evidence’s availability, compatibility with the local context, and relevance to the issue at hand (Jabbar et al., 2014; Z. Neal et al., 2018; Penuel et al., 2018; Robinson, 1992). Decision contexts may also vary in terms of participants, social and cultural norms surrounding the issue, and anticipated controversy (Huguet et al., 2021). Further, how problems are understood can influence which solutions are proposed and the arguments used to support them (Coburn et al., 2009). The relationship between problems and decisions reflects theories of action, which may shape the opportunity for evidence use, the kind of evidence that might be used, and how it is used. As Robinson (1992) explains, practitioners’ judgments of the relevance of research findings “reflect [their] understanding of the problem, the solution, and their role in each” (p. 10), so research that does not reflect those understandings is likely to be deemed irrelevant.
Organizational and decision characteristics reflect the context of leaders’ work, demanding both knowledge of how to navigate opportunities for URE as well as knowledge of how to cultivate those opportunities.

3.3. Conceptualizing Leaders’ URE Capabilities

Prior research suggests a wide range of knowledge and skills associated with individuals’ URE capabilities. Hochbein and Perry (2013) suggest that educational leaders need capabilities across three broad domains in terms of research use: decipher, discuss, and design. Drawing on recent literature, we extend these domains to include the additional capabilities of discover, disseminate, and drive. These six capabilities form a framework for conceptualizing the demands of leaders’ URE work.
Two core skills for all leaders’ URE roles are discovering and deciphering research. However, education leaders need to first discover research before they can decipher it. Discovery requires leaders to have skills related to obtaining evidence from different sources (Bayley et al., 2018; Firestone et al., 2021; Mallidou et al., 2018), including knowing how to identify and access reliable sources of research. Education leaders must then be able to decipher research findings, which includes understanding different research methodologies and learning how to critically appraise research as well as to evaluate its fit for local context (Farley-Ripple et al., 2022b; Bayley et al., 2018; Mallidou et al., 2018). Prior research establishes that educators, including educational leaders, may vary in their capabilities and motivation to use research evidence. This includes their ability to effectively analyze and apply evidence, their beliefs about the role and value of evidence to their work, and perceptions of feasibility and available resources (Cooper et al., 2017; Farley-Ripple et al., 2022b; Massell et al., 2012; Gorard, 2020; Honig et al., 2017; Supovitz & Klein, 2003; Williams & Coles, 2007). Further, research suggests that decision-makers’ individual experiences and beliefs impact the way in which evidence is processed (Corcoran et al., 2001; Jabbar et al., 2014; Z. Neal et al., 2018; Penuel et al., 2018).
For leaders’ roles as brokers and leaders of organizational URE, educators need additional capabilities, such as skills related to disseminating research, including strategies for synthesizing, summarizing, translating, and communicating research knowledge for/to audiences from diverse backgrounds (Bayley et al., 2018; Mallidou et al., 2018). Dissemination also relies on the individual’s social ties and networks to effectively mobilize information throughout their organization. Studies also show that educators’ and leaders’ own professional networks matter in terms of the kinds of evidence they search for and use (Finnigan et al., 2013; Massell et al., 2012; Williams & Coles, 2007). Therefore, leaders need to understand how to develop and leverage social capital. Hochbein and Perry (2013) also argue that education leaders must be able to discuss “the ideological and methodological merits” (p. 187) of research and evidence-based practices with constituents. For example, Coburn (2001) describes leaders’ roles in the sensemaking process of the school community. The capacity for such discussion includes a mix of strong social capital as well as more specific skills in facilitating, negotiating, questioning, and influencing stakeholders to support change (Bayley et al., 2018; Mallidou et al., 2018). Additionally, education leaders must be able to design practical solutions to local problems of practice using research knowledge. Firestone et al. (2024) report examples of leaders who used research when designing a teacher evaluation and professional growth program and when designing and adopting a district-level curriculum.
Finally, leaders need to be able to drive evidence-informed change within their respective organizations. This entails integrating aforementioned capacities with improvement-focused skills such as change management, relationship management, and capacity building (Bayley et al., 2018; Mallidou et al., 2018; Thoonen et al., 2012; Cain & Graves, 2018).

3.4. Conceptualizing Leaders’ URE Motivation

Motivation is understood as a “function of key cognitions that underlie the decision to enact a certain behavior” (Yanovitzky & Blitz, 2017, p. 5897). Although motivation is a core component of the COM-B framework, few studies or frameworks explicate the dimensions of motivation that are relevant to URE. In their application of the COM-B framework to data use, Yanovitzky and Blitz (2017) highlight several dimensions of motivation that reflect the larger theoretical literature on motivation, which links expectancy theories (e.g., Vroom, 1965) with self-efficacy (e.g., Bandura, 1986) and goal theory (e.g., Karoly, 1993). Relatedly, research from a learning sciences perspective suggests that barriers such as perceptions of their agency, their relational expertise, the normative aspect of institutional practice, and risk-taking influence educators’ URE (Hofmann, 2024). These ideas are narrowed into three broad areas related to leaders’ URE: self- and collective efficacy, beliefs, and norms.
Self-efficacy refers to confidence in one’s ability to be successful in achieving a certain goal or standard, whereas collective efficacy reflects a belief in a team’s or organization’s ability to be successful. In the context of URE, these sources of motivation refer to leaders’ beliefs in their own ability to successfully engage in URE as well as beliefs about their organization’s collective ability, reflecting both their own use as well as their ability to lead for URE. Greater self- and collective-efficacy may increase the motivation to engage in URE. Another source of motivation, outcome expectancy, pertains to confidence that behavior will lead to desired outcomes, in this case, leaders’ belief that URE will lead to desired improvements (e.g., student learning). Finally, value expectancy refers to the belief that the anticipated outcomes resulting from behavior are valuable. In other words, the personal, organizational, or student outcomes associated with engaging in URE have a high level of value to educational leaders. Thus, the more important leaders believe the benefits of URE are, the more motivated they will be to engage in it.
Research suggests that decision makers’ individual experiences and beliefs impact the way in which evidence is processed in ways consistent with these sources of motivation. For example, studies have shown that educators who believe research or data are valuable to their practice are more likely to use it (Brown, 2017; L. Lysenko et al., 2014; Coburn & Turner, 2011; Rickinson et al., 2020). Others have suggested that research self-efficacy or research literacy may result in greater use (Coldwell et al., 2017; Ion et al., 2021).
Yanovitzky and Blitz (2017) also describe additional sources of motivation, such as norms for evidence use, anticipated barriers to implementing evidence-based practice (EBP), or external accountability for EBP (e.g., to state or federal governments). However, these are contextually driven, and, given our purposes of understanding the potential role of EdD programs in shaping URE, are a better fit to our conceptualization of opportunity for URE. The three sources of motivation described here might be considered ‘internal’ motivations, which Langer et al. (2016) describe as subject to influence through evidence-based decision making (EBDM) interventions, such as EdD programs.

4. The Redesigned EdD and Leaders’ Preparation for URE

The premise of this paper is that EdD programs can be conceptualized as a lever for strengthening URE and, subsequently, strengthening the role of research in educational improvement. To this point, we have laid out the behaviors, opportunities, capacities, and motivations related to leaders’ engagement in URE. Here, we use the example of CPED-redesigned EdD programs as an illustration of program features that can support URE. The purpose of these redesigned EdD programs is to integrate research, theory, and practice to provide K-12 educational professionals with the knowledge and tools necessary to identify, investigate, and lead change to address real school system problems (Hoffman & Perry, 2016).
Redesigned EdD programs are shorter (often 3–4 years), geared towards part-time practitioners who remain in practice and engage their work setting in their courses and dissertations. Specifically, redesigned EdD programs center professional practice and the practitioner in their design where students enter the program with a focus on a complex problem of practice (POP) defined as “a persistent, contextualized, and specific issue embedded in the work of a professional practitioner, the addressing of which has the potential to result in improved understanding, experience, and outcomes” (CPED, 2010). They are taught how to address their POP by applying theories and research methodologies with the goal of improving the problem by testing interventions, conducting needs assessments, evaluating outcomes, etc., and using data to determine the impacts on the POP. They complete a dissertation in practice (DiP) defined as a “scholarly endeavor that impacts a complex problem of practice” (CPED, 2010), which is often embedded into their program and completed during coursework. This DiP gives students the opportunity to practice the skills of research alongside several faculty instructors and provides hands-on practice in the application of theory and research to solve problems.
Below, we explain more specifically the principles and learning experiences of redesigned EdD programs, followed by a discussion of how their implementation advances goals for improving the use of research evidence.

4.1. Principles and Learning Experiences of Redesigned EdD Programs

Adult learning research suggests that an effective EdD program includes the following principles: (a) builds upon the experience-based mental models learners use to guide their thinking, (b) engages learners in rich and interactive learning experiences to expand their mental models, and (c) gives learners experiences where they use their new knowledge and skills to address local problems of practice (Knowles et al., 2015; Merriam, 2018; Mezirow, 2000; Sheckley et al., 2010). Based on prior empirical work (Firestone et al., 2021, 2024), we argue that redesigned EdD programs often include these adult learning principles as a means to improve evidence-based decision-making. Evidence further suggests that incorporating adult learning principles in the design of programs increases both learners’ capacity and motivation to use evidence (Langer et al., 2016).
Adult learning theory suggests that adult learners learn best when their existing knowledge and the problems they face are incorporated into the teaching of a concept (Knowles et al., 2015; Merriam, 2018; Mezirow, 2000). Thus, when EdD programs are connected to a problem of practice that adult learners face, they are more engaged because of the link between academic studies and professional practice. The problems of practice (POP) that students in redesigned EdD programs choose often build upon their prior experiences and become the focus of their doctoral work early on in the program. Research in intrinsic motivation suggests that self-chosen POPs will energize students’ work over the course of the program (Deci & Ryan, 2000). Additionally, students will link new learning to a rich network of prior experiences (Damasio, 1999) and build upon their prior explicit and tacit knowledge (Clark & Elen, 2006).
To expand the mental models that students use, they engage in coursework that provides new and different ways of thinking about their leadership work through rich and interactive learning experiences with peers. For example, in terms of activities, redesigned EdD programs devote a substantial portion of coursework to teaching research methods using hands-on opportunities. The uniqueness of this research methods training, whether in methods courses or as assignments in other courses, is that students practice how to use research design, studies, evaluations, and even searches for best practices to collect data (sometimes quantitative, sometimes qualitative, sometimes previous research), and to analyze data—all skills they will need as future research users, brokers, and leaders (Osterman et al., 2014). Redesigned EdD programs also require students to complete courses in educational leadership. These courses provide students with research-based knowledge about their social contexts, the political–social environment of schools and schooling, the sociology and psychology of learning, especially as it applies to schools, and leadership practices. Through opportunities to apply such knowledge to their own work situations and those of their peers during leadership courses, students gain experience in reviewing and discussing research to engage in leading URE (e.g., integrate research and local knowledge to set the vision and direction, build human capital).
As with other research highlighting the importance of learning from peers (e.g., Dreier, 2009; Penuel et al., 2016; DiGiacomo et al., 2016; Wenger, 1998), Langer et al. (2016) found that multi-mechanism interventions that include interaction among professionals can build a professional identity and common practices and standards of conduct in relation to evidence use. Redesigned EdD programs have two mutually reinforcing mechanisms to promote such interactions: cohorts and collaborative, deprivatized learning activities related to coursework. Cohorts are a model of learning that creates a group of students who enter the program together, move through the various learning experiences together, and graduate at about the same time. This mechanism gives students a general sense of belonging and camaraderie (Browne-Ferrigno & Muth, 2012; Leland et al., 2020), with the objective of cultivating collaborative practice. The EdD cohort model can create an especially helpful context for continuous interactions among students through the establishment of norms and values that promote free, honest communication (Browne-Ferrigno & Muth, 2012; Jones et al., 2005). As students enter their cohorts having identified a POP and otherwise analyzing shared professional issues, the physical and institutional settings of their working contexts become prominent learning components, allowing them to draw from each other’s backgrounds when making sense of new information (Leland et al., 2020). In redesigned EdD programs, the cohort model is built upon students having a shared dedication to remaining in practice during and after the program, as well as a shared dedication to becoming agents of educational change. Coursework and assignments that allow for collective learning as well as reflection help to develop collaborative, de-privatized work relationships among students and between students and faculty (Firestone et al., 2021; Leland et al., 2020).
Finally, redesigned programs support students in gaining experience applying new knowledge and skills to local problems of practice. Sheckley et al. (2010) argued that bringing an individual’s existing knowledge and experience together with interactive learning opportunities in a specific learning environment can enhance adult learning. Through POPs and other activities, redesigned EdD programs require students to apply knowledge and skills to local issues during their coursework and their dissertation projects. Redesigned EdD programs help their students make the connections between the seemingly abstract ideas that are taught in academic courses with the day-to-day realities they face. For instance, when helping students assess the quality of research evidence, in addition to introducing students to academic ideas of methodological rigor and designs that control for rival hypotheses, faculty help students think about how to assess the relevance of findings to their situation and their utility for addressing particular questions. Moreover, specific data collection and analysis activities—such as equity audits—may be carried out to analyze particular questions in specific schools or districts, and results are often shared with the people from whom the data are collected. Similarly, the POP organizing the student’s doctoral work is typically a real issue in their organization, so the social science and research methods knowledge learned is connected to the student’s own settings (Firestone et al., 2020). Thus, redesigned EdD programs use these applied experiences and the POP as strategies for building students’ capabilities (i.e., discover, decipher, discuss, disseminate, design, drive) for URE across all academic courses. After completing their coursework, students in redesigned programs should have a well-formulated and complex POP that they will use as the focus for their respective dissertation projects. During the project, students apply the skills they have learned throughout the program to solve their POP and implement their ideas in their work settings.
The intent of this work is not to validate redesigns as we describe as the best or only approach to organizing EdD programs, as there are critiques of practitioner-oriented research preparation, the quality of dissertations, and the limitations for preparing researchers (Storey & Hesbol, 2014; Zacharakis & Thompson, 2013). Rather, our purpose is to highlight how these redesigns may support learning that strengthens URE among educational leaders. We acknowledge that a discussion of the advantages and disadvantages of EdD program designs is important, but it is a topic that goes beyond the scope of this paper.

4.2. How Redesigned EdD Principles Can Influence Capability, Motivation, and Opportunity

In what follows, we illustrate how EdD programs—particularly those grounded in adult learning and applied inquiry—can shape each element of this behavior system to promote meaningful and sustained engagement with research.
Redesigned EdD programs build leaders’ capabilities for URE through both coursework and the dissertation phase. Across methods and leadership courses, students gain experience in searching for research (e.g., using databases), critically appraising evidence (e.g., evaluating validity, reliability, applicability), synthesizing and translating research findings, and applying evidence in workplace contexts. They also learn to plan and implement practice change (e.g., conduct needs assessments, develop implementation plans), enhancing their skills to promote evidence-based practices. Structured and interactive classroom practices also help build critical URE skills. Activities such as group papers, fishbowl discussions, and think-pair-share encourage students to evaluate the ideological and methodological merits of research collaboratively. These structured opportunities strengthen students’ abilities to communicate findings to diverse audiences, thus developing both individual and collective URE capabilities. These experiences develop students’ URE capabilities, such as discover, decipher, discuss, disseminate, and drive, and help grow their design capabilities and value expectancies for engaging in URE (Firestone et al., 2020, 2024; Langer et al., 2016). The dissertation phase further develops these capabilities. Students begin by situating their POP within the research literature (discover), analyzing how it applies to their local contexts (decipher), and applying appropriate research methodologies (decipher). They then gather and analyze data (design), develop recommendations and action plans (drive), synthesize findings into a practitioner-focused product (disseminate), and defend their conclusions before faculty (discuss). These experiences align with evidence from other fields suggesting that engaging in applied research can enhance orientation toward and confidence in URE (Janssen et al., 2013).
Redesigned EdD programs not only prepare leaders to use research evidence themselves but also to navigate and shape opportunities for URE. Through sustained engagement with their own POP, students examine how structural, cultural, and political conditions affect their ability—and that of their organization—to apply evidence in decision-making. This reflection encourages students to develop an awareness of external motivators and barriers to research use, helping them diagnose opportunities and constraints within their organizational contexts. Students also learn how to cultivate environments conducive to URE. For example, through assignments and cohort interactions, they explore how to lead collaborative inquiry, promote evidence-informed practices, and create systems for sharing knowledge within their workplaces. These are not just technical skills, but also strategic leadership competencies—graduates learn how to influence the social and structural conditions that shape decision-making, making them better equipped to foster a research-informed culture. Furthermore, both coursework and the cohort model support this capacity by helping students develop bridging and bonding social capital (Patulny & Svendsen, 2007). These relational assets—nurtured through group projects, peer discussion, and faculty feedback—allow students to access diverse information sources (bridging capital) and to mobilize collective action within teams and networks (bonding capital). Research suggests these forms of social capital, developed in EdD programs and carried into work contexts, are empirically linked to increased research use (Leland et al., 2020; Firestone et al., 2021, 2024).
Focusing on a POP throughout the program enhances students’ motivation by increasing relevance, ownership, and perceived efficacy. This focus reinforces both self-efficacy and response efficacy for URE—students believe they can engage in research and that doing so will produce valuable outcomes. It also supports awareness of how external motivating factors and conditions influence one’s ability to use research effectively. This “honest attempt to focus on the learner” aligns with adult learning principles (Knowles et al., 2015, p. 5) and shows how EdD programs intentionally center student motivations in their design. Experiential coursework also plays a key role in increasing internal motivation. Firestone et al. (2020, 2024) found that graduates who had opportunities to apply methods in real-life settings reported more extensive use of research post-graduation. Similarly, Langer et al. (2016) found that applied research training—like that found in redesigned EdD programs—enhances decision-makers’ capability and motivation to use evidence. Through these experiences, students develop a sense of efficacy and begin to see themselves and their peers as capable of driving evidence-informed change in their organizations. Cohort-based learning further strengthens motivation, as extended peer interaction fosters a collective sense of purpose. The social reinforcement of seeing others value and practice URE encourages students not only to learn these skills during the program but to apply them long-term in decision-making contexts (Firestone et al., 2020).
While formal inquiry into the effects of EdD preparation on URE are sorely lacking in the field, emerging evidence suggests that these principles and learning experiences translate to URE later on in leaders’ careers, notably demonstrating links between experiential methods instruction, leaders’ extended time together in cohorts, and bridging and bonding social capital and more extensive use of research evidence after graduating (Firestone et al., 2020, 2021, 2024).

5. Framework Linking Redesigned EdD Programs to URE in Practice

Incorporating both our understanding of URE in schools and districts, including the capacities, motivations, opportunities, and behaviors, as well as our understanding of redesigned EdD programs, Figure 1 (below) integrates the components into a cohesive framework. From left to right, principles and learning experiences in redesigned EdD programs are intended to develop URE-related capabilities and motivations among their graduates, as well as the ability to navigate and cultivate opportunity, represented both as an outcome and as the larger context of leaders’ URE in practice. These collectively contribute to EdD graduates’ URE behavior, represented as use, work as a broker, and their leadership for URE.
This theory of action clarifies the key dimensions of URE, identifying the underlying capabilities and motivations that support their role in fostering evidence-informed decision-making and the features of redesigned programs that support their development. By drawing on the COM-B framework, this perspective highlights how redesigned EdD programs can cultivate the necessary knowledge, skills, and dispositions that enable leaders to effectively interpret, apply, and facilitate the use of research within their schools and districts in ways described in the literature. Additionally, this theory suggests specific dimensions of doctoral leadership preparation that may enhance well-documented URE practices. By embedding targeted experiences that develop leaders’ capacity for URE, EdD programs can serve as a critical lever in advancing evidence-informed decision-making.
If EdD programs are to serve as levers for strengthening URE in educational leadership, they must influence the three key components of behavior identified in the COM-B framework: capability, motivation, and opportunity (Michie et al., 2011). While many programs already include elements that support URE, using COM-B as a conceptual lens brings clarity and coherence to how these components can be purposefully cultivated through program design.
Building capability begins with equipping leaders with the knowledge and skills to locate, interpret, evaluate, and apply research in practice. Research methods coursework is essential, but its effectiveness depends on design: students must learn about research and engage with it in ways that reflect their real-world leadership challenges (Osterman et al., 2014). Activities such as conducting needs assessments, designing equity audits, and leading cycles of inquiry enable students to practice using research tools in context, thereby deepening technical fluency and situational awareness (Janssen et al., 2013). Leadership courses that explore the political, social, and organizational dimensions of schooling further support students in integrating evidence into their strategic thinking (J. A. Levin & Datnow, 2012). Across coursework, scaffolded opportunities to analyze, synthesize, and communicate research help students become not just consumers of evidence but leaders of its use across schools and systems (Brown & Zhang, 2017; Mallidou et al., 2018).
EdD programs also play a critical role in cultivating leaders’ motivation to use research. When programs center learning around student-selected problems of practice (POPs), they create authentic contexts that increase engagement and reinforce the relevance of evidence (Deci & Ryan, 2000; Knowles et al., 2015). As students use research to address challenges that matter to them and their communities, their confidence in its utility grows (Clark & Elen, 2006; Damasio, 1999). Repeated opportunities for success, supported by targeted feedback and reflection, strengthen self-efficacy (Bandura, 1986; Coldwell et al., 2017). Witnessing the tangible impact of research-informed actions further reinforces outcome and value expectancy—key psychological drivers of sustained motivation (Yanovitzky & Blitz, 2017). Through these experiences, students come to see URE not as an external requirement but as a meaningful, integral part of their professional identity (Langer et al., 2016).
To influence opportunity, EdD programs must prepare leaders to navigate and shape the environments in which research is—or isn’t—used. This means developing an understanding of how organizational routines, structures, and norms can either enable or constrain URE (Asen et al., 2013; Coburn & Talbert, 2006; Supovitz & Klein, 2003). Through peer discussion, structured reflection, and applied coursework, students learn to diagnose contextual conditions and strategize ways to build absorptive capacity within their schools and districts (W. M. Cohen & Levinthal, 1990; Farrell & Coburn, 2017). Leadership preparation can also foreground the political and social dynamics of evidence use, helping students learn how to build cultures of trust, create time for collaborative inquiry, and advocate for research-informed change (Cosner, 2011; B. Levin, 2010). Program structures such as cohorts and collaborative assignments mirror the professional communities supporting evidence use in practice, helping students develop social capital and habits of de-privatized inquiry that carry into their workplaces (Leland et al., 2020; Firestone et al., 2021).
Framing EdD preparation through the COM-B model clarifies the pathway from program design to behavioral change. It shifts the focus from general leadership development to the targeted cultivation of the learning conditions that enable URE. When EdD programs embed opportunities to build capability, strengthen motivation, and expand opportunity into their curriculum and pedagogy, they function as more than academic credentials—they become systems-level interventions. In doing so, they prepare graduates not only to use research but also to lead its use, fostering a culture of evidence-informed decision-making in their organizations.

6. Broader Implications and Conclusions

As described earlier, URE is widely regarded as a lever for educational improvement, and leaders are highly influential in URE practices in schools and districts. At the same time, there has been little attention to how to develop and support leadership development that leads to greater engagement with research in improvement efforts. The framework presented in this paper offers much-needed direction as to how leadership programs can strengthen URE and the specific principles and learning experiences by which that can happen. These mechanisms may be important building blocks for developing effective leaders with the knowledge and skills to lead evidence-informed improvement. Furthermore, these mechanisms offer strategies that are or can be embedded in courses across common leadership preparation curricula, such that preparing leaders for URE is part of broader preparation rather than separate from or in addition to it. Given the expansive roles and responsibilities of leaders, this embeddedness is an important feature of the theory of action and one that makes them realistic and feasible levers for strengthening URE. Importantly, these are not merely hypothetical mechanisms, but ones that are currently enacted in a number of redesigned EdD programs, thereby offering models that can be adopted or adapted by other programs.
While the primary focus of this paper has been on the design and implementation of EdD programs in educational leadership, these insights likely have implications for other forms of in-service professional learning, offering a broader pathway for strengthening URE across the careers of educational leadership. For example, professional learning available through leadership professional learning communities, coaching, or principal supervisors may be able to incorporate learning experiences that parallel those described here in redesigned EdD programs and/or may explicitly address capabilities and motivations associated with URE. Furthermore, expanding this theory of action to in-service learning may also create means of addressing opportunities for URE more directly, as such opportunities are likely to be situated in and therefore able to influence local context more than EdD programs, which have limited influence on local leadership contexts.
Of course, our theory of action is a first step in clarifying EdD programs as a lever for strengthening URE. Future research should empirically test these proposed relationships to assess both the presence and strength of the anticipated connections between EdD preparation and URE outcomes. Investigating the variability in program implementation will be essential to understanding which design elements most effectively support research engagement. Furthermore, a deeper exploration of how these preparation experiences interact with the local contexts in which URE is enacted may uncover additional supports necessary for leaders to successfully support and engage in URE in their schools and districts. Nonetheless, we offer an important conceptual framework for better aligning collective efforts to improve education through the design and leveraging of leadership programs to advance URE in education, with the ultimate goal of strengthening educational outcomes and opportunities for all.

Author Contributions

All authors contributed to the conceptualization, methodology, analysis writing and review of this manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not Applicable.

Informed Consent Statement

Not Applicable.

Data Availability Statement

No data were used for this conceptual paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50(2), 179–211. [Google Scholar] [CrossRef]
  2. Anderson, D. G. (1983). Differentiation of the EdD and Ph.D. in education. Journal of Teacher Education, 34(3), 55–58. [Google Scholar] [CrossRef]
  3. Anderson, G. L., & Herr, K. (1999). The new paradigm wars: Is there room for rigorous practitioner knowledge in schools and universities? Educational Researcher, 28(5), 12–40. [Google Scholar] [CrossRef]
  4. Andrews, R., & Grogan, M. (2005). Form should follow function: Removing the EdD dissertation from the PhD straight jacket. UCEA Review, 46(2), 10–13. Available online: https://digitalcommons.chapman.edu/education_articles/178/ (accessed on 25 September 2021).
  5. Asen, R., Gurke, D., Conners, P., Solomon, R., & Gumm, E. (2013). Research evidence and school board deliberations: Lessons from three Wisconsin school districts. Educational Policy, 27(1), 33–63. [Google Scholar] [CrossRef]
  6. Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Prentice-Hall. [Google Scholar]
  7. Bayley, J. E., Phipps, D., Batac, M., & Stevens, E. (2018). Development of a framework for knowledge mobilisation and impact competencies. Evidence & Policy: A Journal of Research, Debate and Practice, 14(4), 725–738. [Google Scholar] [CrossRef]
  8. Biddle, B. J., & Saha, L. J. (2006). How principals use research. Educational Leadership, 63(6), 72–77. Available online: https://eric.ed.gov/?id=EJ745567 (accessed on 13 April 2025).
  9. Bossert, S. T., Dwyer, D. C., Rowan, B., & Lee, G. V. (1982). The instructional management role of the principal. Educational Administration Quarterly, 18(3), 34–64. [Google Scholar] [CrossRef]
  10. Brown, C. (2017). Enabling teachers to critically engage with education research: Insights from the UK. Teaching and Teacher Education, 63, 147–157. [Google Scholar] [CrossRef]
  11. Brown, C., & Zhang, D. (2017). How can school leaders establish evidence-informed schools: An analysis of the effectiveness of potential school policy levers. Educational Management Administration & Leadership, 45(3), 382–401. [Google Scholar] [CrossRef]
  12. Browne-Ferrigno, T., & Muth, R. (2012). Call for research on Candidates in leadership preparation programs. Planning and Changing, 43, 10–24. [Google Scholar]
  13. Buss, R. R., & Zambo, D. (2016). Using action research to develop educational leaders and researchers. In J. A. Perry (Ed.), The EdD and the scholarly practitioner: The CPED path (pp. 137–152). Information Age Publishing. [Google Scholar]
  14. Cain, T., & Allan, D. (2017). The invisible impact of educational research. Oxford Review of Education, 43(6), 718–732. [Google Scholar] [CrossRef]
  15. Cain, T., & Graves, S. (2018). Building a research-informed culture. In Becoming a research-informed school (pp. 99–119). Routledge. [Google Scholar]
  16. Carnegie Project on the Education Doctorate (CPED). (2010). Guiding principles for program design. Available online: https://www.cpedinitiative.org/the-framework (accessed on 19 September 2020).
  17. Clark, R. E., & Elen, J. (2006). When less is more: Research and theory insights about instruction for complex learning. In J. Elen, & R. E. Clark (Eds.), Handling complexity in learning environments: Theory and research (pp. 283–295). Elsevier Publishing Company. [Google Scholar]
  18. Clifford, G. J., & Guthrie, J. W. (1988). Ed school: A brief for professional education. University of Chicago Press. [Google Scholar]
  19. Coburn, C. E. (2001). Collective sensemaking about reading: How teachers mediate reading policy in their professional communities. Educational Evaluation and Policy Analysis, 23(2), 145–170. [Google Scholar] [CrossRef]
  20. Coburn, C. E., Spillane, J. P., Bohannon, A. X., Allen, A.-R., Ceperich, R., Beneke, A., & Wong, L.-S. (2020). The role of organizational routines in research use in four large urban school districts (Technical Report No. 5). National Center for Research in Policy and Practice. Available online: https://files.eric.ed.gov/fulltext/ED612257.pdf (accessed on 11 April 2025).
  21. Coburn, C. E., & Talbert, J. E. (2006). Conceptions of evidence use in school districts: Mapping the terrain. American Journal of Education, 112(4), 469–495. [Google Scholar] [CrossRef]
  22. Coburn, C. E., Touré, J., & Yamashita, M. (2009). Evidence, interpretation, and persuasion: Instructional decision making at the district central office. Teachers College Record, 111(4), 1115–1161. [Google Scholar] [CrossRef]
  23. Coburn, C. E., & Turner, E. O. (2011). Research on data use: A framework and analysis. Measurement: Interdisciplinary Research & Perspective, 9(4), 173–206. [Google Scholar] [CrossRef]
  24. Cochran-Smith, M., & Lytle, S. (2009). Teacher research as stance. In S. E. Noffke, & B. Somekh (Eds.), The SAGE handbook of educational action research (pp. 39–49). Sage Publications. [Google Scholar]
  25. Cohen, M. D., March, J. G., & Olsen, J. P. (1972). A garbage can model of organizational choice. Administrative Science Quarterly, 17(1), 1–25. [Google Scholar] [CrossRef]
  26. Cohen, W. M., & Levinthal, D. A. (1990). Absorptive capacity: A new perspective on learning and innovation. Administrative Science Quarterly, 35(1), 128–152. [Google Scholar] [CrossRef]
  27. Coldwell, M., Greany, T., Higgins, S., Brown, C., Maxwell, B., Stiell, B., Stoll, L., Willis, B., & Burns, H. (2017). Evidence-informed teaching: An evaluation of progress in England. Department for Education. Available online: https://assets.publishing.service.gov.uk/media/5a749aca40f0b61938c7ece0/Evidence-informed_teaching_-_an_evaluation_of_progress_in_England.pdf (accessed on 11 April 2025).
  28. Cooper, A., Klinger, D. A., & McAdie, P. (2017). What do teachers need? An exploration of evidence-informed practice for classroom assessment in Ontario. Educational Research, 59(2), 190–208. [Google Scholar] [CrossRef]
  29. Corcoran, T., Fuhrman, S. H., & Belcher, C. L. (2001). The district role in instructional improvement. Phi Delta Kappan, 83(1), 78–84. [Google Scholar] [CrossRef]
  30. Cordingley, P. (2008). Research and evidence-informed practice: Focusing on practice and practitioners. Cambridge Journal of Education, 38(1), 37–52. [Google Scholar] [CrossRef]
  31. Cosner, S. (2011). Supporting the initiation and early development of evidence-based grade-level collaboration in urban elementary schools: Key roles and strategies of principals and literacy coordinators. Urban Education, 46(4), 786–827. [Google Scholar] [CrossRef]
  32. Cosner, S. (2019). What makes a leadership preparation program exemplary? Journal of Research on Leadership Education, 14(1), 98–115. [Google Scholar] [CrossRef]
  33. Cremin, L. A. (1978). The education of the educating professions [Speech transcript]. American Association of Colleges for Teacher Education. Available online: https://files.eric.ed.gov/fulltext/ED148829.pdf (accessed on 19 September 2020).
  34. Daft, R. L., & Becker, S. W. (1978). The innovative organization: Innovation adoption in school organizations. Elsevier Publishing Company. [Google Scholar]
  35. Daly, A. J., Moolenaar, N. M., Finnigan, K. S., Jordan, S., & Che, J. (2014). Misalignment and perverse incentives: Examining the politics of district leaders as brokers in the use of research evidence. Educational Policy, 28(2), 145–174. [Google Scholar] [CrossRef]
  36. Damasio, A. (1999). The feeling of what happens: Body and emotion in the making of consciousness. Harcourt Brace & Company. [Google Scholar]
  37. Deci, E., & Ryan, R. (2000). The “what” and “why” of goal pursuits: Human needs and the self determination of behavior. Psychological Inquiry, 11(4), 227–268. [Google Scholar] [CrossRef]
  38. DiGiacomo, D. K., Prudhomme, J. J., Jones, H. R., Welner, K. G., & Kirshner, B. (2016). Why theory matters: An examination of contemporary learning time reforms. Education Policy Analysis Archives, 24, 44. Available online: https://epaa.asu.edu/index.php/epaa/article/view/2334 (accessed on 11 April 2025). [CrossRef]
  39. Dreier, O. (2009). Persons in structures of social practice. Theory and Psychology, 9(2), 193–212. [Google Scholar] [CrossRef]
  40. Earl, L. M. (2009). Leadership for evidence-informed conversations. In L. M. Earl, & H. Timperely (Eds.), Professional learning conversations: Challenges in using evidence for improvement (pp. 43–52). Springer. [Google Scholar]
  41. Farley-Ripple, E. N. (2012). Research use in school district central office decision making: A case study. Educational Management Administration & Leadership, 40(6), 786–806. [Google Scholar] [CrossRef]
  42. Farley-Ripple, E. N. (2021). Research brokerage: How research enters and moves through schools (Research Brief). The Center for Research Use in Education, University of Delaware (IES Grant R305C150017). [Google Scholar]
  43. Farley-Ripple, E. N. (2024). The use of research in schools: Principals’ capacity and contributions. Education Sciences, 14(6), 561. [Google Scholar] [CrossRef]
  44. Farley-Ripple, E. N., Mead, H., & Tilley, K. (2023). Organising for research use: Lessons from four deep-use schools. In Who really cares about using education research in policy and practice? (pp. 174–190) Educational Research and Innovation, OECD Publishing. [Google Scholar] [CrossRef]
  45. Farley-Ripple, E. N., Tilley, K., Mead, H., Van Horne, S., & Agboh, D. (2022a). How is evidence use enacted in schools? A mixed methods multiple case study of” deep-use” schools. Center for Research Use in Education. [Google Scholar]
  46. Farley-Ripple, E. N., Tilley, K., Sheridan, S., & Gallimore, R. (2020). What improvement challenges do schools face? Searching for ways research might help. Teachers College Record. [Google Scholar]
  47. Farley-Ripple, E. N., Van Horne, S., Tilley, K., Shewchuk, S., May, H., Micklos, D. A., & Blackman, H. (2022b). Survey of evidence in education for schools (see-s) descriptive report. Center for Research Use in Education, University of Delaware. [Google Scholar]
  48. Farrell, C. C., & Coburn, C. E. (2016). What is conceptual use of research, and why is it so important? (blog). William T. Grant Foundation. Available online: https://wtgrantfoundation.org/conceptual-use-research-important (accessed on 11 April 2025).
  49. Farrell, C. C., & Coburn, C. E. (2017). Absorptive capacity: A conceptual framework for understanding district central office learning. Journal of Educational Change, 18(2), 135–159. [Google Scholar] [CrossRef]
  50. Finnigan, K. S., Daly, A. J., & Che, J. (2013). Systemwide reform in districts under pressure: The role of social networks in defining, acquiring, using, and diffusing research evidence. Journal of Educational Administration, 51(4), 476–497. [Google Scholar] [CrossRef]
  51. Firestone, W. A., Louis, K. S., Leland, A. S., & Perry, J. A. (2024). Learning to use research evidence: The case of the education doctorate. Journal of Research on Leadership Education, 19(1), 122–143. [Google Scholar] [CrossRef]
  52. Firestone, W. A., Perry, J. A., Leland, A. S., & McKeon, R. T. (2021). Teaching research and data use in the education doctorate. Journal of Research on Leadership Education, 16(1), 81–102. [Google Scholar] [CrossRef]
  53. Firestone, W. A., Seashore, L. K., Perry, J. A., & Leland, A. (2020). How the education doctorate teaches leaders to use research evidence. Society for Research on Educational Effectiveness Conference. Available online: https://www.youtube.com/watch?v=z8Q8Xl_GChg&feature=youtu.be (accessed on 19 September 2020).
  54. Freeman, F. N. (1931). Practices of American universities in granting higher degrees in education: A series of official statements (Vol. 19). University of Chicago Press. [Google Scholar]
  55. Golde, C. (2013). Mapping the transformation of the EdD student. In J. A. Perry, & D. L. Carlson (Eds.), In their own words: A journey to stewardship of practice of education (pp. 139–151). Information Age Publishing. [Google Scholar]
  56. Gorard, S. (Ed.). (2020). Getting evidence into education: Evaluating the routes to policy and practice. Routledge. [Google Scholar]
  57. Grissom, J. A., Egalite, A. J., & Lindsay, C. A. (2021). How principals affect students and schools: A systematic synthesis of two decades of research. The Wallace Foundation. Available online: http://www.wallacefoundation.org/principalsynthesis (accessed on 1 September 2020).
  58. Hallinger, P., & Murphy, J. (1985). Assessing the instructional management behavior of principals. The Elementary School Journal, 86(2), 217–247. [Google Scholar] [CrossRef]
  59. Hill, H. C., & Briggs, D. C. (2020). Education leaders’ knowledge of causal research design: A measurement challenge (EdWorkingPaper: 220–298). Annenberg Institute at Brown University. [Google Scholar] [CrossRef]
  60. Hochbein, C., & Perry, J. A. (2013). The role of research in the professional doctorate. Planning & Changing, 44(3/4), 181–195. Available online: https://eric.ed.gov/?id=EJ1145893 (accessed on 1 September 2020).
  61. Hoffman, R. L., & Perry, J. A. (2016). The cped framework. In J. A. Perry (Ed.), The EdD and the scholarly practitioner. Information Age Publishing. [Google Scholar]
  62. Hofmann, R. (2024). The four paradoxes that stop practitioners from using research to change professional practice and how to overcome them. Education Sciences, 14(9), 996. [Google Scholar] [CrossRef]
  63. Honig, M. I., & Coburn, C. (2008). Evidence-based decision making in school district central offices: Toward a policy and research agenda. Educational Policy, 22(4), 578–608. [Google Scholar] [CrossRef]
  64. Honig, M. I., & Donaldson Walsh, E. (2019). Learning to lead the learning of leaders: The evolution of the university of Washington’s education doctorate. Journal of Research on Leadership Education, 14(1), 51–73. [Google Scholar] [CrossRef]
  65. Honig, M. I., & Venkateswaran, N. (2012). School–central office relationships in evidence use: Understanding evidence use as a systems problem. American Journal of Education, 118(2), 199–222. [Google Scholar] [CrossRef]
  66. Honig, M. I., Venkateswaran, N., & McNeil, P. (2017). Research use as learning: The case of fundamental change in school district central offices. American Educational Research Journal, 54(5), 938–971. [Google Scholar] [CrossRef]
  67. Honig, M. I., Venkateswaran, N., McNeil, P., & Twitchell, J. M. (2014). Leaders’ use of research for fundamental change in school district central offices: Processes and challenges. In K. S. Finnigan, & A. J. Daly (Eds.), Using research evidence in education: From the schoolhouse door to capitol hill (pp. 13–32). Springer. [Google Scholar]
  68. Horn, I. S., Kane, B. D., & Wilson, J. (2015). Making sense of student performance data: Data use logics and mathematics teachers’ learning opportunities. American Educational Research Journal, 52(2), 208–242. [Google Scholar] [CrossRef]
  69. Huguet, A., Coburn, C. E., Farrell, C. C., Kim, D. H., & Allen, A. R. (2021). Constraints, values, and information: How leaders in one district justify their positions during instructional decision making. American Educational Research Journal, 58(4), 1–38. [Google Scholar] [CrossRef]
  70. Ion, G., Suárez, C. I., & Díaz-Vicario, A. (2021). Evidence-informed educational practice in Catalan education: From public agenda to teachers’ practice. Center for Educational Policy Studies Journal, 11(2), 37–57. [Google Scholar] [CrossRef]
  71. Jabbar, H., La Londe, P. G., Debray, E., Scott, J., & Lubienski, C. (2014). How policymakers define ‘evidence’: The politics of research use in New Orleans. Policy Futures in Education, 12(8), 1013–1027. [Google Scholar] [CrossRef]
  72. Janssen, J., Hale, L., Mirfin-Veitch, B., & Harland, T. (2013). Building the research capacity of clinical physical therapists using a participatory action research approach. Physical Therapy, 93(7), 923–934. [Google Scholar] [CrossRef] [PubMed]
  73. Jones, S., Gilbride-Brown, J., & Gasiorski, A. (2005). Getting inside the “underside” of service-learning: Student resistance and possibilities. In D. Butin (Ed.), Service learning in higher education: Critical issues and directions (pp. 3–24). Palgrave Macmillan. [Google Scholar]
  74. Karoly, P. (1993). Mechanisms of self-regulation: A systems view. Annual Review of Psychology, 44(1), 23–52. [Google Scholar] [CrossRef]
  75. Khalifa, M. A., Gooden, M. A., & Davis, J. E. (2016). Culturally responsive school leadership: A synthesis of the literature. Review of Educational Research, 86(4), 1272–1311. [Google Scholar] [CrossRef]
  76. Knowles, M. S., Holton, E. F., & Swanson, R. A. (2015). The adult learner: The definitive classic in adult education and human resource development (8th ed.). Routledge Publishing. [Google Scholar]
  77. Kochanek, J. R., Scholz, C., & Garcia, A. N. (2015). Mapping the collaborative research process. Education Policy Analysis Archives, 23(21), 121. [Google Scholar] [CrossRef]
  78. Langer, L., Tripney, J. S., & Gough, D. (2016). The science of using science: Researching the use of research evidence in decision-making (EPPI-Centre reports 3504). EPPI-Center, Social Science Research Unit, UCL Institute of Education. Available online: https://discovery.ucl.ac.uk/id/eprint/1493171/ (accessed on 1 September 2020).
  79. Leithwood, K., Louis, K. S., Anderson, S., & Wahlstrom, K. (2004). How leadership influences student learning. Review of research. The Wallace Foundation. Available online: https://eric.ed.gov/?id=ED485932 (accessed on 1 September 2020).
  80. Leithwood, K., & Montgomery, D. J. (1982). The role of the elementary school principal in program improvement. Review of Educational Research, 52(3), 309–339. [Google Scholar] [CrossRef]
  81. Leithwood, K., & Seashore Louis, K. (Eds.). (2011). Linking leadership to student learning. John Wiley & Sons. [Google Scholar]
  82. Leland, A. S., Firestone, W. A., Perry, J. A., & McKeon, R. T. (2020). Examining cohort models in the education doctorate. Studies in Graduate and Postdoctoral Education, 11(3), 249–262. [Google Scholar] [CrossRef]
  83. Levin, B. (2010). Leadership for evidence-informed education. School Leadership and Management, 30(4), 303–315. [Google Scholar] [CrossRef]
  84. Levin, J. A., & Datnow, A. (2012). The principal role in data-driven decision making: Using case-study data to develop multi-mediator models of educational reform. School Effectiveness and School Improvement, 23(2), 179–201. [Google Scholar] [CrossRef]
  85. Levine, A. (2005). Educating school leaders. The Education Schools Project. Available online: https://files.eric.ed.gov/fulltext/ED504142.pdf (accessed on 19 September 2020).
  86. Ludlow, H. G. (1964). The doctorate in education. American Association of Colleges for Teacher Education. [Google Scholar]
  87. Lysenko, L., Abrami, P. C., Bernard, R. M., Dagenais, C., & Janosz, M. (2014). Research use in education: An exploration of the role of research literacy. Educational Psychology, 34(3), 254–272. [Google Scholar]
  88. Lysenko, L. V., Abrami, P. C., Bernard, R. M., & Dagenais, C. (2016). Research use in education: An online survey of school practitioners. Brock Education: A Journal of Educational Research and Practice, 25(1), 35–54. Available online: https://files.eric.ed.gov/fulltext/EJ1105482.pdf (accessed on 1 September 2020). [CrossRef]
  89. Mallidou, A. A., Atherton, P., Chan, L., Frisch, N., Glegg, S., & Scarrow, G. (2018). Core knowledge translation competencies: A scoping review. BMC Health Services Research, 18(1), 1–15. [Google Scholar] [CrossRef]
  90. Massell, D., Goertz, M. E., & Barnes, C. A. (2012). State education agencies’ acquisition and use of research knowledge for school improvement. Peabody Journal of Education, 87(5), 609–626. [Google Scholar] [CrossRef]
  91. May, H., Farley-Ripple, E. N., Blackman, H., Wang, R., Tilley, K., & Shewchuk, S. (2020). Individual and school-level capacity to critically evaluate research: A multilevel organizational analysis. Presentation accepted for the Society of Research on Educational Effectiveness. [Google Scholar]
  92. Merriam, S. B. (2018). Adult learning theory: Evolution and future directions. In K. Illeris (Ed.), Contemporary theories of learning (pp. 83–96). Routledge. [Google Scholar]
  93. Meyer, M. (2010). The rise of the knowledge broker. Science Communication, 32(1), 118–127. [Google Scholar] [CrossRef]
  94. Mezirow, J. (2000). Learning to think like an adult: Core concepts of transformation theory. In J. Mezirow (Ed.), Learning as transformation: Critical perspectives on a theory in progress (pp. 3–33). Jossey Bass. [Google Scholar]
  95. Michie, S., Van Stralen, M. M., & West, R. (2011). The behaviour change wheel: A new method for characterising and designing behaviour change interventions. Implementation Science, 6(1), 1–12. [Google Scholar] [CrossRef]
  96. Neal, J. W., Neal, Z. P., Kornbluh, M., Mills, K. J., & Lawlor, J. A. (2015). Brokering the research–practice gap: A typology. American Journal of Community Psychology, 56(3), 422–435. [Google Scholar] [CrossRef]
  97. Neal, Z., Neal, J. W., Mills, K., & Lawlor, J. (2018). Making or buying evidence: Using transaction cost economics to understand decision making in public school districts. Evidence & Policy: A Journal of Research, Debate and Practice, 14(4), 707–724. [Google Scholar] [CrossRef]
  98. Nelson, S. R., Leffler, J. C., & Hansen, B. A. (2009). Toward a research agenda for understanding and improving the use of research evidence. Northwest Regional Educational Laboratory (NWREL). Available online: https://eric.ed.gov/?id=ED506962 (accessed on 1 September 2020).
  99. Osguthorpe, R. T., & Wong, M. J. (1993). The Ph. D. versus the Ed. D.: Time for a decision. Innovative Higher Education, 18(1), 47–63. [Google Scholar] [CrossRef]
  100. Osterman, K., Furman, G., & Sernak, K. (2014). Action research in EdD programs in educational leadership. Journal of Research on Leadership Education, 9(1), 85–105. [Google Scholar] [CrossRef]
  101. Patulny, R. V., & Svendsen, G. L. H. (2007). Exploring the social capital grid: Bonding, bridging, qualitative, quantitative. International Journal of Sociology and Social Policy, 27(1/2), 32–51. [Google Scholar] [CrossRef]
  102. Penuel, W. R., Briggs, D. C., Davidson, K. L., Herlihy, C., Sherer, D., Hill, H. C., Farrell, C., & Allen, A.-R. (2017). How school and district leaders access, perceive, and use research. AERA Open, 3(2), 1–17. [Google Scholar] [CrossRef]
  103. Penuel, W. R., Briggs, D. C., Davidson, K. L., Herlihy, C., Sherer, D., Hill, H. C., Farrell, C. C., & Allen, A.-R. (2016). Findings from a national survey of research use among school and district leaders (Technical Report No. 1). National Center for Research in Policy and Practice. [Google Scholar]
  104. Penuel, W. R., Farrell, C. C., Allen, A., Toyama, Y., & Coburn, C. E. (2018). What research district leaders find useful. Educational Policy, 32(4), 540–568. [Google Scholar] [CrossRef]
  105. Rickinson, M., Walsh, L., Cirkony, C., Salisbury, M., & Gleeson, J. (2020). Quality use of research evidence framework. Monash University. Available online: https://www.monash.edu/education/research/projects/qproject/publications (accessed on 1 September 2020).
  106. Robinson, V. (1992). Why doesn’t educational research solve educational problems? Educational Philosophy and Theory, 24(2), 8–28. [Google Scholar] [CrossRef]
  107. Rosenstock, I. M. (1974). The health belief model and preventive health behavior. Health Education Monographs, 2(4), 354–386. [Google Scholar] [CrossRef]
  108. Sheckley, B. G., Donaldson, M. L., Mayer, A. P., & Lemons, R. W. (2010). An Ed. D. program based on principles of how adults learn best. In G. Jean-Marie, & A. Normore (Eds.), Educational leadership preparation (pp. 173–202). Palgrave Macmillan. [Google Scholar]
  109. Shewchuk, S., & Farley-Ripple, E. N. (2021, April 8–12). Mapping the “third space” between research and practice. Presented at the (Virtual) Annual Meeting of the American Educational Research Association, Virtual. [Google Scholar]
  110. Shulman, L. S., Golde, C. M., Bueschel, A. C., & Garabedian, K. J. (2006). Reclaiming education’s doctorates: A critique and a proposal. Educational Researcher, 35(3), 25–32. [Google Scholar] [CrossRef]
  111. Spillane, J. P., & Miele, D. B. (2007). Evidence in practice: A framing of the terrain. Yearbook of the National Society for the Study of Education, 106(1), 46–73. [Google Scholar] [CrossRef]
  112. Storey, V. A., & Hesbol, K. A. (2014). Can the dissertation in practice bridge the researcher-practitioner gap? The education professional practice doctorate and the impact of the carnegie project on the education doctorate consortium. Journal of School Public Relations, 35(3), 324–327. [Google Scholar] [CrossRef]
  113. Supovitz, J. A. (2006). The case for district-based reform: Leading, building and sustaining school improvement. Harvard Education Press. [Google Scholar]
  114. Supovitz, J. A., & Klein, V. (2003). Mapping a course for improved student learning: How innovative schools systematically use student performance data to guide improvement (Research Report). Consortium for Policy Research in Education. Available online: https://repository.upenn.edu/cpre_researchreports/39 (accessed on 1 September 2020).
  115. Thoonen, E. E., Sleegers, P. J., Oort, F. J., & Peetsma, T. T. (2012). Building school-wide capacity for improvement: The role of leadership, school organizational conditions, and teacher factors. School Effectiveness and School Improvement, 23(4), 441–460. [Google Scholar] [CrossRef]
  116. Thornton, P. H., Ocasio, W., & Lounsbury, M. (2015). The institutional logics perspective. In Emerging Trends in the Social and Behavioral Sciences (pp. 1–22). Wiley. [Google Scholar] [CrossRef]
  117. Vroom, V. H. (1965). Motivation in management. American Foundation for Management Research. [Google Scholar]
  118. Wahlstrom, K., & Louis, K. S. (1993). Adoption revisited: Decision-making and school district policy. In S. Bachrach, & R. Ogawa (Eds.), Advances in research and theories of school management and educational policy (Vol. 1, pp. 61–119). JAI Publishing House. [Google Scholar]
  119. Weiss, C. H. (1979). The many meanings of research utilization. Public Administration Review, 39(5), 426–431. [Google Scholar] [CrossRef]
  120. Weiss, C. H., & Bucuvalas, M. J. (1980). Social science research and decision-making. Columbia University Press. [Google Scholar]
  121. Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. Cambridge University Press. [Google Scholar]
  122. Williams, D., & Coles, L. (2007). Teachers’ approaches to finding and using research evidence: An information literacy perspective. Educational Research, 49(2), 185–206. [Google Scholar] [CrossRef]
  123. Yanovitzky, I., & Blitz, C. (2017, July 3–5). The Capacity-Opportunity-Motivation (COM) model of data-informed decision making in education. International Conference on Education and New Learning Technologies (pp. 5895–5901), Barcelona, Spain. [Google Scholar]
  124. Yoshizawa, L. (2020). Research use under pressure: State and district implementation of the ESSA evidence requirements [Doctoral dissertation, Harvard University]. Available online: https://www.proquest.com/docview/2457700532?pq-origsite=gscholar&fromopenview=tru (accessed on 1 September 2020).
  125. Zacharakis, J., & Thompson, D. C. (2013). In response to J. A. Perry: Professional thoughts on what is the meaning and unintended consequence of the Carnegie Project on the Education Doctorate? UCEA Review, 54(3), 10–14. [Google Scholar]
Figure 1. Elaborated framework linking redesigned EdD programs to URE in practice.
Figure 1. Elaborated framework linking redesigned EdD programs to URE in practice.
Education 15 00747 g001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Perry, J.A.; Farley-Ripple, E.; Leland, A.; Shewchuk, S.; Firestone, W. Conceptualizing the Education Doctorate (EdD) as a Lever for Improving Education Leaders’ Use of Research Evidence. Educ. Sci. 2025, 15, 747. https://doi.org/10.3390/educsci15060747

AMA Style

Perry JA, Farley-Ripple E, Leland A, Shewchuk S, Firestone W. Conceptualizing the Education Doctorate (EdD) as a Lever for Improving Education Leaders’ Use of Research Evidence. Education Sciences. 2025; 15(6):747. https://doi.org/10.3390/educsci15060747

Chicago/Turabian Style

Perry, Jill Alexa, Elizabeth Farley-Ripple, Andrew Leland, Samantha Shewchuk, and William Firestone. 2025. "Conceptualizing the Education Doctorate (EdD) as a Lever for Improving Education Leaders’ Use of Research Evidence" Education Sciences 15, no. 6: 747. https://doi.org/10.3390/educsci15060747

APA Style

Perry, J. A., Farley-Ripple, E., Leland, A., Shewchuk, S., & Firestone, W. (2025). Conceptualizing the Education Doctorate (EdD) as a Lever for Improving Education Leaders’ Use of Research Evidence. Education Sciences, 15(6), 747. https://doi.org/10.3390/educsci15060747

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop