1. Introduction
Data use in education has been defined as “drawing on and interacting with information in the course of decision making” [
1] (p. 99). On its face, the purpose of data use seems relatively simple. For school leaders, the basic idea is that they should have access to various types of data that they can analyze to help determine the best path forward for their schools. A number of scholars—in the U.S. and internationally—have designed programs that provide a straightforward process of how to use data to inform educational decisions [
2,
3], yet the spectrum of policies, standards, and regulations focused directly or indirectly on informing and guiding data use is vast [
4,
5]. The simplicity of data-use processes in theory seems to belie the complexity of data use processes in practice.
Despite relatively clear policy expectations in the Every Student Succeeds Act (ESSA) to use data to improve U.S. schools [
6], the process of preparing leaders for executing this work is complex. National principal leadership standards such as the National Policy Board for Educational Administration (NPBEA) offer some high-level direction without explicit instruction [
7]. State leadership standards, like those advanced by the Virginia Department of Education (VDOE) can be more specific but still include an array of responsibilities that can have competing demands (e.g., instructional leadership, school climate, and organizational management) [
8]. Additionally, leadership preparation programs (LPPs) have their own set of credentialing standards to meet, such as those set forth by the Council for the Accreditation of Educator Preparation (CAEP) [
9]. Perhaps it should not be surprising that approaches to integrating data use in LPPs appear to be inconsistent [
10], or that research suggests that principals are frequently underprepared to lead good data use practices effectively in schools [
11]. Given the exploratory nature of this study and the lack of consistent data use terminology and framing across national and state standards, policy, and research, we loosely define data use as the systematic collection, analysis, and interpretation of data for the purposes of strategically responding in ways to improve education and educational outcomes. This conception of data use is closely associated with the definition provided by Kippers et al. as “educators’ ability to implement DBDM [data-based decision making] … defined as educators’ ability to set a purpose, collect, analyze, and interpret data and take instructional action” [
12] (p. 21). Our review of the Virginia state performance standards for school leaders later in this article enumerates the many—and varying—ways data and their use can be conceptualized for school leaders and their preparation programs.
Although there is a substantial and growing body of research on educator data use, there is considerably less research focused on school leaders [
13] and essentially none on how LPPs prepare leaders to understand different types of data or retrieve, analyze, interrupt, and act on data in the U.S. [
14] or elsewhere [
15,
16]. Despite the many responsibilities that school leaders hold, U.S. policies and accreditation standards center the school principal as the person primarily responsible for ensuring that data use efforts result in strategic organizational and instructional improvement. Determining whether school leaders should facilitate data team meetings, observe and support teacher data use practices, or engage in myriad other potential ways to improve schools was beyond the scope of this research. In this landscape study of one state, we relied on Virginia state leadership standards as our conceptual framing, and LPPs’ interpretations of and responses to those standards as the drivers of what counts with regard to data use. Moreover, this study is a direct response to Mandinach and Gummer’s nearly decade-old call for more intentional study of data use in educator preparation programs, as well as this special issue’s focus on strengthening leadership preparation and development [
17]. To do so, we conducted a mixed methods study inclusive of all 19 LPPs in the state of Virginia to answer the following research question:
How are LPPs designed to prepare pre-service leaders to lead and engage in data use in schools in ways that are responsive to Virginia state leadership standards?
This study offers an important contribution to understanding if and how future school leaders are being prepared to lead data use in schools. Given substantial national [
18,
19] and international [
20,
21] investment in data use for informing school and instructional improvement decisions, it is critically important that we expand upon our current limited understanding of the challenges LPPs face in preparing pre-service leaders to lead data use [
13,
14]. In the remainder of this article, we first provide a review of related strands of the literature on data use and leadership preparation, including a consideration of standards and competencies. Next, we report on the methods, including an overview of the Virginia LPP context. We then present our findings before concluding with a discussion of implications.
4. Methods
This study examined leadership preparation programs in Virginia to understand how leadership preparation programs are designed to prepare pre-service leaders to lead and engage in data use. Program design includes courses and content, delivery approach, and experiential components (e.g., internship). We examined how program coursework addressed data use and prepared students to lead data use in schools. We employed a case study methodology and collected multiple forms of qualitative data to gather information about LPPs [
71]. We chose a case study approach because the methodology allows for the exploration of a complex phenomenon in its real-life setting [
72].
4.1. Study Sample of Institutions
There are 37 public or private four-year colleges and universities across Virginia. Of these, 20 offer leader preparation programs and 19 currently enroll pre-service leaders (see
Table 2). In the cases where institutions offered programs with multiple degree and/or certificate options, a core set of courses was common across the programs. As a result, our study focused on school leadership programs (
n = 18) that led to administrative and supervision endorsement and a graduate degree (e.g., M.Ed.). These programs offered a graduate degree program in leadership and supervision and included more coursework compared with post-graduate certification programs. However, one institution offered a certificate program only; we included this program in the study as it contributed to a full understanding of the coursework offered at all institutions with leadership preparation programs.
Ten public and nine private institutions describe their leadership preparation programs as offering graduate certificates or master’s degrees in educational leadership (n = 13), administration and supervision (n = 5), or educational administration (n = 1). All institutions offer master’s degrees requiring between 30 and 36 credit hours, while nine institutions also offer graduate certificates requiring between 18 and 24 credit hours. Pre-service leaders in certificate programs previously earned a master’s degree and enrolled in the program for the purpose of obtaining an administrative license. Programs include a mixture of delivery formats, with nine being offered fully online, four being offered fully in-person, two being offered hybrid (part online, part in-person), two being offered both fully online and fully in-person, one being offered both hybrid and fully in-person, and one offering hybrid, in-person, or online. In sum, there are a variety of course-taking options among LPPs in the state.
4.2. Data Sources
We drew on three data sources to explore how programs were designed to include data use content and experiences in school leadership programs. These included the following: (1) publicly available program course descriptions from each institution’s academic catalog, (2) interviews with program coordinators or administrators, and (3) course syllabi. The primary data source was course descriptions, which was supplemented by interviews and sample course syllabi. To mask participants, we reference data sources in the findings section as PU#/Data Source (public institution plus randomly assigned number, followed by data source [CD = course description, SYL = syllabus, or interviewee title]) or PR#/Data Source (private institution plus randomly assigned number, followed by data source).
4.2.1. Course Descriptions
We obtained 197 course descriptions from the 214 publicly available course titles (descriptions for 17 courses were not available in the public academic catalog). Course descriptions were found on each institution’s website and were brief summary statements about the main focus and content of the course. It is worth noting here that we did not have a record of whether course descriptions were generated by the professor of record or someone else. Course descriptions averaged 56 words and ranged between 13 and 179 words in length. Guidelines for one institution, for example, stipulated a 400-character limit for course descriptions inclusive of text, punctuation, spaces, etc. In general, descriptions included 2–4 sentences about the primary focus of the course and the knowledge and skills pre-service leaders would acquire. The following is a typical course description:
This course provides educators with tools to initiate and sustain continuous improvement to promote all student’s academic success and well-being. Drawing on improvement science, the course provides frameworks and protocols for understanding and leading systemic change in schools and school systems. Activities include authentic application of approaches used to support high-quality teaching and leading in P-12 school systems across the country. (PU2/CD8).
4.2.2. Interviews with Program Coordinators
Interviews were recorded and professionally transcribed for analysis. Interviews with program coordinators lasted between 35 and 41 min and produced transcripts totaling 82 pages, with an average of 11.7 pages per interview. Each program’s coordinator was invited to participate in semi-structured interviews held via Zoom. Seven program coordinators agreed to be interviewed. Institution types represented by the coordinators included three large public and four private universities.
The interviews were guided by a protocol that included nine open-ended questions and follow-up prompts designed to obtain information about how the program approached the teaching of data use and the specific courses that emphasized data use or data-informed decision-making. Questions gathered specific information about the level of the course, modality, and type of data emphasized (e.g., achievement, behavioral, attendance, demographic, or environmental). More in-depth questions focused on pre-service leader experiences and opportunities to examine simulated or authentic forms of student performance data to guide plans for school improvement. Additional questions asked how courses developed pre-service leader skills in supporting the use of data and building data use capacities within schools. Probing questions explored the extent to which pre-service leaders were exposed to different strategies such as modeling, team building, establishing norms and routines for creating school cultures, establishing missions or visions that value inquiry, and making evidence-based decisions
4.2.3. Course Syllabi
Six program coordinators submitted a total of 23 syllabi that were deemed by the coordinators to be most relevant to understanding data use in their programs. The syllabi were for courses in general school leadership, organizational theory and leadership, community relationships, leading instruction, evaluation and supervision, and internships.
4.3. Limitations of the Data Sources
We note here that our data sources have limitations. Course descriptions are relatively short accounts of course content that highlight what the instructor of record elected to emphasize and likely do not capture all content or much of the nuances within the courses. Further, interviews were conducted primarily with program coordinators who might have had varying levels of knowledge about the courses. Interviews were also conducted with representatives from only 7 of the 19 institutions. Similarly, program coordinators from only six of the institutions provided syllabi. Thus, the findings we report (e.g., programs’ commitments to clinical experiences) should be understood within the parameters of limited overall response rate. Institutions and institutional practices might have been different depending on their willingness to engage in this study, as well as the ways in which they engaged in this study. Given the researchers’ employment at two of the universities in Virginia and potential concerns regarding proprietary information and competition for pre-service leaders as students, reluctance to participate was assumed.
4.4. Codebook Development and Data Analysis
This section outlines the processes and procedures that were used to analyze the three data sources including course descriptions, interviews, and course syllabi. We defined data use as the practice of using systematically gathered evidence to inform organizational improvement and/or instructional responses.
Prior to undertaking any analysis, we developed a codebook informed by the conceptual framework that was used to facilitate data exploration, pattern identification, and insight development. A deductive coding system was designed to capture the content and focus of each course description based on our definition of data use and the leadership competencies in the Virginia
Uniform Performance Standards and
the National Professional Standards for Education Leaders [
7,
8]. An initial set of deductive codes were developed that were used to capture leadership competency (e.g., school improvement, instructional improvement, student academic progress, culturally responsive and equitable leadership); nature of data use (e.g., assessment, evaluation, research); data use processes and skills (e.g., collection, organization, analysis, sensemaking); and specific approaches to data use (e.g., action research, inquiry cycle, continuous improvement). These initial codes were applied to a subset of course descriptions, and additional codes were identified (emergent) with some refinement of the a priori codes. For example, we found it important to document when course descriptions did not reference data use generally. The final set of codes allowed for an analysis that examined the data use course content. The coding structure provided findings specific to course content and how data use was described relative to specific leadership areas of responsibility. The analytic approach enabled findings that documented LLP course content and gaps in ways that may guide LPP program development and inform further research. The final version of the codebook was agreed upon by the research team.
4.4.1. Analysis of Course Descriptions
Course descriptions were uploaded into MAXQDA 2022, a software program for computer-assisted analysis with qualitative and mixed methods data. The first step was the application of the codebook for sorting and organizing the data. For example, the course descriptions were analyzed first by applying the codes to sentences within each description. This initial coding was followed by a second level of coding where similar codes were collapsed to formulate categories within the data. At the completion of this stage, the researchers noted their impressions, interpretations, and questions (i.e., memoing), which were brought to the team meetings for discussion. We also engaged in inductive coding that involved closely reading the course descriptions, then having one team member identify relevant text segments and generate new codes. Typical emerging codes involved document descriptions that did not reference data or data use, or if the course was a stand-alone class on data use. The interviews and course syllabi were analyzed separately with these analyses serving as supplemental insights into the course description analysis.
Two research team members double-coded 20% of the course descriptions to ensure consistency in code applications. Two rounds of double coding were conducted until a sufficient level of inter-coder agreement was achieved (93%) [
73], after which the raw data were coded independently.
4.4.2. Analysis of Interviews
All seven interview transcripts were read by two research team members independently to identify initial emergent themes and patterns. One team member created detailed summaries of each interview. A narrative and thematic approach to analysis was used to identify emergent themes and to understand program design and learning experiences more fully, in ways that supplemented the findings of the course description analysis [
74]. For example, a key area of the transcript summaries was focused on pre-service leaders’ learning opportunities to engage with data.
4.4.3. Analysis of Course Syllabi
One team member analyzed all syllabi, while another team member performed spot reviewing to check the coding [
75]. Similar to the analysis of course descriptions, syllabi were first reviewed for explicit references of content associated with data use. This resulted in binary coding about whether data were referenced in the syllabi’s course description (not necessarily the same as the course description provided on university websites) or learning objectives, or if they were featured in any particular week of the course, reading assignments, class assignments (e.g., activity log). The coding also focused on whether data use was framed in terms of Virginia school leader competencies such as instructional leadership. In cases where data were mentioned in syllabi, words, phrases, assignments, etc., were copied and pasted verbatim into a separate file for comparison of the content (e.g., how data were referenced across syllabi within learning objectives).
5. Findings
The research question of this study centered on understanding the ways in which (if any) LPPs are designed to prepare pre-service leaders to lead and engage in data use. Our findings indicate that LPPs in Virginia have undertaken this work to a limited degree. Further, evidence suggests that LPPs designed their programs to focus on the development of leadership moves, often without consideration for information gleaned through data use. Our findings also suggest that LPPs tend to rely on internships for preparing pre-service leaders to lead data use efforts, despite the fact that LPPs infrequently have control over what transpires in clinical settings. In the subsequent sections, we provide a more detailed analysis of these findings as they relate to this study’s research question.
5.1. Disconnect between State Standards and Course Content
The analysis of course descriptions revealed far less attention on data use and data-informed decision-making compared with the expectations outlined in Virginia’s principal evaluation standards. Significant gaps between course content and professional standards for school leaders were apparent. For example, 59% of the descriptions did not reference data, data use, or data-informed decision-making. Of the 77 course descriptions with references to data, 10% mentioned data-informed decision-making, 10% referenced assessment and/or the use of assessment data, 8% described research and/or research data, and 6% cited evaluation. Most courses included data use content and skills in relation to other course content such as the evaluation of instructional programs, organizational change, and school improvement. This is illustrated in the following description of a course focused on curriculum, instruction, and assessment, “The assessment of student learning, including student learning data analyses, will be the third area of focus for this course” (PU1/CD6).
Of the 197 courses, there were only two standalone courses dedicated to data use and data-informed decision-making. These courses emphasized different approaches to and roles for data use. The first connected data use to school improvement and student learning as follows:
Collaboratively lead, design, implement, and evaluate school improvement and classroom culturally responsive practices that increase student learning and well-being. Includes information sources and processing, data literacy (use, analysis, and decision making), equitable resource planning, and program evaluation to inform and lead change. (PR4/CD7).
In comparison, the other course emphasized the technical and process aspects of data use within a research or action research framework to identify needs and monitor learning as follows:
A survey of tools and techniques used in conducting and utilizing assessment data. Includes current research approaches, project design, and data collection. Also included are methods for using data to identify school needs, evaluate personnel, track student performance, and develop strategies for increasing performance as necessary. (PR7/CD2).
These descriptions of courses focused exclusively on data use reflected the wide variation in how programs considered and approached data use and the role of data in decision-making by school leaders. Based on our review, it was clear that a shared understanding of data use preparation, especially in ways that aligned with state professional expectations, was not evident. Courses that referenced data use, in combination with other technical topics such as research methods, assessment, and evaluation, tended to focus on the procedural aspects of data use and data analysis. In these cases, data use was often disconnected from leadership roles and responsibilities. Few courses connected data and/or data use with a specific area of leadership; school improvement (4%) and instructional leadership (3%) were referenced most frequently. However, the vast majority of course descriptions that referenced data-specific content did not mention a leadership domain. Data and data use were most frequently connected to ideas of continuous improvement.
5.2. Coursework Provides Few Opportunities for PSLs to Learn Data Use within a Standards Framework
Nearly all twenty-three syllabi from six LPPs referenced one of the NELP standards that included data use, but only seven of the syllabi included references to data in learning objectives, and of those seven, only four mentioned data use in their course overviews. Few syllabi provided information about weekly course content coverage. Only PU8/SYL8 focused on data use as a standalone topic (instead of being embedded as part of a broader initiative such as instructional leadership). Not a single syllabus made clear what counts as data proficiency. More syllabi, however, identified various data use assignments requiring differing levels of detail. For example, PR2/SYL1 assigned analyzing multiple types of data to list three to five goals for school improvement and PU2/SYL5 prioritized using student achievement data to make recommendations about resources.
The interview data suggested a number of possible reasons for so few explicit references to data use in course materials. While considering course offerings, one program coordinator admitted the following: “Data is one of those things that when [pre-service leaders] come to your classes and you tell them to do these projects, you just assume they know how to do it. And you asked a really good question. ‘Do we intentionally introduce these types of data?’ And I don’t know that we do” (PU2/Program Coordinator). Instead, pre-service leaders seemed to be tasked regularly with rational and situational decision-making that required them to consider data as they believed warranted. In other cases, pre-service leaders were instructed to take “deep dives” into district and school data “to look at the demographics” (PR8/Program Coordinator). An imperfect balance seemed to be at play where, on the one hand, “throughout our courses, the expectation is for all of our instructors to integrate data literacy one way or another” (PR5/Program Coordinator). Yet, on the other hand, “I use data not in an intentional way such as, ‘In this unit, we’re going to talk about demographic data’, or ‘We’re going to talk about student academic achievement data’” (PU2/Program Coordinator, emphasis added).
5.3. Coursework Contained Few Clear References to Data Use for Instructional Change
When the course descriptions and syllabi referenced data use, focus on instructional change was seldom evident. As highlighted above, course descriptions that included references to data use almost always did so generically (e.g., “data analysis”). Even when they provided more detail—“designing projects to enhance school culture through the application of assessment and research data” (PR8/CD4)—instructional change was not usually prioritized.
Of the four syllabi course overviews that referenced data, only one clearly stated using data for instructional purposes (PU2/SYL4). Similarly, of the seven syllabi that specified data in their learning objectives, only three referenced data use for instructional purposes (PU2/SYL4, PU2/SYL5, and PU7/SYL8). Few syllabi provided information about weekly course content coverage. Two syllabi referenced data use for instructional purposes in their list of weekly content coverage (PU2/SYL7 and PU3/SYL8). Only PU3/SYL8 included a reading assignment—
Data Wise [
2]—focused on data use and instruction. Three syllabi (PU2/SYL4, PU2/SYL8, and PU8/SYL8) identified data for instructional improvement within class content. Two syllabi tasked pre-service leaders with assignments that required them to use data to inform instructional decisions. Specifically, PU8/SYL8 required pre-service leaders to log the use of school data to improve a specific Virginia learning standard (i.e., SOL) instructional issue from state standardized testing, and PU3/SYL8 required the creation of an educational intervention plan based on various data.
Programmatic inclusion of data use, the idea of supporting teachers in using data for their instructional planning, seemed infrequently connected to instructional change. For example, one program administrator noted that pre-service leaders “will focus a little bit more on data pertaining to student behavior and…family engagement” because “this class connects to the school improvement plan developed in another course and illustrates some scaffolding or intention to develop a comprehensive understanding of varied roles and uses of data” (PR8/Program Coordinator). Even efforts to focus on instructional support in response to data typically failed to make connections between data and instructional responses explicit:
In the [supervision and evaluation course], another assignment that they have, is to write a plan of improvement with a teacher. And oftentimes, it depends on the scenario that they have, but it may be lack of test score progress, showing progress. And so, looking at that data, their assignment, or their task, is to say what type of instructional support might you give [the teacher] based on the data if we see that reading scores are low in a particular area, comprehension, what type of resources, or what type of professional development might you provide this teacher, or recommend…? (PR5/Program Coordinator).
Moreover, course descriptions from two programs indicated a focus on using data for research and analysis purposes, but not for school or instructional improvement. For example, one program provided the following course description for a research and assessment course: “Overview of the nature of research on human development, learning, and pedagogical knowledge and skills. Topics include current trends and issues in education, skills in data collection and assessment, and application of research in the school setting” (PR2/CD8).
Although the evidence suggested an overall limited focus on data use for instructional change, some program coordinators highlighted courses that seemed to be focused on building pre-service leader capacity to use data to improve instruction. For example, the PU3 Program Coordinator stated the following after highlighting two courses, one focused on educational change and improvement and the other enhancing and supporting instruction: “[pre-service leaders] obviously are doing some deep data dives in those classes, and they’re thinking about how those enhance instruction and decisions that they need to make as a leader, and really looking at the disaggregated and the aggregated, and thinking about how those decisions should be made differently”. Similarly, the PR2 Program Coordinator focused on the importance of data-inquiry cycles anchored to a data-based goal where pre-service leaders engaged in a process as follows:
They use a data chart to scaffold them through the process…Based on your case, what data sources are most relevant? Now go explore those. And after you’ve explored those, then are there any other data sources that now, maybe you didn’t find what you were expecting to find in those data sources?…They’re required to develop one to three goals …that come out of the data. So, yes, we scaffold that. It’s really a scaffolding process.
We emphasize, however, that few programs communicated an approach involving inquiry cycles when teaching pre-service leaders how to use data.
5.4. Coursework Prioritized Leadership Moves in Data Contexts over Building Data Knowledge and Use in Alignment with Standards
LPPs were frequently designed to teach pre-service leaders how to make leadership moves instead of first demonstrating an understanding of various data types and how to use them effectively. One relatively common way programs accomplished this was by having pre-service leaders leverage various data types to inform improvement planning processes. For example, in one program,
[Pre-service leaders] have to go ahead and audit their comprehensive school improvement plan [from the internship site]. They have to pick a content area that they’re really going to focus on—we ask literacy or math for that particular purpose in that course. Then we also have [pre-service leaders] conduct—basically use—a variety of qualitative and quantitative data to kind of figure out where we need to go with the school improvement plan—they have to conduct observations, classroom observations of those contact areas, they have to collect qualitative data from PLC meetings they have to look at student assessment data—and then they have to evaluate and then if they need to tweak and adjust their plan they have to do that, and then they have to present that to their mentor or their building administrator essentially. (PR8/Program Coordinator)
This example suggests that pre-service leaders were seldom asked to demonstrate an understanding of different data types or select and justify the data used to make leadership decisions. Further, data use assignments and responsibilities for pre-service leaders were often limited to organizational change, not instructional improvement.
Interestingly, in that program, the focus on data to inform improvement planning was from a course focused on instructional leadership and student achievement. No evidence was provided to suggest if or how pre-service leaders were using data to inform instruction in that course. Similarly, in another program, pre-service leaders “have to develop a plan as to how [they]would help this school improve. And they present that as though they are presenting it to a group of teachers, or a group of community members, parents…where we’re able to see whether they can interpret data and explain data to the layman” (PR5/Program Coordinator).
As an extension of this approach to leadership and data use, programs regularly appeared to be designed based on an assumption that pre-service leaders (and current ones, discussed more in the next section) already understand data and data use processes (i.e., retrieval, analysis, interpretation). In the following example, pre-service leaders were responsible for bringing various data types to bear to inform teacher instructional practice, but there was no evidence that they identified the right or best data for the situation, analyzed those data accurately, or responded to teachers with suggestions for appropriate instructional approaches based on those data.
We’re teaching them how to conduct observations of teachers where they do the post-conference that is based off of that data. And sometimes if there is an area of need in a department or on a grade level through a PLC (professional learning community) they’re looking at their scores, benchmark assessments, whatever it might be through that data analysis and specifically writing goals and strategies for improvement. So, that would be modeled through that observation component, but it would certainly be modeled through grade-level meetings, department meetings and continuous improvement planning. And it really and all this that’s what it goes towards is developing that school’s continuous improvement plan, professional development needs, and then ultimately increasing student achievement is ultimately the goal. (PU5/Program Coordinator)
This quotation actually suggests that much of the responsibility to ensure pre-service leaders know how to use data for instructional purposes is placed on their internships or field-based experiences.
5.5. LPPs Rely on Internships for Pre-Service Leaders to Develop Data Knowledge and Skills Instead of Directly Overseeing PSLs’ Ability to Execute Data Practices According to Standards
Many LPPs appeared to rely on their pre-service leaders’ internships with mentor school leaders as the primary way to grow pre-service leader data knowledge and skills. The following syllabi (from one program) included site-based assignments to advance data use for instructional change: PU2/SYL7 included field-based learning applications to “build coherence of practice by analyzing your school data”, while PU2/SYL8 required pre-service leaders to shadow mentors and write reflections on data-use observations. Program coordinators across the LPPs discussed the flexibility of the internship. The following coordinator spoke at length about how important the internship was for developing pre-service leader’s data knowledge:
One of those standards emphasizes the use of data to drive instructional practices as well as organizational issues, etc. So, they really can’t get out of the internship without looking at data and being involved in data use. But the internship is flexible enough that if a [pre-service leader’s] readiness does not afford them the chance to lead through data, then they can observe or participate in data use, and there’s no penalty for that…But I would say all [pre-service leaders] in some way engage with data. (PU2/Program Coordinator)
The internships had numerous, varied requirements. One coordinator pointed out that over the course of a 16-month internship, pre-service leaders were responsible for meeting “35 internship objectives, 350 h in the field”, so the internship was deemed “essential” (PU5/Program Coordinator). Not only were the internships demanding in terms of content coverage and time, but they were also typically conducted one-to-one (e.g., pre-service leader working with and learning from a school principal). Another program coordinator articulated some of the concerns with this approach: “We cover…the reliability and validity aspect of that and bias as well…‘All right, well, what’s good data?’ When there’s so much data, how do you figure out which data is the best data?” (PR8/Program Coordinator). Moreover, no course descriptions of internships or interview data clearly stated what aspects of data knowledge and use were to be learned through internships or whether or not mentors of internships met any standards of data knowledge and capacity themselves.
Decisions related to the depth of data use and type of engagement depended “on the level of comfort of their school leader and whether those doors are open for them, and whether they’re ready for it to be honest with you…because they have to have a discussion with their principal to approve these projects” (PU2/Program Coordinator) because access was dependent on the school context and “what [data] the mentor is willing to give them access to” (PU3/Program Coordinator). Thus, “if I’m your principal and I don’t think you’re ready to take the reins on a data team, then I’m [going to] put you on the data team to observe it. Or you can participate, but I’m not letting you lead it. And so, that could happen” (PU2/Program Coordinator). As a result, pre-service leader access in internships was dependent on the school context.
Like all research studies, our findings and conclusions are limited by several factors. First, we relied on publicly available information in several instances. We used course descriptions found in institutional academic records and bulletins to explore the ways in which data use is addressed in LPP coursework. These descriptions, by design, are brief (average of 56 words) and highlight the main topics of the course. The descriptions do not provide in-depth information about how data use is addressed in a course, and data may be a topic of the course, but not mentioned in the description. Additionally, we assumed that the implementation of each course aligned with the public course description (and syllabus), which may or may not be accurate. Second, relatively few syllabi were made publicly available or shared with us. Third, we reviewed the individual web pages of program faculty to examine the extent to which they had expertise and interests in data use. The content of the web pages may not have been updated or current. Fourth, we had a relatively small number of program coordinators participate in the interviews (37%). We note that several responded to invitations indicating that they lacked the time needed for participation, whereas others expressed some hesitation until the study purposes were clarified to confirm that we were not conducting individual program evaluations. We extrapolate that these reasons were likely held by non-respondents. Lastly, we conclude with an acknowledgment that this study was also limited to one state, limiting the generalizability of our claims despite some evidence that preparation programs across states are likely similar in scope and design [
17].
6. Discussion
In this study, we investigated pre-service leaders’ opportunities to lead and engage in data use in schools in ways that are responsive to Virginia state leadership standards. We found few clear references to data use in course descriptions or syllabi, and even fewer of those references specified data use for instructional change. Instead, some assumptions seemed to be embedded within LPPs, including that pre-service leaders should already know and understand data processes (i.e., data types, retrieval, analysis, interpretation, etc.) and that gaps in knowledge would be filled in through ongoing interactions with data across classes and through internships. Indeed, program coordinators suggested that LPP content is more about learning to lead than ensuring an understanding of what is being led (that is, leadership moves are prioritized over content knowledge, which in this case was an understanding of data and data processes).
Research suggests that principals often take responsibility for scheduling data meetings and creating conditions for data conversations to occur [
46,
47], but there is relatively little evidence that principals are actively engaged in facilitating meaningful data conversations. Yet, data use programming regularly places principals figuratively—and literally—at the head of the table [
2,
3]. However, in some cases, principal engagement in data team meetings actually seems to impede data conversations [
11]. There is little evidence that principals fully understand the nuance and complexity of different types of data or are able to match those data to appropriate organizational and/or instructional improvement goals. Thus, if LPPs are responsible for developing school leaders who can meet the many, various ways data are included in federal and state standards, a first order of business is to become more intentional in closing the data knowledge gap that many pre-service (and in-service) school leaders appear to have.
LPPs likely also need to consider more deeply what they want pre-service leaders to gain from internships. Given the research noted above, there is little guarantee that mentors have deep knowledge of data and how to use data effectively to make instructional decisions, or that internship experiences are crafted to include data use opportunities. If mentors received preparation training similar to that provided by the pre-service programs central to this study, it cannot be assumed that current mentors have the competencies needed to enact effective data use for instructional improvement. Seemingly every program in the state, however, relies on mentors to help bridge data course content—even how limited it is—and practice. This is not to diminish the practical elements of “on-the-job” learning opportunities, which can be invaluable [
76]. Yet, to specifically help pre-service leaders better understand data types, data processes, and/or how to lead others to use data, we find little evidence to suggest that clear expectations or structures are in place for mentors to orchestrate deeper learning of data-use processes. The role of mentors and internship placements are worthy of further exploration. At a minimum, more clearly defined structures and expectations for both pre-service leaders and their mentors seem warranted [
14].
When the findings of this study are considered jointly with research on principals’ somewhat limited knowledge of data and facilitation of data use, especially for instructional purposes [
11], there are a number of implications for programming. Among them, LPP faculty and district leaders could more regularly collaborate to design programs—or at least components of programs (e.g., internship)—perhaps in university–district partnerships to be more responsive to contextualized needs [
77,
78]. In international settings, further consideration should be given to how ministries of education, especially in smaller and/or more centralized countries, might better engage with LPPs [
79]. The extent to which districts have data leadership capacity and services to build strong data leaders likely varies considerably, but LPPs currently appear to offer little flexibility other than mentor/internship designations, which, as noted above, seem mostly unregulated. LPPs could instead embed data use opportunities in leadership courses to better align with the expectations for data use across multiple leadership domains. In addition, more intentional development and design of induction programs [
80] at the state or local levels within the U.S. or internationally [
3] could be a way to ensure pre-service leaders have robust opportunities to learn more about data use knowledge and facilitation as they transition into leadership positions. Similarly, state departments of education could be more intentional in establishing professional learning opportunities for principals, especially novice ones, to build skills [
81]. In that vein, state departments could also work with LPP faculty to better establish expectations for pre-service leader mentors.
Despite leadership standards and LPP accreditation standards, there appears to be an overall lack of direction about the collective vision for school leader data use. Perhaps assumptions about educator preparedness more broadly contribute. In most cases, pre-service leaders have already been teachers who presumably learned about data for instruction. Yet, a complementary study we conducted similarly shows inconsistent expectations for teacher data knowledge and use (Author, in press). Mentors currently lead buildings and are responsible for data team meetings, so, again, we might be inclined to assume an understanding of data and capacity to execute responses, but in-service leadership development programming [
2,
3] and research on principals leading data teams [
11,
82] suggest otherwise. Even within LPPs, there is the following disconnect about whether and how to prioritize data: are pre-service leaders being developed for organizational management (e.g., personnel, master scheduling, etc.), instructional leadership (e.g., leading teachers and others to better understand students and how to respond to their learning needs), or both? More explicit goal-setting about the purposes of data use within programs seems necessary to clearly design courses, devise expectations and tasks, and prepare mentors to prioritize pre-service leader data use learning experiences.
There are some related considerations about this study worth noting. The research literature—including this study—and U.S. policy and accreditation standards feature the role of the principal as a central figure, if not the primary one, to lead data work in schools. Given the many responsibilities already assigned to school leaders across the world, it is reasonable to consider the extent to which they should be a central figure in the work, or even be responsible for ensuring that the work is being conducted with reasonable levels of competence. The ability to lead data use might also depend on pre-service leaders’ background knowledge, interests, and attitudes regarding data, as research has suggested that these factors influence actual data use [
42]. As such, researchers might design future studies that clearly define what effective data use looks like in schools and then determine what pathways exist that reflect successful data practices regardless of how those practices are led and by whom. International comparative studies in this space would also be helpful to understand how policy context and local interpretation vary, as well as how those differences influence what principals are responsible for and how they facilitate data to advance schools to meet organizational goals. Such studies would also be instructive for LPPs (and those who develop the policies and standards that drive them), aiming to make strategic program design decisions for better preparing preservice leaders to ensure good data use practices in schools, even if others are charged with championing the work.