Next Article in Journal
Training Staff to Implement Free-Operant Preference Assessment: Effects of Remote Behavioral Skills Training
Next Article in Special Issue
Dynamics in District–University Partnerships Focused on Leadership for Equity
Previous Article in Journal
Symbolic Representation of Young Children in Science: Insights into Preschoolers’ Drawings of Change of State of Matter
Previous Article in Special Issue
Centering Equity within Principal Preparation and Development: An Integrative Review of the Literature
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

School Leader Preparation in the U.S. State of Virginia: Exploring the Relationship between Data Use in Standards and Program Delivery

by
Coby V. Meyers
1,*,
Lisa Abrams
2,
Tonya R. Moon
3 and
Michelle Hock
3
1
Education Leadership, Foundations & Policy, School of Education and Human Development, University of Virginia, Charlottesville, VA 22903, USA
2
Department of Foundations of Education, School of Education, Virginia Commonwealth University, Richmond, VA 23284, USA
3
Curriculum, Instruction & Special Education, School of Education and Human Development, University of Virginia, Charlottesville, VA 22903, USA
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(10), 1081; https://doi.org/10.3390/educsci14101081
Submission received: 6 June 2024 / Revised: 23 September 2024 / Accepted: 27 September 2024 / Published: 3 October 2024
(This article belongs to the Special Issue Strengthening Educational Leadership Preparation and Development)

Abstract

:
National school leadership standards are now de facto curriculum for preparation programs. Data use is embedded throughout standards to guide school improvement and classroom instruction. Yet, across a number of areas, pre-service principals do not appear ready to lead once in the field. Principals are responsible for using various data to guide internal policies, school cultures, and capacity building, largely supporting teachers by establishing norms, expectations, and clear visions for data use in instructional decisions. In this study, we examined leadership preparation programs in one U.S. state to understand how data use is addressed in leader preparation. Our analysis of course description, syllabi, and program director interview data resulted in the following findings: (1) programs and courses seldom explicitly acknowledged data use as a topic; (2) when data use was acknowledged as a topic, it was infrequently tied to standards; (3) connections between data use and instructional change were limited; and (4) most programs relied on internships for leadership preparation programs to learn data use practices. There are opportunities for programs to make connections between standards, data use, and instructional improvement more explicit, as well as to clarify expectations for and increase oversight of field-based mentors.

1. Introduction

Data use in education has been defined as “drawing on and interacting with information in the course of decision making” [1] (p. 99). On its face, the purpose of data use seems relatively simple. For school leaders, the basic idea is that they should have access to various types of data that they can analyze to help determine the best path forward for their schools. A number of scholars—in the U.S. and internationally—have designed programs that provide a straightforward process of how to use data to inform educational decisions [2,3], yet the spectrum of policies, standards, and regulations focused directly or indirectly on informing and guiding data use is vast [4,5]. The simplicity of data-use processes in theory seems to belie the complexity of data use processes in practice.
Despite relatively clear policy expectations in the Every Student Succeeds Act (ESSA) to use data to improve U.S. schools [6], the process of preparing leaders for executing this work is complex. National principal leadership standards such as the National Policy Board for Educational Administration (NPBEA) offer some high-level direction without explicit instruction [7]. State leadership standards, like those advanced by the Virginia Department of Education (VDOE) can be more specific but still include an array of responsibilities that can have competing demands (e.g., instructional leadership, school climate, and organizational management) [8]. Additionally, leadership preparation programs (LPPs) have their own set of credentialing standards to meet, such as those set forth by the Council for the Accreditation of Educator Preparation (CAEP) [9]. Perhaps it should not be surprising that approaches to integrating data use in LPPs appear to be inconsistent [10], or that research suggests that principals are frequently underprepared to lead good data use practices effectively in schools [11]. Given the exploratory nature of this study and the lack of consistent data use terminology and framing across national and state standards, policy, and research, we loosely define data use as the systematic collection, analysis, and interpretation of data for the purposes of strategically responding in ways to improve education and educational outcomes. This conception of data use is closely associated with the definition provided by Kippers et al. as “educators’ ability to implement DBDM [data-based decision making] … defined as educators’ ability to set a purpose, collect, analyze, and interpret data and take instructional action” [12] (p. 21). Our review of the Virginia state performance standards for school leaders later in this article enumerates the many—and varying—ways data and their use can be conceptualized for school leaders and their preparation programs.
Although there is a substantial and growing body of research on educator data use, there is considerably less research focused on school leaders [13] and essentially none on how LPPs prepare leaders to understand different types of data or retrieve, analyze, interrupt, and act on data in the U.S. [14] or elsewhere [15,16]. Despite the many responsibilities that school leaders hold, U.S. policies and accreditation standards center the school principal as the person primarily responsible for ensuring that data use efforts result in strategic organizational and instructional improvement. Determining whether school leaders should facilitate data team meetings, observe and support teacher data use practices, or engage in myriad other potential ways to improve schools was beyond the scope of this research. In this landscape study of one state, we relied on Virginia state leadership standards as our conceptual framing, and LPPs’ interpretations of and responses to those standards as the drivers of what counts with regard to data use. Moreover, this study is a direct response to Mandinach and Gummer’s nearly decade-old call for more intentional study of data use in educator preparation programs, as well as this special issue’s focus on strengthening leadership preparation and development [17]. To do so, we conducted a mixed methods study inclusive of all 19 LPPs in the state of Virginia to answer the following research question:
How are LPPs designed to prepare pre-service leaders to lead and engage in data use in schools in ways that are responsive to Virginia state leadership standards?
This study offers an important contribution to understanding if and how future school leaders are being prepared to lead data use in schools. Given substantial national [18,19] and international [20,21] investment in data use for informing school and instructional improvement decisions, it is critically important that we expand upon our current limited understanding of the challenges LPPs face in preparing pre-service leaders to lead data use [13,14]. In the remainder of this article, we first provide a review of related strands of the literature on data use and leadership preparation, including a consideration of standards and competencies. Next, we report on the methods, including an overview of the Virginia LPP context. We then present our findings before concluding with a discussion of implications.

2. Literature Review

In this brief literature review, we begin with an overview of data use expectations in policy, program standards, and accreditation requirements that establish the overarching expectation that principals in the U.S. and elsewhere in the world are responsible for leading data use practices in schools. We then transition to report on what is known about principals’ efforts to lead data use in schools before shifting to demonstrate how those expectations are not always realized. The literature suggests that, at least in part, data use expectations are often not met in schools because principals have been inadequately prepared to meet them. Thus, in the final section of the literature review, we discuss what is known about the preparation of principals to lead data use, including specifically LPPs.

2.1. Data Use Expectations in U.S. Policies, Program Standards, and Accreditation Requirements

Initiatives around the world have prioritized the development of standards to make good school leadership practice (and the support of such practice) explicit [22]. Schildkamp emphasizes that in the Netherlands—and elsewhere—data policies and standards are reflective of local contexts and particular times and are thus never value-neutral [16]. For example, in Ireland, the notion of schools as data-free zones changed rapidly when the Department of Education developed guidelines inclusive of data use for post-primary schools [23]. Despite the local, contextual nature of data and other policies, some international overlap in the content of data use policies and standards seems evident. Australian standards, for example, identify the use of data as critically important for leading the management of the school, including school organizational performance (i.e., operations) and support of student learning outcomes [24]. In English schools, data are regularly used to support teaching and learning, as well as school improvement [4]. Studies inclusive of school leaders in relation to national policy and standards contexts are available in Belgium [25], Norway [26], and Trinidad and Tobago [27], among others.
Since the turn of the century, there has been a coalescence of accountability standards in the United States tied to student achievement results and increased access to various data. Specifically, federal [6,28] and state [8] policies have shepherded in explicit expectations that data use be leveraged as a formal strategy to support instruction and school improvement [29]. Schools and their principals are now held accountable for student scores on achievement tests [30], and data systems are increasingly responsive to the increased scrutiny of student test scores [29]. Various data types—summative, benchmark, interim, or predictive—are designed to signal how well students learn content, especially English language arts and mathematics [31]. Other types of data—including formative assessments intended to measure immediate understanding [32], discipline and attendance data [33], and social–emotional learning data [34]—measure different aspects of the student experience. Collectively, the data types and sources can be numerous and potentially overwhelming, but to varying extents, they are all regularly considered to be tools for increasing student achievement.
Relatedly, accreditation standards for LPPs also underscore the importance of using data to guide school improvement and instructional change. CAEP, for example, provides LPPs with the following guiding question for clinical educators in internships or similar settings [9]: “How does the EPP [educator preparation provider] engage partners in the data informed decision-making for clinical educators?” (p. 20). Pre-service leaders should, in theory, receive clinical experiences in which they use “the impact data to guide instructional decision-making” (p. 24) and modify “instruction based on impact data” (p. 24). LPPs are also responsible for documenting evidence of pre-service leaders being prepared to understand and use “longitudinal data” (p. 27), “state-level data of student performance” (p. 35), and “state-level data of teacher performance” (p. 35). The CAEP standards also include expectations for understanding data quality and assessing stakeholder involvement, as well as measuring content pedagogical knowledge and skills. In sum, expectations of LPPs to provide (and measure) data use learning opportunities for pre-service leaders are substantial and varied.

2.2. Principals Leading Data Use

Many aspects of leading data use appear to span international borders. School leaders need access to quality data [16] and the analytic skills to interpret [35] and respond to those data in strategic ways [23,36,37]. Good data use approaches are regularly conceptualized as systematic, strategic, and inclusive of more than only student achievement [38]. In Germany, for example, there is evidence that leaders believe the data they receive are of limited quality and usefulness and instead prefer student feedback and other internal sources of information [39,40]. This is especially noteworthy given the positive relationship between German principals’ levels of data use and their teachers’ levels of data use [41].
In the context of how principals approach data and lead data use practices specifically, German researchers determined that principal evidence-oriented attitudes and epistemological beliefs influenced how they used data [42]. A study of principals in Norway revealed that many school leaders emphasized reaching consensus with teachers about how to respond to data, sometimes to the detriment of addressing problems or developing solutions [43]. Some principals in Kuwait similarly reported the need to establish cultures of data-driven decision-making in schools but without the level of inclusion of teachers as illustrated in the preceding Norway example [44]. In short, education systems worldwide increasingly underscore the importance of developing high-quality principals through preparation programs, but contexts influence content and preparation strategies.
In the U.S., whether the considerable emphasis on policies, standards, and accreditation translates to principal practice remains somewhat unclear [45]. Relatively little research focuses specifically on principals’ preparation for and ability to identify and retrieve the most relevant data for any given topic, much less analyze, interpret, and respond to those data. Instead, some of the most involved studies of principals and data use focus on what principals do to establish processes and routines in schools [2,46] and organize teachers and their interactions within those processes and routines [47,48], but not necessarily on how principals facilitate or effectively lead data use. Principals are instead regularly expected to create the conditions for teachers to engage fully with, learn from, and respond to student data [49]. When research does account for the principal as someone with a central role in analyzing data and responding as an instructional leader, the focus of studies still often veers toward managerial responsibilities such as establishing time and space for data team meetings [48], as well as resources and tools for teachers and others to analyze and respond to data [47,49].
Instructional teams and professional learning communities that prioritize data are typically among the data structures principals establish and lead [11,50], generally with the principal acting as the facilitator [2]. Data teams can go by a number of names, but for us, they refer to intentional, regular meetings of educators—some combination of principals, teachers, specialists, counselors, and others—who review data to develop and carry out a response. Despite having a regular role in data teams [51], little research on how principals lead and facilitate them exists.

2.3. Data Use Expectations: Not Always Realized

Expectations for continuous improvement and evidence-informed policy and practice to support instructional effectiveness and learning have increased in many nations across the world [52,53,54]. In response, school leaders are responsible for routinely using data to inform decisions about a range of responsibilities, including internal policies, school culture, capacity building, and instructional improvement [49]. However, studies show that school leaders do not always use data well, if at all. For example, Aravena found that school leaders in Chile regularly made decisions without the use of data [55]. In some contexts, even when school leaders proactively use data, they struggle to develop purposeful goals that are measurable [12].
Although U.S. principals are frequently responsible for using data to lead learning in schools [56], the steps for classroom-level inquiry, learning, and action are almost exclusively reserved for teachers [2]. Given this disconnect, many U.S. principals seem unequipped to effectively navigate the various data processes that would help teachers translate what has been learned into instructionally responsive practice [52].
Still, expectations for principals to lead and conduct data use processes persist, as do their engagement in roles and responsibilities in support of data use processes. Studies of teacher data teams and their use of data frequently implicate principals as either not knowing how to help translate teacher data work into instructional improvement or not feeling responsible for it [57]. Too often, principals in data teams leave the weight of inquiry, learning, and action to teachers [11]. Research suggests that principals generally seem well-enough prepared in managerial, hard, and soft skills to lead teacher meetings. It is less evident, however, that they are strong enough in data skills, understanding different types of data and their purposes, tying the appropriate data to the questions or problems to be solved, and interpreting those data to plan for changing policy or practice. Instead, there is some evidence that principals in U.S. contexts sometimes leverage data to advance deficit thinking about students and communities [11,58].

2.4. The Preparation of Principals to Lead Data Use in U.S. Schools

Intentional, purposeful training of school leaders to be able to analyze and interpret various data, effectively model data use processes for teachers, and provide high-quality coaching and feedback (i.e., leverage data to be complete, competent instructional leaders) appears to be a critical missing element to achieve transformative data-driven instruction at scale [51,52]. Data use expectations of school leaders are now established in a number of U.S. policies and accreditation standards [59]. The Professional Standards for Educational Leaders (PSEL) (i.e., what we might expect school leaders to be able to do) have been matched to those found in the National Educational Leadership Preparation (NELP) Recognition Standards (i.e., what programs should do to prepare those school leaders), but such matching processes can be disjointed [59]. Moreover, when state-level interpretations such as Virginia’s licensure regulations for school administrators are added to the mix, a clear purpose for and measurement of whether LPPs adequately prepare school leaders to use data effectively can become muddied.
Clear, coherent LPPs for data use and other leadership topics are critical to long-term principal success. In their comprehensive and systematic review of research on leadership preparation programs, Darling-Hammond and colleagues found that LPPs are associated with positive principal, teacher, and student outcomes [60]. Further, although access to learning opportunities for future school leaders has increased in the U.S., the quality of LPPs varies across contexts [60]. The need for high-quality and contextually relevant LPPs is not limited to the U.S., but it is an issue in global leadership development [5]. The need to “critically examine existing views on the challenge of establishing more context-sensitive school leadership preparation programmes in an era of new public management in education” (p. 5) is immense and with far-reaching implications [61].
In the U.S., research seems to implicate LPPs, at least in part, for the lack of principal preparation to lead data-use practices despite their centrality in preparation standards. Young and colleagues noted that national leadership standards have become the “de facto ‘recommended curriculum’ for preparation programs” [62] (p. 228). Data use is embedded throughout the standards to ensure that pre-service leaders are prepared to guide school improvement and classroom instruction upon entry into the profession. Yet, less than 30 percent of LPPs focused on data use and the preparation of leaders to make data-informed decisions or evidence-based policy or practice [63,64]. Leadership preparation programs must cover so much material across various topics that data use coverage appears to be limited and, unsurprisingly, pre-service principals do not appear ready to lead once in the field [13].
As a singular topic, data use requires considerable instructional time and focus to ensure pre-service school leaders are adequately prepared [65]. There are a number of considerations within LPPs that require coherence, including policies and standards that can constrain [66] or expand [62] programmatic focus, foundational content knowledge [11], real-world learning opportunities [14], and powerful learning opportunities [67] that are active and interactive [68]. There are also complexities in how leaders perceive data and their use [69], especially within school systems with varying priorities [70], which regularly remain unresolved in leadership preparation programs. Thus, despite increased policy attention, access to data, and expectations for data use, a gap in school leader preparation and the expectations to lead or facilitate data use policies and practices in schools persists. Many of these complexities of data use can be understood through performance standards, which is the focus of our next section and conceptual framework.

3. Conceptual Framework: Virginia Performance Standards on Data

Like the majority of states in the U.S., Virginia adopted the Guidelines for Uniform Performance Standards and Evaluation Criteria for Principals as the state’s leadership standards [8]. The standards are closely aligned with national professional standards (e.g., NELP [National Educational Leadership Preparation Standards]) and detail leadership competencies in the following eight areas of responsibility: (1) instructional leadership, (2) school climate, (3) human resources leadership, (4) organizational management, (5) communication and community relations, (6) culturally responsive and equitable school leadership, (7) professionalism, and (8) student academic progress. Expectations for data use are embedded throughout the standards, which outline the data use responsibilities and requirements of school leaders in almost all areas of leadership (see Table 1). LPPs are required to align coursework, internships, and experiential field-based activities with the state performance standards. As such, the Virginia standards provide a useful lens to explore the extent to which leadership preparation programs are designed to address expectations for data use and prepare pre-service leaders to lead and engage in data use.
In the state of Virginia, there are four pathways for someone seeking an endorsement to serve as a building principal or assistant principal (see Supplemental Table S1 for details of the four paths). Paths to endorsement typically require graduate-level degrees and practical experiences in schools, as well as satisfying other Virginia licensure requirements. Among them, as per the Code of Virginia for Administration and Supervision in Prek-12 schools (8 Va. Admin. Code 20-23-620), LPPs must include an internship of at least 320 clock hours, with 120 of those hours spent in a field-based experiential placement.

4. Methods

This study examined leadership preparation programs in Virginia to understand how leadership preparation programs are designed to prepare pre-service leaders to lead and engage in data use. Program design includes courses and content, delivery approach, and experiential components (e.g., internship). We examined how program coursework addressed data use and prepared students to lead data use in schools. We employed a case study methodology and collected multiple forms of qualitative data to gather information about LPPs [71]. We chose a case study approach because the methodology allows for the exploration of a complex phenomenon in its real-life setting [72].

4.1. Study Sample of Institutions

There are 37 public or private four-year colleges and universities across Virginia. Of these, 20 offer leader preparation programs and 19 currently enroll pre-service leaders (see Table 2). In the cases where institutions offered programs with multiple degree and/or certificate options, a core set of courses was common across the programs. As a result, our study focused on school leadership programs (n = 18) that led to administrative and supervision endorsement and a graduate degree (e.g., M.Ed.). These programs offered a graduate degree program in leadership and supervision and included more coursework compared with post-graduate certification programs. However, one institution offered a certificate program only; we included this program in the study as it contributed to a full understanding of the coursework offered at all institutions with leadership preparation programs.
Ten public and nine private institutions describe their leadership preparation programs as offering graduate certificates or master’s degrees in educational leadership (n = 13), administration and supervision (n = 5), or educational administration (n = 1). All institutions offer master’s degrees requiring between 30 and 36 credit hours, while nine institutions also offer graduate certificates requiring between 18 and 24 credit hours. Pre-service leaders in certificate programs previously earned a master’s degree and enrolled in the program for the purpose of obtaining an administrative license. Programs include a mixture of delivery formats, with nine being offered fully online, four being offered fully in-person, two being offered hybrid (part online, part in-person), two being offered both fully online and fully in-person, one being offered both hybrid and fully in-person, and one offering hybrid, in-person, or online. In sum, there are a variety of course-taking options among LPPs in the state.

4.2. Data Sources

We drew on three data sources to explore how programs were designed to include data use content and experiences in school leadership programs. These included the following: (1) publicly available program course descriptions from each institution’s academic catalog, (2) interviews with program coordinators or administrators, and (3) course syllabi. The primary data source was course descriptions, which was supplemented by interviews and sample course syllabi. To mask participants, we reference data sources in the findings section as PU#/Data Source (public institution plus randomly assigned number, followed by data source [CD = course description, SYL = syllabus, or interviewee title]) or PR#/Data Source (private institution plus randomly assigned number, followed by data source).

4.2.1. Course Descriptions

We obtained 197 course descriptions from the 214 publicly available course titles (descriptions for 17 courses were not available in the public academic catalog). Course descriptions were found on each institution’s website and were brief summary statements about the main focus and content of the course. It is worth noting here that we did not have a record of whether course descriptions were generated by the professor of record or someone else. Course descriptions averaged 56 words and ranged between 13 and 179 words in length. Guidelines for one institution, for example, stipulated a 400-character limit for course descriptions inclusive of text, punctuation, spaces, etc. In general, descriptions included 2–4 sentences about the primary focus of the course and the knowledge and skills pre-service leaders would acquire. The following is a typical course description:
This course provides educators with tools to initiate and sustain continuous improvement to promote all student’s academic success and well-being. Drawing on improvement science, the course provides frameworks and protocols for understanding and leading systemic change in schools and school systems. Activities include authentic application of approaches used to support high-quality teaching and leading in P-12 school systems across the country. (PU2/CD8).

4.2.2. Interviews with Program Coordinators

Interviews were recorded and professionally transcribed for analysis. Interviews with program coordinators lasted between 35 and 41 min and produced transcripts totaling 82 pages, with an average of 11.7 pages per interview. Each program’s coordinator was invited to participate in semi-structured interviews held via Zoom. Seven program coordinators agreed to be interviewed. Institution types represented by the coordinators included three large public and four private universities.
The interviews were guided by a protocol that included nine open-ended questions and follow-up prompts designed to obtain information about how the program approached the teaching of data use and the specific courses that emphasized data use or data-informed decision-making. Questions gathered specific information about the level of the course, modality, and type of data emphasized (e.g., achievement, behavioral, attendance, demographic, or environmental). More in-depth questions focused on pre-service leader experiences and opportunities to examine simulated or authentic forms of student performance data to guide plans for school improvement. Additional questions asked how courses developed pre-service leader skills in supporting the use of data and building data use capacities within schools. Probing questions explored the extent to which pre-service leaders were exposed to different strategies such as modeling, team building, establishing norms and routines for creating school cultures, establishing missions or visions that value inquiry, and making evidence-based decisions

4.2.3. Course Syllabi

Six program coordinators submitted a total of 23 syllabi that were deemed by the coordinators to be most relevant to understanding data use in their programs. The syllabi were for courses in general school leadership, organizational theory and leadership, community relationships, leading instruction, evaluation and supervision, and internships.

4.3. Limitations of the Data Sources

We note here that our data sources have limitations. Course descriptions are relatively short accounts of course content that highlight what the instructor of record elected to emphasize and likely do not capture all content or much of the nuances within the courses. Further, interviews were conducted primarily with program coordinators who might have had varying levels of knowledge about the courses. Interviews were also conducted with representatives from only 7 of the 19 institutions. Similarly, program coordinators from only six of the institutions provided syllabi. Thus, the findings we report (e.g., programs’ commitments to clinical experiences) should be understood within the parameters of limited overall response rate. Institutions and institutional practices might have been different depending on their willingness to engage in this study, as well as the ways in which they engaged in this study. Given the researchers’ employment at two of the universities in Virginia and potential concerns regarding proprietary information and competition for pre-service leaders as students, reluctance to participate was assumed.

4.4. Codebook Development and Data Analysis

This section outlines the processes and procedures that were used to analyze the three data sources including course descriptions, interviews, and course syllabi. We defined data use as the practice of using systematically gathered evidence to inform organizational improvement and/or instructional responses.
Prior to undertaking any analysis, we developed a codebook informed by the conceptual framework that was used to facilitate data exploration, pattern identification, and insight development. A deductive coding system was designed to capture the content and focus of each course description based on our definition of data use and the leadership competencies in the Virginia Uniform Performance Standards and the National Professional Standards for Education Leaders [7,8]. An initial set of deductive codes were developed that were used to capture leadership competency (e.g., school improvement, instructional improvement, student academic progress, culturally responsive and equitable leadership); nature of data use (e.g., assessment, evaluation, research); data use processes and skills (e.g., collection, organization, analysis, sensemaking); and specific approaches to data use (e.g., action research, inquiry cycle, continuous improvement). These initial codes were applied to a subset of course descriptions, and additional codes were identified (emergent) with some refinement of the a priori codes. For example, we found it important to document when course descriptions did not reference data use generally. The final set of codes allowed for an analysis that examined the data use course content. The coding structure provided findings specific to course content and how data use was described relative to specific leadership areas of responsibility. The analytic approach enabled findings that documented LLP course content and gaps in ways that may guide LPP program development and inform further research. The final version of the codebook was agreed upon by the research team.

4.4.1. Analysis of Course Descriptions

Course descriptions were uploaded into MAXQDA 2022, a software program for computer-assisted analysis with qualitative and mixed methods data. The first step was the application of the codebook for sorting and organizing the data. For example, the course descriptions were analyzed first by applying the codes to sentences within each description. This initial coding was followed by a second level of coding where similar codes were collapsed to formulate categories within the data. At the completion of this stage, the researchers noted their impressions, interpretations, and questions (i.e., memoing), which were brought to the team meetings for discussion. We also engaged in inductive coding that involved closely reading the course descriptions, then having one team member identify relevant text segments and generate new codes. Typical emerging codes involved document descriptions that did not reference data or data use, or if the course was a stand-alone class on data use. The interviews and course syllabi were analyzed separately with these analyses serving as supplemental insights into the course description analysis.
Two research team members double-coded 20% of the course descriptions to ensure consistency in code applications. Two rounds of double coding were conducted until a sufficient level of inter-coder agreement was achieved (93%) [73], after which the raw data were coded independently.

4.4.2. Analysis of Interviews

All seven interview transcripts were read by two research team members independently to identify initial emergent themes and patterns. One team member created detailed summaries of each interview. A narrative and thematic approach to analysis was used to identify emergent themes and to understand program design and learning experiences more fully, in ways that supplemented the findings of the course description analysis [74]. For example, a key area of the transcript summaries was focused on pre-service leaders’ learning opportunities to engage with data.

4.4.3. Analysis of Course Syllabi

One team member analyzed all syllabi, while another team member performed spot reviewing to check the coding [75]. Similar to the analysis of course descriptions, syllabi were first reviewed for explicit references of content associated with data use. This resulted in binary coding about whether data were referenced in the syllabi’s course description (not necessarily the same as the course description provided on university websites) or learning objectives, or if they were featured in any particular week of the course, reading assignments, class assignments (e.g., activity log). The coding also focused on whether data use was framed in terms of Virginia school leader competencies such as instructional leadership. In cases where data were mentioned in syllabi, words, phrases, assignments, etc., were copied and pasted verbatim into a separate file for comparison of the content (e.g., how data were referenced across syllabi within learning objectives).

5. Findings

The research question of this study centered on understanding the ways in which (if any) LPPs are designed to prepare pre-service leaders to lead and engage in data use. Our findings indicate that LPPs in Virginia have undertaken this work to a limited degree. Further, evidence suggests that LPPs designed their programs to focus on the development of leadership moves, often without consideration for information gleaned through data use. Our findings also suggest that LPPs tend to rely on internships for preparing pre-service leaders to lead data use efforts, despite the fact that LPPs infrequently have control over what transpires in clinical settings. In the subsequent sections, we provide a more detailed analysis of these findings as they relate to this study’s research question.

5.1. Disconnect between State Standards and Course Content

The analysis of course descriptions revealed far less attention on data use and data-informed decision-making compared with the expectations outlined in Virginia’s principal evaluation standards. Significant gaps between course content and professional standards for school leaders were apparent. For example, 59% of the descriptions did not reference data, data use, or data-informed decision-making. Of the 77 course descriptions with references to data, 10% mentioned data-informed decision-making, 10% referenced assessment and/or the use of assessment data, 8% described research and/or research data, and 6% cited evaluation. Most courses included data use content and skills in relation to other course content such as the evaluation of instructional programs, organizational change, and school improvement. This is illustrated in the following description of a course focused on curriculum, instruction, and assessment, “The assessment of student learning, including student learning data analyses, will be the third area of focus for this course” (PU1/CD6).
Of the 197 courses, there were only two standalone courses dedicated to data use and data-informed decision-making. These courses emphasized different approaches to and roles for data use. The first connected data use to school improvement and student learning as follows:
Collaboratively lead, design, implement, and evaluate school improvement and classroom culturally responsive practices that increase student learning and well-being. Includes information sources and processing, data literacy (use, analysis, and decision making), equitable resource planning, and program evaluation to inform and lead change. (PR4/CD7).
In comparison, the other course emphasized the technical and process aspects of data use within a research or action research framework to identify needs and monitor learning as follows:
A survey of tools and techniques used in conducting and utilizing assessment data. Includes current research approaches, project design, and data collection. Also included are methods for using data to identify school needs, evaluate personnel, track student performance, and develop strategies for increasing performance as necessary. (PR7/CD2).
These descriptions of courses focused exclusively on data use reflected the wide variation in how programs considered and approached data use and the role of data in decision-making by school leaders. Based on our review, it was clear that a shared understanding of data use preparation, especially in ways that aligned with state professional expectations, was not evident. Courses that referenced data use, in combination with other technical topics such as research methods, assessment, and evaluation, tended to focus on the procedural aspects of data use and data analysis. In these cases, data use was often disconnected from leadership roles and responsibilities. Few courses connected data and/or data use with a specific area of leadership; school improvement (4%) and instructional leadership (3%) were referenced most frequently. However, the vast majority of course descriptions that referenced data-specific content did not mention a leadership domain. Data and data use were most frequently connected to ideas of continuous improvement.

5.2. Coursework Provides Few Opportunities for PSLs to Learn Data Use within a Standards Framework

Nearly all twenty-three syllabi from six LPPs referenced one of the NELP standards that included data use, but only seven of the syllabi included references to data in learning objectives, and of those seven, only four mentioned data use in their course overviews. Few syllabi provided information about weekly course content coverage. Only PU8/SYL8 focused on data use as a standalone topic (instead of being embedded as part of a broader initiative such as instructional leadership). Not a single syllabus made clear what counts as data proficiency. More syllabi, however, identified various data use assignments requiring differing levels of detail. For example, PR2/SYL1 assigned analyzing multiple types of data to list three to five goals for school improvement and PU2/SYL5 prioritized using student achievement data to make recommendations about resources.
The interview data suggested a number of possible reasons for so few explicit references to data use in course materials. While considering course offerings, one program coordinator admitted the following: “Data is one of those things that when [pre-service leaders] come to your classes and you tell them to do these projects, you just assume they know how to do it. And you asked a really good question. ‘Do we intentionally introduce these types of data?’ And I don’t know that we do” (PU2/Program Coordinator). Instead, pre-service leaders seemed to be tasked regularly with rational and situational decision-making that required them to consider data as they believed warranted. In other cases, pre-service leaders were instructed to take “deep dives” into district and school data “to look at the demographics” (PR8/Program Coordinator). An imperfect balance seemed to be at play where, on the one hand, “throughout our courses, the expectation is for all of our instructors to integrate data literacy one way or another” (PR5/Program Coordinator). Yet, on the other hand, “I use data not in an intentional way such as, ‘In this unit, we’re going to talk about demographic data’, or ‘We’re going to talk about student academic achievement data’” (PU2/Program Coordinator, emphasis added).

5.3. Coursework Contained Few Clear References to Data Use for Instructional Change

When the course descriptions and syllabi referenced data use, focus on instructional change was seldom evident. As highlighted above, course descriptions that included references to data use almost always did so generically (e.g., “data analysis”). Even when they provided more detail—“designing projects to enhance school culture through the application of assessment and research data” (PR8/CD4)—instructional change was not usually prioritized.
Of the four syllabi course overviews that referenced data, only one clearly stated using data for instructional purposes (PU2/SYL4). Similarly, of the seven syllabi that specified data in their learning objectives, only three referenced data use for instructional purposes (PU2/SYL4, PU2/SYL5, and PU7/SYL8). Few syllabi provided information about weekly course content coverage. Two syllabi referenced data use for instructional purposes in their list of weekly content coverage (PU2/SYL7 and PU3/SYL8). Only PU3/SYL8 included a reading assignment—Data Wise [2]—focused on data use and instruction. Three syllabi (PU2/SYL4, PU2/SYL8, and PU8/SYL8) identified data for instructional improvement within class content. Two syllabi tasked pre-service leaders with assignments that required them to use data to inform instructional decisions. Specifically, PU8/SYL8 required pre-service leaders to log the use of school data to improve a specific Virginia learning standard (i.e., SOL) instructional issue from state standardized testing, and PU3/SYL8 required the creation of an educational intervention plan based on various data.
Programmatic inclusion of data use, the idea of supporting teachers in using data for their instructional planning, seemed infrequently connected to instructional change. For example, one program administrator noted that pre-service leaders “will focus a little bit more on data pertaining to student behavior and…family engagement” because “this class connects to the school improvement plan developed in another course and illustrates some scaffolding or intention to develop a comprehensive understanding of varied roles and uses of data” (PR8/Program Coordinator). Even efforts to focus on instructional support in response to data typically failed to make connections between data and instructional responses explicit:
In the [supervision and evaluation course], another assignment that they have, is to write a plan of improvement with a teacher. And oftentimes, it depends on the scenario that they have, but it may be lack of test score progress, showing progress. And so, looking at that data, their assignment, or their task, is to say what type of instructional support might you give [the teacher] based on the data if we see that reading scores are low in a particular area, comprehension, what type of resources, or what type of professional development might you provide this teacher, or recommend…? (PR5/Program Coordinator).
Moreover, course descriptions from two programs indicated a focus on using data for research and analysis purposes, but not for school or instructional improvement. For example, one program provided the following course description for a research and assessment course: “Overview of the nature of research on human development, learning, and pedagogical knowledge and skills. Topics include current trends and issues in education, skills in data collection and assessment, and application of research in the school setting” (PR2/CD8).
Although the evidence suggested an overall limited focus on data use for instructional change, some program coordinators highlighted courses that seemed to be focused on building pre-service leader capacity to use data to improve instruction. For example, the PU3 Program Coordinator stated the following after highlighting two courses, one focused on educational change and improvement and the other enhancing and supporting instruction: “[pre-service leaders] obviously are doing some deep data dives in those classes, and they’re thinking about how those enhance instruction and decisions that they need to make as a leader, and really looking at the disaggregated and the aggregated, and thinking about how those decisions should be made differently”. Similarly, the PR2 Program Coordinator focused on the importance of data-inquiry cycles anchored to a data-based goal where pre-service leaders engaged in a process as follows:
They use a data chart to scaffold them through the process…Based on your case, what data sources are most relevant? Now go explore those. And after you’ve explored those, then are there any other data sources that now, maybe you didn’t find what you were expecting to find in those data sources?…They’re required to develop one to three goals …that come out of the data. So, yes, we scaffold that. It’s really a scaffolding process.
We emphasize, however, that few programs communicated an approach involving inquiry cycles when teaching pre-service leaders how to use data.

5.4. Coursework Prioritized Leadership Moves in Data Contexts over Building Data Knowledge and Use in Alignment with Standards

LPPs were frequently designed to teach pre-service leaders how to make leadership moves instead of first demonstrating an understanding of various data types and how to use them effectively. One relatively common way programs accomplished this was by having pre-service leaders leverage various data types to inform improvement planning processes. For example, in one program,
[Pre-service leaders] have to go ahead and audit their comprehensive school improvement plan [from the internship site]. They have to pick a content area that they’re really going to focus on—we ask literacy or math for that particular purpose in that course. Then we also have [pre-service leaders] conduct—basically use—a variety of qualitative and quantitative data to kind of figure out where we need to go with the school improvement plan—they have to conduct observations, classroom observations of those contact areas, they have to collect qualitative data from PLC meetings they have to look at student assessment data—and then they have to evaluate and then if they need to tweak and adjust their plan they have to do that, and then they have to present that to their mentor or their building administrator essentially. (PR8/Program Coordinator)
This example suggests that pre-service leaders were seldom asked to demonstrate an understanding of different data types or select and justify the data used to make leadership decisions. Further, data use assignments and responsibilities for pre-service leaders were often limited to organizational change, not instructional improvement.
Interestingly, in that program, the focus on data to inform improvement planning was from a course focused on instructional leadership and student achievement. No evidence was provided to suggest if or how pre-service leaders were using data to inform instruction in that course. Similarly, in another program, pre-service leaders “have to develop a plan as to how [they]would help this school improve. And they present that as though they are presenting it to a group of teachers, or a group of community members, parents…where we’re able to see whether they can interpret data and explain data to the layman” (PR5/Program Coordinator).
As an extension of this approach to leadership and data use, programs regularly appeared to be designed based on an assumption that pre-service leaders (and current ones, discussed more in the next section) already understand data and data use processes (i.e., retrieval, analysis, interpretation). In the following example, pre-service leaders were responsible for bringing various data types to bear to inform teacher instructional practice, but there was no evidence that they identified the right or best data for the situation, analyzed those data accurately, or responded to teachers with suggestions for appropriate instructional approaches based on those data.
We’re teaching them how to conduct observations of teachers where they do the post-conference that is based off of that data. And sometimes if there is an area of need in a department or on a grade level through a PLC (professional learning community) they’re looking at their scores, benchmark assessments, whatever it might be through that data analysis and specifically writing goals and strategies for improvement. So, that would be modeled through that observation component, but it would certainly be modeled through grade-level meetings, department meetings and continuous improvement planning. And it really and all this that’s what it goes towards is developing that school’s continuous improvement plan, professional development needs, and then ultimately increasing student achievement is ultimately the goal. (PU5/Program Coordinator)
This quotation actually suggests that much of the responsibility to ensure pre-service leaders know how to use data for instructional purposes is placed on their internships or field-based experiences.

5.5. LPPs Rely on Internships for Pre-Service Leaders to Develop Data Knowledge and Skills Instead of Directly Overseeing PSLs’ Ability to Execute Data Practices According to Standards

Many LPPs appeared to rely on their pre-service leaders’ internships with mentor school leaders as the primary way to grow pre-service leader data knowledge and skills. The following syllabi (from one program) included site-based assignments to advance data use for instructional change: PU2/SYL7 included field-based learning applications to “build coherence of practice by analyzing your school data”, while PU2/SYL8 required pre-service leaders to shadow mentors and write reflections on data-use observations. Program coordinators across the LPPs discussed the flexibility of the internship. The following coordinator spoke at length about how important the internship was for developing pre-service leader’s data knowledge:
One of those standards emphasizes the use of data to drive instructional practices as well as organizational issues, etc. So, they really can’t get out of the internship without looking at data and being involved in data use. But the internship is flexible enough that if a [pre-service leader’s] readiness does not afford them the chance to lead through data, then they can observe or participate in data use, and there’s no penalty for that…But I would say all [pre-service leaders] in some way engage with data. (PU2/Program Coordinator)
The internships had numerous, varied requirements. One coordinator pointed out that over the course of a 16-month internship, pre-service leaders were responsible for meeting “35 internship objectives, 350 h in the field”, so the internship was deemed “essential” (PU5/Program Coordinator). Not only were the internships demanding in terms of content coverage and time, but they were also typically conducted one-to-one (e.g., pre-service leader working with and learning from a school principal). Another program coordinator articulated some of the concerns with this approach: “We cover…the reliability and validity aspect of that and bias as well…‘All right, well, what’s good data?’ When there’s so much data, how do you figure out which data is the best data?” (PR8/Program Coordinator). Moreover, no course descriptions of internships or interview data clearly stated what aspects of data knowledge and use were to be learned through internships or whether or not mentors of internships met any standards of data knowledge and capacity themselves.
Decisions related to the depth of data use and type of engagement depended “on the level of comfort of their school leader and whether those doors are open for them, and whether they’re ready for it to be honest with you…because they have to have a discussion with their principal to approve these projects” (PU2/Program Coordinator) because access was dependent on the school context and “what [data] the mentor is willing to give them access to” (PU3/Program Coordinator). Thus, “if I’m your principal and I don’t think you’re ready to take the reins on a data team, then I’m [going to] put you on the data team to observe it. Or you can participate, but I’m not letting you lead it. And so, that could happen” (PU2/Program Coordinator). As a result, pre-service leader access in internships was dependent on the school context.
Like all research studies, our findings and conclusions are limited by several factors. First, we relied on publicly available information in several instances. We used course descriptions found in institutional academic records and bulletins to explore the ways in which data use is addressed in LPP coursework. These descriptions, by design, are brief (average of 56 words) and highlight the main topics of the course. The descriptions do not provide in-depth information about how data use is addressed in a course, and data may be a topic of the course, but not mentioned in the description. Additionally, we assumed that the implementation of each course aligned with the public course description (and syllabus), which may or may not be accurate. Second, relatively few syllabi were made publicly available or shared with us. Third, we reviewed the individual web pages of program faculty to examine the extent to which they had expertise and interests in data use. The content of the web pages may not have been updated or current. Fourth, we had a relatively small number of program coordinators participate in the interviews (37%). We note that several responded to invitations indicating that they lacked the time needed for participation, whereas others expressed some hesitation until the study purposes were clarified to confirm that we were not conducting individual program evaluations. We extrapolate that these reasons were likely held by non-respondents. Lastly, we conclude with an acknowledgment that this study was also limited to one state, limiting the generalizability of our claims despite some evidence that preparation programs across states are likely similar in scope and design [17].

6. Discussion

In this study, we investigated pre-service leaders’ opportunities to lead and engage in data use in schools in ways that are responsive to Virginia state leadership standards. We found few clear references to data use in course descriptions or syllabi, and even fewer of those references specified data use for instructional change. Instead, some assumptions seemed to be embedded within LPPs, including that pre-service leaders should already know and understand data processes (i.e., data types, retrieval, analysis, interpretation, etc.) and that gaps in knowledge would be filled in through ongoing interactions with data across classes and through internships. Indeed, program coordinators suggested that LPP content is more about learning to lead than ensuring an understanding of what is being led (that is, leadership moves are prioritized over content knowledge, which in this case was an understanding of data and data processes).
Research suggests that principals often take responsibility for scheduling data meetings and creating conditions for data conversations to occur [46,47], but there is relatively little evidence that principals are actively engaged in facilitating meaningful data conversations. Yet, data use programming regularly places principals figuratively—and literally—at the head of the table [2,3]. However, in some cases, principal engagement in data team meetings actually seems to impede data conversations [11]. There is little evidence that principals fully understand the nuance and complexity of different types of data or are able to match those data to appropriate organizational and/or instructional improvement goals. Thus, if LPPs are responsible for developing school leaders who can meet the many, various ways data are included in federal and state standards, a first order of business is to become more intentional in closing the data knowledge gap that many pre-service (and in-service) school leaders appear to have.
LPPs likely also need to consider more deeply what they want pre-service leaders to gain from internships. Given the research noted above, there is little guarantee that mentors have deep knowledge of data and how to use data effectively to make instructional decisions, or that internship experiences are crafted to include data use opportunities. If mentors received preparation training similar to that provided by the pre-service programs central to this study, it cannot be assumed that current mentors have the competencies needed to enact effective data use for instructional improvement. Seemingly every program in the state, however, relies on mentors to help bridge data course content—even how limited it is—and practice. This is not to diminish the practical elements of “on-the-job” learning opportunities, which can be invaluable [76]. Yet, to specifically help pre-service leaders better understand data types, data processes, and/or how to lead others to use data, we find little evidence to suggest that clear expectations or structures are in place for mentors to orchestrate deeper learning of data-use processes. The role of mentors and internship placements are worthy of further exploration. At a minimum, more clearly defined structures and expectations for both pre-service leaders and their mentors seem warranted [14].
When the findings of this study are considered jointly with research on principals’ somewhat limited knowledge of data and facilitation of data use, especially for instructional purposes [11], there are a number of implications for programming. Among them, LPP faculty and district leaders could more regularly collaborate to design programs—or at least components of programs (e.g., internship)—perhaps in university–district partnerships to be more responsive to contextualized needs [77,78]. In international settings, further consideration should be given to how ministries of education, especially in smaller and/or more centralized countries, might better engage with LPPs [79]. The extent to which districts have data leadership capacity and services to build strong data leaders likely varies considerably, but LPPs currently appear to offer little flexibility other than mentor/internship designations, which, as noted above, seem mostly unregulated. LPPs could instead embed data use opportunities in leadership courses to better align with the expectations for data use across multiple leadership domains. In addition, more intentional development and design of induction programs [80] at the state or local levels within the U.S. or internationally [3] could be a way to ensure pre-service leaders have robust opportunities to learn more about data use knowledge and facilitation as they transition into leadership positions. Similarly, state departments of education could be more intentional in establishing professional learning opportunities for principals, especially novice ones, to build skills [81]. In that vein, state departments could also work with LPP faculty to better establish expectations for pre-service leader mentors.
Despite leadership standards and LPP accreditation standards, there appears to be an overall lack of direction about the collective vision for school leader data use. Perhaps assumptions about educator preparedness more broadly contribute. In most cases, pre-service leaders have already been teachers who presumably learned about data for instruction. Yet, a complementary study we conducted similarly shows inconsistent expectations for teacher data knowledge and use (Author, in press). Mentors currently lead buildings and are responsible for data team meetings, so, again, we might be inclined to assume an understanding of data and capacity to execute responses, but in-service leadership development programming [2,3] and research on principals leading data teams [11,82] suggest otherwise. Even within LPPs, there is the following disconnect about whether and how to prioritize data: are pre-service leaders being developed for organizational management (e.g., personnel, master scheduling, etc.), instructional leadership (e.g., leading teachers and others to better understand students and how to respond to their learning needs), or both? More explicit goal-setting about the purposes of data use within programs seems necessary to clearly design courses, devise expectations and tasks, and prepare mentors to prioritize pre-service leader data use learning experiences.
There are some related considerations about this study worth noting. The research literature—including this study—and U.S. policy and accreditation standards feature the role of the principal as a central figure, if not the primary one, to lead data work in schools. Given the many responsibilities already assigned to school leaders across the world, it is reasonable to consider the extent to which they should be a central figure in the work, or even be responsible for ensuring that the work is being conducted with reasonable levels of competence. The ability to lead data use might also depend on pre-service leaders’ background knowledge, interests, and attitudes regarding data, as research has suggested that these factors influence actual data use [42]. As such, researchers might design future studies that clearly define what effective data use looks like in schools and then determine what pathways exist that reflect successful data practices regardless of how those practices are led and by whom. International comparative studies in this space would also be helpful to understand how policy context and local interpretation vary, as well as how those differences influence what principals are responsible for and how they facilitate data to advance schools to meet organizational goals. Such studies would also be instructive for LPPs (and those who develop the policies and standards that drive them), aiming to make strategic program design decisions for better preparing preservice leaders to ensure good data use practices in schools, even if others are charged with championing the work.

Supplementary Materials

The following supporting information can be downloaded at https://www.mdpi.com/article/10.3390/educsci14101081/s1: Table S1: Requirements for initial endorsement to be a school leader in Virginia, by pathway.

Author Contributions

Conceptualization, C.V.M., L.A. and T.R.M.; methodology, L.A. and T.R.M.; software, T.R.M.; validation, C.V.M., L.A. and T.R.M.; formal analysis, C.V.M., L.A., T.R.M. and M.H.; data curation, C.V.M., M.H. and T.R.M.; writing—original draft preparation, C.V.M., L.A. and T.R.M.; writing—review and editing, C.V.M. and M.H.; project administration, C.V.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by 4-VA at UVA Collaborative Research Grants (2022–2023), grant number 687.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Review Board of the University of Virginia (protocol code 5267 and date of approval of 19 October 2023).

Informed Consent Statement

Informed consent was obtained from all subjects involved in this study.

Data Availability Statement

The datasets presented in this article are not readily available because the small number of participants makes them susceptible to being identified.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of this study; in the collection, analyses, or interpretation of data; in the writing of this manuscript; or in the decision to publish the results.

References

  1. Coburn, C.E.; Turner, E.O. The practice of data use: An introduction. Am. J. Educ. 2012, 118, 99–111. [Google Scholar] [CrossRef]
  2. Boudett, K.P.; City, E.A.; Murnane, R.J. (Eds.) Data Wise: A Step-by-Step Guide to Using Assessment Results to Improve Teaching and Learning; Harvard Education Press: Cambridge, MA, USA, 2013. [Google Scholar]
  3. Schildkamp, K.; Handelzalts, A.; Poortman, C.L.; Leusink, H.; Meerdink, M.; Smit, M.; Ebbeler, J.; Hubers, M.D. The Data TeamTM Procedure: A Systematic Approach to School Improvement; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
  4. Day, C.; Sammons, P.; Gorgen, K. Successful School Leadership. 2020. Available online: https://files.eric.ed.gov/fulltext/ED614324.pdf (accessed on 18 September 2024).
  5. Gurr, D.; Drysdale, L.; Goode, H. Global Research on Principal Leadership. Oxford Research Encyclopedias. 2020. Available online: https://oxfordre.com/education/display/10.1093/acrefore/9780190264093.001.0001/acrefore-9780190264093-e-714 (accessed on 26 September 2024).
  6. Every Student Succeeds Act, 20 U.S.C. § 6301. 2015. Available online: https://www.congress.gov/bill/114th-congress/senate-bill/1177 (accessed on 26 September 2024).
  7. National Policy Board for Educational Administration. Professional Standards for Educational Leaders. 2015. Available online: https://www.npbea.org/wp-content/uploads/2017/06/Professional-Standards-for-Educational-Leaders_2015.pdf (accessed on 22 July 2023).
  8. Virginia Department of Education. Guidelines for Uniform Performance Standards and Evaluation Criteria for Principals. 2022. Available online: https://www.doe.virginia.gov/teaching-learning-assessment/teaching-in-virginia/performance-evaluation/principals (accessed on 17 February 2024).
  9. Council for the Accreditation of Educator Preparation. CAEP Revised 2022 Standards Workbook. 2022. Available online: https://caepnet.org/~/media/Files/caep/accreditation-resources/caep-2022-standards-workbook-final.pdf?la=en (accessed on 17 August 2023).
  10. Mandinach, E.B.; Friedman, J.M.; Gummer, E.S. How can schools of education help to build educators’ capacity to use data? A systemic view of the issue. Teach. Coll. Rec. 2015, 117, 1–50. [Google Scholar] [CrossRef]
  11. Meyers, C.M.; Moon, T.R.; Patrick, J.; Brighton, C.M.; Hayes, L. Data use processes in rural schools: Management structures undermining leadership opportunities and instructional change. Sch. Eff. Sch. Improv. 2022, 33, 1–20. [Google Scholar] [CrossRef]
  12. Kippers, W.B.; Poortman, C.L.; Schildkamp, K.; Visscher, A.J. Data literacy: What do educators learn and struggle with during a data use intervention? Stud. Educ. Eval. 2018, 56, 21–31. [Google Scholar] [CrossRef]
  13. Dexter, S.; Moraguez, D.; Clement, D. Pedagogical gaps in the bridge from classroom to field for pre-service principal competence development. J. Educ. Adm. 2022, 60, 473–492. [Google Scholar] [CrossRef]
  14. Drake, T.A. Learning by doing: A daily life of principal interns’ leadership activities during the school year. J. Res. Leadersh. Educ. 2022, 17, 24–54. [Google Scholar] [CrossRef]
  15. Ahumada, L.; Galdames, S.; Clarke, S. Understanding leadership in schools facing challenging circumstances: A Chilean case study. Int. J. Leadersh. Educ. 2015, 19, 264–279. [Google Scholar] [CrossRef]
  16. Schildkamp, K. Data-based decision-making for school improvement: Research insights and gaps. Educ. Res. 2019, 61, 257–273. [Google Scholar] [CrossRef]
  17. Mandinach, E.B.; Gummer, E.S. A systemic view of implementing data literacy in educator preparation. Educ. Res. 2013, 42, 30–37. [Google Scholar] [CrossRef]
  18. Mandinach, E.B. A perfect time for data use: Using data-driven decision making to inform practice. Educ. Psychol. 2012, 47, 71–85. [Google Scholar] [CrossRef]
  19. Wohlstetter, P.; Datnow, A.; Park, V. Creating a system for data-driven decision-making: Applying the principal-agent framework. Sch. Eff. Sch. Improv. 2008, 19, 239–259. [Google Scholar] [CrossRef]
  20. Hardy, I. Data, numbers and accountability: The complexity, nature and effects of data use in schools. Br. J. Educ. Stud. 2015, 63, 467–486. [Google Scholar] [CrossRef]
  21. van Geel, M.; Keuning TVisscher, A.J.; Fox, J.-P. Assessing the effects of a school-wide data-based decision-making intervention on student achievement growth in primary schools. Am. Educ. Res. J. 2016, 53, 360–394. [Google Scholar] [CrossRef]
  22. Brown, C.; Zhang, D. How can school leaders establish evidence-informed schools: An analysis of the effectiveness of potential school policy levers. Educ. Manag. Adm. Leadersh. 2017, 45, 382–401. [Google Scholar] [CrossRef]
  23. Young, C.; McNamara, G.; Brown, M.; O’Hara, J. Adopting and adapting: School leaders in the age of data-informed decision making. Educ. Assess. Eval. Account. 2018, 30, 133–158. [Google Scholar] [CrossRef]
  24. Australian Institute for Teaching and School Leadership. Australian Professional Standard for Principals and the Leadership Profiles. 2014. Available online: https://www.aitsl.edu.au/docs/default-source/national-policy-framework/australian-professional-standard-for-principals-and-the-leadership-profiles (accessed on 18 September 2024).
  25. Vanlommel, K. Drivers and obstacles for evidence-informed practice in an autonomous and decentralized educational system: Belgium. In The Emerald Handbook of Evidence-Informed Practice in Education; Brown, C., Malin, J.R., Eds.; Emerald Publishing Limited, Leeds: Leeds, UK, 2022; pp. 259–273. [Google Scholar] [CrossRef]
  26. Prøitz, T.S.; Mausethagen, S.; Skedsmo, G. District administrators’ governing styles in the enactment of data-use practices. Int. J. Leadersh. Educ. 2019, 24, 244–265. [Google Scholar] [CrossRef]
  27. De Lisle, J. Evolving data use policy in Trinidad and Tobago: The search for actionable knowledge on educational improvement in a small island developing state. Educ. Asse. Eval. Acc. 2016, 28, 35–60. [Google Scholar] [CrossRef]
  28. No Child Left Behind (NCLB) Act of 2001, Pub. L. No. 107-110107-110, § 115, Stat. 1425. 2002. Available online: https://www.congress.gov/107/plaws/publ110/PLAW-107publ110.htm (accessed on 26 September 2024).
  29. Datnow, A.; Hubbard, L. Teachers’ use of data to inform instruction: Lessons from the past and prospects for the future. Teach. Coll. Rec. 2015, 117, 1–26. [Google Scholar] [CrossRef]
  30. Shirrell, M. New principals, accountability, and commitment in low-performing schools. J. Educ. Adm. 2016, 54, 558–574. [Google Scholar] [CrossRef]
  31. Konstantopoulos, S.; Miller, S.R.; van der Ploeg, A. The impact of Indiana’s system of interim assessments on mathematics and reading achievement. Educ. Eval. Policy Anal. 2013, 35, 481–499. [Google Scholar] [CrossRef]
  32. Dunn, K.E.; Mulvenon, S.W. A critical review of research on formative assessments: The limited scientific evidence of the impact of formative assessments in education. Pract. Assess. Res. Eval. 2009, 14, 7. [Google Scholar] [CrossRef]
  33. Stein, M.L.; Grigg, J.A. Missing bus, missing school: Establishing the relationship between public transit use and student absenteeism. Am. Educ. Res. J. 2019, 56, 1834–1860. [Google Scholar] [CrossRef]
  34. Forman, S.R.; Foster, J.L.; Rigby, J.G. School leaders’ use of social-emotional learning to disrupt Whiteness. Educ. Adm. Q. 2022, 58, 351–385. [Google Scholar] [CrossRef]
  35. Sakiz, H.; Abdurrahman, E.; Sarıçam, H. Teachers’ perceptions of their school managers’ skills and their own self-efficacy levels. Int. J. Leadersh. Educ. 2020, 23, 585–603. [Google Scholar] [CrossRef]
  36. Fernandes, V. Exploring leadership influence within data-informed decision-making practices in Australian independent schools. Stud. Paedagog. 2021, 26, 139–159. [Google Scholar] [CrossRef]
  37. Jenssen, M.M.F.; Paulsen, J.M. Combining capacity for instructional leadership with individual core practices in the Norwegian policy context. Educ. Manag. Adm. Leadersh. 2024, 52, 475–492. [Google Scholar] [CrossRef]
  38. Visscher, A.J. On the value of data-based decision making in education: The evidence from six intervention studies. Stud. Educ. Eval. 2021, 69, 100899. [Google Scholar] [CrossRef]
  39. Demski, D.; Racherbäumer, K. Principals’ evidence-based practice—Findings from German schools. Int. J. Educ. Manag. 2015, 29, 735–748. [Google Scholar] [CrossRef]
  40. Demski, D.; Racherbäumer, K. What data do practitioners use and why? Evidence from Germany comparing schools in different contexts. Nord. J. Stud. Educ. Policy 2017, 3, 82–94. [Google Scholar] [CrossRef]
  41. Hawlitschek, P.; Henschel, S.; Richter, D.; Stanat, P. The relationship between teachers’ and principals’ use of results from nationwide achievement tests: The mediating role of teacher attitudes and data use cultures. Stud. Educ. Eval. 2024, 80, 101317. [Google Scholar] [CrossRef]
  42. Buske, R.; Zlatkin-Troitschanskaia, O. Investigating principals’ data use in school: The impact of evidence-oriented attitudes and epistemological beliefs. Educ. Manag. Adm. Leadersh. 2019, 47, 925–942. [Google Scholar] [CrossRef]
  43. Mausethagen, S.; Prøitz, T.S.; Skedsmo, G. School leadership in data use practices: Collegial and consensus-oriented. Educ. Res. 2019, 61, 70–86. [Google Scholar] [CrossRef]
  44. Alshammari, I.; AlAjmi, M. School principals’ perspectives on applying data-driven decision making DDDM in centralized school settings. Asia-Pac. Educ. Res. 2024. [Google Scholar] [CrossRef]
  45. Datnow, A.; Park, V. Opening or closing doors for students? Equity and data use in schools. J. Educ. Chang. 2018, 19, 131–152. [Google Scholar] [CrossRef]
  46. Farley-Ripple, E.; Buttram, J. The development of capacity for data use: The role of teacher networks in an elementary school. Teach. Coll. Rec. 2015, 117, 1–34. [Google Scholar] [CrossRef]
  47. Cosner, S. Supporting the initiation and early development of evidence-based grade-level collaboration in urban elementary schools. Urban Educ. 2011, 46, 786–827. [Google Scholar] [CrossRef]
  48. Park, V.; Datnow, A. Ability grouping and differentiated instruction in an era of data-driven decision making. Am. J. Educ. 2017, 123, 281–306. [Google Scholar] [CrossRef]
  49. Huguet, A.; Farrell, C.C.; Marsh, J.A. Light touch, heavy hand: Principals and data-use PLCs. J. Educ. Adm. 2017, 55, 376–389. [Google Scholar] [CrossRef]
  50. Huguet, A.; Marsh, J.A.; Farrell, C.C. Building teachers’ data-use capacity: Insights from strong and developing coaches. Educ. Policy Anal. Arch. 2014, 22, 1–31. [Google Scholar] [CrossRef]
  51. Schildkamp, K.; Datnow, A. When data teams struggle: Learning from less successful data use efforts. Leadersh. Policy Sch. 2020, 20, 1–20. [Google Scholar] [CrossRef]
  52. Abrams, L.M.; Varier, D.; Mehdi, T. The intersection of school context and teachers’ data use practice: Implications for an integrated approach to capacity building. Stud. Educ. Eval. 2021, 69, 100868. [Google Scholar] [CrossRef]
  53. Schildkamp, K.; Ehren, M. From “intuition”- to “data”-based decision making in Dutch secondary schools? In Data-Based Decision Making in Education. Studies in Educational Leadership; Schildkamp, K., Lai, M., Earl, L., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; Volume 17. [Google Scholar]
  54. Schildkamp, K.; Poortman, C.L.; Ebbeler, J.; Pieters, J.M. How school leaders can build effective data teams: Five building blocks for a new wave of data-informed decision making. J. Educ. Chang. 2019, 20, 283–325. [Google Scholar] [CrossRef]
  55. Aravena, F. Destructive leadership behavior: An exploratory study in Chile. Leadersh. Policy Sch. 2019, 18, 83–96. [Google Scholar] [CrossRef]
  56. Bryk, A.S.; Gomez, L.M.; Grunow, A.; LeMahieu, P.G. Learning to Improve: How America’s Schools can Get Better at Getting Better; Harvard Education Press: Cambridge, MA, USA, 2015. [Google Scholar]
  57. Park, V.; Daly, A.J.; Guerra, A.W. Strategic framing: How leaders craft the meaning of data use for equity and learning. Educ. Policy 2012, 27, 645–675. [Google Scholar] [CrossRef]
  58. Lasater, K.; Bengston, E.; Albiladi, W.S. Data use for equity? How data practices incite deficit thinking in schools. Stud. Educ. Eval. 2021, 69, 100845. [Google Scholar] [CrossRef]
  59. Jimerson, J.; Childs, J. Signal and symbol: How state and local policies address data-informed practice. Educ. Policy 2017, 31, 584–614. [Google Scholar] [CrossRef]
  60. Darling-Hammond, L.; Wechsler, M.E.; Levin, S.; Leung-Gagné, M.; Tozer, S. Developing Effective Principals: What Kind of Learning Matters? 2022. Available online: https://files.eric.ed.gov/fulltext/ED620192.pdf (accessed on 18 September 2024).
  61. Brauckmann, S.; Pashiardis, P.; Ärlestig, H. Bringing context and educational leadership together: Fostering the professional development of school principals. Prof. Dev. Educ. 2023, 49, 4–15. [Google Scholar] [CrossRef]
  62. Young, M.D.; Anderson, E.; Nash, A.M. Preparing school leaders: Standards-based curriculum in the United States. Leadersh. Policy Sch. 2017, 16, 228–271. [Google Scholar]
  63. Brown, C.; Greany, T. The evidence-informed school system in England: Where school leaders should be focusing their efforts. Leadersh. Policy Sch. 2018, 17, 115–137. [Google Scholar] [CrossRef]
  64. Grigsby, B.; Vesey, W. Assessment training in principal preparation programs. Adm. Issues J. 2011, 1, 18–31. [Google Scholar] [CrossRef]
  65. Mense, E.G.; Griggs, D.M.; Shanks, J.N. School Leaders in a Time of Accountability and Data Use: Preparing Our Future School Leaders in Leadership Preparation Programs; IGI Global: Hershey, PA, USA, 2018. [Google Scholar]
  66. Roegman, R.; Perkins-Williams, R.; Maeda, Y.; Greenan, K.A. Developing data leadership: Contextual influences on administrators’ data use. J. Res. Leadersh. Educ. 2018, 13, 348–374. [Google Scholar] [CrossRef]
  67. Cunningham, K.M.W.; VanGronigen, B.A.; Tucker, P.D.; Young, M.D. Using powerful learning experiences to prepare school leaders. J. Res. Leadersh. Educ. 2019, 14, 74–97. [Google Scholar] [CrossRef]
  68. Dexter, S.; Clement, D.; Moraguez, D.; Watson, G.S. (Inter)active learning tools and pedagogical strategies in educational leadership preparation. J. Res. Leadersh. Educ. 2020, 15, 173–191. [Google Scholar] [CrossRef]
  69. Albiladi, W.S.; Lasater, K.; Bengtson, E. Data use among principals and teachers: Divergent paths or common ground? Implications for the leadership preparation programs. J. Sch. Adm. Res. Dev. 2020, 5, 63–76. [Google Scholar] [CrossRef]
  70. Honig, M.I.; Honsa, A. Systems-focused equity leadership learning: Shifting practice through practice. J. Res. Leadersh. Educ. 2020, 15, 192–209. [Google Scholar] [CrossRef]
  71. Yin, R.K. Case Study Research and Applications: Design and Methods; Sage: Thousand Oaks, CA, USA, 2017. [Google Scholar]
  72. Gerring, J. Case Study Research: Principles and Practices; Cambridge University Press: Cambridge, UK, 2017. [Google Scholar]
  73. O’Connor, C.; Joffe, H. Intercoder reliability in qualitative research: Debates and practical guidelines. Int. J. Qual. Methods 2020, 19, 1609406919899220. [Google Scholar] [CrossRef]
  74. Altheide, D.; Coyle, M.; DeVriese, K.; Schneider, C. Emergent qualitative document analysis. In Handbook of Emergent Methods; Hesse-Biber, S.N., Leavy, P., Eds.; The Guilford Press: New York, NY, USA, 2008; pp. 127–151. [Google Scholar]
  75. Creswell, J.W.; Poth, C.N. Qualitative Inquiry and Research Design: Choosing among Five Approaches; Sage: Thousand Oaks, CA, USA, 2017. [Google Scholar]
  76. Davis, S.H.; Leon, R.J.; Fultz, M. How principals learn to lead: The comparative influence of on-the-job experiences, administrator credential programs, and the ISLLC standards in the development of leadership expertise among urban public school principals. Int. J. Educ. Leadersh. Prep. 2011, 8, 1–33. Available online: https://eric.ed.gov/?id=EJ1012996 (accessed on 26 September 2024).
  77. Abdelrahman, N.; Irby, B.J.; Lara-Alecio, R.; Tong, F. The influence of university principal preparation program policies on the program internship. Sage Open 2022, 12, 21582440221117110. [Google Scholar] [CrossRef]
  78. Bastian, K.C.; Drake, T.A. School leader apprenticeships: Assessing the characteristics of interns, internship schools, and mentor principals. Educ. Adm. Q. 2023, 59, 1002–1037. [Google Scholar] [CrossRef]
  79. Karami-Akkary, R.; Hammad, W. The knowledge base on educational leadership management in Arab countries: Its current state its implications for leadership development. In Teaching Educational Leadership in Muslim Countries; Samier, E.A., ElKaleh, E.S., Eds.; Springer: Berlin/Heidelberg, Germany, 2019; pp. 77–92. [Google Scholar]
  80. Bush, T. Preparation and induction for school principals: Global perspectives. Manag. Educ. 2018, 32, 66–71. [Google Scholar] [CrossRef]
  81. Lochmiller, C.R. Leadership coaching in an induction program for novice principals: A 3-year study. J. Res. Leadersh. Educ. 2014, 9, 59–84. [Google Scholar] [CrossRef]
  82. Hock, M.; Moon, T.; Meyers, C. Equipping pre-service teachers for data use: A study of secondary educator preparation programs in Virginia. J. Teach. Educ. in press.
Table 1. Data use expectations across Virginia’s Uniform Performance Standards and Evaluation Criteria for Principals.
Table 1. Data use expectations across Virginia’s Uniform Performance Standards and Evaluation Criteria for Principals.
VA Performance StandardExpectations for Data Use or Data-Informed Decision-Making
Instructional LeadershipAnalyzes current academic achievement data and instructional strategies to make appropriate educational decisions that improve classroom instruction, increase student achievement, and maximize overall school effectiveness. Promotes and supports professional development and instructional planning and delivery practices that incorporate the use of achievement data and result in increased student progress.
School ClimateUses data and incorporates knowledge of the social, cultural, emotional, and behavioral dynamics of the school community to cultivate a positive, engaging academic learning environment.
Human Resource LeadershipActively leads in the selection process, where applicable, and assigns highly effective staff in a fair and equitable manner based on school and division needs, assessment data, and local, state, and federal requirements.
Organizational ManagementAnalyzes data to identify and plan for organizational, operational, or resource-related problems and resolves them in a timely, consistent, and appropriate manner.
Communication and Community RelationsNo reference to data use.
Culturally Responsive and Equitable School LeadershipCollects, interprets, and communicates student group disaggregated assessment, engagement, behavioral, and attendance data to identify and understand how and why inequities exist and implements procedures and strategies to address inequity.
ProfessionalismNo reference to data use.
Student Academic ProgressCollaborates with teachers and staff to monitor and improve multiple measures of student progress through the analysis of data, the application of educational research, and the implementation of appropriate intervention and enrichment strategies.
Source: Virginia Department of Education, 2022.
Table 2. Virginia institutions offering leadership preparation programs.
Table 2. Virginia institutions offering leadership preparation programs.
Public InstitutionsProgramCert.
Credit
Hours
Master’s (Credit Hours)Format
George Mason UniversityEd. Leadership24M.Ed. (30)Online,
In-Person
James Madison UniversityEd. Leadership18M.Ed. (33)Online
Longwood UniversityEd. Leadership-M.S. (36)Hybrid
Old Dominion UniversityEd. Leadership-M.S.Ed. (30)Online
Radford UniversityEd. Leadership-M.S. (33)Online
University of VirginiaAdministration and Supervision-M.Ed. (33)Online
Virginia Commonwealth UniversityEd. Leadership21M.Ed. (33)Online, In-Person
Virginia State UniversityAdministration and Supervision-M.Ed. (36)In Person
Virginia Tech UniversityEd. Leadership-M.A.Ed. (30)Hybrid
William and Mary Ed. Leadership18M.Ed. (33)Hybrid,
In-Person
Private InstitutionsProgramCert. Credit HoursMaster’s (Credit Hours)Format
Averett UniversityAdministration and Supervision22M.Ed. (31)Online
Bluefield UniversityEd. Leadership-M.A. (36)In-Person
Hampton UniversityEd. Leadership-M.A. (N/A)In-Person
Liberty UniversityAdministration and Supervision21M.Ed. (36)Online
Marymount UniversityAdministration and Supervision-M.Ed. (30)Online
Regent UniversityEd. Leadership-M.Ed. (30)Online
Shenandoah UniversityEd. Administration21M.S.Ed. (30)In-Person
University of LynchburgEd. Leadership24M.Ed. (33)Online
University of RichmondEd. Leadership22M.Ed. (31)Hybrid,
In-Person, Online
n = 19.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Meyers, C.V.; Abrams, L.; Moon, T.R.; Hock, M. School Leader Preparation in the U.S. State of Virginia: Exploring the Relationship between Data Use in Standards and Program Delivery. Educ. Sci. 2024, 14, 1081. https://doi.org/10.3390/educsci14101081

AMA Style

Meyers CV, Abrams L, Moon TR, Hock M. School Leader Preparation in the U.S. State of Virginia: Exploring the Relationship between Data Use in Standards and Program Delivery. Education Sciences. 2024; 14(10):1081. https://doi.org/10.3390/educsci14101081

Chicago/Turabian Style

Meyers, Coby V., Lisa Abrams, Tonya R. Moon, and Michelle Hock. 2024. "School Leader Preparation in the U.S. State of Virginia: Exploring the Relationship between Data Use in Standards and Program Delivery" Education Sciences 14, no. 10: 1081. https://doi.org/10.3390/educsci14101081

APA Style

Meyers, C. V., Abrams, L., Moon, T. R., & Hock, M. (2024). School Leader Preparation in the U.S. State of Virginia: Exploring the Relationship between Data Use in Standards and Program Delivery. Education Sciences, 14(10), 1081. https://doi.org/10.3390/educsci14101081

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop