Next Article in Journal
Black Women’s Narratives Navigating Gendered Racism in Student Affairs
Next Article in Special Issue
Enhancing Teachers’ Interdisciplinary Professional Development through Teacher Design Teams: Exploring Facilitating Conditions and Sustainability
Previous Article in Journal
Teaching and Investigating on Modelling through Analogy in Primary School
Previous Article in Special Issue
Structure of Science Teacher Education in PISA Leading Countries: A Systematic Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Framing School Governance and Teacher Professional Development Using Global Standardized School Assessments

by
Estela Costa
* and
Luís Miguel Carvalho
Education and Training Research and Development Unit, Institute of Education, University of Lisbon, 1649-013 Lisboa, Portugal
*
Author to whom correspondence should be addressed.
Educ. Sci. 2023, 13(9), 873; https://doi.org/10.3390/educsci13090873
Submission received: 1 July 2023 / Revised: 5 August 2023 / Accepted: 22 August 2023 / Published: 27 August 2023
(This article belongs to the Special Issue Teacher Professional Development and Sustainability)

Abstract

:
The OECD’s education agenda has been marked since the 1990s by monitoring quality and manufacturing problems and solutions for the so-called knowledge economy. Among the instruments used by the OECD is “PISA for Schools” (Pisa-S), an assessment applied directly to schools worldwide since 2011. In Portugal, it was implemented in 2019 under the designation of “PISA for schools in the municipalities” (PISA-M), claiming to create opportunities for collaborative work between schools to promote the success of local educational policies and the quality of student learning. Taking PISA-M as a policy instrument and building on Coburn problem-framing typology, in this article, the revision of PISA-S to PISA-M is reread to analyze the regulatory rationale for the educational system that PISA-M encodes. This research draws on data from OECD PISA/PISA-S/PISA-M websites and two public hearings with the Portuguese PISA-M coordination in the Portuguese parliament and with education unionists, and in existing research relating to PISA-S. Overall, PISA-M appears to be an instrument to reframe local school governance and teacher professional development practices by capturing problematizations and solutions raised on education, teachers’ development, and how school education should be ordered and coordinated at a municipal scale.

1. Introduction

Since the 1990s, the Organization for Economic Cooperation and Development (OECD) educational agenda has been marked by the monitoring of quality and involved in the continuous fabrication of problems and solutions for the so-called knowledge economy [1,2,3,4], ascribing growing importance to data collection on student performance through international large student assessments (ILSAs). Over the last decade, ILSAs became an important research subject for education policy and comparative education studies steering from diverse theoretical perspectives (e.g., [5,6,7,8,9,10,11,12,13,14,15,16,17,18,19]). The Program for International Student Assessment (PISA) is presently the most well-known and valued ILSA. Aiming to create “generalizable knowledge” based on the credibility of its learning metrics ([6], p. 489), it supports the use of comparison of educational systems as a resource for—and a driver of—educational policies [20] and boosts the learning of best practices from high-performing countries [21].
This article is part of the literature that seeks to understand the effects of ILSAs on government education processes. It analyses a PISA-derived program created in 2011—PISA for Schools (Pisa-S)—when several schools and districts were invited to participate in a pilot in 126 secondary schools in the United States (USA), the United Kingdom (UK), and the Canadian province of Manitoba. In 2013, in Spain, a pilot was also developed involving 225 schools, and in April 2013 it was launched in the USA for all schools nationwide [22]. In other countries, PISA-S arrived a few years later, like in Brazil, with a first pilot in 2017, mainly applied in public schools of a municipality located in the northeastern state of Ceará, known for high-accountability policies and publicly acknowledged for having good results in national tests [23]. Until April 2020, according to the OECD, more than 5500 schools from 10 countries participated in the program (PISA-S website, OECD), IIh is available for the following countries: Andorra, Australia, Brazil, Brunei Darussalam, China, Colombia, Japan, Kazakhstan, Russia, Spain, Thailand, United Arab Emirates, United Kingdom, and United States of America. In Portugal, it was implemented in 2019 under the label “PISA for schools in the municipalities” (PISA-M) and the pilot phase, which took place between 2020 and 2021, involved 103 schools from nine municipalities and intermunicipal communities.
The PISA survey has been widely studied, and its normative influence in formulating policies and educational discourses is felt worldwide (e.g., [2,24,25,26]), legitimizing the implementation of different political measures in national/regional contexts (e.g., [13,27,28,29,30,31,32]) and there is abundant literature on how international agencies use their indicators to influence national and regional educational policies. However, the issue of how international organizations (IO) use their knowledge to directly influence and eventually “govern” local education policies is less discussed in the literature [33]. Equally, the number of studies on the use given to qualitative evidence to justify and guide measures and policy objectives is still not substantial, including research on how governments manage the complementarity of quantitative and qualitative evidence [34]. Regarding research on PISA-S, it focusses mainly on the North American reality [22,33,34,35,36,37,38,39,40,41,42]. Still, the academic literature on PISA-S has not yet addressed the significant changes since the beginning of PISA-S in the USA [43], and research addressing how the PISA-S influences and perhaps governs educational policies and teachers’ development is a field of study less discussed.
This paper presents research that is in line with the existing literature, aimed at understanding ILSAs as policy tools that guide public policies. We argue that instruments such as PISA (and its derivatives like PISA-M) are not neutral, bear cognitive foundations, and project representations of educational problems and ways of solving them, inducing educational systems (and in this case, school principals and teachers) to adopt particular solutions and means of materializing them. Therefore, this paper is focused on analyzing the regulation that PISA-M brings in, which we access by observing the reasoning behind its documents and the discursive elements woven around it by those responsible for the program. It follows a previous line of inquiry of PISA as an OECD soft power instrument [2] that influences the different contexts where it is implemented, as it holds representations, ideas about education, and meanings that make it a privileged analyzer of the processes and dynamics of sociopolitical regulation in education [26]. The analysis is also based on Coburn frame analysis [44]. Moreover, PISA-M is reread [45] as a policy instrument to reframe school governance and teacher professional development practices and goes on to examine the underlying principles of the program to be applied to the Portuguese context, to describe and analyze the problematizations and solutions raised by the OECD (through PISA-M) on education, namely regarding teacher development, and how school education should be ordered and coordinated locally. Hence, the following research question guided this study: What is the regulatory rationale for the educational system that PISA-M encodes?

2. Background of the Study

2.1. PISA-S as Part of the Expansion of OECDs’ Soft Regulatory Agency

PISA for Schools (PISA-S) is a manifestation, among others, of the OECD’s expansion as a transnational regulatory agency, which has long occurred through the extension in scope (expanding the set of skills and competencies), scale (covering more countries, systems, and schools), and the explanatory power of the PISA assessment [46]. Contrasting with other ILSAs, PISA-S contributes to strengthening the international agency’s influence by going beyond the national level and deliberately addressing local actors (schools and municipalities) [47], which is relevant because targeting schools directly positions the OECD as a local expert in education policy that knows how to measure what matters to schools. Moreover, it amplifies its role of an organization that knows what is needed to—and must—govern and manage schools [33,35].
Likewise, through PISA-S the OECD intensifies the participation in policy processes that stems from: (i) PISA products’ sophistication, (ii) new thematic forums, and (iii) new inquiries. The first encourages emulation and political learning through reports with studies on certain countries—under the title “Strong performers and Successful reformers”—and new information products for immediate consumption that provide practical answers on specific topics to administrators, trainers, and teachers, based on PISA results (on attitudes and student performance, problems related to family contexts, and even on political choices). There is also the digital platform “Education GPS” that enables users of the OECD platform to build their comparison models and reports based on PISA data [48]. The second intensification and sophistication mode of transnational regulatory processes is to create and stabilize new approaches for validating good practices and agenda setting through thematic forums [47,49], such as the “International Summit on the Teaching Profession” that has been held since 2011. The third way is to expand the surveys, as exemplified by “PISA for Development” (e.g., [50,51,52]), the Program for the International Assessment of Adult Competencies (PIAAC) (e.g., [53,54]), and, naturally, PISA-S [22,36,40,46].

2.2. PISA-S Features (2010–2022)

2.2.1. Objectives and Particulars

The OECD justifies the relevance of PISA-S as a strategy to improve results and school practices through peer learning and networking. Developed by the Australian Council for Educational Research (ACER), the program aims to enable “schools around the world to connect and work together to build practical resources designed to help students learn better, teachers teach better and schools to become more engaging learning environments” (PISA-S website, OECD). The idea of an “Online Community” directed at educators and school leaders who participate in the program is advocated as seeking to provide “a secure space to share and solve problems, learn from other contexts and experiences, and to support school leaders, teachers, and educators as they pursue innovation in their classrooms and schools” (idem).
Moreover, under the PISA-S protocol, each school can assess 15-year-old students in reading, math, and scientific literacy and compare this information with the performance of education systems as measured using the PISA core test. This is possible because PISA-S is aligned with the PISA assessment frameworks, sharing the same metrics and proficiency levels ([36], p. 283). In this way, by making the PISA-S application procedures like the main PISA test, the OECD has made it possible to argue for comparing schools within the same country and across countries and between schools and national PISA data and world education systems.
This option, by a measurement design that treats schools in comparison to educational systems—as the units of analysis of PISA and PISA-S are equivalent—is seen by some authors as disregarding cultural and historical factors and ignoring contextual elements related to students [22,36,40]. Research highlights that it affects the exercise of power because it does not primarily arise from controlling or communicating across physical space but emerges from the relationships and interactions of the individuals and entities involved ([40], p. 87). Moreover, the integration of schools in a global comparison context as happens with PISA-S reconfigures the educational governance relations, which are subject to a re-spatialization [36].

2.2.2. Assessment Characteristics

PISA-S consists of the “PISA-based Test for Schools” assessment, seeking “to help school leaders understand their 15-year-old students’ abilities to think critically and apply their knowledge creatively in new contexts” (PISA-S website, OECD). Grounded on PISA frameworks, lasting two hours, it comprises 141 items (47 for reading, 40 for math, and 54 for sciences) reflecting the distribution of PISA items of response types and content. The minimum number of students tested is 35, although this varies according to the characteristics of the school (idem). A 30 min contextual questionnaire on information about students, the influence of specific aspects on student learning within the school (e.g., classroom climate) and outside (e.g., the student’s attitudes towards reading) [40] is also applied. Overall, the testing experience lasts 3 to 3.5 h, including the instructions and break periods (PISA-S website, OECD). It culminates in a final report on students’ performance, providing examples of “good practice” from high-performing education systems identified by the core PISA, with the stated purpose of supporting the improvement of schools.

2.2.3. Implementation Diversity

Since 2011, several changes have occurred related to the actors associated with PISA-S. Concerning the supranational coordinator of PISA-S, called the universal International Platform Provider (IPP), was assigned to an Australian company—Janison—which hosts and delivers the digital version of the test ([43], p. 100).
Observing the national providers, for example, the USA national provider started with CTB/McGraw-Hill (2012–2015), an edu-business company that was followed by Northwest Evaluation (2015–2016), a not-for-profit organization, and is now taking place under the umbrella of Janison, an ed-tech company, which as stated above is simultaneously the universal PISA-S IPP. Taking the example of two countries in southern Europe (Portugal and Spain) and two countries in the Americas (USA and Brazil), Table 1 shows the variations that PISA-S takes in the different places where it is implemented, and the actors associated with it.
As can be seen in the table, PISA-S is funded by entities of different natures (e.g., philanthropic, research institutions, public authorities) in the countries where it is accommodated, which extends to the national provider (HEIs, R&D institutions, etc.) and the type of schools covered (public and/or private). Such differences clearly aim to find ways in each specific context to ensure that the PISA-S rationale and test structure remain unchanged (these negotiations make possible the mobility of PISA through diverse contexts without being altered [29], applying to PISA Latour’s notion of immutable mobile).

2.2.4. PISA-M: The Portuguese Version of PISA-S

In Portugal, it is called “PISA for schools in the municipalities” (PISA-M) because it is financed by municipalities (CM) and intermunicipal communities (CIM) (groups of municipalities), many of which joined the program during the pilot (Table 2). Scheduled to start in 2020 “with the application of the tests and analysis of the results and the elaboration of the report in the autumn of the same year” (PISA-M, IPL website), the implementation was postponed for one year due to the COVID-19 pandemic. The process was severely delayed, compromising the validity of the samples, which forced the schools (when students returned to the face-to-face regime) to reassess the sample for the test. Still, the project went ahead, and 103 schools have already participated, and in December 2021 the first results began to be discussed.
The national service provider (NSP) of PISA-M is the Instituto Politécnico de Lisboa (IPL), whose coordinator is an IPL academic with a background in engineering, who holds a Ph.D. in operational research and systems analysis, and a specialization in education, and who, since 2018, has been a researcher and consultant at the OECD.

2.2.5. PISA-M and the Complexification of Local Governance

PISA-M activates new actors in the local governance processes: state, local authorities, schools, OECD, and higher education experts and institutions (Figure 1). These actors are differently involved: consenting the program, funding it, operationalizing it, and mobilizing schools to use it. As for the OECD, it is responsible for designing the program, general supervision, and certification of the public research and training organization (IPL) accredited to administer the test, analyze the data, and produce reports for each school (under OECD oversight).
In such a context where the reconfiguration of powers at the local level is signaled, the OECD emerges as a new actant interacting with the local administration and schools, “speaking” directly with school management, teachers (as targets of the programs’ purposes), and the municipalities, as coordinators of the network.

3. Theoretical Approach

3.1. Taking PISA-M as a Policy Tool

This study seeks to reread PISA-M from the perspective of the political sociology of education, based on the concept of policy instrument [13,55,56] described as a “technical device with a generic vocation, carrying a specific conception of the political/society relationship and supported by a conception of regulation” ([57], p. 4). Thus, instruments are carriers of a conception of regulation, directing how to coordinate and control the educational systems [57].
Diverging from the understanding of the ILSAs—and of PISA-M, in particular—as merely technical, this conceptual approach allows us to explore the way policy tools guide public policies, highlighting the importance of their cognitive and normative dimensions [58].
Therefore, policy instruments structure policy according to their own logic, being subject to controversies and modifications throughout their trajectories, as explained in [59]:
[…] the instrument is never an isolated device: it is inextricably linked to contextualized modes of appropriation. Via the instrument it is possible to observe not only professional mobilizations (for example, the affirmation of new competencies) but also reformulations (serving specific interests and power relations between actors) and, finally, resistance (to reduce the impact of the instrument or circumvent it by creating paradoxical alliances).
(p. 15)

3.2. Frame Analysis as a Means of Capturing the Normative and Cognitive Purposes of PISA-M

Framing is an everyday social practice and a somewhat pervasive concept in the social sciences and humanities since the last quarter of the last century. Traceable to Goffman’s frame analysis [60], it is conceived as a cultural practice by which humans make sense of and organize their ongoing experience. The way a policy issue is framed is relevant because it determines who bears responsibility for it and justifies certain policy solutions while ruling out others ([44], p. 344). According to [44], the process of framing a problem is of utmost importance not only for encouraging and organizing action, but also for potentially reshaping power dynamics and impacting the way teachers perceive and approach the issue at hand (idem). Therefore, the interest here is not only in understanding how people define situations or use already incorporated definitions to make sense of contexts and act; it is also related to influence, as pursued in various studies on policy implementation or organizational processes of intentional change [44,61,62].
According to [44]’s analysis (p. 374), frames promote and justify specific ideas to guide social action, to persuade other people to be involved in the change-oriented process.
Three types of framing are considered to explore how PISA-M coordination shapes the messages communicated to launch the instrument: diagnostic framing; prognostic framing; and frame alignment [44]. The first two relate to the definition of problems (identification of problems) and the proposed solution to the same problems (identifying goals and ways to achieve those goals). The third one is a different component of framing, referring to discursive elements that try to mobilize the “interests, values, and beliefs” of the audiences to the invoked diagnostic–prognostic framing.

4. Materials and Methods

In this research, we used the qualitative and interpretive methodology [63]. Data collection includes materials from PISA-S/OECD websites, PISA-M website, and two public hearings with the Portuguese coordination, one at parliament and the other with a federation of education unionists. The existing research relating to PISA-S is also mobilized. This paper observes PISA-M from two points of anchorage: in official documents and the narratives of key actors, in meetings. The public hearings of the PISA-M coordination were as follows: a parliamentary hearing lasting 53:35 min that took place with deputies in the Portuguese parliament on 3 March 2020; and an online public hearing of 1:40:04 min with education unionists at the Forum of the National Education Federation (FNE) (FNE affiliates teachers’ unions and unions of education professionals (e.g., unions of technicians, administrators, and education assistants, working in schools and in organizations of education administration). It is registered with the International Trade Union Confederation (CSI) and the General Union of Workers (UGT), and with the Education International (IE and IE-CSEE)) on 13 November 2020.
Data were subjected to qualitative content analysis [64]. For the analysis of documents and public hearings transcripts, the deductive method was employed. PISA-M documents and PISA-M central actors’ written statements were key to seizing the conceptualization of educational reality and policy agency that supports the policy tool. Therefore, documents and public hearing transcripts were read and then the targeted text was segmented to represent an idea related with the predefined categories from [44]’s problem-framing typology (diagnostic framing, prognostic framing, and frame alignment). The two researchers independently analyzed data, considering the categories and comparing their analyses with each other. Each difference in text coding decisions was discussed until a consensus was reached. After this, interpretive codes were created, and three subcategories emerged for each category (Table 3).
The way policy messages about PISA-M are communicated is a part of the PISA-M instrument and the concepts it carries on when entering local contexts. The ideas conveyed by those responsible for the program frame the thoughts of its target audience, committing the coordination and instrumentation of PISA-M with specific ideas and beliefs about the system and the contexts of action where it will enter, and how such intervention must be carried out. We then analyze PISA-M as a policy instrument, observing PISA-M problematizations and preconizations (solutions) [65] developed in the OECD’s documents and oral statements by PISA-M coordination in contexts of public disclosure, with members of the parliament and trade unions and through institutional websites.

5. Current Study

The relationship between problems and solutions occurs through framing processes that involve logical and causal associations. Problems do not exist outside solutions, and the way they are formulated brings a type of solution [66]. Considering the connection between problems and solutions, we present the diagnostic frame and the prognostic frame [44] results together, jointly analyzing the identified problems and the respective proposed solutions and ways to achieve them (Section 5.1, Section 5.2 and Section 5.3). Regarding frame alignment [44], it is related to the association of invoked frames to the interests and beliefs of those to reach [67,68]. This is analyzed in Section 5.4.

5.1. Adherence to PISA Tests to Address the Deficit of Objective Knowledge by Schools and Teachers

Problem 1 is based on the argument that schools and their teachers do not possess enough information to consistently improve students’ learning outcomes. It is suggested that PISA provides a good needed at a national level, but schools still lack contextual information and objective data to be able to make decisions:
Schools began to convey the idea that participation in PISA was very interesting, but it did not have any direct return for schools, something that they expressed as extremely necessary to be able to act based on the results
(National Coordinator, Parliamentary hearing) (Probl1)
Advocated as an upgrading of the national PISA, PISA-M is problematized so as to provide support to school leaders and teachers in their decision making processes:
The part of proposing actions and implementing these actions is exclusive to those who have this responsibility, and those who have this responsibility are those who experience the school in their day-to-day life and, specifically, school principals and teachers
(National Coordinator, Forum with Trade Unions) (Solut1)
One of the aspects referred to as an added value is ensuring that schools have access to disaggregated data at the school level, which stems from the fact that the policy tool belongs to the PISA universe, being praised for the worth of the data it provides for the school system:
[PISA] collects a lot of information about the characteristics of students, which has proved to be extremely useful at the country level, and we believe it [PISA-M] can also prove to be very useful at the school level. (…)
(National Coordinator, Parliamentary hearing) (Solut1)
Then, PISA-M is perceived as a tool that can catalyze improvement processes, because it not only provides knowledge about the unique realities of each context but also motivates all stakeholders to take action to drive improvement:
[PISA-M aims to] provide schools with a set of information that allows for an internal and external analysis process, with the aim of favoring possible improvement solutions—strategies, tools, etc.
(National Coordinator, Parliamentary hearing) (Solut1)
In this way, adhering to PISA tests is described as a solution to bridge the lack of information by schools, and it is depicted as a technical–scientific tool that acts as a “bridge language” connecting everyone. Furthermore, it is a sort of “vehicular tool” appearing as a common ground that facilitates the exchange of knowledge free from subjective views, with the capacity to enhance professional and organizational development practices:
PISA is a common language from the point of view of the results, and it seems to us that it will be able to enhance this reflection and this sharing because it is clear and objective information
(National Coordinator, Parliamentary hearing) (Solut1)

5.2. Adherence to PISA-M to Address the Lack of Collaboration and Reflection by Schools and Teachers

The second problem pertains to the lack of collaboration among schools and teachers, which results in the inability of schools and teachers to identify and solve obstacles regarding students’ learning outcomes. Schools are characterized as having a limited capacity to implement effective solutions and solve problems by themselves:
The schools, alone, based on that information, have a real picture of where they are and a clear notion of where they would like to go, but they are very limited in their capacity for action. (…) you shouldn’t leave schools alone to face their problems and look for… and hope that they find their solutions
(National Coordinator, Forum with Trade Unions) (Probl2)
Consequently, providing assistance to support school actors is suggested. The rationalization behind Problem 2 relies upon the argument that schools are not accustomed to sharing their experiences or engaging in collective reflection, and these are organizational and professional development practices that are regarded as necessary to cultivate in schools:
We are convinced that—if we manage to get schools to use this instrument in a network for subsequent sharing of results and for reflection and subsequent collaborative learning—we can have a greater impact on student learning. And it is in this sense that the PISA project was designed for schools in the municipalities of Portugal
(National Coordinator, Parliamentary hearing) (Solut2)
Moreover, best practices are valued to guide schools in their quest for improvement, and the onus of building the solutions is set within the education system itself, placing high expectations on collaboration as a kind of regenerative practice for the education system:
It is necessary that those who make up the educational system share, reflect together, learn collaboratively and then manage to implement their solutions, which do not need to be solutions from others, but solutions that can be inspired by solutions from others, depending on the circumstances
(National Coordinator, Parliamentary hearing) (Solut2)
But these expectations also submit the education system to open itself cognitively and socially to external forces. Thus, comparability conveyed by PISA-M is portrayed as advantageous, thanks to the trust and reputation of the OECD and because national experts will join in the “Collaboratory” to integrate the network for reflection and sharing of practices:
And what is intended is to mobilize what the OECD can do well, which is an international mobilization of specialists in the field of education, but also the National Academy and this whole network of actors that work around education that must be able to contribute to this learning. Hence the name is collaboratory in this higher sense
(National Coordinator, Parliamentary hearing) (Solut2)
Likewise, the program is not only presented as the solution to the problem of lack of collaboration but the training of teachers and schools:
Since the main objective of PISA is to enable schools to achieve permanent improvement in student learning outcomes, it is known that sharing, joint reflection, and networking, as well as collaborative learning, are essential approaches in the development of this training of schools
(PISA-M website, IPL) (Solut2)
The training of school actors is discursively highlighted, associated with mutual learning and collaboration. And it can be assumed as a sort of “new training script” for teachers enrolled in reticular and contextualized dynamics in schools, which appear to be the right ones and the most indicated by the OECD to improve student learning:
It is a project that aims to train schools; that information effectively helps them to improve the teaching-learning processes, but (…) we all know that the school alone is not able to do this.
(National Coordinator, Parliamentary hearing) (Solut2)
(…) leads to the improvement of what schools seek to do every day, which is to teach better and that their students learn better.
(National Coordinator, Forum with Trade Unions) (Solut2)
Therefore, the solution to problem 2 is that adherence to PISA-M allows schools to be part of a network that facilitates collaborative learning, thus impacting students learning. It mixes two main ideas: on the one hand, “research laboratory”, i.e., a label that is associated with the mobilization of objective data “measured in a very transparent way”; on the other hand, collaboration, as “effective learning” is imagined as coming out from an educational environment characterized by collaboration and sharing (National Coordinator, Forum with Trade Unions). In other words, the results do not stand for themselves. They need to be examined by teachers and experts to become usable, and the space for such convergence is the concept-practice of the “research collaboratory”.
(Solut2)
And the “research collaboratory” may well mean a new pattern for the relation that schools and professionals undertake with (expert) knowledge:
(…) the application to each School of the OECD Test based on PISA will provide the participating schools with clear, objective, relevant and comparable information, nationally and internationally, which will allow reflection and internal and external sharing, as well as collaborative learning in a network structure of nets made up of Schools, Municipalities, Technicians, and Researchers in Education, together with political decision-makers and those in educational management, with regional or national intervention
(PISA-M website, IPL) (Solut2)

5.3. Assigning the Leadership of the Network to Municipalities to Fill the Deficit of Coordination by Schools

Problem 3 stems from the previous one (lack of collaboration). It is focused on the actions of schools (and of their teachers) to improve results, which are not efficient/effective because they lack articulation and internal and external coordination. This way, problematization is focused on the governance of the preconized network, namely on its coordination and articulation:
Each of the actors in this network has its own picture, has its own diagnosis, whether it be an individual picture of each school, which concerns only each school, or a picture of the network … and can be related to a region, a municipality (…)
(National Coordinator, Forum with Trade Unions) (Probl3)
The coordination of this network advocated by the OECD at the local level is assigned to the Municipalities (and Intermunicipal Communities) who start to play a central role, becoming:
“facilitators, both from a financial point of view and from the point of view of articulating the whole process”.
(National Coordinator, Parliamentary Hearing)
Choosing the municipalities is related to the previous positive experience of the national coordinator in a central education and training administrative body. What is relevant from an analytical perspective is the narrative that justifies such centrality—a narrative that places municipalities at the very heart of change, even as a change-driven force, and strategically oriented by a “collaborative” ethos:
[Municipalities are] increasingly assuming a very active role in boosting educational networks. They stopped looking at territorial development (…) exclusively with the variables they looked at until 15 years ago, they started looking at education as something strategic from the point of view of the development of their territories, and they have sought to play a collaborative and active role in this regard, and I have witnessed this over the last few years, in particular at the National Agency for Qualification and Professional Education, where I found extremely important partners in the municipalities… participatory and collaborative in building solutions
(National Coordinator, Forum with Trade Unions) (Solut3)
Thus, the advocated solution to Problem 3 is that the local authorities are the central nodes for developing and direct school networking, and even the most relevant collective actors to ensure the coordination of improvement efforts:
Municipalities have been the basis of what we are doing in Portugal, in an innovative way, within the scope of the OECD precisely to maximize the potential of the instrument
(National Coordinator, Forum with Trade Unions) (Solut3)
Moreover, they are represented as local network creators, and the OECD as the builder of “networks of networks” that encourages collaborative efforts to find solutions together through PISA-M:
It is important that the results of applying a test such as PISA, the reports produced, and the surveys carried out to students are accompanied by group analysis, sharing of joint approaches, and collaborative learning coordinated by those who assume in their mission the search for the improvement of learning outcomes of students from all schools in each network. For these reasons, countries are starting to design the application of PISA for Schools in a logic of local territorial network with the coordination of the actors responsible for the territories
(PISA-M website, IPL) (Solut3)
The argument is that schools benefit from having an entity that organizes and promotes territorial networks, through which people’s qualifications can be improved, which leads us to invest in human capital, as can be read on the PISA-M website:
Municipalities are also empowered with municipal information on the characteristics and skills generation system for the future active population of their territory. Through this national and international benchmark, it is intended to enable the Municipalities, together with their Schools, to active participation in the development of the skills of the population they represent
(PISA-M website, IPL) (Solut3)

5.4. Frame Alignment: Prioritizing Students’ Wellbeing, Collaboration, and School Autonomy

In the analysis, we found frame alignment of the problems associated with two recurring criticisms of the PISA test reports: On the one hand, the overvaluation of the cognitive dimension of school learning. On the other hand, PISA results in league tables, a solution adopted by most of the media in Portugal, with the creation of annual rankings of schools from the results of national exams. Both criticisms are addressed by PISA-M coordination.
In the first case, frame alignment [44] is produced by associating the analysis of the results obtained by each school with the consideration of students’ wellbeing, and proposing such a reference indicator—the student wellbeing index:
We cannot adopt strategies focused solely and exclusively on performance, which is relatively easy to do, except that it is often done at the expense of the well-being and overall happiness of the students who are involved
(National Coordinator, Forum with Trade Unions) (FrAlign1)
If we raise cognitive indices, lowering the level of student satisfaction, we do not do a good job. The objective is for this project to be able to contribute to these two dimensions and, therefore, it is called improving the learning outcomes and well-being of students
(National Coordinator, Parliamentary hearing) (FrAlign1)
The suggested norm is to simultaneously improve the performance of students’ learning outcomes and their wellbeing, which paradoxically belongs more to the domain of desire than to that of the data and results published by PISA:
The countries that have had the best results in recent years have been many eastern countries (…), but when we cross this performance with the happiness index (…) the trend is reversed. Those who are performing better are experiencing lower happiness scores. However, when we look at countries like, for example, the Nordic countries that no longer have the best performance indexes, their students continue to have very high levels of happiness and well-being. And it is at this intersection, this balance between cognitive performance and well-being and happiness that we should find the solutions
(National Coordinator, Forum with Trade Unions) (FrAlign1)
In the second case, the alignment materializes both with the texts of the national authorities that in the late years pronounced against the use of the results of national assessments by the media to promote school rankings but also aligned with the OECD texts that lament the way politicians and media use PISA results for the same ranking purpose:
We must be very careful in how the results will be mobilized, and how they will be shared outside the networks, and we will do everything to not influence the construction of rankings, in the wrong way as it has happened in terms of results between countries and at the national level. This critical factor has been identified and can weaken the whole process, as we all understand it
(National Coordinator, Parliamentary hearing) (FrAlign2)
PISA-M is presented and valued under the PISA/OECD “brand” and justified by allowing for comparison. However, even though assessment and comparison instrumentation are presented as a requirement for school improvement, they are framed as a means for cooperation and searching for shared solutions, not for competition and ranking:
Based on information that is clear, that is objective, that is relevant, but that is also comparable to be able to promote dialogues because when it is not comparable, it is not possible to establish a collaborative dialogue
(National Coordinator, Forum with Trade Unions) (FrAlign2)
A school that can position itself in relation to its country, will be able to position itself in relation to its municipality, it can position itself internationally if it is interested in that—and all this facilitates communication between these various actors
(National Coordinator, Parliamentary hearing) (FrAlign2)
Sharing shouldn’t be for comparison, for positioning. It is supposed to identify problems and solutions
(National Coordinator, Forum with Trade Unions) (FrAlign2)
In addition to these two alignments, a third one can be observed, but related to the power relations between schools and municipalities. It is now a matter of preventing the strengthening of municipal power from being interpreted as a loss of power for schools. Thus, in the relationship with the municipalities, there seems to be a safeguard of the autonomy of the schools:
The product for municipalities is their regional photography, the product for schools is their school photography. And the municipalities do not have the photo of the schools. This was defined from the beginning and there is an element of trust that must exist and that cannot be violated at any time, so what is school photography is for schools, and schools will do what they feel they should do with their photography
(National Coordinator, Forum with Trade Unions) (FrAlign3)

6. Discussion

In this article, we resorted to the sociology of public action and policy instrumentation [57], which conceives policy tools as bearing thoughts and principles about education, and how the education system should be regulated (ordered, coordinated, and controlled). The OECD’s PISA-M was the ILSA under examination. Aimed at studying the regulation it carries, we tried to capture and understand the problems identified (constructed) by the OECD to justify its relevance, the problems it creates and the ways of solving them, the ideas, and representations behind it that will be inducted in public action, trying to influence schools and teachers work. Coburn frame analysis [44] was the methodological and conceptual tool utilized to comprehend the regulatory rationale behind the educational system that PISA-M embodies. Framing involves creating meaning about an issue and implies adopting specific policy solutions while excluding others [69]. Policy tools like PISA-M bear forms of problematizing education that are accompanied by solutions and formats for materializing them. By promoting particular solutions over others, policy tools implement “ways of doing” and “thinking” based on representations of education that will influence the educational reality and education governance. Observing the reasoning behind PISA-M, and the discursive elements woven around it, three problems (diagnostic framing) stand out. They are interrelated and correspond to three solutions (prognostic framing) (Table 4), which, given the consequential logic that connects them, makes it more relevant to analyze them as problem–solution binomials.
The first problem–solution binomial (the knowledge deficit is bridged by adhering to PISA tests) revolves around the idea that PISA is an improvement-oriented instrument whose information is objective and respected. PISA is described as being highly appreciated by principals and teachers, but not fully meeting their needs, because it does not offer disaggregated data at the school level, something that PISA-M allows for. Then, rationalizations about PISA-M are embedded in a sort of dependency presents vis-à-vis the national PISA, and PISA-M is depicted as a valuable source of information, whose credibility is associated to the success assigned to national PISA—which, in fact, depends on the image of the OECD as a transnational regulator [70]. This is related to the fact that the perceptions associated to evidence-based approaches are founded on the idea of the OECD as an organization that performs as an independent expert knowledge provider, “truth teller” ([70], p. 42), and a useful and trustworthy resource for the steering of educational systems [26]. Moreover, the argument goes on addressing school principals and teachers towards reinforcing the encouragement for schools to use their autonomy and make contextualized decisions based on results, which seems to meet the long-standing claims for school autonomy in Portugal [71], appealing to self-determination as a useful tool for school management and an answer to school problems.
The second problem–solution binomial (the collaboration deficit is bridged by adhering to PISA-M) is associated with the idea that isolation prevents schools and teachers from deciding even when possessing objective and comparable data. Accessing objective information and comparing and accessing best practices are highly valued, as well as the idea of research collaboration—a network of diverse actors that includes the expert knowledge of the OECD and the academia and multiple actors from schools and the territory. Problem and prognostic framing converge to focus on the need for teachers and schools to be involved in networks to develop collaborative learning. Furthermore, collaborative work, sharing, joint reflection, and mutual learning are advocated to boost the teaching–learning process and improve students outcomes, embodying a new script for teachers’ professional development, or, in the words of one responsible for PISA-M: enabling them to teach better so that their students learn better.
The third binomial (the coordination deficit is bridged by assigning that responsibility to municipalities) is that schools are isolated and lack internal and external articulation and coordination, something that the municipalities do well thanks to their experience. Described as the ideal coordinators of such innovative improvement-oriented networks, the municipalities are assigned a role that places them in a prominent position in local education governance, whether in terms of financing it or in playing the role of network enablers and supervisors. The designation “PISA for schools in the municipalities” illustrates the objective stated by the promoters, to be part of “an intervention strategy in Education for territorial development” aiming to “enable the Municipalities, together with their Schools, for an active participation in the development of the population skills” (PISA-M website). In a way, there is a shift in the role of the institutional regulator—the state (national level)—that moves to the municipalities (local level) through regulation coming from the transnational regulator (OECD). This is new in Portugal, where central administrative control persists over municipalities and schools, even though competencies between central and local authorities have been assigned and arranged through diverse means of deconcentration, decentralization, and school autonomy [72] since the 1980s. This new configuration of forces that PISA-M brings at the local level may well contribute to making the relationships between the actors—located at different levels of the network and assigned different roles—more complex and may induce changing control on schools between principals, teachers, and municipalities. Nevertheless, it should be noted that PISA-M frame alignment covers the beliefs of lacking autonomy by schools by addressing the critique of rankings generated by PISA results. Thus, in the relationship with the municipalities, there seems to be a safeguard of the autonomy of the schools, as the individual results of each school are not disclosed, neither publicly nor to the local authorities (which only have access to the global report). Therefore, even though the rationale behind PISA-M is oriented to comparison, it is assumed it is not to compete but to foster learning, leverage improvements, and seek solutions. This is challenging and worth following in the coming times and future research, as it becomes particularly relevant to understanding how the “government by results” and “by what works” that the instrument is a vehicle is established and evolves.
This study allows us to understand that the regulation embodied by PISA-M is evidence-based and knowledge-based, namely on the OECD’s expert knowledge, which stresses the symbolic power of the IO that holds a valuable “knowledge monopoly” favoring the effectiveness of regulation, a tendency of new forms of regulation by governments to legitimize their actions [73]. Like PISA, PISA-M is based on the OECD’s image of credibility and expertise, upon which its key role of transnational regulator depends. In fact, it is the authority and the success allocated to PISA that enables the OECD’s agenda in schools and municipalities, finetuning the capacity for influence through a new instrument, focused on the school setting. ILSAs are soft power instruments with regulatory capacity through suggestion and recommendation [70]. The PISA-M frame alignment with PISA critiques introduces the idea of novelty and improvement of the instrument, namely regarding contradicting the emphasis on the cognitive aspects and explicitly acclaiming the good examples—the Nordic countries—to prioritize students’ wellbeing and materialize it in a reference indicator—the student wellbeing index.
Likewise, the regulation inherent to PISA-M is network-based, and responds to three main ideas: regulatory effectiveness, teacher professional development, and local school governance. In the first case, the policy tool assumes an orientation towards “what works”, not only in quantitative terms—regarding better performance and achievement of goals [74,75]—but in qualitative terms, through models and examples of best practices, and exchange of experiences, which seems to be a new rationale in which the combination of statistical data and practices enables everything to be measured. In the second case, the network is seen as enhancing teacher professional development, promoting contact with examples from other contexts, disseminating and sharing organizational knowledge, procedural and pedagogical practices related to the school, the work of education, and teaching and learning. The third case is about the OECD appearing as a new actant interacting directly with the local administration and schools and may be seen as a break with the conventional trilogy of forces (state, schools, and municipalities) that has been considered central in local education policies [72]. In this sense, the “Research Collaboratory”, which is grounded on networking and sharing, and a motto for change, improvement, and effective learning, is an example of how it can contribute to redefine and reframe local school governance and teacher professional development practices.
This study shows that PISA-M is an instrument that stands as a variation of the assumption that evaluating and comparing the performances of educational systems leads to improvement in their effectiveness [76]. However, it is not about improving the country’s economic competitiveness that PISA-M supports its rationale but by putting it at the service of better institutional practices and results. By placing teachers at the forefront of the debate and based on the reports and the surveys carried out to students and parents, the policy tool is oriented towards the possibility of exploring the various factors that influence the quality of education. Therefore, rather than treating educational quality as an abstract concept or a fixed set of standards, PISA-M is framed to make it actual, founded on contextual results, and specific to the setting under analysis. Moreover, it emphasizes that teachers play a critical role in shaping its ongoing development and evolution, stressing a need for mutual learning and comparison, based on “sharing”, “joint reflection”, “networking”, “collaborative learning”, and “group analysis”.
Finally, PISA-M may be a valuable resource in the dispute for the autonomy of schools vis-à-vis public authorities. Even so, it is not exactly autonomy that it is about, but more a relationship built from a new heteronomy, as schools will be dependent both on national standards and on universal canon and supranational technical–scientific knowledge. Gathered around data and educational representations conveyed implicitly and explicitly by the OECD, schools are persuaded regarding how they should be governed and managed and how teachers’ professionalism should develop. This is a new dependency, no longer deriving from the state alone, but now directly from an expert organization, which is part of a broader phenomenon of reinforcement of the role of specialists and so-called evidence-based policies.

Author Contributions

Conceptualization, E.C. and L.M.C.; methodology, E.C. and L.M.C.; analysis, E.C. and L.M.C.; investigation, E.C.; data curation, E.C.; writing—original draft preparation, E.C. and L.M.C.; writing—review and editing, E.C.; visualization, L.M.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Rinne, R.; Kallo, J.; Hokka, S. Too Eager to Comply? OECD Education Policies and the Finnish Response. Eur. Educ. Res. J. 2004, 3, 454–485. [Google Scholar] [CrossRef]
  2. Sellar, S.; Lingard, B. The OECD and global governance in education. J. Educ. Policy 2013, 28, 710–725. [Google Scholar] [CrossRef]
  3. Ydsen, C. The OECD’s Historical Rise in Education: The Formation of a Global Governing Complex; Palgrave Macmillan: London, UK, 2019. [Google Scholar]
  4. Carvalho, L.M. Revisiting the Fabrications of PISA. In Handbook of Education Policy Studies; Fan, G., Popkewitz, T.S., Eds.; Springer: Singapore, 2020; Volume 2, pp. 259–273. [Google Scholar]
  5. Addey, C.S.; Sellar, S.; Steiner-Khamsi, G.; Lingard, B.; Verger, A. Forum Discussion: The Rise of International Large-scale Assessments and Rationales for Participation. Compare 2017, 47, 434–452. [Google Scholar] [CrossRef]
  6. Fischman, G.E.; Marcetti Topper, A.; Silova, I.; Goebel, J.; Holloway, J.L. Examining the influence of international large-scale assessments on national education policies. J. Educ. Policy 2019, 34, 470–499. [Google Scholar] [CrossRef]
  7. Gorur, R. Seeing like PISA: A cautionary tale about the performativity of international assessments. Eur. Educ. Res. J. 2016, 15, 598–616. [Google Scholar] [CrossRef]
  8. Lindblad, S.; Petersson, D.; Popkewitz, T.S. International Comparisons of School Results: A Systematic Review of Research on Large-Scale Assessments in Education. 2015. Available online: https://publikationer.vr.se/produkt/international-comparisons-of-school-results-a-systematic-review-of-research-on-large-scale-assessments-in-education/ (accessed on 25 May 2023).
  9. Lindblad, S.; Pettersson, D.; Popkewitz, T. (Eds.) Education by the Numbers and the Making of Society: The Expertise of International Assessments; Routledge: New York, NY, USA, 2018. [Google Scholar]
  10. Lingard, B.; Martino, W.; Rezai-Rashti, G.; Sellar, S. Globalising Educational Accountabilities; Routledge: New York, NY, USA, 2016. [Google Scholar]
  11. Lewis, S.; Lingard, B. The multiple effects of international large-scale assessment on education policy and research. Discourse Stud. Cult. Politics Educ. 2015, 36, 621–637. [Google Scholar] [CrossRef]
  12. Lockheed, M.E.; Wagemaker, H. International large-scale assessments: Thermometers, whips, or useful policy tools? Res. Comp. Int. Educ. 2013, 8, 296–306. [Google Scholar] [CrossRef]
  13. Grek, S. Governing by numbers: The PISA ‘effect’ in Europe. J. Educ. Policy 2009, 24, 23–37. [Google Scholar] [CrossRef]
  14. Ozga, J. Assessing PISA. Eur. Educ. Res.J. 2012, 11, 166–171. [Google Scholar] [CrossRef]
  15. Rutkowski, D.; Thompson, G.; Rutkowski, L. Understanding the Policy Influence of International Large-Scale Assessments in Education. In Reliability and Validity of International Large-Scale Assessment. IEA Research for Education; Wagemaker, H., Ed.; Springer: Cham, Switzerland, 2020; Volume 10. [Google Scholar]
  16. Sahlberg, P. Finnish Lessons 2.0; Teachers College Press: New York, NY, USA, 2015. [Google Scholar]
  17. Steiner-Khamsi, G.; Waldow, F. PISA for scandalisation, PISA for projection: The use of international large-scale assessments in education policy making—An introduction. Glob. Soc. Educ. 2018, 16, 557–565. [Google Scholar] [CrossRef]
  18. Verger, A.; Parcerisa, L.; Fontdevila, C. The growth and spread of large-scale assessments and test-based accountabilities: A political sociology of global education reforms. Educ. Rev. 2019, 71, 5–30. [Google Scholar] [CrossRef]
  19. Winthrop, R.; Simons, K.A. Can International Large-Scale Assessments Inform a Global Learning Goal? Insights from the Learning Metrics Task Force. Res. Comp. Int. Educ. 2013, 8, 279–295. [Google Scholar] [CrossRef]
  20. Grek, S. International Organisations and the Shared Construction of Policy ‘Problems’: Problematisation and Change in Education Governance in Europe. Eur. Educ. Res. J. 2010, 9, 396–406. [Google Scholar] [CrossRef]
  21. Tan, C. Rethinking the notion of the high-performing education system: A Daoist response. Res. Comp. Int. Educ. 2021, 16, 100–113. [Google Scholar] [CrossRef]
  22. Lewis, S.; Sellar, S.; Lingard, B. PISA for Schools: Topological rationality and new spaces of the OECD’s global educational governance. Comp. Educ. Rev. 2016, 60, 27–57. [Google Scholar] [CrossRef]
  23. Oliveira, D.; Carvalho, L.M. Performance-Based Accountability in Brazil: Trends of Diversification and Integration. In World Yearbook of Education; Routledge: Oxford, UK, 2020. [Google Scholar]
  24. Breakspear, S. The Policy Impact of PISA: An Exploration of the Normative Effects of International Benchmarking in School System Performance; OECD Education Working Papers 71; OECD Publishing: Paris, France, 2012. [Google Scholar]
  25. Martens, K.; Niemann, D.; Teltemann, J. Effects of international assessments in education—A multidisciplinary review. Eur. Educ. Res. J. 2016, 15, 516–522. [Google Scholar] [CrossRef]
  26. Carvalho, L.M.; Costa, E.; Gonçalves, C. Fifteen years looking at the mirror: On the presence of PISA in education policy processes (Portugal, 2000–2016). Eur. J. Educ. 2017, 52, 154–166. [Google Scholar] [CrossRef]
  27. Acosta, F. Who Is Setting the Agenda? OECD, PISA, and Educational Governance in the Countries of the Southern Cone. Eur. Educ. 2020, 52, 87–101. [Google Scholar] [CrossRef]
  28. Białecki, I.; Jakubowski, M.; Wiśniewski, J. Education policy in Poland: The impact of PISA (and other international studies). Eur. J. Educ. 2017, 52, 167–174. [Google Scholar] [CrossRef]
  29. Gorur, R. The “thin descriptions” of the secondary analyses of PISA. Educ. Soc. 2016, 37, 647–668. [Google Scholar] [CrossRef]
  30. Morgan, C. Tracing the sub-national effect of the OECD PISA: Integration into Canada’s decentralized education system. Glob. Soc. Policy 2015, 16, 47–67. [Google Scholar] [CrossRef]
  31. Niemann, D.; Martens, K.; Teltemann, J. PISA and its consequences: Shaping international comparisons in education. Eur. J. Educ. 2017, 52, 175–183. [Google Scholar] [CrossRef]
  32. Tan, C. PISA and education reform in Shanghai. Crit. Stud. Educ. 2019, 60, 391–406. [Google Scholar] [CrossRef]
  33. Rutkowski, D. The OECD and the local: PISA-based Test for Schools in the USA. Discourse Stud. Cult. Politics Educ. 2015, 36, 683–699. [Google Scholar] [CrossRef]
  34. Simons, M. Governing education without reform: The power of the example. Discourse Stud. Cult. Politics Educ. 2015, 36, 712–731. [Google Scholar] [CrossRef]
  35. Lewis, S. Governing schooling through ‘what works’: The OECD’s PISA for Schools. J. Educ. Policy 2017, 32, 281–302. [Google Scholar] [CrossRef]
  36. Lewis, S. PISA for Schools: Respatializing the OECD’s Global Governance of Education. In The Impact of the OECD on Education Worldwide; Emerald Publishing Limited: Leeds, UK, 2017. [Google Scholar]
  37. Lewis, S. Policy, philanthropy and profit: The OECD’s PISA for Schools and new modes of heterarchical educational governance. Comp. Educ. 2017, 53, 518–537. [Google Scholar] [CrossRef]
  38. Lewis, S. PISA ‘Yet to Come’: Governing schooling through time, difference and potential. Br. J. Sociol. Educ. 2018, 39, 683–697. [Google Scholar] [CrossRef]
  39. Lewis, S. ‘Becoming European’? Respatialising the European Schools System through PISA for Schools. Int. Stud. Sociol. Educ. 2020, 29, 85–106. [Google Scholar] [CrossRef]
  40. Lewis, S. PISA, Policy and the OECD. Respatialising Global Education Governance through PISA for Schools; Springer: Singapore, 2020. [Google Scholar]
  41. Rutkowski, D.; Rutkowski, L. Measuring Socioeconomic Background in PISA: One Size Might not Fit all. Res. Comp. Int. Educ. 2013, 8, 259–278. [Google Scholar] [CrossRef]
  42. Rutkowski, D.; Rutkowski, L.; Plucker, J.A. Should individual U.S. schools participate in PISA? Phi Delta Kappan 2014, 96, 68–73. [Google Scholar] [CrossRef]
  43. Lewis, S.; Lingard, B. PISA for Sale? Creating Profitable Policy Spaces Through the OECD’s PISA for Schools. In The Rise of External Actors in Education: Shifting Boundaries Globally and Locally; Lubienski, C., Yemini, M., Maxwell, C., Eds.; Bristol University Press: Bristol, UK, 2022; pp. 91–112. [Google Scholar]
  44. Coburn, C.E. Framing the Problem of Reading Instruction: Using Frame Analysis to Uncover the Microprocesses of Policy Implementation. Am. Educ. Res. J. 2006, 43, 343–349. [Google Scholar] [CrossRef]
  45. Simons, M.; Olssen, M.E.H.; Peters, M.A. Re-Reading Education Policy; Sense: Rotterdam, The Netherlands, 2009. [Google Scholar]
  46. Sellar, S.; Lingard, B. The OECD and the expansion of PISA: New global modes of governance in education. Br. Educ. Res. J. 2014, 40, 917–936. [Google Scholar] [CrossRef]
  47. Williamson, B. Digital education governance: An introduction. Eur. Educ. Res. J. 2015, 15, 3–13. [Google Scholar] [CrossRef]
  48. Almeida, M.; Viana, J.; Carvalho, L. Processos e Sentidos da Regulação Transnacional da Profissão Docente: Uma análise das Cimeiras Internacionais sobre a Profissão Docente (2011–2017). Curriculo Front. 2020, 20, 62–84. [Google Scholar] [CrossRef]
  49. Robertson, S.L. ‘Placing’ Teachers in Global Governance Agendas; Centre for Globalisation, Education and Societies, University of Bristol: Bristol, UK, 2012; Available online: http://susanleerobertson.com/publications/ (accessed on 23 April 2023).
  50. Addey, C. Golden relics & historical standards: How the OECD is expanding global education governance through PISA for Development. Crit. Stud. Educ. 2017, 58, 311–325. [Google Scholar] [CrossRef]
  51. Auld, E.; Rappleye, J.; Morris, P. PISA for Development: How the OECD and World Bank shaped education governance post-2015. Comp. Educ. 2019, 55, 197–219. [Google Scholar] [CrossRef]
  52. Rutkowski, D.; Rutkowski, L. Running the Wrong Race? The Case of PISA for Development. Comp. Educ. Rev. 2021, 65, 147–165. [Google Scholar] [CrossRef]
  53. Valiente, O.; Lee, M. Exploring the OECD survey of adult skills (PIAAC): Implications for comparative education research and policy. Comp. A J. Comp. Int. Educ. 2020, 50, 155–164. [Google Scholar] [CrossRef]
  54. Volante, L. The PISA Effect on Global Educational Governance; Routledge: Oxford, UK, 2017. [Google Scholar]
  55. Maroy, C. Comparing Accountability Tools and Rationales. Various Ways, Various Effects. In Governing Educational Spaces. Knowledge, Teaching and Learning in Transition; Kotthof, H.-G., Klerides, E., Eds.; Sense Publishers: Rotterdam, The Netherlands, 2015; pp. 35–58. [Google Scholar]
  56. Verger, A.; Fontdevila, C.; Parcerisa, L. Reforming governance through policy instruments: How and to what extent standards, tests and accountability in education spread worldwide. Discourse Stud. Cult. Politics Educ. 2019, 40, 248–270. [Google Scholar] [CrossRef]
  57. Lascoumes, P.; Le Galès, P. (Eds.) Gouverner par les Instruments; Presses de Sciences Po: Paris, France, 2012. [Google Scholar]
  58. Muller, P. Les Politiques Publiques; Puf: Paris, France, 2018. [Google Scholar]
  59. Lascoumes, P.; Simard, L. L’action publique au prisme de ses instruments: Introduction. Rev. Française Sci. Polit. 2011, 61, 5–22. [Google Scholar] [CrossRef]
  60. Entman, R.M. Framing: Toward Clarification of a Fractured Paradigm. J. Commun. 1993, 43, 51–58. [Google Scholar] [CrossRef]
  61. Lane, J.L. Maintaining the Frame: Using Frame Analysis to Explain Teacher Evaluation Policy Implementation. Am. Educ. Res. J. 2020, 57, 5–42. [Google Scholar] [CrossRef]
  62. Woulfin, S.L.; Donaldson, M.L.; Gonzales, R. District Leaders’ Framing of Educator Evaluation Policy. Educ. Adm. Q. 2016, 52, 110–143. [Google Scholar] [CrossRef]
  63. Hennink, M.; Hutter, I.; Bailey, A. Qualitative Research Methods; Sage Publications: London, UK, 2020. [Google Scholar]
  64. Schreier, M. Qualitative Content Analysis in Practice; Sage: London, UK, 2012. [Google Scholar]
  65. Delvaux, B. Qual é o papel do conhecimento na acção pública? Educ. Soc. 2009, 30, 959–985. [Google Scholar] [CrossRef]
  66. Barroso, J. Conhecimento, Políticas e Práticas em Educação. In Políticas e Gestão da Educação: Desafios Tempo Mudanças; Martins, A.M., Caldéron, A.I., Ganzeli, P., Garcia, T.O.G., Eds.; ANPAE: Campinas, Brazil, 2013; pp. 1–25. [Google Scholar]
  67. Snow, D.A.; Rochford, E.B., Jr.; Worden, S.K.; Benford, R.D. Frame alignment processes, micromobilization, and movement participation. Am. Sociol. Rev. 1986, 51, 464–481. [Google Scholar] [CrossRef]
  68. Williams, R.H.; Kubal, T.J. Movement Frames and the Cultural Environment: Resonance, Failure, and the Boundaries of the Legitimate. In Research in Social Movements, Conflicts and Change; Emerald Publishing: Leeds, UK, 1999; Volume 21, pp. 225–248. [Google Scholar]
  69. Benford, R.D.; Snow, D.A. Framing Processes and Social Movements: An Overview and Assessment. Annu. Rev. Sociol. 2000, 26, 611–639. [Google Scholar] [CrossRef]
  70. Noaksson, N.; Jacobsson, K. The Production of Ideas and Expert Knowledge in OECD; SCORE: Stockholm, Sweden, 2003. [Google Scholar]
  71. Barroso, J. Descentralização, territorialização e regulação sociocomunitária da educação. Rev. Adm. Emprego Público 2018, 4, 7–29. [Google Scholar]
  72. Barroso, J. 40 anos: Educação & sociedade seção comemorativa. A transversalidade das regulações em educação: Modelo de análise para o estudo das políticas educativas em Portugal. Educ. Soc. 2018, 39, 1075–1097. [Google Scholar]
  73. Pons, X.; van Zanten, A. The Social and Cognitive Mapping of Policy, Report; Knowandpol: Paris, France, 2008. [Google Scholar]
  74. Davies, H.T.O.; Nutley, S.; Smith, P. (Eds.) Evidence-Based Policy and Practice in Public Services; The Policy Press: Bristol, UK, 2000. [Google Scholar]
  75. Luke, A. After the marketplace: Evidence, social science and educational research. Aust. Educ. Res. 2003, 30, 89–109. [Google Scholar] [CrossRef]
  76. Rochex, J.-Y. Chapter 5: Social, Methodological, and Theoretical Issues Regarding Assessment: Lessons from a Secondary Analysis of PISA 2000 Literacy Tests. Rev. Res. Educ. 2006, 30, 163–212. [Google Scholar] [CrossRef]
Figure 1. PISA-M (re)configuring power relationships.
Figure 1. PISA-M (re)configuring power relationships.
Education 13 00873 g001
Table 1. Characteristics of PISA-S in USA, Brazil, Portugal, and Spain.
Table 1. Characteristics of PISA-S in USA, Brazil, Portugal, and Spain.
Role PortugalSpainUSABrazil
Country
Idealization
Certification
OECDOECDOECDOECD
FundingLocal
authorities
(Public)
National
Authorities
(Public)
Edu-businesses
Philanthropic
foundations
US Federal
Government
(Edu-businesses/
philanthropy/public)
Lemman Foundation
(Philanthropy)
National Service
Provider
Polytechnic
Institute of Lisbon
(Public Higher Education
Institution)
2E es
(Edu-business)
CTB/McGraw-Hill (2012–2015)
(Edu-business)

Northwest
Evaluation
(2015–2016)
(not-for-profit
organization)

Janison (Ed-tech)
Cesgranrio
Foundation
(R&D)
AuthorizationStateStateStateState
Type of SchoolsSchool clusters (Public)Schools/school clusters
(Public/private)
Schools
(Public/private)
Schools
(Public/private)
Table 2. Participants in the PISA-M pilot.
Table 2. Participants in the PISA-M pilot.
Adherent EntitiesCIM/CM ParticipantsNumber of Schools per CIM/CMNo of Participating Schools
CIM AVE83328
CIM Médio Tejo101915
CIM Terras Trás-os-Montes91211
CIM Viseu Dão-Lafões143520
CM Amadora11212
CM Arouca122
CM Barcelos199
CM Braga1146
Total45136103
Table 3. Categories and subcategories of analysis based on [44]’s typology.
Table 3. Categories and subcategories of analysis based on [44]’s typology.
CategorySubcategoryCode
Diagnostic framing
(problems)
1. Schools (and teachers) lack useful knowledge Probl1
2. Schools (and teachers) do not collaborate and reflect together Probl2
3. Schools are isolated and lack internal and external articulation and coordination Probl3
Prognostic framing
(solutions)
1. PISA tests provide the objective information and expert knowledgeSolut1
2. PISA-M guarantees working collaboratively and networkingSolut2
3. Municipalities ensure effective coordination of the networkSolut3
Frame alignment
(beliefs)
Students can improve their learning outcomes and wellbeingFrAlign1
Comparisons for improvement promote dialogue and learning, not rankings.FrAlign2
Municipalities’ coordination of the network will not affect schools’ autonomy.FrAlign3
Table 4. Problems and Solutions framed to justify adherence to PISA-M.
Table 4. Problems and Solutions framed to justify adherence to PISA-M.
ProblemsSolutions
Useful knowledge deficitAdhering to PISA tests
(objective data and expert knowledge)
Binomial 1—The knowledge deficit is bridged by adhering to PISA tests
Collaboration deficitAdhering to PISA-M
(collaboration and networking)
Binomial 2—The collaboration deficit is bridged by adhering to PISA-M
Coordination deficitAssigning the network coordination to the Municipalities
(effective network coordination)
Binomial 3—The coordination deficit is bridged by assigning that responsibility to municipalities
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Costa, E.; Carvalho, L.M. Framing School Governance and Teacher Professional Development Using Global Standardized School Assessments. Educ. Sci. 2023, 13, 873. https://doi.org/10.3390/educsci13090873

AMA Style

Costa E, Carvalho LM. Framing School Governance and Teacher Professional Development Using Global Standardized School Assessments. Education Sciences. 2023; 13(9):873. https://doi.org/10.3390/educsci13090873

Chicago/Turabian Style

Costa, Estela, and Luís Miguel Carvalho. 2023. "Framing School Governance and Teacher Professional Development Using Global Standardized School Assessments" Education Sciences 13, no. 9: 873. https://doi.org/10.3390/educsci13090873

APA Style

Costa, E., & Carvalho, L. M. (2023). Framing School Governance and Teacher Professional Development Using Global Standardized School Assessments. Education Sciences, 13(9), 873. https://doi.org/10.3390/educsci13090873

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop