Evaluation of Education Programmes and Policies

A special issue of Education Sciences (ISSN 2227-7102).

Deadline for manuscript submissions: closed (30 September 2024) | Viewed by 4327

Special Issue Editors


E-Mail Website
Guest Editor
School of Education, Durham University, Durham, DH1 1TA, UK
Interests: teacher supply; teacher development; teacher effectiveness; education policy; parental involvement; critical thinking; arts education; evaluation of education programmes; research methods
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor Assistant
School of Education, Durham University, Durham DH1 1TA, UK
Interests: research synthesis; metacognition and self-regulated learning; teacher supply, evaluation of education programmes; STEM learning; English as a second language (ESL)

Special Issue Information

Dear Colleagues,

The last decade has been an exciting era for education research. It has seen a burgeoning of rigorous evaluations of education programmes and policies, not only in the UK, US and Europe, but also in Latin America, Africa, Asia and elsewhere. New and more robust approaches have been developed, tried and applied to the evaluation of education programmes.

More organisations and countries are recognising the importance of evidence-based practices in education. The Education Endowment Foundation (EEF) in the UK and the Institute of Education Sciences in the US are just two examples of organisations that are committed to evaluating education programmes and policies to ensure that programmes and policies used in schools are based on the best evidence.

International assessments, such as PISA, TIMSS, PIRLS and TALIS, provide valuable data on the performance of students and education systems across different countries and regions, which can be used to inform education policies and practices.  Access to administrative datasets has also made it possible to evaluate the effectiveness of education policies on a large scale. The use of robust research methods and the availability of data have greatly enhanced our ability to evaluate education policies and programmes and make evidence-based decisions. This Special Issue aims to curate a collection of such evaluations.

Why the need to evaluate education programmes and policies?

Whilst many education programmes have been evaluated, there is also an abundance of education classroom practices that have not been rigorously evaluated. For example, there is still no clear evidence whether class sizes, streaming or tracking, academic selection, academisation of schools (England) and grade retention are beneficial, and for which phase of education and in what context. Further, the proliferation of education technology and online teaching resources, especially during the lockdown in the recent COVID-19 pandemic, has seen many of these resources and this technology used by schools. Most of these have not been tested. Some may, indeed, be detrimental to learning.

Evaluating education policies and practices is crucial to ensure that our children receive the best possible education. It is important to recognise that different contexts and different groups of children may require different approaches to education. What works well in one school or community may not work as well in another, and what works for one group of students may not work for another. Evaluating education programmes and policies allows us to understand these nuances and tailor our approach to education to best meet the needs of all students.

In this Special Issue, we would like to invite papers that evaluate education policies, practices and programmes for all phases of education, from early years to higher education. We are particularly interested in rigorous research in the evaluation of education programmes and policies. These could be systematic reviews, meta-analyses and experimental studies, such as randomised control trials and quasi-experimental studies, e.g., regression discontinuity and difference-in-difference approaches. We also welcome articles that focus on methodological issues, conceptual pieces on policy evaluations and replication works.

If you would like to contribute to this Special Issue, please submit an abstract of between 250 and 300 words.

Prof. Dr. Beng Huat See
Guest Editor

Loraine Hitt
Guest Editor Assistant

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a double-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Education Sciences is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • programme evaluation
  • policy evaluation
  • systematic reviews
  • education interventions
  • experimental designs
  • process or implementation evaluations

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

28 pages, 882 KiB  
Article
Where Are the Costs? Using an Economic Analysis of Educational Interventions Approach to Improve the Evaluation of a Regional School Improvement Programme
by Emma Tiesteel, Richard C. Watkins, Carys Stringer, Adina Grigorie, Fatema Sultana and J. Carl Hughes
Educ. Sci. 2024, 14(9), 957; https://doi.org/10.3390/educsci14090957 - 29 Aug 2024
Viewed by 834
Abstract
Education systems are moving to a more evidence-informed paradigm to improve outcomes for learners. To help this journey to evidence, robust qualitative and quantitative research can help decisionmakers identify more promising approaches that provide value for money. In the context of the utilisation [...] Read more.
Education systems are moving to a more evidence-informed paradigm to improve outcomes for learners. To help this journey to evidence, robust qualitative and quantitative research can help decisionmakers identify more promising approaches that provide value for money. In the context of the utilisation of scarce resources, an important source of evidence commonly used in health and social care research is an understanding of the economic impact of intervention choices. However, there are currently very few examples where these methodologies have been used to improve the evaluation of education interventions. In this paper we describe the novel use of an economic analysis of educational interventions (EAEI) approach to understand both the impact and the cost of activities in the evaluation of a formative assessment implementation project (FAIP) designed to improve teachers’ understanding and use of formative assessment strategies. In addition to utilising a mixed method quasi-experimental design to explore the impact on learner wellbeing, health utility and attainment, we describe the use of cost-consequence analysis (CCA) to help decisionmakers understand the outcomes in the context of the resource costs that are a crucial element of robust evaluations. We also discuss the challenges of evaluating large-scale, universal educational interventions, including consideration of the economic tools needed to improve the quality and robustness of these evaluations. Finally, we discuss the importance of triangulating economic findings alongside other quantitative and qualitative information to help decisionmakers identify more promising approaches based on a wider range of useful information. We conclude with recommendations for more routinely including economic costs in education research, including the need for further work to improve the utility of economic methods. Full article
(This article belongs to the Special Issue Evaluation of Education Programmes and Policies)
Show Figures

Figure 1

11 pages, 227 KiB  
Article
Research, Science Identity, and Intent to Pursue a Science Career: A BUILD Intervention Evaluation at CSULB
by Hector V. Ramos and Kim-Phuong L. Vu
Educ. Sci. 2024, 14(6), 647; https://doi.org/10.3390/educsci14060647 - 15 Jun 2024
Viewed by 676
Abstract
This paper presents an analysis of survey data to examine the association between participating in one of the National Institute of Health’s (NIH) funded Building Infrastructure Leading to Diversity Initiative (BUILD) program and students’ intent to pursue a career in science. Data were [...] Read more.
This paper presents an analysis of survey data to examine the association between participating in one of the National Institute of Health’s (NIH) funded Building Infrastructure Leading to Diversity Initiative (BUILD) program and students’ intent to pursue a career in science. Data were collected from students at California State University Long Beach (CSULB) to examine the effectiveness of the BUILD Scholars program. Both BUILD Scholars and non-BUILD students were surveyed. Propensity score matching was used to generate the non-BUILD comparison group. Multinomial logistic regression results revealed that students participating in the BUILD intervention were associated with significantly higher intent to pursue a career in science. Results also showed the importance of variables such as science identity and research participation when assessing interest in science careers. These findings have implications for STEM program evaluation and practice in higher education. Full article
(This article belongs to the Special Issue Evaluation of Education Programmes and Policies)
12 pages, 794 KiB  
Article
An Investigation of the Cross-Language Transfer of Reading Skills: Evidence from a Study in Nigerian Government Primary Schools
by Steve Humble, Pauline Dixon, Louise Gittins and Chris Counihan
Educ. Sci. 2024, 14(3), 274; https://doi.org/10.3390/educsci14030274 - 6 Mar 2024
Viewed by 1480
Abstract
This paper investigates the linguistic interdependence of Grade 3 children studying in government primary schools in northern Nigeria who are learning to read in Hausa (L1) and English (L2) simultaneously. There are few studies in the African context that consider linguistic interdependence and [...] Read more.
This paper investigates the linguistic interdependence of Grade 3 children studying in government primary schools in northern Nigeria who are learning to read in Hausa (L1) and English (L2) simultaneously. There are few studies in the African context that consider linguistic interdependence and the bidirectional influences of literacy skills in multilingual contexts. A total of 2328 Grade 3 children were tested on their Hausa and English letter sound knowledge (phonemes) and reading decoding skills (word) after participating in a two-year English structured reading intervention programme as part of their school day. In Grade 4, these children will become English immersion learners, with English becoming the medium of instruction. Carrying out bivariate correlations, we find a large and strongly positively significant correlation between L1 and L2 test scores. Concerning bidirectionality, a feedback path model illustrates that the L1 word score predicts the L2 word score and vice versa. Multi-level modelling is then used to consider the variation in test scores. Almost two thirds of the variation in the word score is attributable to the pupil level and one third to the school level. The Hausa word score is significantly predicted through Hausa sound and English word score. English word score is significantly predicted through Hausa word and English sound score. The findings have implications for language policy and classroom instruction, showing the importance of cross-language transfer between reading skills. The overall results support bidirectionality and linguistic interdependence. Full article
(This article belongs to the Special Issue Evaluation of Education Programmes and Policies)
Show Figures

Figure 1

Back to TopTop