Next Article in Journal
Indigenous Math Video Project: Community-Based Language Acquisition in Turtle Mountain
Previous Article in Journal
The Use of Schoolgrounds for the Integration of Environmental and Sustainability Education in Natural and Social Sciences Pedagogy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Mapping Collaborations in STEM Education: A Scoping Review and Typology of In-School–Out-of-School Partnerships

1
Chair of Educational Psychology and Research on Excellence, Friedrich-Alexander University Erlangen-Nuremberg, 91054 Erlangen, Germany
2
Chair of Educational Psychology and Research on Excellence, University of Regensburg, 93053 Regensburg, Germany
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(11), 1513; https://doi.org/10.3390/educsci15111513
Submission received: 19 September 2025 / Revised: 24 October 2025 / Accepted: 5 November 2025 / Published: 10 November 2025
(This article belongs to the Topic Organized Out-of-School STEM Education)

Abstract

In-school–out-of-school collaborations are increasingly recognized as a key mechanism for enriching STEM education. Guided by conceptual frameworks on boundary crossing and STEM learning ecologies, this scoping review maps and synthesizes findings from 470 studies and 469 programs published between 2014 and 2024, focusing on how such partnerships are reported, structured, and distributed across educational contexts. Approximately 73% of the programs reported some form of collaboration, although often in general terms. The most common forms included shared infrastructure, recruitment coordination, and personnel involvement. More pedagogically grounded forms, such as curricular alignment and co-development of instruction, were rarely described. Collaboration patterns varied across program types, durations, subject areas, and participant target groups. A typology of seven collaboration categories was developed to organize the findings. Notable gaps include the near-total absence of collaboration in medicine-related programs and the underrepresentation of research from low- and middle-income countries. Although collaboration is frequently mentioned, it is seldom described in enough detail to support systematic analysis or theoretical insight. The review recommends more precise definitions, stronger reporting practices, and enhanced theoretical engagement with collaboration as a pedagogical and systemic component of STEM education. The proposed typology provides a foundation for more coherent future research and comparative studies.

1. Introduction

Over the past two decades, STEM education has evolved into a thriving and well-established field of research. One clear indicator is the proliferation of high-impact STEM-focused journals across educational research. For instance, several leading journals in the field of education, such as the “International Journal of STEM Education”, “Journal of Research in Science Teaching”, and “Educational Studies in Mathematics”, rank among the top quartiles in international citation metrics. In addition, a broader array of journals—such as “Computers & Education”, the “International Journal of Science and Mathematics Education”, and the “Journal of STEM Education”—document the field’s breadth and disciplinary diversity. This surge in publication venues reflects the field’s maturation and the increasing demand for specialized outlets to disseminate research findings.
The rapid increase in scholarly output further illustrates the flourishing nature of STEM education research (Li et al., 2022). Bibliometric analyses reveal that the volume of STEM-related educational publications has expanded dramatically over the past two decades. For example, Li et al. (2020) found an exponential growth of research output and thematic diversification within STEM education, documenting a sharp rise since the early 2000s. Several highly influential reviews have documented and shaped the development of STEM education research. Early conceptual clarifications, such as those by Breiner et al. (2012) and Bybee (2010), laid foundational understandings of STEM’s interdisciplinary nature and educational relevance.
In parallel with its quantitative growth, STEM education research has diversified thematically. Studies now encompass a diverse range of topics, including identity development, equity, interdisciplinary curriculum, and informal learning (Archer et al., 2013; English, 2017; Wang & Degol, 2017). Among these, school-based STEM education remains a central focus (Chiu et al., 2025; Honey et al., 2014), while out-of-school learning, for example, through science centers, competitions, or informal programs, has gained growing attention (Gupta et al., 2020; Falk & Dierking, 2010). Still, there is a clear preponderance of research studies in school settings compared to out-of-school settings (Allen et al., 2019).
A related and increasingly salient area concerns cooperation in STEM education, including cross-sectoral partnerships and inter-institutional networks (Foster et al., 2010; Godec et al., 2022). However, research that explicitly investigates joint in-school and out-of-school STEM collaborations remains scattered, with limited synthesis across models, goals, or outcomes (Braund & Reiss, 2006; Dierking & Falk, 2003). This fragmentation motivates the present scoping review, which aims to map existing research on in-school–out-of-school STEM education cooperation, clarify how such partnerships are conceptualized, and identify areas for future inquiry.

2. Cooperation in STEM Education

The education of an individual is a joint endeavour of many educators. The outcomes depend not only on the competencies of each educator but also on the quality of their collaboration. Thus, education is more than the sum of what each educator contributes to educational outcomes. Examples include the quality of collaboration between teachers (Murawski & Lochner, 2011; Vangrieken et al., 2015), parents and school (Fan & Chen, 2001; Jeynes, 2012), school counsellors and teachers (Clark & Breman, 2009), and multiple partnerships such as school, family and community (Bryan & Griffin, 2010). In terms of the interaction of many actors involved, STEM education is no exception.
The number of actors (individuals, groups, organisations) involved in an individual’s STEM education is immense (schools, families, museums, industries, etc.). Their influence takes place in a variety of settings and situations (Crane et al., 1994; Sahin et al., 2014), such as in the family (e.g., Dou et al., 2025), organised home projects (So et al., 2018), visits to museums and zoos (Hofstein & Rosenfeld, 1996), mobile science labs (Jones & Stapleton, 2017), summer programs (Moore et al., 2022), STEM clubs (Davis et al., 2023), mentoring (Stoeger et al., 2021), watching STEM-related TV and online videos and playing STEM-related video games (Chen et al., 2023), social media (He et al., 2016), and, of course, in-school STEM education. By the latter, we mean formal STEM learning, defined as “any experience or activity that takes place within a normal class period and in a school” (Xia et al., 2025).
Cooperation between schools and external partners is a well-established field of educational research. Several recent reviews have highlighted the crucial role of collaborative partnerships between schools and extracurricular entities in improving educational outcomes (e.g., Anderson-Butcher et al., 2022; McMullen et al., 2020; Mu et al., 2023; Walker & Bond, 2025). More recently, research has also begun to explore systematically various forms of cooperation in STEM education across disciplines, institutions, and sectors. However, it seems that only a fraction of the actors involved in STEM education collaborate, i.e., work together and leverage their respective expertise, resources, and perspectives to achieve STEM education outcomes that would be difficult to achieve independently. Nevertheless, a growing body of research indicates that linking school and out-of-school STEM education can have a profoundly positive impact (Fallik et al., 2013; Itzek-Greulich & Vollmer, 2017; Noam & Tillinger, 2004). A growing body of conceptual work also examines the potential benefits of collaboration and partnership in STEM education (Allen et al., 2019; Falloon et al., 2024; Watters & Diezmann, 2016).
Despite their potential to enrich learning experiences and improve access to high-quality STEM education, research shows that often there are only relatively weak connections and coordination between formal and informal learning providers, leading to missed opportunities for supporting and diversifying STEM engagement and participation (Burke & Navas Iannini, 2021; Staus et al., 2023). Moreover, some authors are somewhat sobered by the observation that “the majority of these partnerships break down and fail” (Margherio et al., 2020, p. 1).
It has not yet been possible to precisely determine what causes the difficulties mentioned by some authors, thereby establishing effective and sustainable cooperation between schools and external partners. Terminological and conceptual heterogeneity undoubtedly complicates the search for causes. While “school” is a fuzzy concept, especially when considering the wide variety of educational systems, structures, and practices across different countries and cultures, there is a common understanding of what a school typically represents (Moehlman, 2012; Torres et al., 2022). The terminology for organised out-of-school STEM education, on the other hand, is much more nebulous. Many different terms can be found referring to learning contexts and opportunities outside regular classroom teaching, for example, afterschool programs (Scott-Little et al., 2002), after-school education (Noam et al., 2002), out-of-school STEM enrichment program (Jaggy et al., 2025), extracurricular activities (Eccles et al., 2003; Feldman & Matjasko, 2005), school-based extracurricular activities (Fischer et al., 2014), organized activities (Mahoney et al., 2005), non-formal learning contexts (Maschke & Stecher, 2018), or structured informal contexts (Vadeboncoeur, 2006).
This diversity in terminology and conceptual framing creates significant challenges for synthesizing findings across studies. Without a consistent vocabulary or shared conceptual anchors, the research on in-school–out-of-school STEM collaborations remains fragmented and difficult to navigate. Terms such as “afterschool programs,” “non-formal learning,” or “structured informal contexts” may refer to overlapping but not identical phenomena, and their use often varies across national and disciplinary boundaries. This terminological heterogeneity complicates efforts to identify patterns, evaluate outcomes, or develop cumulative insights. As such, it represents a key methodological challenge—and a strong justification—for undertaking a scoping review that aims to map, organize, and clarify this complex body of work. For this review, we define in-school and out-of-school STEM education collaboration as the intentional exchange, integration, or co-creation of educational resources (e.g., curricula, staff expertise, facilities, technologies) to advance shared STEM learning goals. This working definition is intentionally broad to capture the range of collaborations in the literature.
Although scoping reviews are not always theory-driven in a narrow sense, conceptual framing is essential for interpreting patterns of collaboration and for developing cumulative insight. To guide our synthesis and typology development, we drew on conceptual frameworks from the literature on boundary crossing (Akkerman & Bakker, 2011) and STEM learning ecologies (Archer et al., 2025). These perspectives emphasize the systemic and relational nature of educational collaboration, particularly the challenges and opportunities that arise when actors from different institutional settings—such as schools and out-of-school learning providers—seek to co-construct educational experiences. While the boundary-crossing framework highlights mechanisms of coordination, brokerage, and transformation across institutional borders, the learning ecologies literature situates these efforts within broader structural and equity-related dynamics. Our review leverages these perspectives to organize and interpret the diverse forms of in-school–out-of-school collaboration described in the literature and to illuminate the conceptual gaps that limit current theorization.

3. The Present Study

While past reviews have often focused on student outcomes or single program types (e.g., Li et al., 2020; Staus et al., 2023; Xia et al., 2025), few have examined collaboration as the primary object of analysis. Guided by perspectives on boundary crossing (Akkerman & Bakker, 2011) and STEM learning ecologies (Archer et al., 2025), this review adopts a structural perspective on in-school–out-of-school partnerships. We focus on mapping how collaboration is enacted, reported, and distributed across program contexts while identifying conceptual and systemic patterns relevant to educational coordination. Accordingly, the review is structured around the following research questions:
Research question 1: What are the characteristics of in-school–out-of-school collaborations reported in the literature? More specifically, what types of programs, locations, timing, duration, and subject areas are reported for in-school–out-of-school collaborations?
Research question 2: What are the characteristics of the reviewed studies on in-school–out-of-school STEM education? More specifically, what article types on the topic can be found, and what research designs, measured outcomes, target groups, and sample sizes are reported?
Research question 3: What patterns of collaboration between in-school and out-of-school contexts are reported in the literature, particularly concerning personnel, curriculum, and infrastructure?
Research question 4: How do collaboration patterns vary across program and study characteristics, and what systemic or ecological factors may underlie this variation?
These questions guided the methodological design and data coding procedures, providing the basis for interpreting variation and trends across the included studies.

4. Method

This study employed a scoping review methodology to map the landscape of in-school–out-of-school collaborations in STEM education. Scoping reviews are beneficial in fields where evidence is diverse and terminologies are inconsistently applied (Munn et al., 2018). The review process was guided by the PRISMA Extension for Scoping Reviews (PRISMA-ScR) framework to ensure transparency and replicability (Tricco et al., 2018). The overall analysis consisted of four phases: (1) literature search and screening, (2) inclusion coding and interrater reliability, (3) data extraction and variable coding (cf. Tricco et al., 2018), and (4) typology development for collaboration types. These are described in the following subsections.

4.1. Literature Search and Inclusion Criteria

We searched for relevant articles published between 2014 and 2024 in the following four educational or psychological databases: the Web of Science Core Collection, PsycINFO, ERIC, and PSYNDEX. The search strategy was conducted in two steps. In the first step, we specifically targeted publications that explicitly addressed in-school–out-of-school collaborations by including terms such as “collaboration”, “cooperation”, and “partnership” in the search string. However, a preliminary analysis revealed that this approach yielded only a small number of relevant publications. Consequently, in the second step, the collaboration-related terms were removed from the search string to capture a broader body of research on out-of-school learning. This extended database was then systematically screened to identify studies that contained indications of collaboration with in-school settings, resources, or activities.
This modified approach was developed and refined in consultation with six leading experts in the field to ensure methodological rigor and relevance to the research questions. The final search string included three main groups of terms: (1) STEM subjects (e.g., science, mathematics), (2) out-of-school learning contexts (e.g., museum, summer camp), and (3) program-related concepts (e.g., program, intervention). The search term was constructed in a way that the relevant papers needed to contain at least one of the terms related to STEM subjects, out-of-school activities, and program-related terms. The Appendix A provides full details on the search.
This search yielded 7355 articles, of which 6318 remained after automatic duplicate removal. These articles were then subjected to title and abstract screening. Our inclusion criteria were as follows:
(a)
The article had to be about STEM education, in whole or in part.
(b)
The article had to be about an out-of-school program and target school students as its primary audience.
(c)
The article needed to be a primary study.
We thus excluded articles unrelated to STEM education and those that did not report on out-of-school programs. Articles that reported extracurricular or informal programs for university students or children under school age were excluded, as they fell outside of our out-of-school definition. Reviews, meta-analyses, or research syntheses of any kind were also excluded. No limitations were placed on the language, publication, or peer-review status of the documents, as we sought to gain a broad overview that could benefit from dissertations, research reports, and peer-reviewed journal articles.
The exclusions from the abstract coding followed a hierarchical logic. If an article was not concerned with STEM education, we excluded it without checking whether it was about an out-of-school program or a review. Six hundred seventy-seven articles remained after the article screening. The reasons for exclusion and the overall coding are outlined in the PRISMA flow chart in Figure 1.
Five coders, four educational researchers, and one student assistant screened the abstracts for relevance. We first familiarized ourselves with the coding manual. Given our team of five coders, this allowed for ten possible coder–pair combinations. Eight of these pairs independently screened between 45 and 100 articles to assess interrater agreement, resulting in Cohen’s kappas ranging from 0.53 to 0.95. One coder–pair’s agreement can be interpreted as moderate, and two as substantial agreement. The remaining five pairs almost perfectly agreed (McHugh, 2012). Disagreements in the coding were resolved through discussion within the respective coder dyads, and the remaining team members were consulted in cases that proved difficult to resolve. After establishing a satisfactory agreement, the remaining articles were coded by one of the five coders; other coders could be consulted in cases of uncertainty.

4.2. Data Extraction and Coding

As the next step in the data analysis process, the four researchers screened the full texts of the remaining studies. The inclusion criteria described earlier were reapplied to ensure consistency. Articles identified at this stage as not meeting the inclusion criteria were excluded from further analysis.1 Additionally, some full texts could not be retrieved despite attempts to contact the authors via ResearchGate or email (37 articles). This process resulted in a final dataset of 470 articles for in-depth coding and synthesis. A complete list of the 470 included studies is provided in the Supplementary Materials.
Each researcher was responsible for some of the remaining articles and recorded the relevant data in a customized spreadsheet. We utilized a shared coding framework comprising standardized categories to ensure consistency. Each program described in the studies was treated as a separate entry. In cases where a single article included multiple programs, each was coded independently.
The included articles were then coded according to study-level characteristics and the presence of a program description. Study-level variables included (a) research design (qualitative, quantitative, or mixed methods, with further differentiation among quantitative subtypes); (b) article type (e.g., empirical, theoretical, descriptive); (c) sample size; (d) whether a follow-up or longitudinal design was used; (e) country of the first author and the sample; and (f) measured outcomes. If a program description was present, it was coded according to the following program-level variables: (a) type of program (e.g., summer camp, mentoring); (b) subject area; (c) location of delivery (e.g., in-school, out-of-school, hybrid, online); (d) timing (during school, after school, or both); (e) whether the program was conducted during regular school days or outside the school calendar; (f) duration; and (g) target group. If an article described several out-of-school programs, each program was coded separately. Variables were coded into consistent categories to enable cross-study comparison and pattern analysis.
Additionally, if a program description was present in the article, the program contents were coded using constructed categories (e.g., lessons, hands-on activities, field trips) to allow comparability across the diverse initiatives. If collaboration between in-school and out-of-school actors was described, the specific forms of collaboration were categorized using a predefined scheme (see Table 1).

4.3. Collaboration Typology Development and Coding

As our primary goal for this scoping review was to provide an overview of the ways collaboration between in-school and out-of-school STEM education can occur, we needed categories to systematically assess the different forms of collaboration that can take place between in-school and out-of-school settings. To achieve this, we searched all articles in our database by close reading and looking for the program description (or descriptions, in cases of multiple programs) that would allow us to answer questions regarding the collaboration. If no program description was available or no form of collaboration was described, the publication was categorized as ‘no info’. Additionally, after reviewing examples from the articles and discussing the results as a team, we identified seven main categories, which represent different thematic areas of the collaboration. The different categories, along with a brief explanation and their corresponding subcategories, are listed in Table 1. For each out-of-school program, we assessed the types of collaboration present.
Importantly, many forms of collaboration can originate from either the in-school or the out-of-school partner. For example, under the category of infrastructural collaboration, a school might lend tablets to students participating in a field trip. Conversely, an out-of-school provider might supply laptops for students attending an after-school program held on school premises. Both instances represent infrastructural collaboration, illustrating that contributions can flow from both sides of the partnership.

5. Results

5.1. Overview of Included Programs

This scoping review draws on 470 studies about out-of-school STEM learning; among these studies were 48 unpublished dissertations and theses, 10 reports, and 412 articles published in scholarly journals. 398 articles contain a description of one or more out-of-school STEM education programs, totalling 469 program descriptions. These programs differ considerably in structure, duration, subject focus, and pedagogical orientation. They aim to engage school-aged learners in STEM-related learning opportunities outside formal classroom instruction (e.g., Bell et al., 2009). The included programs span brief interventions, such as one-day events or week-long workshops, to multi-year initiatives embedded in broader institutional strategies. They are implemented in diverse settings, including science centres, universities, school buildings, online platforms, and community-based venues. Collectively, they offer a cross-section of contemporary practice in out-of-school STEM education and provide a robust empirical basis for analyzing how collaboration between schools and external partners is conceptualized and enacted. The following sections present a detailed synthesis of program characteristics, research designs, and the types and patterns of collaboration observed across the included studies.

5.2. Program Characteristics

5.2.1. Type of Program

The dataset includes a range of program formats, reflecting the heterogeneity of out-of-school STEM education. Table 2 displays the ten most frequently reported program types across the reviewed studies, based on consolidated labels (e.g., Krishnamurthi et al., 2014). This means that we grouped synonymous labels where appropriate, while preserving the diversity of the original terminology. The most common formats were after-school programs and summer programs, followed by camps and workshops. These programs are typically designed to foster extended engagement with STEM topics through hands-on learning, project-based work, and exploratory activities.
Beyond these frequently recurring types, the dataset also includes a range of specialized formats, such as mentoring programs, hackathons, robotics competitions, and virtual makerspace initiatives. While some appeared only once or twice, they illustrate the field’s creative diversity.

5.2.2. Subject Areas

The disciplinary scope of out-of-school STEM programs offers insight into which areas receive the most attention and how STEM is interpreted in practice. The five most frequently mentioned domains were general STEM (N = 117), science (N = 108), mathematics (N = 64), computer science (N = 39), and engineering (N = 30). These core fields reflect the continued emphasis on the main STEM subject areas within out-of-school education. Other commonly addressed subjects included STEAM (N = 23), biology (N = 19), robotics (N = 16), technology (N = 13), chemistry (N = 11), and physics (N = 10). Fewer programs focused on more specialized or interdisciplinary areas such as medicine, biomedical science, environmental science, coding AI, and spatial thinking.
Subject labelling varied considerably across studies. Some programs were described using broad umbrellas (e.g., “STEM,” “STEAM,” or “STEMM”), while others specified narrow topics (e.g., “cybersecurity” or “pulsar science”) or situated their program both within a broad discipline and provided a narrow topic (e.g., “science: cancer research”). In these cases, the multiple labels were assigned (e.g., “science”, “cancer research”). For clarity, closely related categories were grouped where appropriate, preserving the diversity of labels found in the literature. Table 3 summarizes the 11 most frequently mentioned subject areas.

5.2.3. Location and Timing

Table 4 gives an overview of the location- and timing-related categories analyzed in this study. Of the 469 program descriptions included in this review, 61 did not provide any information on the program location. Of the remaining ones, most were implemented exclusively out of school (N = 257), underscoring the prominence of informal and non-formal learning settings in STEM enrichment efforts. A smaller proportion occurred solely within school facilities (N = 75), while hybrid models—combined in-school and out-of-school elements—were reported in 35 cases. In addition, a subset of programs was delivered exclusively online (N = 11), reflecting the growing role of digital learning environments in extracurricular STEM education. Some programs reported a combination of an online and either an in-school or out-of-school setting (N = 8).
142 program descriptions did not provide any scheduling information. Of the remaining ones, most programs were offered after school (N = 263), with only a limited number taking place during school hours (N = 47). A small number of programs combined both formats (N = 17). These timing patterns align with the general goal of out-of-school STEM programs, which is to supplement rather than replace formal instruction.
A secondary timing variable concerned whether the programs were offered on school days or outside of the regular school calendar, such as during summer vacations or holidays. 140 descriptions did not provide relevant information in that regard. Of the remaining programs, those scheduled outside regular school days (e.g., summer) accounted for 147 cases, which is nearly equal to the number offered during the school term (N = 148). An additional 34 programs spanned both types of timing. Together, these findings highlight the structural flexibility of out-of-school STEM initiatives and their responsiveness to institutional and learner scheduling constraints.

5.2.4. Duration

The reviewed programs differed considerably in duration, highlighting the structural flexibility that characterizes many out-of-school STEM initiatives. While some lasted only a day or a few days, others extended over several months or more than a year. The largest group spanned one to three months in duration (N = 78). This was followed by programs that lasted less than one week (N = 60) or between one and three weeks (N = 56), suggesting that many initiatives are designed as short, intensive learning experiences—often in workshop or compact camp formats. A smaller portion covered a whole academic semester or school year (N = 38). Long-term programs lasting more than one year were relatively rare but still notable (N = 23), reflecting efforts to provide sustained engagement in STEM pathways. Only one program was classified as self-paced, and 15 programs were explicitly identified as summer-based. Another 58 program descriptions provided details about the frequency of program sessions without specifying the overall duration. These programs ranged from one to 39 sessions. However, these numbers may underestimate the actual number due to inconsistencies in reporting, as evidenced by 130 programs not providing any information on their temporal extension. Especially those programs designed to extend over longer periods have great potential to facilitate continuity and alignment between in-school and out-of-school learning environments.

5.3. Characteristics of the Reviewed Studies

In addition to describing the programs themselves, we analyzed the methodological and publication characteristics of the studies (N = 398) that reported on them.

5.3.1. Research Designs and Article Types

The studies included in this review applied a variety of methodological approaches. Most were based on qualitative research designs (N = 154), while quantitative methods were used in 96 cases. Mixed-methods designs were reported in 107 studies. This distribution likely reflects the heterogeneity of programs and the different types of questions addressed in the literature. A smaller number of publications were not empirical but described programs (N = 33) or advanced theoretical perspectives (N = 8).
Among the empirical studies, the most frequently used design was the case study (N = 155), typically drawing on interviews, observations, or document analyses. Additional designs included pre-post studies without control groups (N = 106), experimental designs (N = 54), descriptive research (N = 48), phenomenological approaches (N = 23), and ethnographic approaches (N = 16). Some studies combined methods—for instance, using pre-post evaluations embedded within broader case studies—illustrating the field’s tendency toward methodological pluralism. Table 5 provides an overview of the most frequently reported research designs.

5.3.2. Measured Outcomes

The studies reviewed examined a concentrated set of outcome constructs related to student learning and development. Table 6 presents the nine most frequently reported categories, which were consolidated from overlapping labels. The most common outcomes included interest, engagement, and perceptions, followed closely by skills, knowledge, and attitudes. Measures related to self-efficacy, identity, and social–emotional factors were also frequently reported, though with slightly lower frequencies.
The dominance of affective and motivational outcomes reflects a broader trend in STEM education research, which increasingly emphasizes student dispositions—such as interest, identity, and self-efficacy—as predictors of long-term engagement (Maltese & Tai, 2011). Meanwhile, academic achievement and content mastery were rarely the primary focus, pointing to a broader preference for studying how students experience and engage with STEM learning.

5.3.3. Target Groups and Sample Sizes

Most programs targeted school-aged students, although the level of specificity varied across studies. Many referred to “students” in general or did not specify a particular age group (N = 65), while others identified more defined groups such as high school students (e.g., grades 9–12: N = 55), middle school students (e.g., grades 6–8: N = 46), or the entire K-12 span (N = 7). A few programs included older participants, such as undergraduate students or pre-service teachers, though these were rare.
Sample sizes of the included programs (N = 469) also varied considerably. For 88 programs, no information was given regarding the sample sizes, or the information was too vague or inconsistent to categorize. 56% (N = 213) involved small samples of fewer than 50 participants, while 84 programs reported samples ranging from 50 to 200 participants, and another 84 programs included large samples comprising more than 200 participants. This further illustrates the wide range of empirical scope across the reviewed studies, from small, exploratory interventions to larger-scale evaluations.

5.3.4. Geographic Distribution

Table 7 gives an overview of the most frequently reported geographic origins of study samples and first authors. Of the 469 programs, 46 did not provide information about the sample’s geographic origin. Most reported samples from the United States (N = 276), reflecting the dominance of U.S.-based research in out-of-school STEM education. Other frequently represented sample locations included the United Kingdom (N = 20), Turkey (N = 19), Australia (N = 9), and Germany (N = 9). Additional samples were drawn from Spain (N = 8), Israel (N = 7), Canada (N = 6), and China (N = 6), among others. A small number of studies featured joint or international samples (e.g., “UK and USA”).
In terms of first author affiliation, the United States also dominated (N = 235), followed by the United Kingdom (N = 18), Turkey (N = 17), Spain (N = 8), Australia and Canada (N = 7, each), and Germany and Israel (N = 6, each). A small number of studies featured joint or international affiliations (e.g., “Sweden and Germany”), and 39 studies had missing author location data. This distribution highlights the concentration of research activity in high-income, English-speaking countries, although contributions from a broader international base are also emerging.

5.4. Collaboration Patterns

This section summarizes how in-school–out-of-school collaboration is reported and structured in the reviewed studies, highlighting patterns that may reflect different forms of boundary negotiation, coordination, or integration. We begin by examining the frequency of collaboration reported across programs, followed by a synthesis of the types of collaboration described. A more detailed analysis of how collaboration patterns vary across different program and study characteristics is presented separately in the next section.

5.4.1. Reporting of Collaboration

Out of the 469 programs included in the review, 354 (approximately 75%) reported some form of collaboration between in-school and out-of-school stakeholders. The nature and depth of these collaborations varied considerably. In some cases, schools were involved primarily in promoting or facilitating access to out-of-school offerings—for example, by distributing flyers or recommending participants. In others, the collaboration was more substantive, involving shared planning, personnel exchange, or curricular alignment.
Roughly one-quarter of the programs (25%) did not mention collaboration or provided too little detail to determine whether schools were involved. These programs presented out-of-school activities as self-contained initiatives without explicitly linking to formal education settings. This indicates that collaboration is common—but far from universal—and that the extent to which it is described varies widely across studies.

5.4.2. Types of Collaboration

The analysis of program descriptions revealed a wide range of collaboration types between in-school and out-of-school stakeholders. Among the 354 programs that reported collaboration, many combined multiple forms of interaction. Using a seven-category framework developed during coding, we identified the following patterns (see Table 8):
Personnel collaboration was reported most frequently (N = 147) and included the involvement of teachers in out-of-school activities, as well as professional development or joint staffing. Infrastructural collaboration (e.g., shared locations, materials, or scheduling) was the second most frequently mentioned type (N = 104). Among these, location-related collaborations dominated, featuring in 86 programs that referred to the use of school facilities for out-of-school activities or vice versa. Recruitment-based collaboration was also common (N = 87), often involving schools recommending participants, distributing information, or selecting students for external programs. Curricular alignment was explicitly reported in 71 programs, frequently framed as efforts to link the out-of-school content to state or national standards. Funding arrangements were mentioned in 27 cases, typically involving joint sponsorships or compensation for school-based involvement. Though more difficult to identify unambiguously, didactic collaboration appeared in 16 programs, typically when methods were co-developed with school personnel or adapted to classroom practices. 31 programs, finally, reported on a collaboration in more generic terms, i.e., they mentioned partnerships with universities, industry, or other institutions, but did not further specify the functional mechanisms.
In sum, infrastructural, recruitment, and personnel collaborations emerged as the most frequently reported forms of in-school–out-of-school linkage in STEM programs.

5.5. Variation in Collaboration Patterns

This section examines how collaboration patterns vary across program and study characteristics, with a focus on potential systemic or ecological influences on the enactment and reporting of collaborative practices. Such patterns may reflect underlying structural or contextual constraints—for example, the timing of a program, its disciplinary focus, or the setting in which it is implemented. By analyzing these dimensions, we identify where collaboration most likely occurs and what forms it takes in different educational settings.

5.5.1. Program Setting

Collaboration between in-school and out-of-school actors was most frequently reported in programs that either spanned multiple settings or were explicitly embedded in school contexts. Among programs delivered in both in-school and out-of-school locations, 100% included reported collaboration. Programs that combined online and physical locations (e.g., ‘online, in school’, ‘out of school, in school’) also showed 100% collaboration reporting, though these categories were numerically small.
Collaboration patterns varied by delivery setting. Among the 75 in-school programs, 92% (N = 69) reported collaboration. Programs delivered outside the school premises, comprising the largest category (N = 257), had a lower collaboration rate of 63.8% (N = 164). Collaboration was also reported in 8 of the 11 online programs, though the small sample size limits interpretability.
Timing also shaped collaboration patterns. Programs conducted after school were the most frequent and had the highest collaboration rates, with 182 of 263 programs (69.2%) reporting collaboration. Those that spanned both during- and after-school time slots also frequently included collaboration (13 of 17 programs). Collaboration was reported in 38 of 47 programs conducted exclusively during school hours—another small but notable subset.
These findings suggest that programs integrated into the temporal and physical structures of school life are particularly likely to involve collaboration with school personnel or utilize existing school infrastructure. Hybrid and in-school formats, in particular, appear to invite closer coordination, perhaps due to logistical necessity or policy mandates, whereas purely external programs without formal school linkage show more variation.

5.5.2. Program Duration and Type

Collaboration rates varied considerably depending on the program’s structural format, particularly its duration and type. While collaboration was observed across all duration categories, longer programs reported collaboration more consistently. For example, programs lasting one to three months (N = 78) reported 60% collaboration. Similarly, one-to-three-week programs (N = 56) showed a high collaboration rate of 65.5%.
Programs lasting longer than six months, by contrast, were less consistent, with only 50% reporting collaboration, possibly due to the difficulty of sustaining partnerships over extended periods. Overall, the data suggest that mid-range durations (e.g., 1 week to 6 weeks) may offer optimal conditions for collaboration: long enough to justify coordination, but short enough to remain manageable.
Differences were also observed between program types. After-school programs showed the most consistent reporting, with 75% of such programs (N = 103) reporting collaboration. Other formats were more variable. For example, camps (N = 44) and summer or vacation programs (N = 80) reported collaboration rates of 59% and 55%, respectively. Similarly, workshops (N = 37) reported collaboration in only 58% of cases, and outreach programs (N = 24) had higher rates (83%), likely reflecting their community-oriented goals. Interestingly, no collaboration was reported at all in the seven programs labelled “STEMM program,” which may reflect underreporting or a structural separation between the domains.

5.5.3. Subject Areas

Collaboration rates varied across subject areas, suggesting that a program’s disciplinary content may influence both the need for and the form of in-school–out-of-school coordination. The most frequently represented programs were broadly labelled as STEM (N = 117) or science (N = 107). Within these groups, collaboration was reported in 71.4% and 75.6% of programs, respectively. While these are moderately high proportions, more specific subject areas often showed clearer or more consistent collaboration patterns.
For instance, programs focused on engineering (N = 30) had one of the highest collaboration rates (79.4%), possibly reflecting the infrastructural or institutional demands of engineering-related learning (e.g., labs, tools, or industry involvement). Similarly, mathematics programs (N = 64) and robotics programs (N = 168) also showed high collaboration rates of 78.3% and 83.3%, respectively.
Some of the highest collaboration rates were found in biology (89.5%) and STEAM (94.7%) programs, though these categories were smaller (N = 19 and 23, respectively). In contrast, computer science programs (N = 39) showed somewhat lower collaboration rates (62.9%), and technology programs (N = 13) were even less likely to report collaborative elements (50.0%). Notably, none of the medicine-related programs (N = 8) reported collaboration. Although this number is too small to support broad generalizations, it may indicate that such programs are often organized independently by hospitals or medical faculties, without established links to schools. These patterns suggest that collaboration is not evenly distributed across subject areas. Fields that involve technical infrastructure (e.g., engineering), interdisciplinary integration (e.g., STEAM), or project-based learning (e.g., robotics) may more naturally invite cooperation between formal and informal learning providers.

5.5.4. Research and Target Group Characteristics

Collaboration reporting was also influenced by the study design and the program’s target population. Collaboration was reported across all major research designs, with similar overall frequencies. Qualitative studies—the most frequent type (N = 154)—reported collaboration in 73.7% of cases. Quantitative studies (N = 144) showed a nearly identical rate (75.0%), while mixed-methods studies (N = 96) reported collaboration somewhat more frequently (78.8%). Studies labelled as program descriptions or theoretical contributions were more variable in their content. For instance, “program description” articles (N = 33) reported collaboration in 57.2% of cases, and theoretical papers (N = 8) in 60.0%. These lower rates likely reflect differences in reporting conventions and objectives, rather than a lack of collaboration in practice.
Target group classification varied widely, with some entries using general terms like “students” and others referring to specific age groups or grade levels. Programs targeting middle school students, specifically grades 4–8, showed particularly high collaboration rates, reaching 100% in several categories. For example, all reviewed programs directed at “students (grades 6–8)” (N = 46) and “middle and high school students” (N = 4) reported collaboration. In contrast, more general entries, such as “students” (N = 21) and “high school students” (N = 10), reported lower rates of 52.4% and 50.0%, respectively. Broader labels may correspond to less detailed program descriptions, where collaborative components may not have been emphasized or explicitly stated. Programs with clearly defined age groups or a narrower educational focus were more likely to report structured collaboration, possibly reflecting the design intent and more precise reporting in studies with targeted populations.

6. Discussion

As conceptualized in the literature on STEM learning ecologies and boundary crossing, collaboration between in-school and out-of-school actors is increasingly recognized as essential to fostering equitable and engaging STEM education (National Academies of Sciences, Engineering, and Medicine, 2016). Nevertheless, while these collaborations are growing, research has not fully explored how they are described, organized, or conceptualized (Denton & Borrego, 2021; Salame et al., 2025). This scoping review aims to clarify these gaps by mapping the landscape of collaborations between in-school and out-of-school actors in STEM education programs. Drawing on 469 programs described across 398 studies, our analysis reveals that collaboration is a frequently reported feature, with approximately 73% of programs including some form of cooperative structure. However, the nature and depth of these collaborations vary considerably.
The most common forms of collaboration involved infrastructure sharing, recruitment coordination, and personnel exchange. Less frequently, studies reported curricular alignment, co-developed didactics, or shared funding mechanisms. Importantly, collaboration was not evenly distributed across program contexts. It was most prevalent in hybrid or in-school settings, after-school formats, and programs of moderate duration (e.g., one to six weeks). Specific subject areas—particularly engineering, STEAM, and robotics—showed higher collaboration rates, while others, such as medicine or computer science, reported fewer structured linkages.
Study-level patterns echoed this variability. Mixed-methods studies reported collaboration slightly more frequently than qualitative or quantitative designs, and programs with clearly defined target groups (e.g., middle school learners) tended to document more collaboration than those with broader or ambiguous participant labels.
These findings suggest that while collaboration is widely adopted in principle, its implementation is influenced by program structure, disciplinary culture, and reporting practices. The review not only highlights where collaboration occurs, but also exposes where it is absent, underdeveloped, or inconsistently reported—laying the foundation for the conceptual and practical insights discussed in the following sections.
The findings of this review point to a notable conceptual asymmetry in how collaboration between in-school and out-of-school actors is approached in STEM education research. While collaboration is frequently mentioned, few studies embed it in a theoretical framework that explains its mechanisms, outcomes, or conditions for success (Epstein et al., 2018). Instead, most studies treat collaboration as an organizational feature, described in practical terms (e.g., shared facilities, joint recruitment) but not examined conceptually or theoretically (Hofstein & Rosenfeld, 1996; De Jong et al., 2022). This concern echoes recent critiques of the “ecosystem” metaphor, which highlight that prevailing models often obscure the structural inequalities and competitive dynamics that shape collaboration between formal and informal learning sectors (Archer et al., 2025). This lack of theorization is particularly evident in the less frequently reported categories, such as didactic co-development, curricular alignment, or joint instructional design. These forms of collaboration are arguably the most pedagogically consequential, yet they are also the most conceptually opaque in the literature. Their rarity may reflect implementation challenges and the absence of guiding constructs, such as those offered by the literature on boundary crossing, distributed expertise, or networked learning, that could help interpret these more complex forms of pedagogical collaboration (Akkerman & Bakker, 2011).
Moreover, the variation across disciplines suggests that subject-specific teaching and learning cultures influence how collaboration is imagined and enacted (Becher & Trowler, 2002). For instance, engineering programs often involved infrastructural sharing, while biology and STEAM initiatives were more likely to feature co-developed programming. In sum, the review highlights the need to move beyond viewing collaboration as a logistical arrangement and toward conceptualizing it as a pedagogical practice shaped by relationships, epistemologies, and disciplinary norms.
The review highlights several methodological features that characterize current research on in-school–out-of-school collaboration in STEM education. One striking observation is the uneven quality and precision with which collaboration is reported. In many studies, it remains unclear how collaboration was initiated, who the relevant actors were, or what roles they played. Even when explicitly mentioned, collaboration is often described in broad terms, with few details about its organizational or pedagogical structure. This lack of specification makes it challenging to evaluate how the collaboration functioned, or to link its structure to the outcomes examined, highlighting a need for more consistent reporting standards in line with existing guidance for scoping syntheses (Tricco et al., 2018).
The reviewed literature leans heavily on qualitative approaches, with case studies and descriptive program evaluations being the most frequently cited. These studies offer rich contextual information and often capture the situated realities of STEM programming. However, they rarely enable comparisons across programs or provide a basis for generalizations (Sutton & Austin, 2015). Quantitative and mixed-methods studies are also present, but designs involving control groups, follow-ups, or longitudinal components were comparatively rare. As a result, very little is known about the longer-term effects of collaboration or the stability of partnership structures over time.
Another issue concerns the definition and operationalization of key variables. Constructs such as engagement, motivation, or identity are frequently used but not always clearly distinguished. Program characteristics, such as duration or delivery setting, are inconsistently reported and often ambiguous. These inconsistencies complicate efforts to synthesize findings across studies and limit the potential for cumulative knowledge building (Banerjee et al., 2024; Li et al., 2020).
Finally, there is relatively little use of theoretically informed coding frameworks, particularly in relation to the collaborative dimension itself. While this review introduces a seven-part typology of collaboration, the lack of comparable frameworks in the existing literature meant that few studies could be confidently mapped onto such a structure. Going forward, the field would benefit from more consistent use of shared descriptors and analytic categories, especially in multi-case and comparative research.
In summary, the reviewed literature highlights the growing importance and conceptual fragility of school-external collaboration in STEM education. Partnerships are frequently mentioned, but how they function and why they matter is often underexplored, and rarely situated within a broader theoretical framework. Advancing this field will require stronger conceptual frameworks, more precise reporting, and deliberate efforts to bridge disciplinary and institutional boundaries (Akkerman & Bakker, 2011). As this review suggests, boundary-crossing and ecological perspectives offer useful tools for conceptualizing collaboration not merely as logistical coordination but as a pedagogical and systemic practice.

6.1. Limitations

This review has several limitations, some inherent to the scoping review methodology (Tricco et al., 2018), while others are more specific to our topic and data sources. One limitation concerns the scope of the literature search. Although the strategy was carefully developed, it focused on selected educational and psychological databases. Studies published in other fields or in non-indexed outlets, such as institutional reports or dissertations, may not have been fully captured, particularly those outside the English-speaking academic world (Beller et al., 2013).
A third limitation pertains to how collaboration is reported in the literature. Our coding relied on what was explicitly described in the articles. It is likely that some forms of collaboration—especially less formal or incidental ones—took place but were not documented in sufficient detail to be identified (Penuel et al., 2015). This is particularly relevant for more integrated forms of cooperation, such as shared lesson planning or informal professional exchange, which are often underreported.
A third issue concerns the coding process itself. While we worked with a predefined framework and ensured interrater agreement during the initial stages (Campbell et al., 2020), applying this framework to such a diverse body of literature inevitably involved judgment calls. Not all program descriptions mapped easily onto our categories, and the granularity of reporting varied widely between studies.
Finally, although we could categorize the presence and type of collaboration, we did not—and could not—assess its depth, quality, or effectiveness. The studies reviewed rarely provided the kind of rich process data that would allow for such evaluations (Durlak & DuPre, 2008). As a result, the findings presented here should be understood as descriptive rather than evaluative. Nevertheless, the present review constitutes an important first step toward understanding the prevalence of collaboration patterns. Building on this overview, future research should investigate the effectiveness of different forms of collaboration in greater depth.

6.2. Directions for Future Research

The review not only reveals where collaboration in STEM education is already well-documented but also where it remains underexplored or inconsistently addressed. While many programs included some form of partnership between in-school and out-of-school actors, several areas stood out due to their lack of reporting or unexpectedly sparse patterns, given the collaborative demands of the field.
To highlight some of these areas, we compiled a set of high-priority evidence gaps—combinations of program features and collaboration types that were either rarely observed, insufficiently described, or entirely missing. These gaps were identified not only based on low frequency but also in light of broader expectations rooted in educational theory, policy documents, and practice-oriented literature. Table 9 summarizes the selected gaps and provides brief justifications for why they warrant further attention.
While the list is not exhaustive, it provides a starting point for researchers, funders, and practitioners seeking to extend the field in productive directions. Several of these gaps point to areas of high potential relevance—both practically and conceptually—but have received limited attention to date. Future work might focus on understanding why these gaps exist, whether they reflect barriers in implementation, reporting, or conceptual framing, and how they might be addressed through targeted empirical or design-based research. Future research might draw more explicitly on conceptual lenses such as boundary crossing or ecological models of STEM learning to analyze how collaboration functions across institutional divides and to design more integrated partnership models.

7. Conclusions

This scoping review examined how collaboration between in-school and out-of-school actors is represented in the STEM education literature. Drawing on a broad and diverse body of studies, it shows that while collaboration is widely reported, it is often described in general terms and unevenly distributed across program types, subject areas, and research designs. Some forms of collaboration, such as shared infrastructure and recruitment, are frequently observed. Others, including didactic co-development or funding partnerships, are much less common.
The review also identifies areas where collaboration is largely absent, or, at least, not reported or documented, despite being educationally or institutionally relevant. While we do not claim that these omissions represent the entire practice landscape, they highlight collaboration types and contexts that remain underrepresented in the research. Making these gaps more visible may help shape future inquiry and draw attention to dimensions of partnership work that have so far received limited focus. The typology developed in this review may also serve as a conceptual scaffold for future studies seeking to theorize collaboration in STEM education more systematically.

Supplementary Materials

The list of publications included in this scoping review can be downloaded at: https://www.mdpi.com/article/10.3390/educsci15111513/s1.

Author Contributions

A.Z. conceptualized the study together with H.S. and played the lead role in writing the manuscript. D.W. coordinated the analysis procedure. D.W. and M.S. conducted the data extraction and annotation and contributed to writing the methodology and results sections. All authors have read and agreed to the published version of the manuscript.

Funding

The authors declare that the research was supported by the German Federal Ministry of Education and Research (BMBF) under project grant numbers 16DWMQP02A and 16DWMQP02B. The responsibility for the content of this publication lies with the authors.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The dataset used and analyzed during the current study is available from the corresponding author upon reasonable request.

Acknowledgments

We are very grateful to Charlotte Popp, Fabian Heller, Sophia Horina, Nicholas Sagberger, and Amelie Tahedl for assistance in data extraction, data annotation, and data curation. The authors have reviewed and edited the output and take full responsibility for the content of this publication.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Search Strings (Carried Out on 28 June 2024)

  • Web of Science
  • ((TI = ((STEM OR STEAM OR STREAM OR STEMM OR iSTEM OR SMET OR pSTEM OR science OR technology OR engineering OR mathematics OR math OR maths OR biology OR chemistry OR physics OR computer) AND (out-of-school OR informal OR nonformal OR after-school OR (summer NEAR/2 camp) OR (summer NEAR/2 program) OR (optional NEAR/2 experience) OR workshop OR (outreach NEAR/2 activity) OR museum OR zoo OR science NEAR/2 center OR aquarium OR botanic garden OR club OR science fair OR exhibition OR field trip OR extracurricular OR fab lab OR makerspace OR out-of classroom) AND (program OR intervention OR initiative OR offering OR cooperation OR collaboration)))) OR AB = ((STEM OR STEAM OR STREAM OR STEMM OR iSTEM OR SMET OR pSTEM OR science OR technology OR engineering OR mathematics OR math OR maths OR biology OR chemistry OR physics OR computer) AND (out-of-school OR informal OR nonformal OR after-school OR (summer NEAR/2 camp) OR (summer NEAR/2 program) OR (optional NEAR/2 experience) OR workshop OR (outreach NEAR/2 activity) OR museum OR zoo OR science NEAR/2 center OR aquarium OR botanic garden OR club OR science fair OR exhibition OR field trip OR extracurricular OR fab lab OR makerspace OR out-of classroom) AND (program OR intervention OR initiative OR offering OR cooperation OR collaboration))
  • ERIC
  • ((TI = ((STEM OR STEAM OR STREAM OR STEMM OR iSTEM OR SMET OR pSTEM OR science OR technology OR engineering OR mathematics OR math OR maths OR biology OR chemistry OR physics OR computer) AND (out-of-school OR informal OR nonformal OR after-school OR (summer NEAR/2 camp) OR (summer NEAR/2 program) OR (optional NEAR/2 experience) OR workshop OR (outreach NEAR/2 activity) OR museum OR zoo OR science NEAR/2 center OR aquarium OR botanic garden OR club OR science fair OR exhibition OR field trip OR extracurricular OR fab lab OR makerspace OR out-of classroom) AND (program OR intervention OR initiative OR offering OR cooperation OR collaboration)))) OR AB = ((STEM OR STEAM OR STREAM OR STEMM OR iSTEM OR SMET OR pSTEM OR science OR technology OR engineering OR mathematics OR math OR maths OR biology OR chemistry OR physics OR computer) AND (out-of-school OR informal OR nonformal OR after-school OR (summer NEAR/2 camp) OR (summer NEAR/2 program) OR (optional NEAR/2 experience) OR workshop OR (outreach NEAR/2 activity) OR museum OR zoo OR science NEAR/2 center OR aquarium OR botanic garden OR club OR science fair OR exhibition OR field trip OR extracurricular OR fab lab OR makerspace OR out-of classroom) AND (program OR intervention OR initiative OR offering OR cooperation OR collaboration))
  • PsycINFO/PSYNDEX
  • TI ((STEM OR STEAM OR STREAM OR STEMM OR iSTEM OR SMET OR pSTEM OR science OR technology OR engineering OR mathematics OR math OR maths OR biology OR chemistry OR physics OR computer) AND (out-of-school OR informal OR nonformal OR after-school OR (summer camp) OR (summer program) OR (optional experience) OR workshop OR (outreach activity) OR museum OR zoo OR science center OR aquarium OR botanic garden OR club OR science fair OR exhibition OR field trip OR extracurricular OR fab lab OR makerspace OR out-of classroom) AND (program OR intervention OR initiative OR offering OR cooperation OR collaboration)) OR AB ((STEM OR STEAM OR STREAM OR STEMM OR iSTEM OR SMET OR pSTEM OR science OR technology OR engineering OR mathematics OR math OR maths OR biology OR chemistry OR physics OR computer) AND (out-of-school OR informal OR nonformal OR after-school OR (summer camp) OR (summer program) OR (optional experience) OR workshop OR (outreach activity) OR museum OR zoo OR science center OR aquarium OR botanic garden OR club OR science fair OR exhibition OR field trip OR extracurricular OR fab lab OR makerspace OR out-of classroom) AND (program OR intervention OR initiative OR offering OR cooperation OR collaboration))

Note

1
This refers to, for instance, an abstract mentioning students without enough context to say whether these are school or university students. We included those articles, checked the full texts, and then excluded them if the students were related to the university.

References

  1. Akkerman, S. F., & Bakker, A. (2011). Boundary crossing and boundary objects. Review of Educational Research, 81(2), 132–169. [Google Scholar] [CrossRef]
  2. Allen, P. J., Chang, R., Gorrall, B. K., Waggenspack, L., Fukuda, E., Little, T. D., & Noam, G. G. (2019). From quality to outcomes: A national study of afterschool STEM programming. International Journal of STEM Education, 6(1), 37. [Google Scholar] [CrossRef]
  3. Anderson-Butcher, D., Bates, S., Lawson, H. A., Childs, T. M., & Iachini, A. L. (2022). The community collaboration model for school improvement: A scoping review. Education Sciences, 12(12), 918. [Google Scholar] [CrossRef]
  4. Archer, L., DeWitt, J., Osborne, J., Dillon, J., Willis, B., & Wong, B. (2013). ‘Not girly, not sexy, not glamorous’: Primary school girls’ and parents’ constructions of science aspirations. Pedagogy, Culture & Society, 21(1), 171–194. [Google Scholar] [CrossRef]
  5. Archer, L., Freedman, E., Nag Chowdhuri, M., DeWitt, J., Garcia Gonzalez, F., & Liu, Q. (2025). From STEM learning ecosystems to STEM learning markets: Critically conceptualising relationships between formal and informal STEM learning provision. International Journal of STEM Education, 12(1), 22. [Google Scholar] [CrossRef]
  6. Banerjee, P., Graham, L., & Given, G. (2024). A systematic literature review identifying inconsistencies in the inclusion of subjects in research reports on STEM workforce skills in the UK. Cogent Education, 11(1), 2288736. [Google Scholar] [CrossRef]
  7. Becher, T., & Trowler, P. (2002). Academic tribes and territories: Intellectual enquiry and the cultures of disciplines (2nd ed.). Open University Press. [Google Scholar]
  8. Bell, P., Lewenstein, B., Shouse, A. W., & Feder, M. A. (Eds.). (2009). Learning science in informal environments: People, places, and pursuits. National Academies Press. [Google Scholar] [CrossRef]
  9. Beller, E. M., Glasziou, P. P., Altman, D. G., Hopewell, S., Bastian, H., Chalmers, I., Gøtzsche, P. C., Lasserson, T., & Tovey, D. (2013). Prisma for Abstracts: Reporting systematic reviews in journal and conference abstracts. PLoS Medicine, 10(4), e1001419. [Google Scholar] [CrossRef]
  10. Braund, M., & Reiss, M. (2006). Towards a more authentic science curriculum: The contribution of out-of-school learning. International Journal of Science Education, 28(12), 1373–1388. [Google Scholar] [CrossRef]
  11. Breiner, J. M., Harkness, S. S., Johnson, C. C., & Koehler, C. M. (2012). What is STEM? A discussion about conceptions of STEM in education and partnerships. School Science and Mathematics, 112(1), 3–11. [Google Scholar] [CrossRef]
  12. Bryan, J., & Griffin, D. (2010). A multidimensional study of school-family-community partnership involvement: School, school counselor, and training factors. Professional School Counseling, 14(1), 75–86. [Google Scholar] [CrossRef]
  13. Burke, L. E. C.-A., & Navas Iannini, A. M. (2021). Science engagement as insight into the science identity work nurtured in community-based science clubs. Journal of Research in Science Teaching, 58(9), 1425–1454. [Google Scholar] [CrossRef]
  14. Bybee, R. W. (2010). Advancing STEM education: A 2020 vision. Technology and Engineering Teacher, 70(1), 30–35. [Google Scholar]
  15. Campbell, M., McKenzie, J. E., Sowden, A., Katikireddi, S. V., Brennan, S. E., Ellis, S., Hartmann-Boyce, J., Ryan, R., Shepperd, S., Thomas, J., Welch, V., & Thomson, H. (2020). Synthesis without meta-analysis (SWiM) in systematic reviews: Reporting guideline. BMJ, 368, l6890. [Google Scholar] [CrossRef]
  16. Chen, C., Hardjo, S., Sonnert, G., Hui, J., & Sadler, P. M. (2023). The role of media in influencing students’ STEM career interest. International Journal of STEM Education, 10(1), 56. [Google Scholar] [CrossRef]
  17. Chiu, T. K. F., Li, Y., Ding, M., Hallström, J., & Koretsky, M. D. (2025). A decade of research contributions and emerging trends in the international journal of STEM education. International Journal of STEM Education, 12(1), 12. [Google Scholar] [CrossRef]
  18. Clark, M. A., & Breman, J. C. (2009). School counselor inclusion: A collaborative model to provide academic and social-emotional support in the classroom setting. Journal of Counseling & Development, 87(1), 6–11. [Google Scholar] [CrossRef]
  19. Crane, V., Chen, M., Bitgood, S., Serrel, B., Thompson, D., Nicholson, H., Weiss, F., & Campbell, P. (1994). Informal science learning: What the research says about television, science museums, and community-based projects. Science Press. [Google Scholar]
  20. Davis, K., Fitzgerald, A., Power, M., Leach, T., Martin, N., Piper, S., Singh, R., & Dunlop, S. (2023). Understanding the conditions informing successful STEM clubs: What does the evidence base tell us? Studies in Science Education, 59(1), 1–23. [Google Scholar] [CrossRef]
  21. De Jong, L., Meirink, J., & Admiraal, W. (2022). School-based collaboration as a learning context for teachers: A systematic review. International Journal of Educational Research, 112, 101927. [Google Scholar] [CrossRef]
  22. Denton, M., & Borrego, M. (2021). Funds of knowledge in STEM education: A scoping review. Studies in Engineering Education, 1(2), 71–92. [Google Scholar] [CrossRef]
  23. Dierking, L. D., & Falk, J. H. (2003). Optimizing out-of-school time: The role of free-choice learning. New Directions for Youth Development, 2003(97), 75–88. [Google Scholar] [CrossRef]
  24. Dou, R., Villa, N., Cian, H., Sunbury, S., Sadler, P. M., & Sonnert, G. (2025). Unlocking STEM identities through family conversations about topics in and beyond STEM: The contributions of family communication patterns. Behavioral Sciences, 15(2), 106. [Google Scholar] [CrossRef]
  25. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3–4), 327–350. [Google Scholar] [CrossRef] [PubMed]
  26. Eccles, J. S., Barber, B. L., Stone, M., & Hunt, J. (2003). Extracurricular activities and adolescent development. Journal of Social Issues, 59(4), 865–889. [Google Scholar] [CrossRef]
  27. English, L. D. (2017). Advancing elementary and middle school STEM education. International Journal of Science and Mathematics Education, 15(1), 5–24. [Google Scholar] [CrossRef]
  28. Epstein, J. L., Sanders, Mavis, G., Sheldon, S. B., Simon, B. S., Clark Salinas, K., Rodriguez Jansorn, N., van Voorhis, F. L., Martin, C. S., Thomas, B. G., Greenfeld, M. D., Hutchins, D. J., & Williams, K. J. (2018). School, family, and community partnerships: Your handbook for action (4th ed.). SAGE Publications Company. [Google Scholar]
  29. Falk, J., & Dierking, L. (2010). The 95 percent solution: School is not where most Americans learn most of their science. American Scientist, 98(6), 486–493. [Google Scholar] [CrossRef]
  30. Fallik, O., Rosenfeld, S., & Eylon, B.-S. (2013). School and out-of-school science: A model for bridging the gap. Studies in Science Education, 49(1), 69–91. [Google Scholar] [CrossRef]
  31. Falloon, G., Stevenson, M., Hatisaru, V., Hurrell, D., & Boden, M. (2024). Principal leadership and proximal processes in creating STEM ecosystems: An Australian case study. Leadership and Policy in Schools, 23(2), 180–202. [Google Scholar] [CrossRef]
  32. Fan, X., & Chen, M. (2001). Parental involvement and students’ academic achievement: A meta-analysis. Educational Psychology Review, 13(1), 1–22. [Google Scholar] [CrossRef]
  33. Feldman, A. F., & Matjasko, J. L. (2005). The role of school-based extracurricular activities in adolescent development: A comprehensive review and future directions. Review of Educational Research, 75(2), 159–210. [Google Scholar] [CrossRef]
  34. Fischer, N., Theis, D., & Zücher, I. (2014). Narrowing the gap? The role of all-day schools in reducing educational inequality in Germany. International Journal for Research on Extended Education, 2(1), 79–96. [Google Scholar] [CrossRef]
  35. Foster, K. M., Bergin, K. B., McKenna, A. F., Millard, D. L., Perez, L. C., Prival, J. T., Rainey, D. Y., Sevian, H. M., VanderPutten, E. A., & Hamos, J. E. (2010). Science education: Partnerships for STEM education. Science, 329(5994), 906–907. [Google Scholar] [CrossRef] [PubMed]
  36. Godec, S., Archer, L., & Dawson, E. (2022). Interested but not being served: Mapping young people’s participation in informal STEM education through an equity lens. Research Papers in Education, 37(2), 221–248. [Google Scholar] [CrossRef]
  37. Gupta, R., Voiklis, J., Rank, S. J., La Dwyer, J. D. T., Fraser, J., Flinner, K., & Nock, K. (2020). Public perceptions of the STEM learning ecology—Perspectives from a national sample in the US. International Journal of Science Education, Part B, 10(2), 112–126. [Google Scholar] [CrossRef]
  38. He, L., Murphy, L., & Luo, J. (2016). Using social media to promote STEM education: Matching college students with role models. In B. Berendt, B. Bringmann, É. Fromont, G. Garriga, P. Miettinen, N. Tatti, & V. Tresp (Eds.), Machine learning and knowledge discovery in databases (pp. 79–95). Springer International Publishing. [Google Scholar] [CrossRef]
  39. Hofstein, A., & Rosenfeld, S. (1996). Bridging the gap between formal and informal science learning. Studies in Science Education, 28(1), 87–112. [Google Scholar] [CrossRef]
  40. Honey, M., Pearson, G., & Schweingruber, H. (Eds.). (2014). STEM integration in K-12 education. National Academies Press. [Google Scholar] [CrossRef]
  41. Itzek-Greulich, H., & Vollmer, C. (2017). Emotional and motivational outcomes of lab work in the secondary intermediate track: The contribution of a science center outreach lab. Journal of Research in Science Teaching, 54(1), 3–28. [Google Scholar] [CrossRef]
  42. Jaggy, A.-K., Wagner, W., Fütterer, T., Göllner, R., & Trautwein, U. (2025). Teaching quality in STEM education: Differences between in- and out-of-school contexts from the perspective of gifted students. International Journal of STEM Education, 12(1), 53. [Google Scholar] [CrossRef]
  43. Jeynes, W. (2012). A meta-analysis of the efficacy of different types of parental involvement programs for urban students. Urban Education, 47(4), 706–742. [Google Scholar] [CrossRef]
  44. Jones, A. L., & Stapleton, M. K. (2017). 1.2 million kids and counting: Mobile science laboratories drive student interest in STEM. PLoS Biology, 15(5), e2001692. [Google Scholar] [CrossRef]
  45. Krishnamurthi, A., Ballard, M., & Noam, G. G. (2014). Examining the impact of afterschool STEM programs. Available online: http://files.eric.ed.gov/fulltext/ED546628.pdf (accessed on 14 January 2025).
  46. Li, Y., Wang, K., Xiao, Y., Froyd, J. E., & Nite, S. B. (2020). Research and trends in STEM education: A systematic analysis of publicly funded projects. International Journal of STEM Education, 7(1), 17. [Google Scholar] [CrossRef]
  47. Li, Y., Wang, K., Xiao, Y., & Wilson, S. M. (2022). Trends in highly cited empirical research in STEM education: A literature review. Journal for STEM Education Research, 5(3), 303–321. [Google Scholar] [CrossRef]
  48. Mahoney, J. L., Larson, R. W., & Eccles, J. S. (Eds.). (2005). Organized activities as contexts of development: Extracurricular activities, after-school and community programs. Lawrence Erlbaum. [Google Scholar]
  49. Maltese, A. V., & Tai, R. H. (2011). Pipeline persistence: Examining the association of educational experiences with earned degrees in STEM among U.S. students. Science Education, 95(5), 877–907. [Google Scholar] [CrossRef]
  50. Margherio, C., Doten-Snitker, K., Williams, J., Litzler, E., Andrijcic, E., & Mohan, S. (2020). Cultivating strategic partnerships to transform STEM education. In K. White, A. Beach, N. Finkelstein, C. Henderson, S. Simpkins, L. Slakey, M. Stains, G. Weaver, & L. Whitehead (Eds.), Transforming institutions: Accelerating systemic change in higher education (pp. 177–188). Pressbooks. [Google Scholar]
  51. Maschke, S., & Stecher, L. (2018). Non-formale und informelle Bildung. In A. Lange, H. Reiter, S. Schutter, & C. Steiner (Eds.), Handbuch kindheits- und jugendsoziologie (pp. 149–163). Springer VS Wiesbaden. [Google Scholar] [CrossRef]
  52. McHugh, M. L. (2012). Interrater reliability: The kappa statistic. Biochemia Medica, 22(3), 276–282. [Google Scholar] [CrossRef]
  53. McMullen, J. M., George, M., Ingman, B. C., Pulling Kuhn, A., Graham, D. J., & Carson, R. L. (2020). A systematic review of community engagement outcomes research in school-based health interventions. The Journal of School Health, 90(12), 985–994. [Google Scholar] [CrossRef]
  54. Moehlman, A. H. (2012). Comparative educational systems. Literary Licensing, LLC. [Google Scholar]
  55. Moore, E. M., Hock, A., Bevan, B., & Taylor, K. H. (2022). Measuring STEM learning in after-school summer programs: Review of the literature. Journal of Youth Development, 17(2), 75–105. [Google Scholar] [CrossRef]
  56. Mu, G. M., Gordon, D., Xu, J., Cayas, A., & Madesi, S. (2023). Benefits and limitations of partnerships amongst families, schools and universities: A systematic literature review. International Journal of Educational Research, 120, 102205. [Google Scholar] [CrossRef]
  57. Munn, Z., Peters, M. D. J., Stern, C., Tufanaru, C., McArthur, A., & Aromataris, E. (2018). Systematic review or scoping review? Guidance for authors when choosing between a systematic or scoping review approach. BMC Medical Research Methodology, 18(1), 143. [Google Scholar] [CrossRef] [PubMed]
  58. Murawski, W. W., & Lochner, W. W. (2011). Observing co-teaching: What to ask for, look for, and listen for. Intervention in School and Clinic, 46(3), 174–183. [Google Scholar] [CrossRef]
  59. National Academies of Sciences, Engineering, and Medicine. (2016). Promising practices for strengthening the regional STEM workforce development ecosystem. The National Academies Press. [Google Scholar] [CrossRef]
  60. Noam, G. G., Biancarosa, G., & Dechausay, N. (2002). Afterschool education: Approaches to an emerging field. Harvard Education Press. [Google Scholar]
  61. Noam, G. G., & Tillinger, J. R. (2004). After-school as intermediary space: Theory and typology of partnerships. New Directions for Youth Development, 2004(101), 75–113. [Google Scholar] [CrossRef]
  62. Penuel, W. R., Allen, A.-R., Coburn, C. E., & Farrell, C. (2015). Conceptualizing research–practice partnerships as joint work at boundaries. Journal of Education for Students Placed at Risk, 20(1–2), 182–197. [Google Scholar] [CrossRef]
  63. Sahin, A., Ayar, M. C., & Adiguzel, T. (2014). STEM related after-school program activities and associated outcomes on student learning. Educational Sciences: Theory & Practice, 14(1), 309–322. [Google Scholar] [CrossRef]
  64. Salame, A. H., Tengku Shahdan, T. S., Kayode, B. K., & Pek, L. S. (2025). Enhancing STEM education in rural schools through play activities: A scoping review. International Journal on Studies in Education, 7(1), 103–124. [Google Scholar] [CrossRef]
  65. Scott-Little, C., Hamann, M. S., & Jurs, S. G. (2002). Evaluations of after-school programs: A meta-evaluation of methodologies and narrative synthesis of findings. The American Journal of Evaluation, 23(4), 387–419. [Google Scholar] [CrossRef]
  66. So, W. W. M., Zhan, Y., Chow, S. C. F., & Leung, C. F. (2018). Analysis of STEM activities in primary students’ science projects in an informal learning environment. International Journal of Science and Mathematics Education, 16(6), 1003–1023. [Google Scholar] [CrossRef]
  67. Staus, N., Riedinger, K., & Storksdieck, M. (2023). Informal STEM learning. In R. J. Tierney, F. Rizvi, & K. Ercikan (Eds.), International Encyclopedia of Education (pp. 244–250). Elsevier. [Google Scholar] [CrossRef]
  68. Stoeger, H., Heilemann, M., Debatin, T., Hopp, M. D. S., Schirner, S., & Ziegler, A. (2021). Nine years of online mentoring for secondary school girls in STEM: An empirical comparison of three mentoring formats. Annals of the New York Academy of Sciences 1483, 153–173. [Google Scholar] [CrossRef]
  69. Sutton, J., & Austin, Z. (2015). Qualitative research: Data collection, analysis, and management. The Canadian Journal of Hospital Pharmacy, 68(3), 226–231. [Google Scholar] [CrossRef]
  70. Torres, C. A., Arnove, R. F., & Misiaszek, L. (Eds.). (2022). Comparative education: The dialectic of the global and the local (5th ed.). Rowman & Littlefield. [Google Scholar]
  71. Tricco, A. C., Lillie, E., Zarin, W., O’Brien, K. K., Colquhoun, H., Levac, D., Moher, D., Peters, M. D. J., Horsley, T., Weeks, L., Hempel, S., Akl, E. A., Chang, C., McGowan, J., Stewart, L., Hartling, L., Aldcroft, A., Wilson, M. G., Garritty, C., … Straus, S. E. (2018). Prisma extension for scoping reviews (PRISMA-ScR): Checklist and explanation. Annals of Internal Medicine, 169(7), 467–473. [Google Scholar] [CrossRef]
  72. UNESCO. (2021). Engineering for sustainable development: Delivering on the sustainable development goals. UNESCO. Available online: https://unesdoc.unesco.org/ark:/48223/pf0000375644/PDF/375644eng.pdf.multi (accessed on 14 January 2025).
  73. Vadeboncoeur, J. A. (2006). Engaging young people: Learning in informal contexts. Review of Research in Education, 30(1), 239–278. [Google Scholar] [CrossRef]
  74. Vangrieken, K., Dochy, F., Raes, E., & Kyndt, E. (2015). Teacher collaboration: A systematic review. Educational Research Review, 15, 17–40. [Google Scholar] [CrossRef]
  75. Walker, S., & Bond, C. (2025). Strength in partnership: A systematic review of key characteristics underpinning home-school collaboration. European Journal of Special Needs Education, 1–19. [Google Scholar] [CrossRef]
  76. Wang, M.-T., & Degol, J. L. (2017). Gender gap in science, technology, engineering, and mathematics (STEM): Current knowledge, implications for practice, policy, and future directions. Educational Psychology Review, 29(1), 119–140. [Google Scholar] [CrossRef] [PubMed]
  77. Watters, J. J., & Diezmann, C. M. (2016). Engaging elementary students in learning science: An analysis of classroom dialogue. Instructional Science, 44(1), 25–44. [Google Scholar] [CrossRef]
  78. Xia, X., Bentley, L. R., Fan, X., & Tai, R. H. (2025). STEM outside of school: A meta-analysis of the effects of informal science education on students’ interests and attitudes for STEM. International Journal of Science and Mathematics Education, 23(4), 1153–1181. [Google Scholar] [CrossRef]
Figure 1. PRISMA Flow Chart.
Figure 1. PRISMA Flow Chart.
Education 15 01513 g001
Table 1. Typology of In-school–Out-of-school Collaboration Categories.
Table 1. Typology of In-school–Out-of-school Collaboration Categories.
Collaboration TypeDescriptionSubcategoriesExamples
PersonnelAn exchange of personnel between in-school and out-of-school actors.People;
training, professional development
Teachers are involved in teaching in an informal learning space.
Teachers receive professional development as part of the out-of-school program.
InfrastructuralInfrastructural resources are shared between in-school and out-of-school actors.Location;
transportation;
equipment/material;
scheduling;
food
An out-of-school program takes place in the school building.
An out-of-school initiative provides laptops for the school students who participate.
An out-of-school program is scheduled to fit the school’s calendar.
Curricular The out-of-school curriculum is aligned with some in-school standards.CurriculumThe summer camp addresses mathematical concepts aligned with the state’s mathematics standards.
RecruitmentRecruitment of participants for the out-of-school program is done in collaboration with the in-school program.School;
school administration;
teachers;
school visit;
flyer;
community
Schools or the school administration are responsible for recommending participants to the out-of-school program providers.
Teachers choose one student in their class to participate in an out-of-school program;
Flyers are distributed at school to recruit participants.
DidacticsThe didactical methods for the out-of-school program are developed in collaboration with the in-school program.DidacticsThe out-of-school intervention program was co-developed by a teacher
FundingFinancial resources are shared between out-of-school and in-school actors.General funding;
compensation;
scholarships
General funding for an out-of-school program came from a university grant.
Teachers received compensation from the out-of-school program provider.
General collaborationCollaborations and partnerships on a general level between in-school and out-of-school partners were mentioned.Industry;
university;
researchers;
various organization
A cybersecurity company partnered with schools in the district to realize the out-of-school program.
The university’s physics faculty worked collaboratively with the school to conduct the out-of-school program.
Table 2. The Ten Most Frequently Reported STEM Program Types, Unified Across Labels (N = 469).
Table 2. The Ten Most Frequently Reported STEM Program Types, Unified Across Labels (N = 469).
Program TypeFrequency
After-school/extra-curricular/out-of-school program104
Summer/vacation/holiday program80
Camp44
Workshop37
(STEM) club27
Makerspace, laboratory26
Outreach program24
STEM program14
Enrichment program13
Field trip12
Other 77
Program types were grouped based on semantic similarity across reported labels (e.g., “summer camp,” “summer school,” and “summer program” were conflated/merged).
Table 3. Most Frequently Reported Subject Areas in STEM Program Studies (N = 469).
Table 3. Most Frequently Reported Subject Areas in STEM Program Studies (N = 469).
Subject AreaFrequency
STEM117
Science108
Mathematics64
Computer science39
Engineering30
STEAM23
Biology19
Robotics16
Technology13
Chemistry11
Physics10
Other122
Multiple labels are possible per study.
Table 4. Location and Timing Characteristics in STEM Program Studies (N = 469).
Table 4. Location and Timing Characteristics in STEM Program Studies (N = 469).
LocationFrequencyTimingFrequencySchool DaysFrequency
Out of school257After school263Yes148
Within School75During school47No147
Hybrid35Both17Both34
Online11No information142No information140
Online+30
No information61
“Hybrid” denotes programs delivered both on and outside of school premises. “Online+” refers to programs delivered online and either in school or outside of school premises.
Table 5. Most Frequently Reported Research Designs in STEM Program Studies (N = 398).
Table 5. Most Frequently Reported Research Designs in STEM Program Studies (N = 398).
Research DesignFrequency
Case study155
Pre-post test without control group106
Experimental design54
Descriptive48
Phenomenological design23
Ethnographic design16
Correlational study13
Other 40
Multiple measures are possible per study.
Table 6. Most Frequently Reported Measured Outcomes in STEM Program Studies (N = 398).
Table 6. Most Frequently Reported Measured Outcomes in STEM Program Studies (N = 398).
Measured OutcomeFrequency
Engagement and interest197
Knowledge and skills184
Attitudes and perceptions179
Identity and self-efficacy140
Social–emotional factors71
Other68
Outcome labels were consolidated across semantically equivalent or closely related terms (e.g., “student knowledge” and “knowledge”). Multiple measures are possible per study.
Table 7. Geographic Origins of Samples (N = 469) and Authors (N = 398) in STEM Program Studies.
Table 7. Geographic Origins of Samples (N = 469) and Authors (N = 398) in STEM Program Studies.
SampleFrequencyAuthorsFrequency
United States276United States235
United Kingdom20United Kingdom18
Turkey19Turkey17
Australia9Spain8
Germany9Australia7
Spain8Canada7
Israel7Germany6
Canada6Israel6
China6Italy5
Other63Other50
No information46No information39
Sample data are based on the number of programs (N = 469), author data are based on the number of studies (N = 398).
Table 8. Most Frequent Collaboration Types in STEM Program Studies Reporting on Collaboration (N = 354).
Table 8. Most Frequent Collaboration Types in STEM Program Studies Reporting on Collaboration (N = 354).
Collaboration TypeFrequency
Personnel147
Infrastructure104
Recruitment87
Curriculum71
Funding27
Didactics16
General (not specified)31
Multiple measures are possible per study.
Table 9. Selected Gaps in the Literature on In-school–Out-of-school STEM Collaboration.
Table 9. Selected Gaps in the Literature on In-school–Out-of-school STEM Collaboration.
Gap AreaObserved PatternNEducational Significance
Medical programsNo collaboration reported8Healthcare education typically involves cross-sector settings (e.g., hospitals and schools). A lack of collaboration suggests a missed opportunity for institutional integration.
Long-term (>1 year) programsCollaboration inconsistently reported23Long-term collaboration is central to sustainability and system change, but it remains under-theorized and empirically underdeveloped.
Didactic collaboration in computer scienceRarely documented<5As computer science rapidly expands in schools, co-designed didactics are expected under modern instructional design frameworks (e.g., CT integration).
Engineering with funding collaborationRarely described<5Engineering programs frequently require external resources and industry input. The absence of funding partnerships highlights implementation blind spots.
Programs in low- and middle-income countries (LMICs)Underrepresented in the dataset<10STEM equity frameworks (e.g., UNESCO, 2021) emphasize global inclusion; however, most research is concentrated in high-income countries.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ziegler, A.; Shiani, M.; Wengler, D.; Stoeger, H. Mapping Collaborations in STEM Education: A Scoping Review and Typology of In-School–Out-of-School Partnerships. Educ. Sci. 2025, 15, 1513. https://doi.org/10.3390/educsci15111513

AMA Style

Ziegler A, Shiani M, Wengler D, Stoeger H. Mapping Collaborations in STEM Education: A Scoping Review and Typology of In-School–Out-of-School Partnerships. Education Sciences. 2025; 15(11):1513. https://doi.org/10.3390/educsci15111513

Chicago/Turabian Style

Ziegler, Albert, Maryam Shiani, Diana Wengler, and Heidrun Stoeger. 2025. "Mapping Collaborations in STEM Education: A Scoping Review and Typology of In-School–Out-of-School Partnerships" Education Sciences 15, no. 11: 1513. https://doi.org/10.3390/educsci15111513

APA Style

Ziegler, A., Shiani, M., Wengler, D., & Stoeger, H. (2025). Mapping Collaborations in STEM Education: A Scoping Review and Typology of In-School–Out-of-School Partnerships. Education Sciences, 15(11), 1513. https://doi.org/10.3390/educsci15111513

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop