Next Article in Journal
Beyond Beliefs: Understanding Instructor Framing and the Uptake of Educational Technology in Engineering Education
Previous Article in Journal
Changing the Concept of Teaching English in Novice English Teachers Through Attentive Teaching
 
 
Review
Peer-Review Record

Methodological Choices When Assessing Summer Bridge Programs in STEM Majors: A Scoping Review

Educ. Sci. 2026, 16(2), 220; https://doi.org/10.3390/educsci16020220
by Daniela Caballero Díaz *, Avani Amin, Pako Musa and Vincent Leung
Reviewer 1: Anonymous
Educ. Sci. 2026, 16(2), 220; https://doi.org/10.3390/educsci16020220
Submission received: 5 December 2025 / Revised: 24 January 2026 / Accepted: 27 January 2026 / Published: 2 February 2026
(This article belongs to the Section Higher Education)

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

This manuscript provided a scoping review to map and describe the methodological features that have been reported in the literature on summer bridge programs (SBPs) for STEM majors. The authors were specifically interested in studies that used quantitative measures for academic performance, such as course performance evaluation, graduation and retention rates, and GPA. The study also analyzed the study design and participants in the SBP literature. This study has the potential to make important contributions to the SBP literature by providing a detailed and rigorous literature search, screening, and analysis approach to identify gaps in the current literature base and suggest areas for future research on SBPs. However, several areas require revision before the manuscript is ready for publication.

 

Strengths

  • The manuscript is well-structured and well-written.
  • Several of the tables and figures are clear and provide strong visual support for the results.
  • The methods are clear in terms of how the PRISMA and PICO frameworks were applied and adhered to.
  • The criteria for inclusion/exclusion are clear and logical.
  • The authors offer several relevant implications and direction for future work.

 

Areas for Revision

  1. Introduction
    1. A major area to be strengthened is the argument development and rationale for the study in the introduction. Currently, there is not a clear rationale for focusing exclusively on academic achievement in relation to SBPs. In addition, there needs to be a stronger argument for the need to focus on SBPs to improve STEM academic performance, specifically. The authors should provide more context related to academic persistence/performance challenges in STEM and STEM inequity challenges. As is, the argument is not clear or compelling for how/why SBPs could improve STEM student success outcomes.
    2. In the introduction, it is not clear that the review is focused on SBP studies that quantitatively measured SBP outcomes. This should be clearly stated upfront as part of the purpose statement and this decision should be explained.
    3. In the introduction, it is not clear whether the authors are talking about SBPs in general or STEM-specific SBPs. This should be clarified, and STEM-specific should be emphasized.
    4. I agree that it is helpful to understand and map the methodological design/considerations of the studies, as the authors did in this study. I understand that evaluating the effectiveness of the SBPs was not a goal of this scoping review. However, I believe the manuscript would benefit from evaluating the effectiveness of the SBPs. Without knowing the effectiveness of the SBPs that were reviewed, it is difficult to fully judge how future research can build upon the existing literature.
  2. Literature
    1. There are several areas that could use more (or more recent) literature citations. Enhancing the references will strengthen the study and better ground it in the literature.
    2. For instance, section 1.1 and its subsections contain a lot of literature that is older than 10 years. This should be reviewed and updated.
    3. The authors could add more references to Section 1.1.3 (Differences in academic content), as this section only contains one citation.
    4. The authors should cite articles more consistently throughout the Results section. There are some paragraphs and sections that have few/no references to articles and therefore lack supporting evidence for the results (e.g., section 3.3.2).
  3. Figures and Tables
    1. Figure 1 is a helpful visual and provides clear justification for the literature selection process. The box that shows “Reports Excluded” is cut off at the bottom and should be resized. Figure 1 could also be moved to the Methods section so that the process is more clearly visualized when it is being described in the text.
    2. Table 4 should be removed as the percentages do not add any new or essential information (the “reason” column repeats what is provided in Table 3, and the number of excluded studies is already included in Figure 1).
    3. Some program names are not provided in Table 5 (the list of all 33 SBPs). For consistency, it would be helpful to denote “not mentioned” or that a pseudonym was used (if applicable).
    4. I believe some of the Tables are mis-numbered in the text (e.g., Line 365, and Line 504).
  4. Writing, Flow, and Line Suggestions
    1. The title references STEM careers, and STEM careers are mentioned as one of the criteria as well. I wonder if the authors meant STEM majors…SBPs that were intended for STEM majors seems like a more accurate fit.
    2. The authors use the phrase “equity-deserving groups” a few times in the manuscript, which is not a typical word choice. The decision to use “equity-deserving groups” over other terms (e.g., underrepresented, marginalized, historically excluded) should be explained, or it should be replaced.
    3. When referring to a subsample of the articles, lowercase n should be used rather than uppercase N.
    4. In the Results section, authors should check that verbs are past tense rather than present tense.
  5. Discussion, Conclusions, and Implications
    1. A major area that could be strengthened is the discussion section, as it needs to more clearly convey the unique contributions of this study to the greater body of research. How does the study build on, compare to, and/or confirm prior research, which could include Ashley et al. (2017) and Bradford et al. (2021)? The discussion currently reads more like an implications section rather than a discussion and interpretation of the study’s results. The discussion of the results should be separate from the implications.
    2. The policy implications (starting on line 551) seem more like implications for practice or implications for higher education institutions. Policy implications give the impression that state or federal policy is involved, but that does not seem to be the authors’ intention when they refer to policy implications.

Author Response

Thank you very much for taking the time to review this manuscript. Please find the detailed responses below and the corresponding revisions/corrections highlighted/in track changes in the re-submitted files

  • Comment 1: (Introduction) A major area to be strengthened is the argument development and rationale for the study in the introduction. Currently, there is not a clear rationale for focusing exclusively on academic achievement in relation to SBPs. In addition, there needs to be a stronger argument for the need to focus on SBPs to improve STEM academic performance, specifically. The authors should provide more context related to academic persistence/performance challenges in STEM and STEM inequity challenges. As is, the argument is not clear or compelling for how/why SBPs could improve STEM student success outcomes.
    Response 1: We agree with this comment. We have revised and added context in the introduction section. In the last paragraph, where we explain what this review aims (and does not) we added the following: “We focused solely on quantitatively measured academic outcomes for two main reasons. First, this provides a solid foundation for organizing existing research using comparable indicators relevant to SBP evaluation and HEI accountability (ref). Second, there is sig-nificant diversity and little consensus on SBP implementation and its effects on students, particularly regarding non-academic outcomes (e.g., programs with role-modeling components influencing STEM identity). By emphasizing academic results, we also align with a consensus established by prior systematic reviews of SBP effectiveness (Bradford et al., 2021; Barnett et al 2012)”. [Line 216].

 

  • Comment 2: (Introduction) In the introduction, it is not clear that the review is focused on SBP studies that quantitatively measured SBP outcomes. This should be clearly stated upfront as part of the purpose statement and this decision should be explained.
    Response 2: Agree. We believe this was addressed in Comment 1.

  • Comment 3: (Introduction) In the introduction, it is not clear whether the authors are talking about SBPs in general or STEM-specific SBPs. This should be clarified, and STEM-specific should be emphasized.
    Response 3: Agree. All the examples for different implementations are for STEM-specific SBPs. We have added the following line in the section 1.1 when discussing different implementations : “Through examples of different **STEM-specific SBP implementations**, we show how different implementations translate into distinct methodological considerations when evaluating SBPs” [Line 96]

  • Comment 4: (Introduction) I agree that it is helpful to understand and map the methodological design/considerations of the studies, as the authors did in this study. I understand that evaluating the effectiveness of the SBPs was not a goal of this scoping review. However, I believe the manuscript would benefit from evaluating the effectiveness of the SBPs. Without knowing the effectiveness of the SBPs that were reviewed, it is difficult to fully judge how future research can build upon the existing literature.
    Response 4: We appreciate this comment. Since the Scoping Reviews are not intended to evaluate the quality of evidence nor the effectiveness, we cannot explicitly add the evaluation of the effectiveness. Nevertheless, we wanted to incorporate the Reviewer’s comments and therefore we have put them in the discussion section to contrast the findings of the review with the exiting literature, specifically with Bradford et al., 2021: “Although the aim of scoping reviews is to map the current body of evidence and do not evaluate the quality of the studies or effect of the interventions, we note that the five studies with the highest methodological rigor, the two experimental designs and three observational designs with matching techniques to balance the participants' background variables, are quite diverse in terms of implementation and effectiveness. For instance, two programs focused on Engineering design; one reported positive effects on retention and academic performance in courses, especially among women (Jackson et al., 2024), while the other found no impact on GPA (Kornreich-Leshem et al., 2015). Similarly, among two experimental studies on online SBPs, one reported that SBP no improvement in math skills (Chingos et al., 2017), whereas the other found positive effects on chem-istry performance and persistence (Dockter et al., 2017). The meta-analysis conducted by Bradford et al. (2021) found that program participation had a medium-sized effect on first-year overall grade point average and first-year university retention. This review complements Bradford et al. (2021)’s work by including more studies (37 vs 16) and categorizing approaches, allowing us to map which SBPs have been studied in which ways and how their varied implementations may influence outcomes.” [Line 586]

  • Comments 5: (Literature) There are several areas that could use more (or more recent) literature citations. Enhancing the references will strengthen the study and better ground it in the literature.
    For instance, section 1.1 and its subsections contain a lot of literature that is older than 10 years. This should be reviewed and updated.
    The authors could add more references to Section 1.1.3 (Differences in academic content), as this section only contains one citation.
    Response 5: Thank you for pointing this out. We have added more than 10 references from the last 10 years to different sections. Especially in the introduction and discussion sections. In particular, we have added seven references to section 1.1.3
  • Comments 6: (Literature)The authors should cite articles more consistently throughout the Results section. There are some paragraphs and sections that have few/no references to articles and therefore lack supporting evidence for the results (e.g., section 3.3.2).
    Response 6: Agree. We have revised the reference citations included in the scoping review. We carefully checked that all the claims are supported by evidence.

  • Comments 7: (Figures and Tables) Figure 1 is a helpful visual and provides clear justification for the literature selection process. The box that shows “Reports Excluded” is cut off at the bottom and should be resized. Figure 1 could also be moved to the Methods section so that the process is more clearly visualized when it is being described in the text.
    Response 7: Agree. The “Reports Excluded” box was not clearly showing the information. Thank you for pointing this out. We have resized the “Reports Excluded” box to include all reasons for exclusion. However, we prefer to keep Figure 1 in the Methods section, as the PRISMA checklist explicitly requires a “Selection of sources of evidence” subsection in the Results section, along with the corresponding PRISMA flowchart. If the Editor finds it is best to move to the Methods section, we will happily do it.

  • Comment 8: (Figures and Tables) Table 4 should be removed as the percentages do not add any new or essential information (the “reason” column repeats what is provided in Table 3, and the number of excluded studies is already included in Figure 1).
    Response 8: Agree. We have deleted Table 4 and referred to Figure 1 in the text.

  • Comment 9: (Figures and Tables) Some program names are not provided in Table 5 (the list of all 33 SBPs). For consistency, it would be helpful to denote “not mentioned” or that a pseudonym was used (if applicable).
    Response 9: Thank you for pointing this out. We have added “Not mentioned” to each program row where the name or pseudonym was not available for retrieval.

  • Comment 10: (Figures and Tables) I believe some of the Tables are mis-numbered in the text (e.g., Line 365, and Line 504).
    Response 10: Agree. We have reviewed the references for the tables in the text (considering the deletion of Table 4, due to comment 9)

  • Comment 11: (Writing, Flow, and Line Suggestions) The title references STEM careers, and STEM careers are mentioned as one of the criteria as well. I wonder if the authors meant STEM majors…SBPs that were intended for STEM majors seems like a more accurate fit.
    Response 12: Agree. We have adapted the manuscript to refer STEM majors (including the title and figures)

  • Comment 12: (Writing, Flow, and Line Suggestions) The authors use the phrase “equity-deserving groups” a few times in the manuscript, which is not a typical word choice. The decision to use “equity-deserving groups” over other terms (e.g., underrepresented, marginalized, historically excluded) should be explained, or it should be replaced.
    Response 12: Agree. This term is widely used in the country where the authors work, and we were not aware that this term is not very typical. Thank you very much for raising this issue. We have adopted the terms “marginalized” and “historically excluded” groups.

  • Comment 13: (Writing, Flow, and Line Suggestions) When referring to a subsample of the articles, lowercase n should be used rather than uppercase N.
    Response 13: Agree. Thank you for pointing this out. We have changed the uppercase N when referring to a subsample of the articles. Now it is also consistent across Figures.

  • Comment 14: (Writing, Flow, and Line Suggestions) In the Results section, authors should check that verbs are past tense rather than present tense.
    Response 14: We have revised the Results section so that the verbs are in the past tense. Thank you for this comment.

  • Comment 15: (Discussion, Conclusions, and Implications) A major area that could be strengthened is the discussion section, as it needs to more clearly convey the unique contributions of this study to the greater body of research. How does the study build on, compare to, and/or confirm prior research, which could include Ashley et al. (2017) and Bradford et al. (2021)? The discussion currently reads more like an implications section rather than a discussion and interpretation of the study’s results. The discussion of the results should be separate from the implications. The policy implications (starting on line 551) seem more like implications for practice or implications for higher education institutions. Policy implications give the impression that state or federal policy is involved, but that does not seem to be the authors’ intention when they refer to policy implications.
    Response 16: Thank you for this thoughtful comment. We have re-organized the discussion section following also Reviewers 2’s comments. We have separated the implications from the discussion and conclusions. Also, we have added the policy implications related to state or federal policy instead of HEIs policies.
    Based on the reviewers’ suggestions and the PRISMA Extension for Scoping Reviews (PRISMA-ScR), we organized the Discussion section as follows:
    • Summary of evidence: Summarize the main results (including an overview of concepts, themes, and types of evidence available), link to the review questions and objectives, and consider the relevance to key groups.
    • Limitations: Discuss the limitations of the scoping review process.
    • Conclusions: Provide a general interpretation of the results with respect to the review questions and objectives, as well as potential implications and/or next steps

      The suggestions can be found in: https://www.acpjournals.org/doi/10.7326/M18-0850

Reviewer 2 Report

Comments and Suggestions for Authors

N/A

Comments for author File: Comments.pdf

Comments on the Quality of English Language

The English language needs minor revision to improve clarity. 

Author Response

  • Comment 1: (Abstract) Clarify the research methods used and the research results
    Response 1: Agree. We added to the abstract that we followed the PRISMA framework as our methodological approach and clarified that the review identified only quantitative outcomes. Due to the 200-word limit, we were unable to include additional methodological details.
  • Comment 2: Introduction to Discussion, use the latest and most credible references
    Response 2: Consistent with Reviewer 1 Comments, we have added more recent references. In total, we added 14 new references, most of them from the last 5 years especially in the introduction and discussion sections

  • Comment 3: (Line 135) GPA is not abbreviated at the beginning
    Response 3: We have addressed this comment by spelling out ‘Grade Point Average (GPA)’ at its first occurrence in the manuscript.

  • Comment 4: (Line 135) SAT is not abbreviated at the beginning
    Response 4: We have addressed this comment by spelling out ‘Scholastic Assessment Test’  at its first occurrence in the manuscript.

  • Comment 5: (Materials and Methods section) Explain the stages of the research method used along with the reason
    Response 5: Thank you for this. We have added the following paragraph: “To conduct this review, we followed the Preferred Reporting Items for System Reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) guidelines to establish the search protocol and criteria for selecting the relevant studies (Tricco et al., 2018). Specifically, we organized the review into the standard stages of (i) search and identifi-cation of records, (ii) eligibility definition using the Population Intervention Comparison and Outcome (PICO) framework from the National Institute for Health and Clinical Excellence (NICE, 2009) to define inclusion/exclusion criteria, (iii) screening for title and abstract and duplicate removal, (iv) data extraction and (v) data synthesis of the studies included in the review (Tricco et al., 2018).” [Line 234]

  • Comment 6: (Search subsection) Give reasons why you did not limit the year in using/collecting literature from leading journals and others for the study
    Response 6: Thank you, we have added the following explanation: “Because we wanted to map the full range of methodological approaches, we chose not to have time restriction applied to publication dates” [Line 248]

  • Comment 7, 9, 10, 11, 12, 16: Each title of a table/figure is not ended with a period or other punctuation mark
    Response 7, 9, 10, 11, 12, 16: Thank you for pointing this out. We have revised and changed all the Tables and Figures to keep aligned with the Journal’s format.

  • Comment 8: (Line 233) PICO for initial writing is not abbreviated
    Response 8: Thank you for this comment. We reviewed carefully the manuscript and the first occurrence of PICO was above the selected location: “Furthermore, we followed the Population Intervention Comparison and Outcome (PICO)” 

  • Comment 13: (Table 5) Table 5 should be further simplified
    Response 13: We appreciate this comment. We have simplified the Table 5 by combining the participants’ major and subject taught columns to reduce redundancy. Also, we simplified the Eligibility and selection criteria for the program.

  • Comment 14: (Line 400) DFWI extended at the beginning
    Response 14: We have made the change accordingly.

  • Comment 15: (Line 465) ACT extended at the beginning
    Response 15: We have made the change accordingly.

  • Comment 17: (Conclusions). Conclusions contain summaries of the findings in accordance with the research objectives and their implications without any references
    Response 17: Thank you for this thoughtful comment. This comment was in alignment with Reviewer 1’s  comments. Based on the reviewers’ suggestions and the PRISMA Extension for Scoping Reviews (PRISMA-ScR), we organized the Discussion section as follows:
    • Summary of evidence: Summarize the main results (including an overview of concepts, themes, and types of evidence available), link to the review questions and objectives, and consider the relevance to key groups.
    • Limitations: Discuss the limitations of the scoping review process.
    • Conclusions: Provide a general interpretation of the results with respect to the review questions and objectives, as well as potential implications and/or next steps

      The suggestions can be found in: https://www.acpjournals.org/doi/10.7326/M18-0850

 

Round 2

Reviewer 1 Report

Comments and Suggestions for Authors

I appreciate the authors' revisions and believe the manuscript is much stronger. I do not have any additional major revisions to offer but will note the following:

There is a typo in line 131. I suggest the authors conduct a final read-through to fix other minor typos in the manuscript.

There is a missing reference in line 220.

Author Response

Comment 1:There is a typo in line 131. I suggest the authors conduct a final read-through to fix other minor typos in the manuscript.

Response 1: Thank you for this thoughtful comment. We have reviewed the document and fixed some minor typos (e.g., "framwork" in the abstract). Also, we have reworded some sentences to improve clarity. For example, "Finally, little attention has been given to program duration and the time allocated to specific activities."   

Comment 2: There is a missing reference in line 220.

Response 2: Thank you for this comment. We have added the two references that were missing. We also reviewed the list of references in the final document.

Back to TopTop