Next Article in Journal
Using Eye-Tracking to Assess Dyslexia: A Systematic Review of Emerging Evidence
Next Article in Special Issue
“Okay We’re Doing My Idea”: How Students Enact Epistemic Agency and Power in a Design-Based Engineering Context
Previous Article in Journal
Unlocking STEM Pathways: Revealing STEM Choices and Science Teachers Empowering Black Queer Students
 
 
Article
Peer-Review Record

Usable STEM: Student Outcomes in Science and Engineering Associated with the Iterative Science and Engineering Instructional Model

Educ. Sci. 2024, 14(11), 1255; https://doi.org/10.3390/educsci14111255
by Nancy B. Songer *, Julia E. Calabrese, Holly Cordner and Daniel Aina
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Educ. Sci. 2024, 14(11), 1255; https://doi.org/10.3390/educsci14111255
Submission received: 15 September 2024 / Revised: 18 October 2024 / Accepted: 13 November 2024 / Published: 16 November 2024
(This article belongs to the Special Issue Advancing Science Learning through Design-Based Learning)

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

Thank you for the opportunity to review this manuscript, which introduces a novel instructional approach for middle school students that integrates cycles of scientific investigation and engineering design as applied to a local environmental issue. The authors make a strong case for the importance of integrating science and engineering in real-world problem-solving and demonstrate the lack of attention to such integration in formal K-12 educational settings.

 

The authors' model and research question/hypothesis are clear and hold the potential for important contributions to the field. The intervention they present is an exciting innovation and is well-described. Overall, the manuscript is very well-written, and I enjoyed learning about the authors’ work.

 

I can offer a few suggestions to strengthen the manuscript, focusing on the study design, methods, and supporting citations to the literature.

 

The authors employ design-based research (DBR) as the methodology for their study, yet do not introduce or describe DBR for readers or explain the authors’ rationale for using it in this work. I encourage the authors to provide background about DBR and their decision to employ it, as well as the decisions that informed their study design—citing relevant literature to support their choices.

 

I would also encourage the authors to provide additional detail about the study design itself. A diagram or chart may be helpful to illustrate the DBR cycles and the samples and contexts for each cycle. I recommend the authors add information about how the schools/classes were recruited, selected, and assigned to a study condition. It might also be helpful to include gender information in the demographics provided. Finally, I would appreciate clarification about why the authors used a comparison group for Cycle 1 but did not appear to do so for Cycles 2 & 3.

 

In terms of methods, I encourage the authors to provide additional detail about the assessment that was used, including more information about the items and scoring and what literature informed its development and/or testing. It would also be helpful to include the assessment instrument and scoring rubric as figures (similar to Figure 3, which is very helpful) or in an appendix. Finally, while the authors provide clear, detailed descriptions of the modifications made to the instructional activity, there was minimal explanation of how data were gathered from teachers and students and how those data were analyzed to identify needed improvements. I encourage the authors to elaborate on their research methods and support their choices with relevant citations.

 

I would welcome the opportunity to review a revised version of this important manuscript.

Author Response

Responses to Reviewer 1 Comments are provided in red text.

 

Manuscript title: Usable STEM: Student Outcomes in Science and Engineering Associated with The Iterative Science and Engineering Instructional Model

 

Paper submitted to Education Sciences, Special Issue on Advancing Science Learning Through Design-Based Learning

 

 

The authors employ design-based research (DBR) as the methodology for their study, yet do not introduce or describe DBR for readers or explain the authors’ rationale for using it in this work. I encourage the authors to provide background about DBR and their decision to employ it, as well as the decisions that informed their study design—citing relevant literature to support their choices.

 

Agreed. This section of Materials and Methods has been expanded considerably (see lines 171-191 plus Figure 3.)

 

I would also encourage the authors to provide additional detail about the study design itself. A diagram or chart may be helpful to illustrate the DBR cycles and the samples and contexts for each cycle.

 

Agreed. We have added a new Figure 3 to help explain the steps of the DBR cycles.

 

I recommend the authors add information about how the schools/classes were recruited, selected, and assigned to a study condition. It might also be helpful to include gender information in the demographics provided. Finally, I would appreciate clarification about why the authors used a comparison group for Cycle 1 but did not appear to do so for Cycles 2 & 3.

 

Agreed. We expanded the description of recruitment and Cycle 1 ISE and Comparison groups (see lines 213-225).

 

In terms of methods, I encourage the authors to provide additional detail about the assessment that was used, including more information about the items and scoring and what literature informed its development and/or testing. It would also be helpful to include the assessment instrument and scoring rubric as figures (similar to Figure 3, which is very helpful) or in an appendix.

 

Agreed. We added more details about the assessment instrument, including information about the literature that informed development and testing (See lines 188-191). We also included the pre-post assessment instrument in the appendix.

 

Finally, while the authors provide clear, detailed descriptions of the modifications made to the instructional activity, there was minimal explanation of how data were gathered from teachers and students and how those data were analyzed to identify needed improvements. I encourage the authors to elaborate on their research methods and support their choices with relevant citations.

 

Agreed. We have added more detail and descriptions of the modifications made to the instructional activity and data collected associated with Cycles 1, 2, and 3. In addition, we have modified Table 1 to show the changes to the curricular materials associated with each research cycle.

 

 

Reviewer 2 Report

Comments and Suggestions for Authors

This interesting manuscript discusses experiential iterative STEM learning in a middle school/junior high population over three successive offerings. Importantly, it offers a way to expand students' horizons with regard to the breadth of STEM careers available across interests in applications, as well as a chance to test solution designs and perform scientific experiments.

I commend the authors for a very well-written and focused manuscript, that is also quite relevant for adaptation and implementation across age groups. The literature review is particularly well-written and demonstrates clear familiarity and scholarship within pedagogical theory. This manuscript would benefit from several revisions for clarity and methodological completeness, as well as a more thorough discussion of findings and recommendations for implementation to other researchers/educators. I believe that with these revisions it would be a useful and engaging contribution to the field.

 

Methods:

-Provide a statement of ethics review or lack thereof, as well as information on student/guardian opt-in or opt-out of educational activities, ethnic self-identification, and testing scores.

-Refer to the "Comparison" group as "Control", or more preferably "Didactic Control" or similar.

-I cannot find how Engineering Design was administered formally, nor how it was evaluated, beside the new changes discussed in Cycle 3.

-I would suggest changing the "Cycle" nomenclature to "Offering" or similar. The use of "cycle" becomes less intuitive when including the iterative learning model, which I understand is iterative within a single "cycle".

-Describe how classes were chosen for control (see above) vs. interventional treatment arms. Were there any demographic differences between these populations?

-What aspects of the modules were instructor-dependent, if any? 

Discussion:

-Clearly in each "cycle"/offering the researchers refined their curricula, however the indications for these changes are not always clear to the reader. 

-Similarly, several possibly unexpected findings are not discussed in context. Why do the authors think that Class 2B did not improve in Scientific Argument when 2A showed significant improvement? Was it purely because Class 2B had higher objective pre-test scores and it was difficult to improve? The 2-year discrepancy between classes does not immediately explain these differences, as classes 3A and 3B had a similar delta but showed near identical improvements in scores.

-Finally, as above, it is clear that researchers refined their approach over their three offerings. It would be beneficial in the discussion/conclusion to offer final recommendations for other researchers/educators to implement a high-value comparable module (e.g. what worked well and what did not work as well as hoped)?

 

Other:

-I would recommend a single Results and Discussion section in this manuscript. There is interpretation spread throughout the Results section, and I think it would be counter-productive based on a largely subjective analysis (i.e., using a non-validated evaluation tool) to fully and properly separate these sections.

-Reword "Pre-post data" to be more specific

-Include assessment tools (pre-/post-tests) as supplemental items so that readers can assess validity and reliability.

-A statement on conflict of interest would be necessary for this paper, as well as if researchers were involved directly in educational activities. This information may be provided already but not in my blinded version.

Author Response

Responses to Reviewer 2 Comments are provided in red text.

 

Manuscript title: Usable STEM: Student Outcomes in Science and Engineering Associated with The Iterative Science and Engineering Instructional Model

 

Paper submitted to Education Sciences, Special Issue on Advancing Science Learning Through Design-Based Learning

 

 

Methods:

-Provide a statement of ethics review or lack thereof, as well as information on student/guardian opt-in or opt-out of educational activities, ethnic self-identification, and testing scores.

Agreed. See lines 209-211 for this information.

-Refer to the "Comparison" group as "Control", or more preferably "Didactic Control" or similar.

Please see lines 201-208 where we provide more information on the comparison and ISE groups in Cycle 1. Please also see lines 153-164 where we provide more information on why a control group is not appropriate or possible in Design-Based Research.

-I cannot find how Engineering Design was administered formally, nor how it was evaluated, beside the new changes discussed in Cycle 3.

Agreed. We are very appreciative of this suggestion, as it has led to a much stronger explanation of how engineering was under-developed in Cycles 1 and 2. We have added more detail and descriptions of the modifications made to the instructional activity and data collected associated with Cycles 1, 2, and 3, with particular emphasis on what we changed to the Cycle 3 version of the unit. In addition, we have modified Table 1 to show the changes to the curricular materials associated with each research cycle.

-I would suggest changing the "Cycle" nomenclature to "Offering" or similar. The use of "cycle" becomes less intuitive when including the iterative learning model, which I understand is iterative within a single "cycle".

As outlined in lines 153-164, in Design-Based Research, the four steps associated with each offering of the research is called a Cycle. Therefore, we have kept this term the same.

-Describe how classes were chosen for control (see above) vs. interventional treatment arms. Were there any demographic differences between these populations?

Agreed. We expanded the description of Cycle 1 ISE and Comparison groups (see lines 228-230 and 248-256).

-What aspects of the modules were instructor-dependent, if any? 

Design-based research does not claim or intend to distinguish instructor-dependent aspects of the modules. As mentioned in the expanded sections on Design-Based Research (153-164), this approach intentionally adjusts features of the learning environment, work of the students, and roles of teachers in minor ways in each Cycle, dependent on the information and principles drawn from the previous cycle.

Discussion:

-Clearly in each "cycle"/offering the researchers refined their curricula, however the indications for these changes are not always clear to the reader. 

Agreed. We have added more detail and descriptions of the modifications made to the instructional activity and data collected associated with Cycles 1, 2, and 3. In addition, we have modified Table 1 to show the changes to the curricular materials associated with each research cycle.

-Similarly, several possibly unexpected findings are not discussed in context. Why do the authors think that Class 2B did not improve in Scientific Argument when 2A showed significant improvement? Was it purely because Class 2B had higher objective pre-test scores and it was difficult to improve? The 2-year discrepancy between classes does not immediately explain these differences, as classes 3A and 3B had a similar delta but showed near identical improvements in scores.

In Design-based research when instructional materials are revised with each Cycle and the student populations are not constant, we are not comfortable making direct comparisons between students or outcomes in different research cycles. We have provided more details about the different students and teachers in the Cycle 2 implementation. In addition, we conclude the paper with  recommendations for other researchers.

-Finally, as above, it is clear that researchers refined their approach over their three offerings. It would be beneficial in the discussion/conclusion to offer final recommendations for other researchers/educators to implement a high-value comparable module (e.g. what worked well and what did not work as well as hoped)?

Agreed. We have expanded each of the Cycle 1, 2 and 3 sections and added Table 1 to help articulate the changes made in each Cycle. In addition, we added recommendations in the conclusion section that is intended to be usable by other researchers and educators.

Other:

-I would recommend a single Results and Discussion section in this manuscript. There is interpretation spread throughout the Results section, and I think it would be counter-productive based on a largely subjective analysis (i.e., using a non-validated evaluation tool) to fully and properly separate these sections.

Agreed. We have combined these sections.

-Reword "Pre-post data" to be more specific

We have clarified this in lines 187-195.

-Include assessment tools (pre-/post-tests) as supplemental items so that readers can assess validity and reliability.

We have provided the identical pretest and posttest in the appendix.

-A statement on conflict of interest would be necessary for this paper, as well as if researchers were involved directly in educational activities. This information may be provided already but not in my blinded version.

We added information in lines 209-211 that indicates that all research studies were approved by the university and school district ethics and research committees prior to each study and that all participants provided written consent to participate in the research study prior to each study.

Round 2

Reviewer 1 Report

Comments and Suggestions for Authors

Thank you for the opportunity to review this manuscript again. I previously found the manuscript made a strong case for the importance of integrating science and engineering in K-12 education and introduced a novel instructional approach for middle school students. I offered a few suggestions to strengthen the manuscript, focusing on the study design, methods, and supporting citations to the literature.

 

I appreciate the authors’ careful attention to addressing my feedback. The expanded Materials and Methods section does an excellent job introducing DBR and providing a rationale for its use in this study. The authors have added citations to relevant literature to support their choices. The authors have also added helpful detail about the assessment that was used and included it as an appendix. They also added detail about how the schools/classes were recruited and selected, clarified why a comparison group was used for Cycle 1, and elaborated on their data collection and analysis methods.

 

The authors have fully addressed my suggestions, and I believe the revised manuscript is ready for publication. I look forward to seeing this article in print and believe it will make important contributions to the field.

Back to TopTop