Next Article in Journal
Artificial Intelligence in Higher Education: A Bibliometric Study on its Impact in the Scientific Literature
Next Article in Special Issue
Understanding Cellular Respiration through Simulation Using Lego® as a Concrete Dynamic Model
Previous Article in Journal
Plant Classification Knowledge and Misconceptions among University Students in Morocco
Previous Article in Special Issue
Visualizing the Greenhouse Effect: Restructuring Mental Models of Climate Change Through a Guided Online Simulation
 
 
Article
Peer-Review Record

An Investigation of Students’ Use of a Computational Science Simulation in an Online High School Physics Class

Educ. Sci. 2019, 9(1), 49; https://doi.org/10.3390/educsci9010049
by Joshua M. Rosenberg * and Michael A. Lawson
Reviewer 1: Anonymous
Reviewer 2:
Reviewer 3: Anonymous
Educ. Sci. 2019, 9(1), 49; https://doi.org/10.3390/educsci9010049
Submission received: 1 February 2019 / Revised: 22 February 2019 / Accepted: 24 February 2019 / Published: 7 March 2019

Round 1

Reviewer 1 Report

It shows a qualitative research where the experience carried out with simulation in a physics class is collected. The approach is correct, a logical structure is followed within the necessary steps in qualitative research. However, it is necessary to use some qualitative data analysis software that provide rigorous analysis. As well as the necessary categorization of user responses.


Therefore, I recommend the authors:


1) Create categories of analysis, where the different categories related to each one of the research questions are grouped.


2) Use some qualitative data analysis software such as Nvivo, Atlas or MAXQDA and generate a relationship mapping between categories of analysis. This gives rigorous study and greatly improve its quality and presentation.


Author Response

In order to address both of your recommendations, we added a data analysis section. In this section, we described in-depth on the coding frames that we used. We also provided details on how, after initially coding the data separately, the authors met to compare and discuss the data and then develop pattern coding, which allowed us to summarize the data into a smaller number of sets and themes. We believe this additional will help contribute to giving the study a greater sense of rigor (and a greater clarity in its presentation).

Reviewer 2 Report

The paper describes the application and preliminary analysis of a simulation-based lesson plan re: high school thermodynamics. The lesson plan and simulation appear to be well designed to elicit learning of the topic.

The abstract should be re-written. It doesn't frame the paper well, and it doesn't really include any of the key details a browsing reader will want to know re: scope, aims, methodology, key results, main conclusion.

There is a general over-use of punctuation and explanation, in lieu of writing straightforward sentences which are easy to parse. It is not that the usage is incorrect, but in many cases the sentences could be more direct and thus easier to read.

i.e. "These practices, which include developing and using models and analyzing and interpreting data, but also obtaining, evaluating, and communicating information and using mathematics and computational thinking, then, present a rare area of overlap across STEM." => "Skills such as data collection, modelling, communication, and analysis are universally valued across the STEM disciplines".

The literature review section should be re-examined and preferably shortened. There were parts of it that seemed irrelevant when considering the modest research aims of the paper.

The research questions could be simplified as a general "Aims" section. Formalising RQ1-4 is perhaps a useful step for you to undertake when designing the research, but it is not adding much value to a reader here.

The methodology is not laid out as cleanly as I would like. A more straightforward timeline explanation of what happened at each step would help. i.e. at what points did you ask the questions you asked? How did students provide these responses?

To summarise, the main issue I take with this paper is that it is overly long and over-complicates the explanation of things that don't need to be complicated. At the end of the day, the authors themselves acknowledge to be a small and isolated study with few generalisable outcomes - so it shouldn't need so much window dressing. I'd aim to cut 2 pages to start, and would wager the paper would be improved by doing so. I'd also re-pitch it from the outset: using words like "preliminary analysis" or "pilot study" in the abstract and introduction would make the reader more receptive to what is ultimately a small qualitative analysis of 13 students who did 3 short-form exercises.


Minor details:

Reference error [3333]

The use of the word veridical is ... questionable. This word is not in common usage, and can easily be replaced with the common word "truthful". I would note that science communication in general needs to move away from "scholarly" jargon and work to make our research more accessible. I would also note that a better expression in this particular instance is actually "realistic".

209: two processes bi-directionally influence => two process WHICH bi-directionally influence

222: The goal of the study was twofold. Fist, => two-fold. First.

235: This paragraph doesn't appear to have a useful function. It should either be replaced or removed.

245: (not online) is redundant.

355: f you look

466: investigate a Some other

539: on their one => one their own


Author Response

Thank you for your feedback on our manuscript. They helped us to revise and improve the manuscript. We have carefully considered and responded to each of your comments (below).


The abstract should be re-written. It doesn't frame the paper well, and it doesn't really include any of the key details a browsing reader will want to know re: scope, aims, methodology, key results, main conclusion.


We re-wrote the abstract in order to better describe the study in terms of its scope, aims, methodology, key results, and the main conclusion.


There is a general over-use of punctuation and explanation, in lieu of writing straightforward sentences which are easy to parse. It is not that the usage is incorrect, but in many cases the sentences could be more direct and thus easier to read.


We edited the entire manuscript in order to make the language more direct and easier to read.


The literature review section should be re-examined and preferably shortened. There were parts of it that seemed irrelevant when considering the modest research aims of the paper.


In order to address this helpful comment, we carefully revised the literature in light of the aims of the paper. This led to significant reductions (and revisions) to this section of the paper. In particular, we deleted two paragraphs introducing simulations in the Models and Simulations in Science Education section, two paragraphs from the introducing work with data (in general) and some recent, but unrelated, studies from the Work with Data in the Context of Simulations section, and one paragraph introducing remote and virtual laboratories. from the Using Simulations in Online Science Classes section.


The research questions could be simplified as a general "Aims" section. Formalising RQ1-4 is perhaps a useful step for you to undertake when designing the research, but it is not adding much value to a reader here.


In order to respond to this helpful comment, we revised the research questions. We deleted one question and made the first and second research questions more general. We made the third research question more related to the need from past research on modeling variability in data.


The methodology is not laid out as cleanly as I would like. A more straightforward timeline explanation of what happened at each step would help. i.e. at what points did you ask the questions you asked? How did students provide these responses?


We added which lesson each embedded assessment question was from to Table 1. This provides information on when students were asked these questions. We also added information about how students provided these responses (via short-response questions that were embedded in Google Docs that students downloaded, entered their responses into, and then submitted to the instructor of the course).


I'd aim to cut 2 pages to start, and would wager the paper would be improved by doing so. I'd also re-pitch it from the outset: using words like "preliminary analysis" or "pilot study" in the abstract and introduction would make the reader more receptive to what is ultimately a small qualitative analysis of 13 students who did 3 short-form exercises.


As noted earlier, we tightened the literature review to be focused better on our aims. In order to re-pitch the paper with the aim of making readers more receptive to its claims, in the manuscript, we added that the paper reports on an initial attempt to use a simulation in an online science class. We also added this note in the purpose statement that is at the end of the introduction section.


Minor details (multiple suggested edits)


We addressed each of these comments in our revisions.

Reviewer 3 Report

Firstly congratulations on a very well written paper, it has been well edited and is easy to read. When I first started reading I was wondering if I had slipped back into the 1990s world of multimedia science simulations, but then I noted the online component. Even so I have wondered how original the research is, beyond its particular case study. The key to its originality would be  the limitations of an online environment such as Blackboard or Moodle for example for creating simulations rather than the use of simulations in Science. Perhaps the author’s could also provide pointers for future research in the area such as emerging technologies such as augmented reality or virtual environments which are the technologies students get exposed to beyond the classroom.


Author Response

Firstly congratulations on a very well written paper, it has been well edited and is easy to read. When I first started reading I was wondering if I had slipped back into the 1990s world of multimedia science simulations, but then I noted the online component. Even so I have wondered how original the research is, beyond its particular case study. The key to its originality would be  the limitations of an online environment such as Blackboard or Moodle for example for creating simulations rather than the use of simulations in Science.


We addressed this helpful comment through the addition of a paragraph (lines 548-554) on the challenges (and opportunities) of using simulations in online classes:


This past research suggests that there are some benefits and affordances to students’ use of simulations in online post-secondary science courses.  However, the body of research simulations K-12 science classes tends to focus on face-to-face learning environments rather that the growing field of online K-12 science classes. Particularly, as recent reform efforts emphasize K-12 students engagement in STEM practices [XXC[RJ1] ], and the challenge of creating, modifying, and analyzing data generated from simulations in online classes, it is important that we know more about how students use simulations and model the data they generate in online learning environments.


We also added a note on the Learning Management System in section 4.3 on the Participants and Context.


Perhaps the author’s could also provide pointers for future research in the area such as emerging technologies such as augmented reality or virtual environments which are the technologies students get exposed to beyond the classroom.


In order to address this comment, we added a paragraph (lines 1702-1707) on how virtual reality tools can be used to observe phenomenon - and potentially as a context for creating and modeling data. We also added a note on tools for collaborating around data analysis (lines 1693-1694).

Back to TopTop