Next Article in Journal
Influence of Car Configurator Webpage Data from Automotive Manufacturers on Car Sales by Means of Correlation and Forecasting
Next Article in Special Issue
Ecological Forecasting and Operational Information Systems Support Sustainable Ocean Management
Previous Article in Journal
Assessing the Implication of Climate Change to Forecast Future Flood Using CMIP6 Climate Projections and HEC-RAS Modeling
 
 
Article
Peer-Review Record

Integrating Ecological Forecasting into Undergraduate Ecology Curricula with an R Shiny Application-Based Teaching Module

Forecasting 2022, 4(3), 604-633; https://doi.org/10.3390/forecast4030033
by Tadhg N. Moore 1,2,*, R. Quinn Thomas 1,2, Whitney M. Woelmer 1 and Cayelan C. Carey 1
Reviewer 1: Anonymous
Reviewer 2:
Reviewer 3: Anonymous
Forecasting 2022, 4(3), 604-633; https://doi.org/10.3390/forecast4030033
Submission received: 26 April 2022 / Revised: 23 June 2022 / Accepted: 27 June 2022 / Published: 30 June 2022
(This article belongs to the Collection Near-Term Ecological Forecasting)

Round 1

Reviewer 1 Report

Moore et al. describe an impressively detailed teaching module on ecological forecasting, freely available to undergraduate and graduate ecology educators. For full disclosure, I was one of the instructors that piloted their module in the classroom and found their description accurate and useful for educators considering its use. They provide clear and compelling evidence of the module’s effectiveness at teaching the principles of ecological forecasting and complexing modeling in an approachable interactive. I found the manuscript to be well-written and highly valuable for publication in Forecasting. 

I have provided a PDF with minor wording and phrasing comments and broader comments below to further improve the manuscript.

Broad comments:

1. The framing of the research questions in the abstract and the introduction included phrasing that could suggest the researchers were testing the efficacy of the module delivery method (a R Shiny application or interactive software application). While providing this interface for public use is certainly notable, it is not a feature tested by their pre- and post-module assessment of ecological forecasting principles. I suggest separating the statements on the research questions related to student learning from the features of the R Shiny app. The collected data only speaks to student learning of the concepts, not the merits of the platform. I do encourage the authors to highlight the features of the R Shiny app, and importantly, include more details on the effort and maintenance required to build modules of this type.

2. You mention the module applying the 4DEE framework, but I was unclear where this module would fit in the progression of a typical undergraduate ecology course. What fundamental concepts do students need to know first? Does your module provide a bridge between community and ecosystem ecology concepts?

3. A few more details on implementation in the introduction might be helpful to your readers. What is the recommended class times for using your module and potential variations? Did instructors most commonly use a single 3-hour lab period or multiple shorter class periods to implement your module? This comes up in the discussion, but it would be better to describe the intent up front and then discuss how implementation worked in practice for the instructors that used your module.

4. Section 2.2 on the activities of the module is clearly written with great detail about the models that students explore. While Figure 2 is a nice preview of the Shiny App screens, I’d rather have a graphic that has an overview of the progression from Activity A -> B -> C, to provide your reader a higher-level visual view of the module before diving into the details.

5. Section 2.3 should be changed from Accessibility to Availability. In instructional design, accessibility refers to meets standards for usability for persons with disabilities (such as the Web Content Accessibility Guidelines (WCAG), developed by the World Wide Web Consortium (W3C) Web Accessibility Initiative: https://www.w3.org/WAI/standards-guidelines/wcag/). As some educators need to document the accessibility of their teaching materials, it could be helpful to comment on the accessibility of the content for students with disabilities.

6. Figure 4: I’m not convinced the lines between the vertical bars add any understanding to the trends in the stacked bar graphs. If the trends in individual responses are particularly important, a separate slope line graph could be added in the supplement. Figure 4B would be easier to read with the X and Y axes flipped so the category labels wouldn’t need to be on an angle.

7. Table 2 also might be better as a supplement, with the most relevant statistics included in the results text.

8. Figure 5: I would reorder the facets and have undergrad and grad responses next to each other and the forecast topic be organized in rows. You have undergrad and grad next to each other in Figure 4 and it could help your order to parallel the same layout in Figure 5. The lines between bars are even less useful here given the reduced height-to-width ratio of these graphs. Same comments for Figure 6.

Comments for author File: Comments.pdf

Author Response

Please see the attachment.

Author Response File: Author Response.docx

Reviewer 2 Report

Integrating ecological forecasting into undergraduate ecology curricula with an R Shiny application-based teaching module


This is a study that should be published–it will be great to have this tool widely available and understood–but there is some work that this manuscript needs to be a publishable paper.

My recommendations:

1. The teaching module is based on an interactive online tool. There is a body of literature on the pedagogy and cognitive perspectives on using this type of tool for teaching. My most recent work in this field dates back to 2008, and there was already at that time a lot of literature on the topic, as interactive web tools were growing in use. There are different models of feedback from the system to the student, different models for classifying and understanding student errors, and some interesting comparisons to other modes of teaching.  I’m not as up to speed on progress since 2008–I’m sure there has been some great work done since then  (I would think, tools like CODAP and Tuva for statistical literacy, or NetLogo for teaching simulations would be a starting point)–but this context is needed in order to understand what the ecological forecasting module does pedagogically, how it fits into this context, and where it will go in the future.

2. The study has some limitations–i.e. learning outcomes couldn’t be compared to a control, so we don’t know if students would have learned just as well from another teaching method. But given those limitations, there are still some things that could be learned from the data. For example, if I’m understanding Figure 4A correctly, there were some students who identified fewer INTEF steps in post versus pre. Similar effect in figure 5. Digging into which steps were lost and why would give useful insights. Similarly, were there steps that students reported that are not part of the cycle, and were these steps lost through the module? The INTEF cycle is only one model, and one that emphasizes an academic-oriented approach. Looking at the responses in this way could either identify other models that students have for iterative forecasting, or at least identify unintended outcomes of the teaching module. In general, digging deeper into the responses, in lieu of a controlled experiment, is one way to get some insights out of this dataset. In addition to showing some interesting results, digging deeper into the data can help inform improvements of the teaching module.

3. There are some methodological explanations that are missing. This applies both to the design of the module (page 6 explained how it worked, but not why the various design decisions were made), and the design of the survey (explained on page 11, in terms of what the survey did, but not how each survey question was aimed toward a specific finding). These are also places that could benefit from context provided by the body of literature on designing educational tools.

4. One of the strengths of the interactive/online approach is the ability to iterate and quantify the effects of those iterations. What did you learn from this study that will help you improve the teaching module? What are the changes that you’ll make to the module to better teach the material as well as to better understand the module as a teaching tool? 

Finally, there are some minor recommendations:

Line 90 - To say that programming is rarely taught to undergraduates probably isn’t accurate any more, at least in the natural sciences.

Lines 122-124 (and elsewhere, including the title, line 537,...) - The manuscript talks about using R Shiny in undergraduate curricula, etc. As far as I can tell, in this teaching module, it’s just used as an interactive web platform. There’s nothing particular about R Shiny to it. This framing is a little misleading. There isn’t really an assessment of R Shiny, but rather of an interactive web tool. The website could have been built in any language. This should be corrected throughout the manuscript.

Lines 381-385 These lines were confusing to me. I’m not sure what to advise, but a rewording would probably be helpful.

Line 497: Be specific about which “computational skills” are grasped.

Line 544: Be specific about which “pitfalls”

 

Author Response

Please see the attachment.

Author Response File: Author Response.docx

Reviewer 3 Report

Dear editor, thank you for the opportunity to review this paper. In general, the research is congruent with the journal mission and scope. The manuscript can be published. The overall recommendation is accept after minor revision. My only concern is that the article should be shortened and present the conducted research in a more clear way. 

Author Response

Please see the attachment.

Author Response File: Author Response.docx

Round 2

Reviewer 2 Report

Dear Dr. Moore and coauthors,

The changes to the manuscript have addressed the comments in my review for the most part. I have some suggestions for you that I think will help pull the paper together given some of my earlier comments. The paper would be sufficient without these changes, but I think there's still a chance for some improvement.

First, we talked about how the paper doesn't really situate the study in the context of the education literature. It's missing a model of learning. It's hard to go back after the fact and put something like this in, but I have one idea that could help bring out something like a mental model at least. You have the forecast cycle, which is sort of like a mental model that you're trying to teach. From the student surveys, you should be able to construct what the mental model is before and after the module. Before the module, there are probably multiple clusters, with someone different versions of the cycle. Afterwards, maybe some convergence. (But also some divergence, which you now write about.) A figure diagramming the forecast cycles that students have in mind, with clustering, before and after the module could really crystalize this. In other words, what does a stick-diagram of Figure 1 look like for each student? You wouldn't plot all 100+ of them in the figure, but pull out the clusters, and show how these mental models converge after the learning process. As the paper is now, I'm left wondering what those forecasting cycles look like in the minds of students before and after the module.

And the second point is a minor one, but worth mentioning. You have said that "R Shiny applications provide an unprecedented level of customization". I agree that R shiny is handy, but there's nothing new about it. It's a web design language like a dozen others. The module you've designed is what's new. You could have written it in any web design language. Just like how someone learning with NetLogo doesn't notice that there's a Java backend, someone using your module wouldn't notice the backend programming language is R. I think your emphasis on R shiny kind of undermines the cool work you've done with the module itself. At least, that is how it comes across to me, as someone who has worked with a dozen programming languages over the decades, including R shiny.

I'm checking the "minor revisions" box, only because I feel the paper can still be much improved by these small steps, but I won't stand in the way of publication if you choose not to make changes.

 

Author Response

Please see the attachment.

Author Response File: Author Response.docx

Back to TopTop