Next Article in Journal
Effectiveness of a Laboratory Course with Arduino and Smartphones
Previous Article in Journal
Impact of Sport Education Model on Sports Lifestyle and Attitudes of Vocational Education Training Students
Previous Article in Special Issue
A Corruption Course through a Culturally Relevant Pedagogy: The Need for an Assessment That Fits
 
 
Article
Peer-Review Record

Learning through Collaborative Data Projects: Engaging Students and Building Rapport

Educ. Sci. 2022, 12(12), 897; https://doi.org/10.3390/educsci12120897
by Matthew T. Pietryka 1,* and Rebecca A. Glazier 2
Reviewer 1:
Reviewer 2: Anonymous
Educ. Sci. 2022, 12(12), 897; https://doi.org/10.3390/educsci12120897
Submission received: 28 September 2022 / Revised: 12 November 2022 / Accepted: 17 November 2022 / Published: 7 December 2022

Round 1

Reviewer 1 Report

  The article “Learning Through Collaborative Data Projects: Engaging Students and Building Rapport”, presents a novel approach to teaching students research skills and analysis by giving them hands on experience data analyses.  In contrast to existing approaches, the authors provide students individualized feedback about how their analyses compared to the class in aggregate.  The article provides several exemplars and surgery evidence of effectiveness. 

Given the clever and novel pedagogical approach outlined, the effective evidence of its success, and clear recommendations on implementation, I recommend this article be published.

There are two critical challenges in teaching that this article addresses effectively.  The first is how to get students to meaningfully engage in course material and theories, particularly data driven theories, without allowing the math to overwhelm them.  The authors outline an effective method here, by breaking down data analyses and theories into simplified pieces that the students can interact with in a reasonable and straightforward way.  Essentially, this approach gives the students a chance for hands on data analysis/coding without getting lost in the weeds and details that can keep them from understanding the larger theoretical point. By emphasizing enjoyment and it’s practical benefits, the paper provides very effective justification and a welcome reminder that what’s fun isn’t frivolous, but purposeful. 

The second problem it addresses is scalability.  Faculty are increasingly facing larger class sizes and more calls to their time with fewer resources.  That means that methods of student engagement are increasingly hard to do as class sizes grow larger.  By giving students personalized feedback efficiently, and showing how they compare to their classmates, the article allows for the kinds of connections that matter, even in a class of several hundred students.  As noted by Ishiyama (2013) in European Political Science, much active learning research has targeted small classes under 40, and that this has proved to be a critical limit to effectiveness.  To develop a strategy for large classes (along with online classes) is deeply important.  It’s worth  noting that in some ways, this article creates a low resource cost version of the calls of Gordon, Barnes, and Martin (2009) Journal of Criminal Justice Education for ensuring targeted student interactions in large courses.

I have one minor suggestions I would recommend the article consider adding.  The article might want to address the ideas of undergraduate research and math anxiety a touch more.  These projects feel like excellent gateways to undergraduate research, a high impact practice (Ishiyama and Breunig 2008 -Politics and Policy).  This approach feels like a gateway to bring students into the research process, and it might be useful adding a brief additions, discussion through this lens.I suspect that this process might also be effective in helping social science students combat math anxiety by breaking things down into component parts.  Attaching this literature to the enjoyment section might be useful, if space allows.

Minor note: there appears to be an issue with formatting, as figure references have the following error: “ Error! Reference source not found”

Author Response

Reviewer 1 suggests a number of additional citations to engage a broader teaching and learning literature, including on scalability and research/math anxiety. We appreciate the opportunity to engage with the recommended pieces and have included them (and two additional citations) in the revised manuscript.

    1. On page 3: “Indeed, most of the studies on techniques that improve student engagement, like simulations and undergraduate research, are done on classes with 40 students or fewer (Ishiyama 2013). Here, we propose a resource-effective and scalable method for engaging students, with the added benefit of also building rapport with them.”
    2. On page 13: “The literature indicates that participating in research is a high-impact practice that can have long-lasting positive implications for students (Ishiyama and Breuning 2003). Contributing to these in-class data projects can introduce students to research in a supportive environment where research and math anxiety may be less likely to be triggered (Papanastasiou and Zembylas 2008, Leiter 2022).” 
    3. On page 14: “The usefulness of this approach for large classes is a real comparative advantage. Most engagement and rapport-building measures require a significant investment of personal time by the instructor and/or teaching assistants, making them difficult to apply to large classes (e.g., Gordon, Barnes, and Martin 2009). On the contrary, our approach works even better in larger classes, with no additional effort on the part of the instructor.”

 

Both Reviewers 1 and 2 noted a formatting error by which embedded code we had used to denote references to the figures had failed to transfer over to the Education Sciences article format. We have removed that code and hard coded the references to the figures in the revised manuscript.

Reviewer 2 Report

This article is focused on promoting engagement through ‘Collaborative Data Projects’ and rapport. The content of the article is really interesting. However, it is a little difficult to read. Just changing the structure of the article, and moving content to the proper section, it would be easier to read.

Format issues (structure of the article).- ‘Educational Sciences’ do not enforce strict formatting requirements to the articles. However, the manuscripts should contain at least these sections: Introduction, Materials & Methods, Results, Conclusions.  In the article there is not an ‘Introduction’ section.

Comments and suggestions

Minor issues

Line 18.-more invested in their learning’.

In diverse lines of the text (e.g. 412, 415, 419, …) appears ‘Error. Reference source not found’, What does it mean? A reference not included in the text?

--

Introduction -> this section should be included … and assuming that most of the first lines are going to be part of the ‘introduction’ section (from line 21 to line 108?)

·        Line 37 to line 49.- In this part of the introduction, all these lines reflect results and conclusions… Consider moving these lines to the ‘Discussion’ section (in case that you decide to include this section, despite it is not mandatory), to the ‘Results’ section (in case that you include results and discussion within a unique section) or to the ‘Conclusion’ section.

·        Despite that reading the article can be understood which one is the aim of the research, the goal of the research should be clearly stated. The goal should be also clear stated in the ‘Abstract’.

--

Materials and Methods -> this section should be included …

--

·        There is no information about the surveyed participants (number, gender…) in text. Only in the ‘Abstract’ is mentioned that 120 students were surveyed, but not in the text of the article.

·        You have included in the article ‘Overcoming Barriers to the Implementation’, analyzing diverse issues related with the experience, but … Which ones are the limitations of the research study that you have carried out? Are there any future lines?

--

References

·        References are not properly cited in the article, according to the instructions that can be found in the web page of ‘Education Sciences’ (“In the text, reference numbers should be placed in square brackets [ ], and placed before the punctuation”).

·        ‘Reference’ section: the format should be consistent with the previous paragraph.

·        Some doi are missed, e.g.

o   Carini, R.M., Kuh, G.D. & Klein, S.P. Student Engagement and Student Learning: Testing the Linkages. Res High Educ 47, 1–32 (2006). https://doi.org/10.1007/s11162-005-8150-9

o   What’s in a Name? The Importance of Students Perceiving That an Instructor Knows Their Names in a High-Enrollment Biology Classroom. April 2017CBE life sciences education 16(1):ar8

DOI:10.1187/cbe.16-08-0265

Author Response

Thank you for the opportunity to revise and resubmit our manuscript “Learning through Collaborative Data Projects: Engaging Students and Building Rapport” to your journal. We very much appreciate the thoughtful comments provided by the reviewers and the editor and we have taken care to address them in the revised version of the manuscript. Our changes are tracked throughout and our responses are detailed below.

 

Reviewer 2 made a number of suggestions about the structure and organization of the article, to make it easier to read and bring it in line with the formatting norms of Education Sciences. We have revised the manuscript in accordance with these suggestions.

    1. We have added a heading for the Introduction on page 1.
    2. We had renamed the Data and Methods section to Materials and Methods on page 8.
    3. We have removed the section previously titled “Barriers to Implementation” and incorporated the information from that section into the Discussion section.
    4. We have removed the word “more” from line 16 to make the sentence clearer, now reading “They also report that receiving individualized feedback increases their interest in the material and makes them feel like the instructor is invested in their learning.”
    5. Reviewer 2 suggests moving 12 lines in the introduction, in which we preview the findings and conclusions of the paper, to the discussion. We would prefer to keep that material in the introduction, so that the reader can get a full picture of the research in this first section, in case they are not able to fully read the whole article. Providing this kind of concise overview in the introduction is a regular practice in the field of political science.
    6. We have added text to the abstract and page 1 in response to Reviewer 2’s suggestion that the goal of the research be clearly stated. Specifically:
  • The revised abstract states, “To overcome these barriers, we designed a series of collaborative data projects to engage students, even in large online classes. Our goal is to describe and evaluate the efficacy of these projects.” (In order to fit this new text into the abstract while remaining under the 200-word limit, we edited the entire abstract for brevity and clarity.)
  • And we begin the third paragraph on page 1 as follows, “To address these challenges, the goal of this study is to describe and evaluate the effectiveness of a series of collaborative assignments.”
    1. In Response to Reviewer 2’s request for additional information about the sample, we have revised the paragraph about response rates on page 9 (the Materials and Methods section) to explicitly state the number of respondents in each dataset: “Each dataset offers distinct advantages. The first dataset is more immediate and offers a higher response rate–virtually all students (N = 69) responded to these questions even though they could opt out of these items with no penalty. Since they are gathered during the data-entry stage of the assignments, however, these evaluations do not reflect students’ reactions to the subsequent discussions and personalized reports based on the aggregated data. The end-of-semester evaluations are more holistic—reflecting all portions of the assignments in relation to the rest of the course material. Yet the response rates to the end-of-semester evaluations tend to be lower, with roughly half of students responding in each class, some of whom have opted out of sharing their evaluations with instructors. The usable response rate to any item in the evaluations was 36% in the Media class (N = 14), 36% in the Social Influence class (N = 14), and 40% in the Research Methods class (N = 37).”
    2. We have also added a footnote (fn 3) on p. 9 that reports the descriptive data available in the evaluations data: “According to self-reports in the student evaluation data, 25% of the students were seniors, 40% were juniors, 29% were sophomores, and 5% were first years. Students tended to report high grade point averages (GPA), with 63% reporting a 3.5 average or better on the standard 4-point scale; 22% reporting a GPA from 3 to 3.49’ and the rest reporting a GPA between 2 and 2.99.” Unfortunately the data do not provide gender or other demographic data, presumably to inhibit instructors from discerning the students’ identities.
    3. We have also directly addressed limitations of the specific studies described on page 9, “Both datasets also share several limitations. In particular, the results come from a single instructor during a single calendar year at a single university. Therefore, we cannot examine how the student responses might vary in other settings. Nonetheless, the classes do vary considerably in size, format, and content, as discussed above. The effective number of responses is also limited because, unlike a national survey, the observations are not independent from one another—one student’s experience is likely to influence other students in the class. Finally, the analysis relies on students’ self-reported learning, rather than objective measures of student learning. Yet self-reported learning is still an essential element of students’ college experience. And our interests focus not only on learning, but also student interest and student-instructor rapport—two outcomes for which self-reports are intrinsically relevant. We therefore turn next to the results.”

 

Both Reviewers 1 and 2 noted a formatting error by which embedded code we had used to denote references to the figures had failed to transfer over to the Education Sciences article format. We have removed that code and hard coded the references to the figures in the revised manuscript.

    1. Reviewer 2 requested that the formatting of the references be changed to fit the journal requirements.
  • We have added DOI numbers to all references.
  • Per the editor’s request, we have checked all citations for relevance, and have added four new citations, in line for Reviewer 1’s recommendations
  • The reference section is now in the same font and font size as the rest of the manuscript.
  • We submitted the article as “open format” per the journal instructions, but are happy to make changes, at the editor’s request, to the way that the citations are made. Currently, they are cited using in-text citations in author/date format, following the Chicago Manual of Style.

 

 

 

 

 

Round 2

Reviewer 2 Report

The article presents the use of collaborative data projects to engage students. The revised version has improved the structure of the article.

 

Comments and suggestions

·         Minor format issues

In diverse lines … ‘Error! Reference Source not found’ should be deleted (lines 305, 369, 394, 436).

--

Keywords. - perhaps too generic (collaboration; data)?  

Instead… Why not: ‘Active learning’; ‘Assessment’; ‘Collaborative Project’; ‘Engagement’?

--

Additional considerations

Limitations of the data set have been included in the article (lines 350 to 360). Consider moving the limitations of the research to the end of the section ‘Discussion’.

--

Funding Information, Author Contributions, Conflict of Interest and other Ethics Statements should be included in the article.

--

Author Response

We are grateful to Reviewer 2 for the careful attention to our manuscript and the attentive feedback offered in each review. We believe we have addressed each comment, as detailed below.

 

Point 1: In diverse lines … ‘Error! Reference Source not found’ should be deleted (lines 305, 369, 394, 436).

Response 1: We believe we have removed all of these errors. They were not present in the revised manuscript, except for the tracked changes deleting those errors. They occur after submission and therefore we cannot be certain more will not arise. But they seem to be coming from MS Word’s cross-reference feature and we believe we have removed all cross-references.

 

Point 2: Keywords. - perhaps too generic (collaboration; data)?  Instead… Why not: ‘Active learning’; ‘Assessment’; ‘Collaborative Project’; ‘Engagement’?

Response 2: Great idea and thank you for the suggestion. We have revised the keywords to: “Active learning; student engagement; collaborative project; rapport-building; feedback”

 

Point 3: Limitations of the data set have been included in the article (lines 350 to 360). Consider moving the limitations of the research to the end of the section ‘Discussion’.

Response 3: We have moved this text to the Discussion section (p. 13) and revised the Discussion section slightly (e.g., adding transition sentences) to integrate it into the narrative.

 

Point 4: Funding Information, Author Contributions, Conflict of Interest and other Ethics Statements should be included in the article.

Response 4: We have added this information after the Conclusion section (p. 14)

 

Back to TopTop