Next Article in Journal
The Changing Face of Veterinary Professionalism—Implications for Veterinary Education
Previous Article in Journal
Live Remote Classroom: A Tool for Coherent Teacher Education
 
 
Article
Peer-Review Record

Does Previous Experience with Online Platforms Matter? A Survey about Online Learning across Study Programs

Educ. Sci. 2023, 13(2), 181; https://doi.org/10.3390/educsci13020181
by Åse Nygren 1,*,†, Emil Alégroth 2,†, Anna Eriksson 2,† and Eva Pettersson 1,†
Reviewer 1:
Reviewer 2:
Reviewer 3: Anonymous
Educ. Sci. 2023, 13(2), 181; https://doi.org/10.3390/educsci13020181
Submission received: 29 December 2022 / Revised: 1 February 2023 / Accepted: 2 February 2023 / Published: 8 February 2023
(This article belongs to the Topic Advances in Online and Distance Learning)

Round 1

Reviewer 1 Report

The paper is well-written, clear and focused. I really enjoy the reading.

I have just some minor comments that could improve the quality.

The introduction, as well as the literature review, are pertinent.

However, I would suggest supporting the first paragraph of the introduction with a reference. 

The methodology is clear, but the qualitative method of analysis of the student's comments is not well-described. I would like to find not only references about open coding but also some insight into the methods.

Please, check the missing Appendix (cross-ref) at the end of section 2.4 (page. 6, line 265).

Moreover, I would like to read some examples of students' comments within section 3 (page 11, line 421-430). 

Cluster analysis is straightforward and clear, even though the analysis does not allow the highlighting of specific phenomena. It is not a negative result, it is just a result. I was wondering why, as the authors did. But I think authors can consider not only the programs but how the courses are taught. More precisely, I think that the lack of difference depends on the fact that lessons are delivered (both ERT and F2F) as teacher-centred, such as frontal lessons. As a consequence, students do not perceive any difference, because the courses have such common characteristics. 

I strongly invited the authors to point out such aspects, describing the most used practice at BTH. 

This leads to the discussion and conclusion, that are very strong when authors claim that "This result provides valuable insights for 483 teachers engaged in online learning and points towards the fact that there is no need to adjust forms of online teaching to specific groups or study programs. There is, then, no reason to adapt online course structure in relation to different students groups,  study  programs, educational backgrounds or whether the students study 1st or 2nd cycle. [page 14, lines 480-487].

In my view, this claim is quite dangerous, because can be misinterpreted as "no need to design lessons online, but it is just sufficient to migrate what one does in presence". 

I strongly invited Authors to rephrase such claims to avoid possible misunderstanding. A way to do that is to consider the nature of the delivered lessons (both online and F2F).

 

 

 

 

Author Response

REVIEWER 1

R1.1. The paper is well-written, clear and focused. I really enjoy the reading. I have just some minor comments that could improve the quality.

Reply: We are glad to hear that the reviewer enjoyed reading about our work and would like to thank you for the excellent comments. We have done our utmost to address each one, as outlined below, and we hope you find them satisfactory. For your convenience, we have color coded all changes pertaining to your comments in the manuscript with the color red.

 

R1.2. The introduction, as well as the literature review, are pertinent. However, I would suggest supporting the first paragraph of the introduction with a reference.  

Reply: We agree with the reviewer that the first paragraph could have more references, as it lays the premise to a lot of the value of this work. Thank you for this observation.

Action: We have added three references to support our introductory paragraph. 

 

R1.3. The methodology is clear, but the qualitative method of analysis of the student's comments is not well-described. I would like to find not only references about open coding but also some insight into the methods. 

Reply: We appreciate the feedback and agree that the description of the qualitative method used in the study can be described in more detail.

Action: We have described the qualitative method further in section 2.7 Phase 5: Analysis.

 

R1.4. Please, check the missing Appendix (cross-ref) at the end of section 2.4 (page. 6, line 265).

Reply: We thank the reviewer for bringing this oversight to our attention. Due to this issue, we have also gone through all other references of the manuscript (e.g. figures and tables) to ensure that all are correct and in place.

Action: We have added the Appendix, the questionnaire guide, at the end of the document.

 

R1.5. Moreover, I would like to read some examples of students' comments within section 3 (page 11, line 421-430).

Reply: We agree with the reviewer that including comments from the students give a more in-depth understanding of the students’ perceptions.

Action: We have added several quotes from students to illustrate the different thematic codes as well as output from the students, please see Section 3.

 

R1.6. Cluster analysis is straightforward and clear, even though the analysis does not allow the highlighting of specific phenomena. It is not a negative result, it is just a result. I was wondering why, as the authors did. But I think authors can consider not only the programs but how the courses are taught. More precisely, I think that the lack of difference depends on the fact that lessons are delivered (both ERT and F2F) as teacher-centred, such as frontal lessons. As a consequence, students do not perceive any difference, because the courses have such common characteristics. I strongly invited the authors to point out such aspects, describing the most used practice at BTH. 

Reply: The reviewer raises a potential, relevant, point. However, as the sample is taken from all programs at the university, which both pre-, during  and post-pandemic, have included a variety of types of teaching formats, including frontal lessons, PBL, flipped-classroom, we do not perceive this factor to be a leading cause of the result. Instead, we think the heterogeneity of the way teaching is conducted at the university is a more influential cause for the result, coupled with individual student needs for learning, including group belonging and group feedback, as we hypothesize in the end of the manuscript.

Action: We have added a short explanation about the fact that a wide range of teaching methods are used at BTH, both during, before and after the pandemic which is why such a conclusion cannot be made. 

 

R1.7. This leads to the discussion and conclusion, that are very strong when authors claim that "This result provides valuable insights for 483 teachers engaged in online learning and points towards the fact that there is no need to adjust forms of online teaching to specific groups or study programs. There is, then, no reason to adapt online course structure in relation to different students groups,  study  programs, educational backgrounds or whether the students study 1st or 2nd cycle. "  [page 14, lines 480-487]. In my view, this claim is quite dangerous, because can be misinterpreted as "no need to design lessons online, but it is just sufficient to migrate what one does in presence". I strongly invited Authors to rephrase such claims to avoid possible misunderstanding. A way to do that is to consider the nature of the delivered lessons (both online and F2F). 

Reply: Thank you for pointing out that this may be misunderstood as it is quite direct.  

Action: We have softened the formulation somewhat to avoid misunderstanding. 



Reviewer 2 Report

Thank you for the opportunity to review this intriguing paper, “Does previous experience with online platforms matter?...” I found it to be a unique and exciting approach to this research question and I commend the authors for their significant efforts.  I found the manuscript to be mostly ready for publication aside from just a few questions/suggestions that I will outline below:

 

1.     My main confusion is just with the way the data were presented on “previous experience with online learning” versus “previous experience with online platforms”. I found myself confused throughout the manuscript about whether the authors were talking about student attitudes and experience with actual previous online learning or if the authors were talking about their previous experience with online platforms like Facebook and Instagram.  Because I didn’t have access to the survey items, it was difficult to know to what they were referring.  I am going to point out several places where just a little clarification would be incredibly helpful:

a.     P.7, line 312-314: “differences in experience of the online teaching” – is this referring to experiences with online learning or experiences with online platforms?

b.     In figure 7, the cluster analysis, a little more description in the figure legend would be helpful.  I struggled to know whether the outcome variable was “experience and perception with online learning” or “experience with online platforms”.  But then, in lines 446-447 on p.13, it is stated that it is based on the answers to “all questions from the questionnaire survey”.  So, then I was really confused. 

c.     In the Discussion, the first sentence says the main conclusion is that “prior experience with online media platforms” has no impact.  But, is this really what you measured?? I started wondering if you were using their major or program as a proxy for experience with online platforms.  I see in the methods that you did have some questions on the survey that asked their familiarity with online platforms, but I never saw those data specifically highlighted.  Again, it probably just needs to be clarified.  Perhaps you could actually include a graph that showed average familiarity with online platforms by program. 

d.     Discussion, line 517 and 519, you use “previous experience” twice, but again, it isn’t clear if this is previous experience with online classes or previous experience with online platforms.

2.     Here are a handful of other questions/suggestions:

a.     Throughout the paper, the authors include many acronyms in parentheses but then never use the acronyms in the paper. I would just leave these out.  E.g., HE, ERT, SRSs, etc.

b.     Figure 1 seems entirely unnecessary.  Your process is pretty standard and it is described in detail in words, so the figure seems superfluous.

c.     Please define “first-cycle” and “second-cycle” for those of us who do not traditionally use these terms. Is this equivalent to Freshmen and Sophomore or something different?

d.     Line 224 – a random question mark is included in the text.

e.     Under 2.5. Phase 3: Pilot Survey, please describe who these 226 students were.  We need some context for validation.

f.      Line 400, “The question are posed as a 10 point Likert-scale and was the last but one of the surveys total of 18 questions.” – I have no idea what this sentence means. 

g.     I was a bit confused by figure 6.  What is the 45% and then the 28% by “Phone”?  Your figures could all use a bit more description in the figure legends.

h.     Line 436, I don’t think I know what “theoretical knowledge transfer” and “more nuanced elements of teaching” mean. Can you provide context?

3.     Also, the paper, as whole would benefit from English grammar editing.  The Results section, in particular, was riddled with errors.

4.     Lastly, a bit of praise:

a.     I LOVE part 5 of your survey. What a fantastic idea!

b.     The cluster analysis was brilliant! 

c.     Section 2.8. Threats to validity is so well written and perfect to include. Well done!

 

Well done! I look forward to reading a revision of the paper.

Author Response

REVIEWER 2

Thank you for the opportunity to review this intriguing paper, “Does previous experience with online platforms matter?...” I found it to be a unique and exciting approach to this research question and I commend the authors for their significant efforts.  I found the manuscript to be mostly ready for publication aside from just a few questions/suggestions that I will outline below:

Reply: We thank the reviewer for the excellent comments and are happy that you found the paper exciting and of interest. Below we have replied to the comments and given explanations to the actions we’ve taken to improve the manuscript based on your feedback. For your convenience, we have color coded all changes pertaining to your comments in the manuscript with the color teal.

 

My main confusion is just with the way the data were presented on “previous experience with online learning” versus “previous experience with online platforms”. I found myself confused throughout the manuscript about whether the authors were talking about student attitudes and experience with actual previous online learning or if the authors were talking about their previous experience with online platforms like Facebook and Instagram.  Because I didn’t have access to the survey items, it was difficult to know to what they were referring.  I am going to point out several places where just a little clarification would be incredibly helpful:

Reply: We agree with the reviewer that this could have been clarified. As several of the following comments concern this same issue, we have chosen to take actions on those comments and not make an explicit address here. We hope you will find our changes acceptable.

R2.1. P.7, line 312-314: “differences in experience of the online teaching” – is this referring to experiences with online learning or experiences with online platforms? Reply: Thank you for point out this possible confusion. 

Action: We have clarified the formulation. The lines have changed somewhat, and the change is in the section called “Phase 5”. 

 

R2.2. In figure 7, the cluster analysis, a little more description in the figure legend would be helpful.  I struggled to know whether the outcome variable was “experience and perception with online learning” or “experience with online platforms”.  But then, in lines 446-447 on p.13, it is stated that it is based on the answers to “all questions from the questionnaire survey”.  So, then I was really confused.  

Reply: We agree that there is some ambiguity to the research goal and thereby also the conclusion.

Action: We have added a clarifying statement about the core research gap in the introduction (See Section 0) and added a clarifying statement of the underlying hypothesis of the cluster analysis in Section 3.2.

 

R2.3. In the Discussion, the first sentence says the main conclusion is that “prior experience with online media platforms” has no impact.  But, is this really what you measured?? I started wondering if you were using their major or program as a proxy for experience with online platforms.  I see in the methods that you did have some questions on the survey that asked their familiarity with online platforms, but I never saw those data specifically highlighted.  Again, it probably just needs to be clarified.  Perhaps you could actually include a graph that showed average familiarity with online platforms by program. 

Reply: The reviewer raises a valid point that it was unclear in the original version of the manuscript how this was measured. In section 2.4. we outlined the survey instrument (Questionnaire guide) but due to a mistake on our end, the actual guide was never appended to the manuscript. This has been rectified and you can now see all the questions in Appendix A at the end of the manuscript. The measurement itself is based on a comparative analysis based on the hypothesis that experience with online platforms, and their content, would correlate to student perception of the platforms and content in teaching. However, as the study shows, no such correlations were found, implying that this factor does not affect the perceptions of the students toward, or against, online learning.

Action: The questionnaire guide has been added as an appendix to the manuscript and a clarifying statement as to the premise of the analysis has been added in the analysis section, please see Section 2.7 of the manuscript.






R2.4. Discussion, line 517 and 519, you use “previous experience” twice, but again, it isn’t clear if this is previous experience with online classes or previous experience with online platforms. 

Reply: Thank you for pointing out this possible confusion. 

Action: We have, for example, written “previous experience with online platforms” rather than only “previous experience.” 

 

Here are a handful of other questions/suggestions:

R2.5 Throughout the paper, the authors include many acronyms in parentheses but then never use the acronyms in the paper. I would just leave these out.  E.g., HE, ERT, SRSs, etc

Reply: Thank you for spotting this. 

Action: We have removed all the initialisms. 

 

R2.6 Figure 1 seems entirely unnecessary.  Your process is pretty standard and it is described in detail in words, so the figure seems superfluous.

Reply: We agree to some extent with the reviewers comment. However, we believe that providing this graphical overview of the methodology, early in the Methodology section, is beneficial to the reader to gain a better understanding of the steps we took. It can also serve as a quick reminder for readers that wish to scroll back and refresh their memory of the process steps.

Action: Because of the perceived benefits we see with the figure, we have chosen not to remove it. If the reviewer continues to insist that we should remove the figure we will of course do so.

 

R2.7 Please define “first-cycle” and “second-cycle” for those of us who do not traditionally use these terms. Is this equivalent to Freshmen and Sophomore or something different? 

Reply: These expressions are commonly used for programs that follow the Bologna model, but are also widely used in pedagogical research. We realize that the terminology may not be generally used and have therefore provided a clarifying statement. In short, 1st cycle students are undergraduates or bachelor’s level students and 2nd cycle are graduate or master’s level students. 

Action: We have provided definitions of 1st and 2nd cycle students in Section 2.4 of the manuscript, including a reference to the paper by M. G. Ash of the terms’ use. Additionally, we’ve gone through the manuscript to ensure that the terms are used consistently throughout.

 

R2.8 Line 224 – a random question mark is included in the text. 

Reply: We thank the reviewer for observing this issue, it was a remnant from working with the manuscript, indicating the need for a reference that we missed. 

Action: The question mark has been removed and a reference for Likert-scale questionnaires added.

 

R2.9 Under 2.5. Phase 3: Pilot Survey, please describe who these 226 students were.  We need some context for validation.

Reply: We are unsure what additional information the reviewer is requesting by this comment? Section 2.5. describes that the students were taken from the industrial economy and master of business program and a motivation is given by this choice. For the reviewer’s convenience, we have highlighted the existing text in the manuscript and welcome any further feedback on other information you believe we should add.

Action: No action taken.

 

R2.10 Line 400, “The question are posed as a 10 point Likert-scale and was the last but one of the surveys total of 18 questions.” – I have no idea what this sentence means.  

Reply: We agree with the reviewer’s comment.

Action: The sentence has been rephrased. please see Section 3 of the manuscript.

 

R2.11 I was a bit confused by figure 6.  What is the 45% and then the 28% by “Phone”?  Your figures could all use a bit more description in the figure legends. 

Reply: We understand the reviewers comment and agree that the description of Figure 6 needs clarification. Figure 6 bar chart updated.

d.Action: We have rephrased the description in the figure’s caption.

 

R2.12 Line 436, I don’t think I know what “theoretical knowledge transfer” and “more nuanced elements of teaching” mean. Can you provide context? 

Reply: Thank you for pointing out this slightly unclear statement. 

Action: We have revised the sentence to make it more clear. Is is rather about lecture-based teaching vs student active learning, which we have now pointed out.Lin 509. 

 

R2.13 Also, the paper, as whole would benefit from English grammar editing.  The Results section, in particular, was riddled with errors

Reply: Thanks for spotting these grammatical errors. 

Action: We have thoroughly read the paper and updated the English to the best of our ability. 

 

Lastly, a bit of praise:

  1. I LOVE part 5 of your survey. What a fantastic idea!
  2. The cluster analysis was brilliant! 
  3. Section 2.8. Threats to validity is so well written and perfect to include. Well done!

Well done! I look forward to reading a revision of the paper.

Reply: We thank the reviewer for these positive comments. 



Reviewer 3 Report

The study addresses a topic of interest such as training in virtual spaces. Specifically, during the pandemic caused by Covid-19. Although there are numerous studies that have addressed the issue, the study presented may increase knowledge about pedagogical challenges or possible ways to implement the training, supporting and guiding students in their learning process.

Some issues observed to improve the work:

- The size of the font used in the graphs is not easily read

- It explains how the response options are in the different sections of the questionnaire. However, in the third part (11 questions) it is not indicated if they are open questions or if the options are already included. Explain if students can mark or indicate more than one option or only one.

- The validity of the questionnaire is indicated, but the aspect related to reliability is not so clear.

Comments for author File: Comments.pdf

Author Response

REVIEWER 3

The study addresses a topic of interest such as training in virtual spaces. Specifically, during the pandemic caused by Covid-19. Although there are numerous studies that have addressed the issue, the study presented may increase knowledge about pedagogical challenges or possible ways to implement the training, supporting and guiding students in their learning process. 

Reply: We thank the reviewer for your excellent comments on our work. For your convenience, we have color coded all changes pertaining to your comments in the manuscript with the color blue.

 

Some issues observed to improve the work:

 

R3.1 - The size of the font used in the graphs is not easily read

Reply: We agree with the reviewer that some of the figures could be a bit larger. 

Action: We’ve increased the size of the images slightly and also changed the layout of Figure 9 to improve their readability.

 

R3.2 - It explains how the response options are in the different sections of the questionnaire. However, in the third part (11 questions) it is not indicated if they are open questions or if the options are already included. Explain if students can mark or indicate more than one option or only one.

Reply: We thank the reviewer for this observation and it was indeed an oversight on our part.

Action: We have updated the description of the third part of the questionnaire, please see Section 2.4. In addition, we have added the questionnaire as an Appendix to the manuscript.

 

R3.3 - The validity of the questionnaire is indicated, but the aspect related to reliability is not so clear.

Reply: The reviewer raises a valid point and it is an aspect that we considered during development of the questionnaire. The plan was to add multiple, correlating, questions for consistency analysis to remove intentional, or unintentional, faulty answers. However, due to restrictions on the size of the questionnaire, we could not include such a mechanism. Instead, we mitigated the threat through a sanity-check of the results. However, this check was not expressed in the paper, nor the threat.

Action: We have updated the threats to validity section with a new paragraph that explains the threat as well as our practice of mitigation, please see Section 2.8.



Back to TopTop