Next Article in Journal
Introduction to Computational Thinking with Scratch for Teacher Training for Spanish Primary School Teachers in Mathematics
Next Article in Special Issue
Outcomes of a Teaching Learning Sequence on Modelling Surface Phenomena in Liquids
Previous Article in Journal
Learning through Collaborative Data Projects: Engaging Students and Building Rapport
Previous Article in Special Issue
The Lawson’s Test for Scientific Reasoning as a Predictor for University Formative Success: A Prospective Study
 
 
Article
Peer-Review Record

Effectiveness of a Laboratory Course with Arduino and Smartphones

Educ. Sci. 2022, 12(12), 898; https://doi.org/10.3390/educsci12120898
by Giovanni Organtini 1,* and Eugenio Tufino 2
Reviewer 1:
Reviewer 2:
Educ. Sci. 2022, 12(12), 898; https://doi.org/10.3390/educsci12120898
Submission received: 6 October 2022 / Revised: 18 November 2022 / Accepted: 29 November 2022 / Published: 8 December 2022
(This article belongs to the Special Issue Innovation in Teaching Science and Student Learning Analytics)

Round 1

Reviewer 1 Report

A good article, very interesting, clear and presenting a relevant pedagogical methodology.  

Author Response

Thank you very much. We are happy that you found the manuscript already in a good state.

Reviewer 2 Report

Thanks for letting me review this paper. This presents a report on innovative practice in an introductory university physics course.

While the paper reads reasonably well and presents an interesting case of practice, it fails to relate the literature on learning analytics (presumably the core of the special issue) or clarify what is the contribution to the scholarship of learning and teaching.

here are some suggestions/comments:

- contextualise the course (explain more about what it normally entails and what the key innovations are, especially in relation to existing innovative practice in similar courses abroad and/or in relation to online education, such as MOOCs). the reference to active learning is rather limiting without explaining the type of engagement expected

- explain in more detail the use of the survey; bouncing the reader off to the original paper does not help: what type of questions are asked? what are the key constructs measured? how did the authors go about doing the translation and validate the instrument? there is a mention to end-of-course evaluation; i would definitely frame the paper in terms of evaluation of teaching and the student experience in more detail

- what kind of data is collected from either the experiments or the programming? these would be much more useful in terms of 'learning analytics' relevant to the social issue

- what is the key contribution to the scholarship of learning and teaching?

- there are some references to expertise and the process of building expertise in the course and in comparison to 'experts': what are the key takeaway messages here?

- conclusions are very thin; what are the implications for practice which can be derived from the evaluation? what can be generalised to general science courses? 

 

please note that language needs to be checked, there are several colloquialisms and 'italianisms' in the text. 

i would certainly encourage the authors to review the paper and submit elsewhere, but i struggle to see the relevance for this special issue.

Author Response

Please find our response attached as a PDF file

Author Response File: Author Response.pdf

Back to TopTop