Closing the Gap: Potentials of ESE Distance Teaching
Thomas Cosgrove
Round 1
Reviewer 1 Report
Closing the gap: Potentials of ESE distance teaching
Summary
The study addresses an interesting and important issue: the relative utility of contrasting educational approaches in the study of STEM.
Three main areas are suggested for attention:
1) State explicitly the ontological assumptions underlying the control group/ intervention group methodology. State clearly the assumed epistemology informing the study (e.g. social constructivist of other).
2) Provide more detail to allow the reader to understand much more clearly the similarities and differences between the control group(s) and intervention group(s) experiences.
3) Attend to language usage in the interests of clarity
The article seems to assume that the kind of ‘controlled experiment’ methodology such as might be found in the natural sciences is a valid approach in educational research. Many scholars disagree (Hyslop-Margison and Naseem 2007). Donald Schon explicitly challenges the assumption that “if you can't create control groups or manage random assignment of subjects to treatment and control groups, then … you're not doing rigorous research- (Schön 1995) “. Such approaches, sometimes called ‘technical rational’ assume that, apart from the ‘intervention’, ‘all other things are equal’, similar to a physical/chemical system. Borrego has warned of the tendency for engineers and scientists to import the unwarranted assumptions and methods of empirical science into educational research (Borrego 2007). Social situations are constituted by dynamic sets of human relationships engaged in mutually interpretive dialogue with the materials of the situation (Schön 1983, pp.30,31). Dunne notes that judgements about contextual factors are massively underdetermined by any technical rationale (Dunne 1993 p.4). Educational life is messy and dynamic! This technical idea in turn is taken to imply that any differences in the responses between control and experimental groups may be attributed to the ‘intervention’. Such assumptions ‘step over’ ontological issues relating to the nature of human social life and are regarded as problematic by many scholars (Dunne 1993; Dunne 2005). Such scholars hold that teaching is a practice, preferably a reflective one, characterised more by practical artistry and dialogue informed by lived values than by technical rationality (Schön 1983, pp.30,31; Schön 1991). Winter agrees:
…professional work is essentially complex, consisting as it does of subtle interpersonal interactions requiring the continuous exercise of interpretive skill and flexibility (Winter and Burroughs 1989, p.35).
There is a well-established alternative tradition to technical rationality exemplified in the work of Carr et al and others (Carr and Kemmis 1986; Elliott 1991). In summary there is ongoing controversy about these issues.
Furthermore the authors should also make explicit their epistemological assumptions. Do they assume that they can operate as objective observers of a social context without influencing that context? Do they assume that social factors are central to learning? Are they influenced by thinkers such as Dewey who privileged learning through experimentation?
One way or the other it is important to clarify assumptions upfront. At the very least they should explicitly state basic assumptions and acknowledge that the validity of such experimental designs in education are controversial and contested. They should explicitly explain the many variations between control and experimental groups which surely go well beyond the contrast between online and face-to-face teaching & learning processes.
Further down the authors note that the control group had a one-day on-site experience and the experimental group had a 2 week online experience! This suggests that there were major differences in process that go well beyond the medium of instruction. Did the control group have more ‘in-classroom’ activity apart from this one site-visit day? I don’t read how the 288 students were structured in groups/classes/schools. This number suggests multiple classes with multiple teachers in multiple schools. Please clarify who was where, with whom doing what. Were students located in very difference environmental contexts (e.g. big city urban/small town or rural)? Was the gender mix similar in different classes? As the article stands it seems that the two groups experienced radically different processes and quite different timelines. This must be spelt out more clearly so that the reader may judge the applicability of the research to her own context and make her own critical judgements.
In section 2.1 the research design is summarised in lines 218-245. As many as possible of the features, both the similarities and differences of the experimental and control situations, should be clarified here, noting clearly the study limitations, especially as far as the validity of statistical methods goes (e.g. sample size) since they employ these methods. Their study can only be enhanced from the reader’s point of view by doing this. The learning outcomes may have been the same (were they?) but so much else in educational life is variable. Presumably different teachers were involved in the groups (?) and certainly different students. Some kind of ‘thick description’ of educational process is appropriate since, as Elliott notes, for teachers interpretive or practical deliberation involves reflecting about means and ends together (Elliott 1991, p.138).
While the English is mostly satisfactory, in many places non-standard usage is apparent and the meaning of some phrases is not clear. The following edits are suggested based on assumptions about the authors intentions.
e.g. line 37-38: ‘. In bottom-up initiatives, i.e. projects initiated by citizens, a lack of environmental 37 protection knowledge has no such effect [12].’
Suggest (if I have the meaning right) : ‘‘In bottom-up initiatives, i.e. projects initiated by citizens, a lack of environmental protection knowledge is no barrier to citizens’ engagement’.”
Line 43: “as well as by addressing affective aspects.” The meaning of ‘affective aspects’ is unclear. Perhaps a phrase giving an example would clarify the meaning.
Line 50-52: “ there were large differences in the implementation of ESD in the curricula between school types and federal states”.
Suggest (if I have the meaning right): “…in the curricula between school types within one federal state and between schools in different states”. What kind of school ‘types’ are being referred to? Non-German or even non-Bavarian readers will need to understand this.
Line 61-62: “, it is likely that ESE has been able to make little progress in the past two years”.
While the English is not incorrect. in common usage ‘it is likely’ often refers to a future situation. The question arises for the reader if the past is being considered why not report the facts as known. The authors seem to be drawing an inference based on reasonable speculation about negative Covid impacts (“setbacks”).
Therefore I Suggest: “it is reasonable to assume that ESE etc.”
Line 72:” Curricular requirements such as research character…”
Suggest (if I have the meaning right): “Curricular requirements such as the development of research skills…” or “Curricular requirements such as the development of an active research disposition”.
Line 80-82: “This problem was faced by many out- of-school learning opportunities: even when they offered a digital fallback for processing at home, it was questionable what advantages this should have.”
Suggest: “This problem was encountered by many out-of-school learning curricula: even when they offered a digital fallback for processing at home, it was questionable what advantages would accrue from this facility”.
The discussion on ICT skills (112-142 aprx.) is appropriately critical and interesting.
Line 103: “Whether digital nativity exists, and if so, what influence it has, is still debated”.
Good to see a critical tone on this point!
Line 133-34: “Most of them (studies) distinguish fundamentally between the use of digital methods inside and outside of school.”
Meaning is ambiguous: Implication: School is disconnected from the business of real living? Or do these studies separate online social platforms from learning?
Line “Digital nativity deals with the extent to which such skills are inherent”.
It is not immediately apparent which skills exactly are encompassed in DNAS. Do they include how to use digital tools to interact and record discussions online when working in a group; how to manage time digitally; how to research well online; how to present online etc. etc.?
I confess I had not seen ‘fascination’ used as a formal criterion for assessing engagement. In normal usage it implies a very strong engagement, indeed an almost hypnotic fixation. Not a problem!
Line 165-168: “ The authors suggest that the more effort the actions in question take (e.g., going somewhere special or using technical devices), the higher fascination levels are if participants agree with this behavior. Thus, a person with deep fascination will agree to more statements from the test than a person with low fascination.”
Suggestion: “The authors suggest that when effortful actions are suggested in the test (e.g., going somewhere special or using technical devices), then if a participant agrees to make many of those efforts, that is evidence of deep fascination whereas a person with low fascination will not agree to as many of the suggested actions.”
Is this test validated? Will students tend to provide a ‘right answer’ rather than provide an honest answer.
Line 175: “5th graders are an excellent opportunity”
Suggest: “5th graders have an excellent opportunity…”
Line 177: “Since distance 177 learning is not only formal but usually solitary,”
I don’t think so. There are many distance learning curricula that seek to leverage the social dividend for learning online. It is fair t say that it takes more effort on the art of the moderator/teacher to maintain social activity since many normal social cues and affordances are absent.
Line 184: “without authentic learning environments”.
The word “authentic” has many uses and has normative overtones. I don’t think that a face-to-face situation is necessarily ‘authentic’.
Suggest “to be conducted remotely.”
Line 185-186: “To what extent this lack of authentic learning experiences has created an educational gap that needs to be bridged remains open.”
In light of previous comment I suggest: “To what extent this remote learning has led to a lack of authentic learning experiences and to an educational gap that needs to be bridged remains unclear.”
Lines 186-188: “n. In this present study, content-wise and methodological preferences of students and their influence on learning progress will be analyzed.”
Suggest: “This study will analyse the influence of students preferences in regard to content and learning process on their progress in learning”.
However this aim looks far too broad. I think it needs to be clarified and focussed with another sentence or two. Presumably you will not provide limitless freedom as far as content goes or provide multiple learning experiences for comparison.
Line 188: “This bears implications…” : presumably you mean “this has implications”. However the meaning of “this” here is unclear. Does “this” refer to the gap that may need to be bridged?
Lines 191-198: [I will transcribe the text including suggested edits. ]
“The DNAS instrument may be suitable for school use due to its conciseness and comprehensibility. If it performs well in this age group, it may then be adopted for general use. The Fascination with Biology (FBio) Scale is used to measure individual interest levels. An initial validation regarding the target group will be applied here, too. Both tools are analyzed regarding their change [meaning unclear: change in what?]. It says further down that the FBio instrument was applied only once. throughout the delivery of the online ESE learning units. Finally, results on the connection between digital teaching, taught content and ESE learning progress are explored. Accordingly, the following research questions are addressed:”
Lines 202-203: “Does fascination with the topic have a greater influence on knowledge gain in a dig- ital learning module than being comfortable with digital learning methods in general?”
Suggest something like: “Which influences learning progress more: student fascination as measured on the FBio scale or Digital Nativity as measured on the DNAS scale”.
But see my introductory comments: given the apparently very different experiences of the two student groups it seems that many factors within the study and possibly many factors outside the study (such as prior familial and educational experiences and geographical contex) can influence outcomes.
Page 5: reference is made to “The learning units”, “the intervention”, “the first module”, “the second module”, “The online module”, “the learning module”, “the intervention unit”. Do these phrases overlap or which group did which modules/learning units? I think this page should lay out as clearly as possible the common features and contrasting features as far as content, process and timeline goes. Process should include the teachers activities and the students activities as well as task deliverables and assessments: “the control group did XYZ”. “The experimental group did ABC”. Clarify under each heading timeline, content, process and supports so the reader has a clear working knowledge of what the students and teachers were doing throughout the study both when mentored/taught and on their own initiative. Perhaps a table with two long columns side by side? Did the online (experimental) group receive any guidance online? When you say asynchronous do you mean recorded material to be studied by students with no teacher input? How much (voluntary?) contacting of tutors went on? By which group? Mention is made of “DIY projects” (plural). How many? What did these look like? Who did them? Were they done at the students’ discretion? Did both groups do the same number of projects? How did local context give rise to different project outcomes? How was the connection made with the students living environment? What does “their direct environment” (line 242) mean?
What does “the students opinions were included” mean? Opinions about what and how identified and used?
What do the acronyms ESS3C and LS2.C etc. mean? Are these modules? How e.g. do they relate to the first and second modules previously mentioned?
Line 264: how were the 24 students chosen for the pilot test of the DNAS from among the total of 288?
Line 282-3: “Because no significant behavioral changes in individual interest levels were to be expected, the FBio questionnaire was applied once”.
This remark is a little surprising, depending on what is meant by “behavioral changes”. Education is about change. Does it mean changes in attitude towards a body of knowledge such as the SDG’s? Or if it literally means ‘behaviour’ what kind of behaviour?
Line 285: “A single choice test…etc”. Since there is mention of project work, did this test intend to measure the kinds of know-how (practices) associated with project work or just knowing that (factual & propositional knowledge)? e.g. doing self-guided research? Learning how to work in teams (if there was any)? Learning how to ask fruitful questions? Learning how to appreciate the importance of this or that knowledge form? Much positive learning may be unassessed when using traditional exam-type assessments
I will not comment on the statistical methods employed as that is not my area of expertise. However the authors should reference criteria regarding sample type and size appropriate to these methods and note any strengths and limitations. For this reader at any rate why some values were ‘good values’ (e.g. line 339) was not clear
The DNAS section on Multi-tasking seems to me to implicitly support practices which may impact negatively on learning e.g. doing work and chatting with friends at the same time. The authors should express some critical opinion about the DNAS scale. Do they agree that all the dimensions are educationally positive?
Also since both the DNAS and FBio instruments are laid out as positive statements surely there is a high tendency for students to answer with a bias towards ‘Yes’ since the statements are leading and may be taken by the students as right answers? Commonly in social research the researcher is counselled to avoid leading questions.
The study conclusions and import are clearly presented in the discussion section.
Borrego, M. (2007) 'Conceptual Difficulties Experienced by Trained Engineers Learning Educational Research Methods', Journal of Engineering Education, 96(2), 91-102.
Carr, W. and Kemmis, S. (1986) Becoming critical: education knowledge and action research, London: Falmer Press.
Dunne, J. (1993) Back to the rough ground: 'phronesis' and 'techne' in modern philosophy and in Aristotle, Notre Dame, London: University of Notre Dame Press.
Dunne, J. (2005) 'An intricate fabric: understanding the rationality of practice', Pedagogy, Culture & Society, 13(3), 367-390, available: http://dx.doi.org/10.1080/14681360500200234.
Elliott, J. (1991) Action research for educational change, Milton Keynes: Oxford University Press.
Hyslop-Margison, E.J. and Naseem, M.A. (2007) Scientism and Education: Empirical Research as Neo-Liberal Ideology, Dordrecht: Springer Science+Business Media B.V.
Schön, D.A. (1983) The reflective practitioner : how professionals think in action, New York: Basic Books.
Schön, D.A. (1991) The reflective practitioner: how professionals think in action, Aldershot: Arena.
Schön, D.A. (1995) 'Knowing-in-action: The new scholarship requires a new epistemology', Change: The Magazine of Higher Learning, 27(6), 27-34.
Winter, R. and Burroughs, S. (1989) Learning from experience : Principles and practice in action-research, London: Falmer.
Author Response
Manuscript ID: sustainability-1758000
Type of manuscript: Article
Title: Closing the gap: Potentials of ESE distance teaching
Authors: Sonja T. Fiedler *, Thomas Heyne, Franz X. Bogner
-------------------------------------------------------------------------------------------
Comments Reviewer #1
Thank you very much for extensive review and comments. Please find attached the revised version where we incorporated your suggestions. The replies to editorial comments are marked in red below. Tracked changes were used in the manuscript document.
- State explicitly the ontological assumptions underlying the control group/ intervention group methodology. State clearly the assumed epistemology informing the study (e.g. social constructivist of other).
A paragraph on epistemological orientation has been added to the introduction. We agree that controlled experiments in education must always be critically reflected upon and the results must be viewed in perspective.
- Provide more detail to allow the reader to understand much more clearly the similarities and differences between the control group(s) and intervention group(s) experiences.
Additional information on similarities and differences between control group and experimental group as well as teacher involvement is provided in chapter 2.1. More information is also provided in other publications related to this study (for example https://doi.org/10.1007/s43621-021-00041-y).
- Attend to language usage in the interests of clarity.
For the following lines, all suggestions made by the reviewer regarding language usage or ambiguities have been revised. For more detail, see the revised manuscript.
- Line 37-38
- Line 43
- Line 50-52
- Line 61-62
- Line 72
- Line 80-82
- Line 133-34
- Line 114
- Line 165-168
- Line 175
- Line 177
- Line 184
- Line 185-186
- Lines 186-188
- Line 188
- Lines 191-198
- Lines 202-203
- Line 264
- Line 282-3
- Line 285
- […] As the article stands it seems that the two groups experienced radically different processes and quite different timelines. This must be spelt out more clearly so that the reader may judge the applicability of the research to her own context and make her own critical judgements.
More information about the participants, the German school system and intervention design were added. As you proposed, a short overview of the learning unit was provided in form of a table.
- Is this test (fascination) validated? Will students tend to provide a ‘right answer’ rather than provide an honest answer.
The corresponding section has been adapted to indicate more clearly that the fascination scale has already been used successfully in other studies. Due to time limitations within the questionnaire, we had decided not to include a social desirability or lie scale like the RCMAS in this study. We agree that this would have given the opportunity to analyze the answering pattern of the students. In Line 476 it is recommended to use such a scale, though, in follow-up studies.
- What do the acronyms ESS3C and LS2.C etc. mean? Are these modules? How e.g. do they relate to the first and second modules previously mentioned?
Mentioning the learning goals according to NGSS (Next Generation Science Standards) is intended to make the learning unit more tangible, especially for the U.S. education community. The acronyms represent certain NGSS categories.
- […] However, the authors should reference criteria regarding sample type and size appropriate to these methods and note any strengths and limitations.
Some information on adequate sample size was added. Chapter 2.3 additionally provides benchmarks for CFA and SEM analysis, which are needed for data interpretation.
- […] The DNAS section on Multitasking seems to me to implicitly support practices which may impact negatively on learning e.g. doing work and chatting with friends at the same time. The authors should express some critical opinion about the DNAS scale.
The discussion was extended to show a more critical attitude toward the DNAS in its current form. The authors mentioned in several text passages that certain DNAS items need to be revised for use in state-of-the-art education settings.
Author Response File:
Author Response.pdf
Reviewer 2 Report
Dear Authors,
after reviewing the manuscript entitled "Closing the gap: Potentials of ESE distance teaching," I can state the following:
The topic, where you compare traditional with distance teaching of life science and environmental topics gives a new insight into teaching and learning practices. At first glance, the finding that distance teaching can have a comparable effect on students' learning outcomes as in out-of-school ESE center was surprising. However, the individualized/student-centered approach with connectedness to the local environment in an online module proved successful. And as you mentioned, one should test these findings on a larger sample of participants.
The only unclear part of the manuscript I find is the research questions part, especially RQ3. Please see the comment in the manuscript.
You have also used two (or part of) questionnaires already used in previous studies, and you were able to connect the results of both with students' achievements.
Limitations are noting the possible improvements of the research if one would be interested in reproducing your research.
All in all, your research is of direct value for out-of-school centers to promote their work and educational activities to schools through online platforms and influence students' knowledge and possibly behavior (pro-environmental actions).
Some comments are included in the manuscript.
All the best,
Comments for author File:
Comments.pdf
Author Response
Manuscript ID: sustainability-1758000
Type of manuscript: Article
Title: Closing the gap: Potentials of ESE distance teaching
Authors: Sonja T. Fiedler *, Thomas Heyne, Franz X. Bogner
-------------------------------------------------------------------------------------------
Comments Reviewer #2
Thank you very much for extensive review and comments within the manuscript. Please find attached the revised version where we incorporated your suggestions. The replies to editorial comments are marked in red below. Tracked changes were used in the manuscript document.
- The only unclear part of the manuscript I find is the research questions part, especially RQ3. Please see the comment in the manuscript.
From manuscript: “please make this RQ more clear. Add separate RQ regarding teaching methods comparison as it is highlighted several times that on-line approach with active participation had same effect as out-of-school visit.”
The difference between on-site and online learning progress been discussed in separate study (https://doi.org/10.1007/s43621-021-00041-y). Because the results on knowledge gains seemed relevant to us for interpreting the other measurement instruments as well, we find that a brief summary leaves more room and emphasis for findings regarding DNAS and FBio. We incorporated more information about the control group and the experimental group.
Author Response File:
Author Response.pdf
