Next Article in Journal
Sustainable Urbanization on Occupied Land? The Politics of Infrastructure Development and Resettlement in Beira City, Mozambique
Next Article in Special Issue
Design of an Extended Experiment with Electrical Double Layer Capacitors: Electrochemical Energy Storage Devices in Green Chemistry
Previous Article in Journal
Smart Cities: The Main Drivers for Increasing the Intelligence of Cities
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluating Knowledge and Assessment-Centered Reflective-Based Learning Approaches

1
Department of Physics, University of Girona, 17003 Girona, Spain
2
Teaching Innovation Networks on Reflective and Cooperative Learning, Institute of Sciences Education, University of Girona, 17003 Girona, Spain
3
Department of Experimental and Health Sciences, University Pompeu Fabra, 08003 Barcelona, Spain
4
Department of Specific Didactics, University of Girona, 17004 Girona, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2018, 10(9), 3122; https://doi.org/10.3390/su10093122
Submission received: 7 August 2018 / Revised: 27 August 2018 / Accepted: 30 August 2018 / Published: 1 September 2018
(This article belongs to the Special Issue Pedagogical Innovations for Sustainable Development)

Abstract

:
This paper addresses the development of knowledge and assessment-centered learning approaches within a reflective learning framework in a first-year physics class in a university faculty. The quality of students’ reflections was scored using a Self-reporting Reflective Learning Appraisal Questionnaire at the end of each learning approach. The results showed the differences between the approaches based on reflections on the learning control through self-knowledge, by connecting experience and knowledge, as well as through self-reflection and self-regulation. Assessment-centered activities fundamentally help students identify aspects of their attitudes towards, as well as regulate, their sustainability learning education.

1. Introduction

A central characteristic of transformative learning is the process of reflection which may be defined as the intellectual and affective activities that lead to exploration of experiences in order to develop understanding and appreciation [1,2]. Since Schön’s publication in 1983 [3], many authors have explored reflective practices in greater depth, mainly on initial and continuous training for students and professionals, particularly in the fields of sustainability teaching, health, and social education. On the basis of this rethinking, reflective skills can now be considered as essential for professionals, and ways must therefore be found for them to be taught and learnt. It is not only a question of acquiring certain skills, but also of reformulating the relationship between knowledge, practice, and sustainability experience.
Dewey [4] established the pragmatic notion of reflection by distinguishing reflective action from routine action. Reflection has, from his point of view, the purpose of addressing consciousness and thoughtfulness about one’s actions. As noted by Dewey [4], the process of reflection consists of a linear model of successive phases, from an initial interpretation of experiences to defining hypotheses and testing or experimenting with them. Later on, cyclical reflection models for educational settings were suggested [5,6] postulating that reflection encourages learners to take an active role in finding solutions to complex problems. Kolb [5], and Kolb and Kolb [7], posited learning to be the creation of knowledge through the transformation of experience. According to them, learning is a dialectic and cyclical procedure consisting of four processes: concrete experience, reflective observation, abstract conceptualization, and active experimentation. Experience is the basis of learning, but learning cannot take place without reflection, and while reflection is essential to the process, it must be linked to action. Thus, Schön [3] described reflective practice as a dialogue between thinking and doing, via which the learner becomes more skilled. This involves integrating theory and practice, and thought and action [8,9,10]. The learning cycle provides feedback, which is then used as the basis for the new action and the subsequent evaluation of that action’s consequences. Learners ought to go through the cycle several times, so it may best be thought of as a spiral of cycles [11].
In science laboratories, students are mainly involved in the extension mode (the active experimentation), however, in science, technology, engineering, and mathematics (STEM) classroom environments, knowledge transformation is based on the early stages of Kolb’s model, where the information is grasped through experimental procedures and scientific results, as well as the theory behind it. Beatty and Gerace [12] argued that effective learning environments should be student-centered, knowledge-centered, assessment-centered, and community-centered, in which students should be treated as individuals with varied initial states and unique trajectories to learning. In particular, the knowledge-centered learning environment proposes knowledge as being a rich, interconnected structure that must be organized and refined as it is expanded, and should serve to help students become metacognitive by expecting new information to produce a gain in learning. An assessment-centered learning environment should weave formative assessment deeply into the fabric of instruction, providing continual, detailed feedback to guide students’ learning and instructors’ teaching in a learner-driven sustainability education [12,13,14,15,16].
Reflective learning is understood as a process that leads to reflection on all sources of knowledge, including any personal sources and experiences which may contribute to understanding a situation. Although the use of reflective sustainability education-focused activities is a significant contributing factor when optimizing the impact of teaching, the use of reflective activities has not yet been fully explored in science, technology, engineering, and mathematics (STEM) education, and their application should be reinforced [8,17]. As indicated by Songer and Ruiz-Primo [18], greater attention should be paid to STEM education practices based on instructional effectiveness and better assessment. Translating knowledge activities into a useful form for teaching is a way of emphasizing more pedagogical knowledge [19]. Abdulwaked and Nagy [20] noted that the impact of laboratory education on students’ learning is yet to be recognized, and there should be a fundamental rethink of the role the laboratory plays in engineering and science education. These authors report that engineering and science students should be experiencing constructivist pedagogy to gain autonomy in the learning process and, in return, this kind of education would serve as a motivating factor towards pursuing a sustainable engineering or scientific career. Kolb proposed that a way to promote sustainability learning would pass through a spiral of cycles of the concrete experience, reflective observation, abstract conceptualization, and active experimentation. The knowledge grasping dimension, or the prehension dimension, is activated when knowledge can be grasped through apprehension, that is, the concrete experience dimension or by comprehension, that is, the abstract conceptualization dimension. The knowledge construction dimension can be done via either the intention, that is, the reflective observation dimension or the extension, that is, the active experimentation. As in the Kolb’s experiential learning theory, our model allows balanced interplay for three modes, apprehension, comprehension, and intention, this lately being fostered by both reflective observation and guided written reflective narratives. A hybrid combination of these elementary modes in the learning process produces a higher level of learning [20].
In keeping with McKenna et al. [17], knowledge-centered and assessment-centered approaches promote deeper and more meaningful student learning. Indeed, efficient learning appears when there is a concrete situation, i.e., situated learning emphasizes the idea that much of what is learned is specific to the situation in which it is learned [21,22]. Furthermore, as stated by Crouch and Mazur [23], students develop complex reasoning skills when they are more effectively engaged with the material they are studying; consequently, it is important to explore students’ reflection processes when confronted with dynamic methodologies in the science teaching environment.
The initial objective of this study was to investigate three approaches to sustainable teaching physics, namely, the knowledge-centered approach (KA), the assessment formative approach (AFA), and the structured formative approach (SFA). This study attempts to explore students’ reflection processes when subjected to dynamic methodologies in a first-year physics classroom. Little research has investigated the resultant faculty approaches to teaching in reflection contexts focusing on educational innovation and whether that innovation is sustainable, or not, in terms of application to a large group of students. For example, Peer Instruction, developed by Fraser et al. [24], could be seen as an attempt to introduce some (perhaps unrecognized) reflection or metacognition through peer dialogue about conceptual questions posed in large classes. Although research-based instructional strategies have been implemented in physics education [25], little research focuses on developing self-reflection skills and appropriate views about knowledge and learning related to improvements in conceptual understanding [26].
The study’s second objective was to obtain students’ views on the benefits, obstacles, and limitations of incorporating reflective learning methodologies into the class itself, and the design of the activities. This information was essential when evaluating the subsequent experiences. Specifically, we aimed to understand students’ views on reflective learning methodologies in relation to self-knowledge, the relationship between experience and knowledge, and self-reflection and self-regulation. Our study quantifies the students’ perceptions of the influence the three teaching approaches used in the physics class had. We consider individual aspects, such as knowledge about oneself, connecting the experience with prior knowledge, and learning self-reflection and self-regulation [27]. Quantification was based on the information obtained from a Self-reporting Reflective Learning Appraisal Questionnaire (RLQuest hereafter), which was completed by the students at the conclusion of the proposed approaches.

2. Method

2.1. Context

The experiment was carried out with a regular group of students in a first-year university physics classroom. The Spanish curriculum for science students is based on four years of study. This experiment was carried out during a 150 h module in the Bachelor of Environmental Sciences degree at the Faculty of Sciences of the University of Girona, Spain. The objective was to develop basic concepts of physics content.

2.2. Participants

The experiment group was asked to follow a sequential reflective methodology by initially viewing physics experiments, then through open-class discussions, next, by writing a report on the experience, and finally, by reflecting on the experience. There were sixty-nine participants in total, and the ages ranged between 18 and 23 years old, albeit the great majority (over 90%) being between 18 and 19 years old. In terms of gender distribution, the sample had a higher percentage of female (56.9%) to male students (43.1%).

2.3. Sequential Methodologies and Conceptual Framework

The student-centered activities consisted of three approaches that were sequenced in time, in an attempt to activate apprehension, intention, and comprehension [20]. In the class environment, the activity was organized around a set of physics experiments, and defined the taxonomy of concepts and topics to be covered. The basis of the approaches was either experiments taking place in the classroom, or watching experiments on a video. Each approach consisted of four activities (whether they were video cases or in-class experiments) therefore, the full proposal comprised of 12 activities. Given that the effectiveness of reflection might strongly depend on the nature of the demonstration, each set of the four activities was divided into two well-established prior-knowledge activities: one to consolidate recently acquired knowledge, and the other to introduce new knowledge.
In the first set of activities, a knowledge-centered approach (KA) was established, in which conceptual understanding and organization of the knowledge was encouraged. That is, apprehension and intention via concrete experience and reflection were activated. The students were asked to write a report describing each of the physics experiments and to answer the RLQuest in relation to the reflective process (Table 1). The RLQuest was designed, implemented, and tested from the results of a comparative analysis in a project dealing with the introduction of reflective methodologies in four Bachelor’s degrees offered by the Environmental Sciences, Social Education, Psychology, and Nursing faculties at the University of Girona [27].
As highlighted by Hake [28], students involved in active learning outperform those who learn passively, in a traditional lecture setting. When students are provided with opportunities to examine and reflect upon their practices, they are more likely to see themselves as active change agents [29]. In alignment with the conceptual change literature, the KA seeks to help students grow contextually robust, transferable conceptual ecologies that are thoroughly reconciled with their experiences, perceptions, and prior understandings. In addition, students are engaged in extensive dialogical discourse about scientific ideas and their applications, set with the context of rich and challenging questions and problems [12].
In the second set of activities, students were asked to describe four physics activities, but this time, with an assessment-centered approach, as the teacher also suggested a series of formative questions (listed on Table 2) to promote positive feedback (AFA). Again, each set of the four activities was divided into two well-stablished prior-knowledge activities: one to consolidate recently acquired knowledge and the other to introduce new knowledge.
As for the KA approach, students were asked to answer the RLQuest at the end of the process (Table 1). The ten questions in Table 2 were chosen to be aligned with the theoretical framework to measure levels of reflective thinking. They were chosen following the model of Kember et al. [30], that categorizes the level of reflection in written work into four categories, namely, habitual action/non-reflection, understanding, reflection, and critical reflection.
In the third set of activities, a contextualized approach was used in which the teacher contextualized knowledge instruction (Table 3) and reflective learning (Table 2) by referring to authentic practices encompassing the first and second approaches [31]. That is, the students wrote a report on each of the four activities based on the ten reflective questions from Table 2 and the six questions in Table 3. In the case of the KA and SFA approaches, the students were asked to answer the RLQuest at the end of the process (Table 1). The questions in Table 3 were designed to guide the students through the process of describing a scientific experiment. They are aligned with the early stages of the Kolb’s model, where learning processes are particularly enhanced through abstract conceptualization.
At the core of the AFA and SFA methodologies lies the principles of technology-enhanced formative assessment, in which the intention is to motivate and focus student learning through question-driven instructions, in order to set up formative learning situations and to catalyze learning [12,32]. Questions occur throughout the instructional sequence of the students’ observation of the exploratory physics experiments. This direct instruction, given by the teacher, concurs or conflicts with the student’s interpretation of the task and the learning path. As pointed out by Nicol and MacFarlane-Dick [33], to produce an effect on internal processes or external outcomes the student must actively engage with external inputs. Furthermore, Jacoby et al. [34] affirm that the framework should be implemented in a dynamic process in which monitoring is to be continuous. Both AFA and SFA methodologies include feedback responses that have to be continually interpreted, constructed, and internalized by the student if they are to have a significant influence on subsequent learning. Nicol and MacFarlane-Dick [33] found that formative assessment and feedback produced significant benefits in terms of self-regulation of motivation and behavior, as well as cognition. These findings are consistent with Kolb’s argument of considering abstract conceptualization as being fueled by both apprehension and intention, requiring the information to be grasped or depicted in a reflective and formative process that can lead to constructivist learning.

2.4. Instruments and Measures

2.4.1. The RLQuest

The students accrued a reflective portfolio that consisted of a collection of twelve texts, each one based on a different physics experiment. The portfolio provided quantitative information for the primary axis of Kolb’s experiential learning cycle, (the abstract–concrete dimension and the active–reflective dimension), where the first is how we perceive new information or experiences, and the second how we process or transform what we perceive [35]. The portfolio also included the assessment carried out through the Self-reporting Reflective Learning Appraisal Questionnaire (Table 1), which was completed at the end of the KA, AFA, and SFA methodologies.
The student version of the Self-reporting Reflective Learning Appraisal Questionnaire was designed using four blocks of information, as shown in Table 1 [27]. First, the students were asked about their self-knowledge. Second, the questionnaire explored students’ perceptions as to exactly what extent reflective learning (RL) had helped them to connect their experiences with prior knowledge. In the third block, information was obtained on the perceived effect of RL, with regard to self-reflection on the learning process, and the fourth area included data on the effect of RL on self-regulation of learning. A five-point Likert scale was designed, (from 1 = strongly disagree through to 5 = strongly agree) to evaluate the statements related to the areas (see Table 1). The use of questionnaires and rubrics has been found to optimize the quantification of learning strategies, especially in multimethod research [36], metacognitive aspects [37], and methodological skills [38,39]. The validity of learners’ reports depended on the fact that mental episodes in performing the tasks persisted as objects of focal attention in short-term memory. According to Richardson [40], and Watts and Lawson [41], the use of questionnaires to complement tasks may be considered as giving an accurate reflection of cognitive processing. However, their use in learner-centered STEM classroom environments to obtain information on students’ perception of learning approaches is still far from being fully validated.
In order to improve the reliability of the surveys, as well as clarity and instrument design, the questionnaire was refined following feedback from six peer assessors. To ensure scale reliability, a reliability analysis was conducted following the logic that good development procedures result in a reasonably reliable survey instrument [42,43]. For the first block (Table 1), in which students, using their own knowledge, responded to a daily situation, Cronbach’s coefficient alpha [44,45] was 0.84. For the second block (Table 1), which regarded primarily their levels of connecting the experience with prior knowledge, Cronbach’s coefficient alpha was 0.88. For the eight items that constituted self-reflection on the learning (Block 3, Table 1) Cronbach’s coefficient alpha was 0.78, while for the four items constituting self-regulation, the fourth block (Table 1), Cronbach’s coefficient alpha was 0.87. As such, all factors yielded sufficiently high reliability for research purposes.

2.4.2. Assessment of the Students’ Reflections

To measure the level of reflection in the written portfolio (consisting of twelve activities, i.e., four for each methodological approach) the four scales from Kember et al. [46], were used, in which the level of reflection in writing is likely to contain four categories: habitual action/non-reflection, understanding, reflection, and critical reflection [30,46]. Using such a model allows for consistency between the four-category scheme designed to determine the levels of reflection when describing the activities, and the answers given to the RLQuest in relation to the reflective process.
In the habitual action, the physics experiment was written up following a lineal protocol without the student understanding the principles behind the experiment. Non-reflective writing includes partially or wholly plagiarized material, or any sense of meaning or real understanding of the underlying physics constructs. The second category, understanding, while not implying reflection, it does imply an understanding of the concepts of physics, as well as the underlying theory, and is manifested in a reliance on what is described in text books or lecture notes. There is no consideration as to how the concepts relate to personal learning [46]. The third category, reflection, is assumed when the student makes contributions to improve the learning process, i.e., besides understanding, the student becomes involved and suggests learning improvements for the development of the activity [31]. In addition, the physics experiments are related to theory through a process of reflection in which the experiments are being considered, and successfully discussed, in relation to what has been taught. There are personal insights that go beyond book theory [46]. The fourth category, critical reflection, implies undergoing a transformation of perspective. It implies a critical review of presuppositions (from conscious and unconscious prior learning) and their consequences. The student describes the physics experiments with their own arguments, uses significant motivating examples, formulates direct questions from the experiments, and creates original models to explain the experiments.
The first step in analyzing the work of each student was to identify complete units of information, understood as a simple idea/concept, or thought about as a particular learning process. Among the sixty-nine portfolios describing the twelve activities, in the knowledge-centered approach (KA) there were 608 units, in the formative assessment approach (AFA) there were 1834 units, and in the structured formative approach (SFA) there were 2557 units. Following the protocol described in Chamoso and Cáceres [31], the corresponding assessment was undertaken by two members of the research team, and they then agreed on the information units to code. The work was subsequently revised by the rest of the authors, who were in agreement on the approach analysis (96% for KA, 87% for AFA, and 90% for SFA).
The contents of the experiment descriptions each student kept in their portfolio were analyzed according to the categories being considered. It was considered that the normality status and the homogeneity of variance were maintained, and descriptive statistics, analysis of variance for repeated measures (ANOVA), was used.

3. Results

According to the research objectives described in the introduction, the global objective of this study was to determine students’ appraisal of reflective learning (in regard to their reflective learning processes) on class-gained knowledge and assessment-centred activities. Quantified information from the RLQuest questionnaire was obtained, along with information on the foremost difficulties encountered by the students in integrating the reflective learning processes into the three learning methodologies (KA, AFA, and SFA). First of all, and in order to compare students’ results for each methodology, equal population variances, or at least approximately equal, were required so as not to have the null hypothesis of the Levene test contrast rejected. Hereafter, only statistically significant results will be discussed. The Levene test, based on the median, proved a p-value greater than 0.05 (0.950) in the case of question Q2 (Table 1). This implies that the hypothesis H0: σ 1 = σ 2 is not rejected and, consequently, Fisher’s standard test could be used instead of the robust Brown and Forsythe test. On the other hand, a p-value of less than 0.05 was found for question Q1 so, in this case, the robust Brown and Forsythe test was applied. By applying both tests, the only questions with significant differences in the five-point Likert scale were questions Q1 and Q2. The statistical significance, assuming a 99% confidence level, is of 0.000 < 0.01 for question Q1, and 0.008 < 0.01 for question Q2. Based on these results, we can infer that when analyzing both their own behaviour and their emotions in terms of reflective practice, students are selective in the approach they use. This implies that the students are aware of which approach is best, in order to gain knowledge about themselves.
Figure 1 shows the Reflective Learning Questionnaire scores for each of the eighteen questions composing the appraisal (Table 1) for the KA, AFA, and SFA methodologies. When the KA methodology was applied, lower scores were obtained compared to the scores given after implementing the AFA and SFA methodologies. The scores obtained when implementing the first methodology were lower than the scores obtained from the second method and these, in turn, lower than those obtained from the third method. The higher scores for the assessment-centered approaches may imply recognition of more frequent opportunities for formative feedback. Significant differences between the scores for the three methodologies were found for questions Q1, Q2, Q11, Q12, and Q17 (Figure 1). Given that the sample was not normally distributed, a non-parametric test was used; specifically, the Mann–Whitney U test for independent samples. Indeed, assuming a 5% risk, there were significant differences between the first and the third methodology in questions Q1, Q12, and Q17 and, assuming a 10% risk, there was significant variability in the scores of questions Q2 and Q11. In fact, questions Q11 and Q12 reveal the fact that the SFA methodology helped students to identify the positive and the negative aspects in the process of reflection. The SFA methodology was also graded highly for question Q17, indicating that the students recognize both formative feedback and structured reflection for better regulation of their learning.
There were some questions, such as questions Q8, Q9, Q10, and Q14, which presented no variability between the three methodologies. In fact, they provide information on the implementation of the methodology itself, where the sum of focusing on describing the class physics experiments and reflecting on the experience, did not actually help students to produce any kind of self-reflection on the learning process in terms of improving communication skills, or identifying positive and negative aspects of their knowledge and skills, nor was understanding what they learned and how they learned meaningful for them.
Figure 2 shows the mean rankings of the four blocks of questions, and takes into account the three approaches. In all the blocks, scoring was higher for the SFA methodology and lower for the KA methodology. Major differences were attributed to scores from Block 1, especially for the KA and AFA methodologies, which refer to “knowledge about oneself”, and which include questions Q1 and Q2. The mean block scores were approximately equal for Blocks 2, 3, and 4 (Table 1).
Figure 3 shows scores from questions Q1 and Q2, which belong to Block 1. As in the previous test, these two questions presented different results depending on the approach applied. In particular, the median for each answer (represented by a thick dark horizontal line) increased for both the AFA and SFA approaches. Specifically, Figure 3a (left) shows a large deviation in the valuations of the AFA method, while the median increased significantly from the KA to the SFA method. In fact, 50% of the answers from the KA method are between 3 and 4 and, conversely, 50% of the answers from the SFA method are between 4 and 5. Figure 3b (right) shows less variability because of the larger deviations within the three methods, but even so, the KA methodology presents 50% of its valuations between 2 and 3, while 50% of the other two methodologies were between 3 and 4.
Figure 4 shows two of the four questions in the second block, in particular, questions Q3 and Q6. No differences were found between the three methodologies. In Figure 4a (left) and Figure 4b (right), scoring of questions Q3 and Q6, respectively, produced no differences in the median and very small differences in the deviation and the quartiles. This means that not all of the approaches helped students to connect knowledge with their own experiences, emotions, and attitudes (question Q3) or to provide reasoning for decisions taken when confronted with a certain situation (question Q6). Figure 5 shows five of the eight questions of the third block, in particular, question Q7 (Figure 5a), Q10 (Figure 5b), Q11 (Figure 5c), Q13 (Figure 5d), and Q14 (Figure 5e). A similar pattern in all of the figures can be found in which methodologies AFA and SFA depict the same representation. For example, in Figure 5a,c–e, the AFA and the SFA methodology presented the same scores, while in Figure 5b, scores between the KA and the AFA methodologies were similar. In most of them, the median was equal for the three approaches, while the deviation varied with the method being used; with this being generally higher for the KA method. That is, both the AFA and the SFA methodologies that included a feedback process uniformly helped the students in the process of self-reflection on the learning process.
Figure 6 shows two of the four questions from the fourth block, in particular, questions Q15 and Q18. In Figure 6a (left), the mean SFA score was slightly larger than the mean score for KA and, although the AFA method presented a somewhat different representation, the three box plots did not provide much relevant information. However, in Figure 6b (right), while the median is the same regardless of the method, the quartiles were in fact different. For the KA and the AFA methodologies, 50% of the rankings were between 3 and 4, and for the SFA method, they were between 4 and 5. This means that the SFA method helped students to evaluate their planning of learning, the outcome, and what they should do to improve it.
With respect to the outcomes of the work of each student in describing the activities for each of the approaches (KA, AFA, and ASA) with the previously established categorizations, in reflection and critical reflection, the majority of the students’ responses were scored for the AFA and SFA approaches (Table 4). In the KA approach, the students’ responses were scored highly in the understanding category. Repeated ANOVA measures showed that there were significant differences among the outcomes obtained in the four categories—non-reflection, understanding, reflection, and critical reflection—for all the three approaches (Table 4). The analysis also showed that there were no significant differences in the outcomes for the different activities, i.e., prior knowledge, consolidating recently acquired knowledge, and introducing new knowledge activities.

4. Discussion and Conclusions

Step-by-step instructions, corresponding to student-centered sustainability activities, guided first-year students in an effort to improve their understanding of the fundamental concepts of physics. In addition to observing the experiments in class, the activity included student reflection with an objective of quantitatively producing knowledge of students’ reflective learning. The students’ reflection was based on three methodologies: the knowledge-centered approach, the assessment with feedback approach, and the structured formative approach. As described by Healey and Jenkins [11], it was important to systematically take the learner around each stage of Kolb’s cycle [5], and to go over each step several times. The three approaches focus on the apprehension, intention, and comprehension dimensions proposed by Kolb’s experiential learning theory. The sequential application of the three approaches intensifies, first, the connection between the apprehension and comprehension. They are considered as independent modes of grasping knowledge. The addition of the intention mode associated to the reflective learning practice is demonstrated to produce higher levels of learning. Abdulwahed and Nagy [20] proposed that knowledge transformation can also be brought about by activating the extension mode via active experimentation during laboratory engineering sessions. These authors suggest that poor learning in laboratories is due to insufficient activation of the apprehension dimension, therefore, one way to facilitate learning is to better activate the first modes proposed by Kolb, especially the intention (reflective observation), before proposing practices to develop active experimentation. Indeed, Fraser et al. [24] reported that in physics education, adopting research-based instructional strategies in the classroom and focusing on the apprehension and comprehension modes, results in significantly more success than the traditional approach, for improving students’ conceptual understanding, engagement, and retention. Henderson and Dancy [25], in an extensive survey of physics faculties across the United States, indicate that the development of approaches, mostly those activating classroom apprehension, by physics education reformers, have made an impact on the knowledge and practice of many of the faculties, although there is still significant room for improvement.
The findings in this research suggest that higher levels of comprehensive physics learning can be achieved when students receive formative feedback in a system of guided instructions. Guided questions to assess student reflection were also reported by Black and Plowright [47] and Koole et al. [48], with the result that discriminative ability and reliability were attained by students at the time of practicing reflection in summative training situations. This outcome suggested that students’ attitudes, as well as the regulation of students’ learning, were a result of the methodology in combination with the process of reflection [49]. As noted by Hargreaves [50], Koole et al. [48], Konak et al. [51], and Heemsoth and Heinze [14], activities whose design is based on an aligned framework are more likely to increase interest and competency.
Indeed, our results show that student–teacher interaction is an important factor in determining student learning experiences. Students who participated in the reflection process rated highly the questions related to self-regulating learning. The structured formative approach produced maximum reflection in terms of planning, regulating, and evaluating students’ learning. Students demonstrated a greater degree of reflection when the teacher promoted the structured formative approach (including guided instructions) prior to the reflection process, along with the teacher’s assessments after the reflection process. This combination was evaluated higher than that of the knowledge-centered approach (i.e., without instructions or teacher feedback), or the formative assessment approach, which did include feedback from the teacher. Koole et al. [48] indicate that the process of reflection may be a strictly individual process or a thinking process that needs to be complemented with external feedback. These authors argued that feedback on reflection fueled recognition of reflecting thoughts, and increased the individual frames of references. In addition, it was argued that feedback may contribute to reinforcing a different type of knowledge, one which allows teachers to make connections between the contents and the ideas the students express [19,27,52].
The three approaches, that is, knowledge-centered, the assessment formative approach, and the structured formative approach, were implemented within the frame of twelve activities on the learning of physics contents. Within this model, each student underwent twelve cycles of abstract conceptualization and reflective observation. Students underwent higher levels of reflection during the last four cycles of the structured formative approach, in which both reflection and critical reflection were promoted. We can conclude that the increase in critical reflection might be attributed to the guided structure of the structured formative approach, but also speculate that it might be attributed to the students’ continuous spirals of reflection that induced higher levels of self-knowledge and self-regulation. The cyclical process applied in this research might suggest that reflection is a cumulative process in which experience is the foundation of learning, and that Kolb’s dimensions [5] may in fact enhance classroom activities. The application of Kolb’s experiential learning cycle has also been reported in chemical engineering [53] and industrial engineering classes [54], as well as engineering laboratories [20]. These studies claim that knowledge retention from engineering laboratory classes based on balanced learning experiences, with the inclusion of the stages of Kolb’s experiential learning cycle, led to a deeper learning and longer retention of information. As highlighted by Vigotsky and Cole [55], the theoretical basis of both collaborative and reflective learning is constructivism, which states that learning is an active process of constructing knowledge, rather than acquiring it [47,51,56]. In this respect, reflective learning represents a process in which the learner’s paradigm is promoted, i.e., a process in which values, attitudes, and beliefs contribute to effective and grounded sustainability learning.

Author Contributions

Conceptualization, J.C.; Formal analysis, L.S-S.; Investigation, J.C., D.C. and T.S.; Methodology, T.S.; Supervision, J.C.; Validation, L.S-S.; Visualization, T.S.; Writing—original draft, J.C.; Writing—review & editing, J.C. and T.S.

Funding

This research was funded by the University of Girona funding MPCUdG2016 and by the Institute of Sciences Education of the University of Girona (ICE-UdG).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Boud, D.; Keogh, R.; Walter, D. Reflection: Turning Experience into Learning; Routledge & Kegan Paul: London, UK, 1985. [Google Scholar]
  2. Tomkins, A. “It was a great day when…”: An exploratory case study of reflective learning through storytelling. J. Hosp. Leis. Sports Tour. Educ. 2009, 8, 123–131. [Google Scholar] [CrossRef]
  3. Schön, D. The Reflective Practitioner: How Professionals Think in Action; Basic Books: New York, NY, USA, 1983. [Google Scholar]
  4. Dewey, J. How We Think; Prometheus Books: Buffalo, NY, USA, 1933. [Google Scholar]
  5. Kolb, D.A. Experiential Learning: Experience as the Source of Learning and Development; Prentice Hall: Englewood Cliffs, NJ, USA, 1984. [Google Scholar]
  6. Korthagen, F.A.J. Reflective teaching and preservice teacher education in the Netherlands. J. Teach. Educ. 1985, 36, 11–15. [Google Scholar] [CrossRef]
  7. Kolb, A.Y.; Kolb, D.A. Learning Styles and Learning Spaces: Enhancing Experiential Learning in Higher Education. Acad. Manag. Learn. Educ. 2005, 4, 193–212. [Google Scholar] [CrossRef]
  8. Felder, R.M. Learning and teaching styles in engineering education. Eng. Educ. 1988, 78, 674–681. [Google Scholar]
  9. Osterman, K.F.; Kottkamp, R.B. Reflective Practice for Educators: Improving Schooling through Professional Development; Corwin Press: Newbury Park, CA, USA, 1993. [Google Scholar]
  10. Scott, A.G. Enhancing reflection skills through learning portfolios: An empirical test. J. Manag. Educ. 2010, 34, 430–457. [Google Scholar] [CrossRef]
  11. Healey, M.; Jenkins, A. Kolb’s experiential learning theory and its application in geography in higher education. J. Geogr. 2000, 99, 185–195. [Google Scholar] [CrossRef]
  12. Beatty, I.D.; Gerace, W.J. Technology-enhanced formative assessment: A research-based pedagogy for teaching science with classroom response technology. J. Sci. Educ. Technol. 2009, 18, 146–162. [Google Scholar] [CrossRef]
  13. Cowie, B.; Bell, B. A model of formative assessment in science education. Assess. Educ. Princ. Policy Pract. 1999, 6, 101–116. [Google Scholar] [CrossRef]
  14. 14 Heemsoth, T.; Heinze, T. Secondary school students learning from reflections on the rationale behind self-made errors: A field experiment. J. Exp. Educ. 2016, 84, 98–118. [Google Scholar] [CrossRef]
  15. Cañabate, D.; Martínez, G.; Rodríguez, D.; Colomer, J. Analysing emotions and social skills in physical education. Sustainability 2018, 10, 1585. [Google Scholar] [CrossRef]
  16. Herranen, J.; Vesterinen, V.-M.; Aksela, M. From learner-centered to learner-driven sustainability education. Sustainability 2018, 10, 2190. [Google Scholar] [CrossRef]
  17. Mackenna, A.F.; Yalvac, B.; Light, G.J. The role of collaborative reflection on shaping engineering faculty teaching approaches. J. Eng. Educ. 2009, 98, 17–26. [Google Scholar] [CrossRef]
  18. Songer, N.B.; Ruiz-Primo, M.A. Assessment and science education: Our essential new priority? J. Res. Sci. Teach. 2012, 49, 683–690. [Google Scholar] [CrossRef] [Green Version]
  19. Alonzo, A.C.; Kobarg, M.; Seidel, T. Pedagogical content knowledge as reflected in teacher-student interactions: Analysis of two video cases. J. Res. Sci. Teach. 2012, 49, 1211–1239. [Google Scholar] [CrossRef]
  20. Abdulwahed, M.; Nagy, Z.K. Applying Kolb’s experiential learning cycle for laboratory education. J. Eng. Educ. 2009, 98, 283–294. [Google Scholar] [CrossRef]
  21. Anderson, J.R.; Reder, L.M.; Simon, H.A. Situated learning and education. Educ. Res. 1996, 25, 5–11. [Google Scholar] [CrossRef]
  22. Johri, A.; Olds, B.M. Situated engineering learning: Bridging engineering education research and the learning sciences. J. Eng. Educ. 2011, 100, 151–185. [Google Scholar] [CrossRef]
  23. Crouch, C.H.; Mazur, E. Peer instruction: Ten years of experience and results. Am. J. Phys. 2001, 69, 970–977. [Google Scholar] [CrossRef]
  24. Fraser, J.M.; Timan, A.L.; Miller, K.; Dowd, J.E.; Tucker, L.; Mazur, E. Teaching and physics education research: Bridging the gap. Rep. Prog. Phys. 2014, 77, 032401. [Google Scholar] [CrossRef] [PubMed]
  25. Henderson, C.; Dancy, M. Impact of physics education research on the teaching of introductory quantitative physics in the United States. Phys. Educ. Res. 2009, 5, 020107. [Google Scholar] [CrossRef]
  26. May, D.B.; Etkina, E. College physics students’ epistemological self-reflection and its relationship to conceptual learning. Am. J. Phys. 2002, 70, 1249–1258. [Google Scholar] [CrossRef]
  27. Fullana, J.; Pallisera, M.; Colomer, J.; Fernández, R.; Pérez-Burriel, M. Reflective learning in higher education: A qualitative study on students’ perceptions. Stud. High. Educ. 2016, 41, 1008–1022. [Google Scholar] [CrossRef]
  28. Hake, R.R. Interactive-engagement versus traditional methods: A six-thousand-student survey on mechanics test data for introductory physics courses. Am. J. Phys. 1998, 53, 64–74. [Google Scholar] [CrossRef] [Green Version]
  29. Mezirow, J. An overview of transformative learning. In Lifelong Learning; Sutherland, P., Crowther, J., Eds.; Routledge: London, UK, 2006. [Google Scholar]
  30. Kember, D.; McKay, J.; Sinclair, K.; Wong, F.K.Y. A four-category scheme for coding and assessing the level of reflection in written work. Assess. Eval. High. Educ. 2008, 33, 369–379. [Google Scholar] [CrossRef]
  31. Chamoso, J.M.; Cáceres, M.J. Analysis of the reflections of student-teachers of mathematics when working with learning portfolios in Spanish university classrooms. Teach. Teach. Educ. 2009, 25, 198–206. [Google Scholar] [CrossRef]
  32. Prevost, L.B.; Haudek, K.C.; Henry, E.N.; Berry, M.C.; Urban-Lurain, M. Automated text analysis facilitates using written formative assessments for just-in-time teaching in large enrollment courses. In Proceedings of the ASEE Annual Conference and Exposition, Atlanta, GA, USA, 23–26 June 2013. [Google Scholar]
  33. Nicol, D.J.; Macfarlane-Dick, D. Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Stud. High. Educ. 2006, 31, 199–218. [Google Scholar] [CrossRef]
  34. Jacoby, J.C.; Heugh, S.; Bax, C.; Branford-White, C. Enhancing learning through formative assessment. Innov. Educ. Teach. Int. 2014, 51, 72–83. [Google Scholar] [CrossRef]
  35. Smith, D.M.; Kolb, D. Users’ Guide for the Learning Style Inventory: A Manual for Teachers and Trainers; McBer: Boston, MA, USA, 1986. [Google Scholar]
  36. Schellings, G. Applying learning strategy questionnaires: Problems and possibilities. Metacogn. Learn. 2011, 6, 91–109. [Google Scholar] [CrossRef]
  37. Thomas, G.P.; Anderson, D.; Nashon, S. Development of an instrument designed to investigate elements of science students’ metacognition, self-efficacy and learning processes: The SEMLI-S. Int. J. Sci. Educ. 2008, 30, 1701–1724. [Google Scholar] [CrossRef]
  38. Mokhtari, K.; Reichard, C. Assessing students’ metacognitive awareness of reading strategies. J. Educ. Psychol. 2002, 94, 249–259. [Google Scholar] [CrossRef]
  39. Feldon, D.F.; Peugh, J.; Timmerman, B.E.; Maher, M.A.; Hurst, M.; Strickland, D.; Gilmore, J.A.; Stiegelmeyer, C. Graduate students’ teaching experiences improve their methodological research skills. Science 2011, 333, 1037–1039. [Google Scholar] [CrossRef] [PubMed]
  40. Richardson, J.T.E. Methodological issues in questionnaire-based research on student learning in higher education. Educ. Psychol. Rev. 2004, 16, 347–358. [Google Scholar] [CrossRef]
  41. Watts, M.; Lawson, M. Using a meta-analysis activity to make critical reflection explicit in teacher education. Teach. Teach. Educ. 2009, 25, 609–616. [Google Scholar] [CrossRef]
  42. Creswell, J. Research Design: Qualitative, Quantitative and Mixed Methods Approaches, 2nd ed.; Psychological Association: Washington, DC, USA, 2003. [Google Scholar]
  43. Barrera, A.; Braley, R.T.; Slate, J.R. Beginning teacher success: An investigation into the feedback from mentors of formal mentoring programs. Mentor. Tutor. Partnersh. Learn. 2010, 18, 61–74. [Google Scholar] [CrossRef]
  44. Cronbach, L.J. Coefficient alpha and the internal structure of tests. Psychometrika 1951, 16, 297–334. [Google Scholar] [CrossRef] [Green Version]
  45. Gliem, J.A.; Gliem, R.R. Calculating, interpreting, and reporting Cronbach’s alpha reliability coefficient for Likert-type scales. In Proceedings of the Midwest Research-to-Practice Conference in Adult, Continuing, and Community Education, The Ohio State University, Columbus, OH, USA, 8–10 October 2003; pp. 82–88. [Google Scholar]
  46. Kember, D.; Leung, D.Y.P.; Jones, A.; Loke, A.Y.; McKay, J.; Sinclair, K.; Tse, H.; Webb, C.; Wong, F.K.Y.; Wong, M.W.L.; Yeung, E. Development of a questionnaire to measure the level of reflective thinking. Assess. Eval. High. Educ. 2000, 25, 381–395. [Google Scholar] [CrossRef]
  47. Black, P.E.; Plowright, D. A multi-dimensional model of reflective learning for professional development. Reflect. Pract. 2010, 11, 245–258. [Google Scholar] [CrossRef]
  48. Koole, S.; Dornan, T.; Aper, L.; De Wever, B.; Scherpbier, A.; Valcke, M.; Cohen-Schotanus, J.; Derese, A. Using video-cases to assess student reflection: Development and validation of an instrument. BMC Med. Educ. 2012, 12, 22. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  49. Wolters, C.A.; Benzon, M.B. Assessing and predicting college students’ use of strategies for the self-regulation of motivation. J. Exp. Educ. 2013, 81, 199–221. [Google Scholar] [CrossRef]
  50. Hargreaves, J. So how do you feel about that? Assessing reflective practice. Nurs. Educ. Today 2004, 24, 196–201. [Google Scholar] [CrossRef] [PubMed]
  51. Konak, A.; Clark, T.C.; Nasereddin, M. Using Kolb’s experimental learning cycle to improve student learning in virtual computer laboratories. Comput. Educ. 2014, 72, 11–22. [Google Scholar] [CrossRef]
  52. Leijen, Ä.; Valtna, K.; Leijen, D.A.J.; Pedaste, M. How to determine the quality of students’ reflection. Stud. High. Educ. 2012, 37, 203–217. [Google Scholar] [CrossRef]
  53. Stice, J.E. Using Kolb’s learning cycle to improve student learning. Eng. Educ. 1987, 77, 291–296. [Google Scholar]
  54. David, A.; Wyrick, P.; & Hilsen, L. Using Kolb’s cycle to round out learning. In Proceedings of the 2002 American Society for Engineering Education Annual Conference & Exposition, Montreal, QC, Canada, 16–19 June 2002. [Google Scholar]
  55. Vygotsky, L.S.; Cole, M. Mind in Society: The Development of Higher Psychological Processes; Harvard University Press: Cambridge, MA, USA, 1978. [Google Scholar]
  56. Colomer, J.; Pallisera, M.; Fullana, J.; Pérez-Burriel, M.; Fernández, R. Reflective learning in higher education: A comparative analysis. Procedia-Soc. Behav. Sci. 2013, 93, 364–370. [Google Scholar] [CrossRef]
Figure 1. Students’ ratings for each question in the RLQuest about the three methodologies (knowledge-centered approach (KA), assessment formative approach (AFA) and structured formative approach (SFA)).
Figure 1. Students’ ratings for each question in the RLQuest about the three methodologies (knowledge-centered approach (KA), assessment formative approach (AFA) and structured formative approach (SFA)).
Sustainability 10 03122 g001
Figure 2. Assessments of the four blocks of questions for the three methodologies.
Figure 2. Assessments of the four blocks of questions for the three methodologies.
Sustainability 10 03122 g002
Figure 3. Scores for questions from the first block: Knowledge about oneself.
Figure 3. Scores for questions from the first block: Knowledge about oneself.
Sustainability 10 03122 g003
Figure 4. Scores for questions from the second block: Connecting the experience with knowledge.
Figure 4. Scores for questions from the second block: Connecting the experience with knowledge.
Sustainability 10 03122 g004
Figure 5. Scores for questions from the third block: self-reflection on the learning process.
Figure 5. Scores for questions from the third block: self-reflection on the learning process.
Sustainability 10 03122 g005aSustainability 10 03122 g005b
Figure 6. Scores for questions from the fourth block: self-regulation of learning.
Figure 6. Scores for questions from the fourth block: self-regulation of learning.
Sustainability 10 03122 g006
Table 1. Self-reporting reflective learning appraisal questionnaire questions from Q1 to Q18.
Table 1. Self-reporting reflective learning appraisal questionnaire questions from Q1 to Q18.
Questions Q1 to Q18 of the RLQuest
1. Knowledge about oneself
Q1. Analyze my behavior when confronting professional and daily situations.
Q2. Analyze what my emotions are when confronting professional and daily situations.
2. Connecting the experience with knowledge
Q3. Connect knowledge with my own experiences, emotions and attitudes.
Q4. Select relevant information and data in a given situation.
Q5. Formulate/contrast hypotheses in a given situation.
Q6. Provide reasons/arguments for decision-making in a given situation.
3. Self-reflection on the learning process
Q7. Improve my written communication skills.
Q8. Improve my oral communication skills.
Q9. Identify the positive aspects of my knowledge and skills.
Q10. Identify the negative aspects of my knowledge and skills.
Q11. Identify the positive aspects of my attitudes.
Q12. Identify the negative aspects of my attitudes.
Q13. Become aware of what I learn and how I learn it.
Q14. Understand that what I learn and how I learn it is meaningful for me.
4. Self-regulating learning
Q15. Plan my learning: the steps to follow, organize material and time.
Q16. Determine who or what I need to consult.
Q17. Regulate my learning, analyzing the difficulties I have and evaluating how to resolve the problems I encounter.
Q18. Evaluate how I plan my learning, the result, and what I need to do to improve.
Table 2. Reflective questions posed after the class activities to guide students through the reflection process in the assessment approaches, assessment formative approach (AFA) and structured formative approach (SFA).
Table 2. Reflective questions posed after the class activities to guide students through the reflection process in the assessment approaches, assessment formative approach (AFA) and structured formative approach (SFA).
Questions 1 to 10 for the AFA and SFA approaches
1. Formulate searching questions that help to analyze your own actions/thoughts during the whole process of viewing–analyzing–writing the experience.
2. What knowledge/feelings/values/former experiences do I use to formulate my answer(s).
3. What do I need to describe the experience? Do I identify the knowledge/skills that are necessary to describe the experiment?
4. Who or what do I need to consult to describe the experiment?
5. How do I organize myself to develop the experiment?
6. I identify the difficulties behind the definition of the understanding of the experiment.
7. I have planned the timing for the process of writing up the activities.
8. What should I do to improve the process of the activity?
9. What will the result of the process be?
10. What did I learn going through the understanding of the experience?
Table 3. Structured questions posed in the structured formative approach (SFA)
Table 3. Structured questions posed in the structured formative approach (SFA)
Questions 1 to 6 for the SFA
1. Description of the experiment. Summarize it by writing a short abstract describing the experiment.
2. Describe the methodology steps used in the experiment.
3. Write the hypothesis that was supported by the experiment.
4. Describe the analysis of the data obtained from the experiment.
5. Formulate questions derived from the experiment.
6. List the new knowledge that you have created from the experiment.
Table 4. Percentages of students’ outcomes in the categories of non-reflection, understanding, reflection, and critical reflection, for the twelve activities in the experiment, with four for each learning approach.
Table 4. Percentages of students’ outcomes in the categories of non-reflection, understanding, reflection, and critical reflection, for the twelve activities in the experiment, with four for each learning approach.
ApproachCategory 1Category 2Category 3Category 4
KANon-reflectionUnderstandingReflectionCritical reflection
Activity 1/22457127
Activity 319462114
Activity 411333917
AFANon-reflectionUnderstandingReflectionCritical reflection
Activity 5/65324320
Activity 74254823
Activity 84194928
SFANon-reflectionUnderstandingReflectionCritical reflection
Activity 9/103274327
Activity 114313629
Activity 123294226

Share and Cite

MDPI and ACS Style

Colomer, J.; Serra, L.; Cañabate, D.; Serra, T. Evaluating Knowledge and Assessment-Centered Reflective-Based Learning Approaches. Sustainability 2018, 10, 3122. https://doi.org/10.3390/su10093122

AMA Style

Colomer J, Serra L, Cañabate D, Serra T. Evaluating Knowledge and Assessment-Centered Reflective-Based Learning Approaches. Sustainability. 2018; 10(9):3122. https://doi.org/10.3390/su10093122

Chicago/Turabian Style

Colomer, Jordi, Laura Serra, Dolors Cañabate, and Teresa Serra. 2018. "Evaluating Knowledge and Assessment-Centered Reflective-Based Learning Approaches" Sustainability 10, no. 9: 3122. https://doi.org/10.3390/su10093122

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop