Next Article in Journal
A Historical Study on the Scientific Attribution of Biosafety Risk Assessment in Real Cases of Laboratory-Acquired Infections
Previous Article in Journal
Laboratories: A New Open Access Journal
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Methodological Rigor in Laboratory Education Research

by
Hendra Y. Agustian
Department of Science Education, University of Copenhagen, 2100 Copenhagen, Denmark
Laboratories 2024, 1(1), 74-86; https://doi.org/10.3390/laboratories1010006
Submission received: 6 May 2024 / Revised: 8 June 2024 / Accepted: 11 June 2024 / Published: 17 June 2024

Abstract

:
Despite the growing number of published studies on student learning in the laboratory, there is a critical need to improve methodological rigor. Resonating with discussions on research methods, this paper outlines the importance of theory-informed research questions, the minimization of researcher and participant biases, and the use of triangulation and iteration in data collection to establish rigor. An illustrative case is presented within the context of a large interdisciplinary research project aimed at improving laboratory learning at the university level. The project incorporates two research avenues: one focusing on student and faculty perspectives, and the other on a comprehensive assessment of multidimensional learning in the laboratory. The project employs a mixed methods paradigm and is grounded in a conceptual framework that conceptualizes laboratory work as epistemic practice, requiring a holistic analysis of student learning. The article concludes by discussing the results and implications of the project’s findings, which are synthesized to highlight aspects of establishing methodological rigor. The overarching goal is to develop a comprehensive assessment instrument that captures the complexity and richness of the laboratory learning environment. The findings from this research are expected to contribute to the advancement of laboratory education research by providing a model for methodological rigor that can be applied across various scientific and interdisciplinary contexts.

1. Introduction

Research on student learning in the laboratory accounts for a rapidly growing corpus of knowledge. Using the Web of Science and ERIC databases, a recent systematic review of learning outcomes associated with laboratory instruction in university chemistry education shows how the number of publications in leading journals grew substantially in the past decades up to mid-2019, as shown in Figure 1 [1]. Included studies fulfilled six criteria for relevance and quality, i.e., on a topic within education research, within STEM fields, supported by empirical data, focused on student learning outcomes, explicitly about chemistry education, and at the post-secondary level (For more details, see Agustian et al., 2022). The review demonstrates that laboratory instruction lends itself to various learning outcomes and experiences that may be different from those of other settings such as lectures and tutorials. However, in higher education teaching practices, this statement is more often assumed than actually substantiated. Longstanding critics of the effectiveness of laboratory instruction reassert the need to improve research and practice in this field by focusing on evidence and rigor and revisiting the philosophical foundations for what constitutes learning [2,3,4].
In the review above, it was shown that researchers have used widely varied theories and methodologies. They espoused theoretical frameworks such as self-determination theory, social constructivism, active learning theory, and cognitive load theory. Around three-quarters of the included studies described some form of theoretical framing. Through empirical investigation, it was found that chemistry education researchers used methodologies such as phenomenography, (quasi) experimental design, learning analytics, action research, and ethnography. Notably, there were only 3 discourse studies out of the bulk of 355 included in the systematic review. Some of the most recent debates in science education research point to the importance of looking into social and interactional aspects of learning in a naturalistic setting, which may be afforded by discourse analyses [5,6,7,8]. The review incorporated critical appraisals of the quality of reviewed studies, according to the criteria proposed by Zawacki-Richter and colleagues [9]. Drawing on some insights from these appraisals, it was recommended that in terms of research methodology, future studies could benefit from greater methodological rigor, which mirrors Hofstein and Lunetta’s [3] recommendation in their major review of laboratory education research. While there have been some advancements in theories and methodologies of conducting research in this setting, there are still some areas that necessitate improvement and refinement.
This methodology paper seeks to delineate rigor within laboratory education research and illustrate an attempt to address it with a dedicated research project. The notions of “triangulation” and “multiperspectivity” will be used to characterize it. Relevant literature in laboratory education research will be consulted in light of debates on overarching methods and methodology. The systematic review above will also be used at the outset. Some of the data from the review will be revisited to support some claims of rigor in this article. While this paper is primarily situated in chemistry education research literature, many elements are transferable to other science disciplines as well as interdisciplinary contexts.

2. Contextualizing Rigor

Rigor is a critical aspect of conducting science education research, which can be conceptualized as both theoretical and methodological notions. It refers to the extent to which research processes are controlled, systematic, and reliable. Although the ideals of precision and accuracy, as well as validity and reliability, have been debated among qualitative and quantitative researchers [10,11,12], it is generally agreed that for a piece of research to have scientific merit, it has to meet certain quality criteria. These criteria are often centered on clear and theoretically informed research questions, explicated and minimized researcher bias, and triangulated methods or iterative approaches within a single method.

2.1. Theory-Informed Research Questions

Underlying the rationale for choices made in an inquiry is the research question. Rigor is established when the question is researchable, meaningful, and informed by theory [13,14]. In chemistry education research, Bunce [14] particularly emphasizes the importance of theory-based questions by clearly situating them within the existing knowledge structure, such as in cognitive science, learning sciences, sociology, and so forth. While this may seem self-evident to seasoned science education researchers, it may not be for novices. To address this, existing theories and prior research should be critically reviewed. Next, it is useful to develop some form of conceptual framework, in which key concepts and terms are described and defined within the context of laboratory education [15]. Then, this framework can be used to formulate research questions that are directly derived from existing theories. As per general guidelines, these questions should be specific, empirically substantiated, and aligned with the theoretical underpinnings.
For example, in their systematic review, Kiste and colleagues [16] use active learning theory to situate their study on learning assessment of integrated chemistry laboratory–lecture experience. They argue from the overwhelming evidence for the effectiveness of active learning pedagogies to increase student knowledge, positively influence their affective strategies, and decrease failure rates. Informed by the established theory and strong empirical evidence, they formulated research questions around “What is the effect of the integrated laboratory-lecture environment on student knowledge, course grades, failure rates and learning attitudes?” Likewise, Wei and colleagues [17] frame their investigation into student interactions in chemistry laboratories within social constructivism theory, particularly the so-called Model of Educational Reconstruction. They argue that the model highlights the importance of considering students’ perspectives of interactions in the laboratory to improve their understanding of the principles underlying laboratory work. As such, the authors pursue a research question such as “What do undergraduate students consider to be important interactions for effective learning in introductory chemistry laboratories?” Both of these studies are among the most highly regarded studies in the aforementioned systematic review, which shows how theory-informed research questions could enhance the research quality and rigor.
In an illustrative case of the IQ-Lab project elaborated in the next section, the theory of competence development [18,19] is used to design interrelated lines of investigation into student learning in the laboratory. This theoretical framing dovetails a mandate from the Danish Ministry of Education that altogether points to the gaps of knowledge yet to be resolved. As such, the initial design of the project seeks to address a research question such as “Which factors influence pharmaceutical students’ acquisition of laboratory-related competencies, and how can such competencies be assessed?”.

2.2. Researcher and Participant Biases

Rigor is also established when biases, both those of the researchers and participants, are explicated and minimized. Researcher biases have been a real issue, particularly in qualitative research [12,20]. Even from the initial stage, the researcher’s background, beliefs, and experiences may exert biases on the choices made related to research questions, which may influence which context and participants are involved. At any later stage, their biases could also influence data collection and interpretation. Johnson and colleagues [12] maintain that in order to establish rigor, these biases should be acknowledged and addressed. For novice researchers, the following tips may be useful. Firstly, research training is beneficial to increase awareness of their potential unconscious biases [21]. These biases may be related to their positionality, but also gender, race, institution prestige, and so forth. Secondly, collaboration and peer review within the research group are essential. Even if a part of the work is associated with a sole researcher, it is always recommended to seek feedback from peers and, if possible, obtain external perspectives. Lastly, transparency is paramount, by thoroughly documenting all research procedures, decisions, and potential limitations and biases. This allows others to critically evaluate the research.
An example from laboratory education research can be drawn from Markow and Lonning [22], who used a quasi-experimental research design to study the effect of the construction of pre-laboratory and post-laboratory concept maps on students’ conceptual understanding. They acknowledge that the researcher teaching both the experimental and control groups could exert researcher biases. To minimize these, two chemistry educators viewed videotapes of the pre-laboratory instruction for both groups and judged the nature of the instruction. Also, the pre-laboratory concept maps were graded separately prior to the post-laboratory concept maps, which was “purposefully done to avoid researcher bias” (p. 1021).
Since science education research is concerned with human participants, biases could also originate in the participants’ backgrounds, beliefs, and experiences. Winberg and Berg [23] speak of “prestige bias” in their interview study on the effects of pre-laboratory activities on cognitive focus during experiments, which refers to participants’ own perceptions of how a statement would make them appear. When students know they are being evaluated, they may focus on what the researchers may want to hear. The researchers focused their study on the spontaneous use of chemistry knowledge, minimizing participant bias by “avoiding explicitly examining chemistry questions” (p. 1119). Interview studies are particularly prone to both types of biases, and efforts should be made to explicate and reduce them. A way of resolving this problem is by operationalizing the notion of “multiperspectivity”.
Used to characterize methodological rigor, multiperspectivity in educational research can be defined as a narrative approach to investigating the lived experiences of research participants in which multiple and often discrepant viewpoints are employed for the substantiation and evaluation of a subject of interest [24,25]. The philosophical foundation for addressing multiperspectivity in research is twofold. On the one hand is the partiality of knowledge and experiences, which refers to the idea that our understanding of the world and reality is limited and incomplete [26]. On the other hand is intersubjectivity, which refers to the variety of possible relations between people’s perspectives [27]. In substantiating stakeholders’ perspectives on learning in the laboratory, multiple perspectives are aimed at resolving the limitations and presumable biases of one’s viewpoint and lived experiences, by exploring alternative narratives from other stakeholders. It is also useful to establish the relations between these perspectives by identifying similarities and differences, verifying factual information, sketching nuances, and exploring the richness of the context. An example from laboratory education research where multiperspectivity was addressed can be drawn from Bretz and Towns groups’ series of papers on exploring faculty and students’ goals for learning in the laboratory [28,29,30,31]. Primarily contextualized within meaningful learning theory, their studies provide a comprehensive analysis of what the stakeholders of laboratory education think when it comes to laboratory learning that goes beyond rote, isolated accumulation of knowledge and skills.

2.3. Iteration and Triangulation

The third cornerstone of methodological rigor is concerned with data collection. In a single-method approach, such as surveys or interview studies, rigor is established through iteration [12]. For example, Santos-Diaz and colleagues [32] describe how they iteratively administered a questionnaire on undergraduate students’ laboratory goals through three stages of development. After a pilot implementation involving 904 participants, they modified the survey to ensure validity and reliability. In their attempt at establishing rigor, they acknowledge that surveys could cause response fatigue, so redundancy should be avoided. Likewise, Vitek and colleagues [33] also used an iterative method in their design and development of a project-based rubric to assess clinical chemistry residents. To compare with the qualitative research in their phenomenological study on student experiences of reform in a general chemistry laboratory, Chopra and colleagues [34] emphasized the inductive and iterative processes involving five researchers to reduce verbal data into the essence and meaning of experiences. The cyclical process is insightful in this qualitative study, as it demonstrates how the researchers minimize biases through refinement, agreement, and contestation. In many qualitative studies where only one researcher is involved, this may not be possible. Therefore, other measures should be taken to ensure that the findings are generated in a comparably rigorous fashion. One way of carrying this out is through triangulation.
In its simplest definition, triangulation refers to a combination of quantitative and qualitative approaches in the same study [35]. As such, it is predominantly associated with mixed methods research. However, it has also been developed within the qualitative research tradition [36]. The core argument of triangulation in establishing rigor lies in the assertion that the use of multiple methods produces stronger inferences, addresses more diverse research questions, and hence, generates more diverse findings, increases data validity, and reduces bias [35,37]. In their study on argumentation in the laboratory, Walker and Sampson [38] used three different data sources to track students’ argumentative competence development over time. Performance tasks evaluated at three points in time, video recording of students engaging in argumentation in the laboratory, and their laboratory reports were used to make triangulated claims about how students learn to argue in the laboratory, and how engaging in argumentation also generates learning. By addressing methodological rigor, the study provides genuine evidence for learning outcomes associated with higher-order thinking skills.
Within the qualitative research tradition, triangulation has also been pursued [35,36,39]. There are several ways in which qualitative researchers have carried this out. One way is by combining various methods of qualitative data collection and analysis, such as interviews, observations, visual data, discourse analysis, and document analysis. Another way is by meaningfully combining theories that underpin these methods, such that different epistemological assumptions are made explicit and set in a dialogue [39]. Lawrie and colleagues [40] combined wiki laboratory notebooks and open response items to investigate learning outcomes and processes in an inquiry-type nanoscience chemistry laboratory course. Their study shows that the wiki environment enhances student understanding of experimental processes and communication of results. As a comparison, Horowitz [41] incorporated a triangulation of theories in their study on intrinsic motivation associated with a project-based organic chemistry laboratory. By critically engaging with social constructivism, self-determination theory, and interest theory, they demonstrate how students’ responses to curriculum implementation are mediated by their achievement goal orientations, particularly among mastery-oriented students.

3. Illustrative Case

3.1. Context

The context I use to illustrate rigor in this paper is a large research project aimed at improving learning in the laboratory at the university level. An interdisciplinary, collaborative project between Department of Science Education and Department of Pharmacy at University of Copenhagen, it involves several researchers at different career levels from various scientific backgrounds, including chemistry, physics, biology, pharmaceutical sciences, mathematics, and philosophy. It has been running since 2019 and is nearing completion. As of 2024, some of the research findings have been synthesized [42]. The project is illustrated there to highlight some aspects of establishing methodological rigor, as argued in the previous section. The project incorporates two research avenues, as shown in Figure 2. One is the initial design of the project, structured in several work packages (WPs), in which WP2 explores the literature and faculty perspectives, whereas WP3 explores students’ perspectives, and WP4 explores the study program perspective. These are shown in blue in Figure 2. Most of the findings from this part have been published. The other avenue, shown in green, is concerned with a research program within the project, focusing on a comprehensive assessment of multidimensional learning in the laboratory (henceforth referred to as CAMiLLa), which is grounded in a conceptual framework and working model for the integration of learning domains in the laboratory [15].
To address methodological rigor, each study within the project was designed with careful considerations of the aspects mentioned in the previous section. WP2, WP3, and WP4 were primarily designed as qualitative studies. As such, multiperspectivity and a combination of qualitative data sources are central. In CAMiLLa, an attempt has been made to enhance rigor by means of triangulation of methods. As such, this part of the project is designed within a mixed methods paradigm. It is centered on a framework for comprehensively mapping domains of learning in the laboratory. Within this framing, laboratory work is conceptualized as epistemic practice [43,44], referring to processes related to the co-construction and evaluation of knowledge, in which students activate their cognition, affect, conation, and coordinate their bodies with those faculties of mind in continual negotiations with their peers. The conceptual framework regards student learning as a multidimensional construct that requires a more holistic and comprehensive analysis beyond a typical cognitive view. The overarching goal of these studies is to develop a comprehensive assessment instrument that does justice to the complexity and richness of learning environments in this challenging context.

3.2. Methods

In total, 212 students and 50 faculty members have participated in the project, spanning five cohorts of students enrolled in various laboratory courses. They volunteered through open calls. Ethical considerations were secured through approval from the Institutional Review Board (case number 514-0278/21-5000), as per the university’s general data protection regulation policy. The original WPs in the project primarily used transcripts from the interviews as data, with additional materials such as laboratory reports, curriculum documents, and laboratory manuals. The data were analyzed using thematic [45] and phenomenographic analyses [46]. In CAMiLLa, the data were generated from multimodal laboratory discourse transcripts, focus group interviews, laboratory reports, and surveys. Adhering to the principles of ethnography, students carrying out experimental work in the laboratories of analytical chemistry and physical chemistry were observed and recorded (both video and audio). Their interactions and conversations with each other, with instructors, and with laboratory technicians were converted into lines of utterances and non-verbal cues. These form a laboratory discourse [15,47] that differs slightly from classroom discourse [5] and incorporates more modalities [48], including gestural-kinesthetic and instrumental-operational. Immediately following the laboratory exercise, students were interviewed, guided by and contextualized in either snippets of the video recordings or excerpts from their laboratory reports. The interviews elicited their reasoning skills, argumentation, collaborative learning behavior, problem-solving, affective experiences, goal setting, motivational and volitional strategies, as well as broader epistemic understanding related to the uncertainty of measurements, limitations of knowledge, and scientific practices. These prompts also inform the design of a questionnaire focusing on the conative and affective domains of learning in the laboratory, which is grounded in previously validated instruments [49,50,51,52,53].
The laboratory discourse was analyzed using epistemic network analysis (ENA) [54,55] and microanalytic discourse analysis [56]. Both types of discourse analysis were needed because the laboratory discourse itself was characterized inductively at the beginning. Pertinent themes in both verbal and non-verbal interactions were coded inductively as stanzas, and as such, were microanalyzed. Of course, this was not carried out with the entire corpus, but once a degree of saturation was achieved (there was no more new, emerging theme), the data were then analyzed using ENA involving a second coder. The other data sources were analyzed differently. Interview data were analyzed thematically [45], akin to the previous studies [57], whereas questionnaire data were analyzed with variance analysis.

4. Results and Discussion

4.1. Multiperspectivity on Laboratory Learning across the Work Packages

The multiple perspectives on learning in the laboratory from the literature and faculty perspectives (WP2) are centered on the conceptualization of experimental competencies, as shown in Figure 3. Each cluster of learning outcomes in this graph constitutes a host of pedagogical constructs substantiated in the literature of laboratory education research in university chemistry. Experimental competencies are by and large the main learning outcomes associated with experimental work. They include basic practical skills, such as procedural skills and laboratory techniques, from pipetting to operating advanced analytical instruments, as well as more advanced experimental design. Within the cluster of “higher-order thinking skills”, constructs such as “reasoning” and “argumentation” have been shown to be associated with laboratory work. A faculty perspective regarding the contextualization of higher-order thinking skills in the laboratory is that students also develop these skills when they are given opportunities to design an experiment [57]. Likewise, “conceptual learning” in the laboratory pertains to learning about scientific concepts pertaining to experimental work. Scholars have been critical about justifications for laboratory work in addressing conceptual learning goals [4,58]. Therefore, within this cluster, the focus was shifted towards establishing connections between the underlying scientific principles (loosely termed as “theory”) and the experiment at hand (loosely termed as “practice”), which is also perceived by faculty members to be a central goal of laboratory work within this cluster [57].
To illustrate how multiperspectivity was used to establish rigor, the notion of “theory-practice connection” was investigated from a student perspective, as a part of WP3. Students’ conception of this notion seems to fall into at least three categories, i.e., conception of laboratory work as a visual representation of theory, a multimodal support of theory, and a complementary perspective in understanding theory. Since studies within WP2 and WP3 were conducted in the same context, it can be argued that they altogether address both the breadth and depth of understanding of what and how students learn in the laboratory, at least to a certain degree. While WP2 provides a broad overview of learning outcomes associated with laboratory work, WP3 delves deeper into one construct that essentially originates in WP2. Some of the ideas related to discussions on conceptual learning were investigated further, as described in the following section, to provide more perspectives and forms of triangulation.

4.2. Triangulation and the Comprehensive Assessment

Triangulated findings from the comprehensive assessment provide meaningful insights into student learning processes in the laboratory, which complement the previous focus on learning outcomes. As mentioned above, these are currently divided into several parts. First, there is multimodal discourse in the laboratory. This part is primarily concerned with how laboratory discourse is characterized by multimodality [6,48] and multiple representations [59]. The use of gestures in explaining a chemical concept, for example, is evinced in a stanza where Felix, a participant, explains the concept of ‘tailing factor’ using his hands to Eliana, his lab partner (Table 1).
Tailing factor (Tf) is a concept used widely in the pharmaceutical industry that is related to peak tailing. As commonly known in chromatographic science, an ideal chromatography peak is a sharp, symmetrical, Gaussian shape. In reality, it may look more like Figure 4.
This is an example of how students invoke a concept relevant to what they were about to do, i.e., performing an analysis of tablets containing caffeine and acetaminophen using high performance liquid chromatography (HPLC) in the laboratory of analytical chemistry. A typical analysis in pharmaceutical sciences, it requires them to plan an inquiry on how to analyze the content of the tablets and to determine whether the indicated content on the label is true. The experiment required many aspects to consider: technical skills at preparing solutions and operating the instrument, experiment design competence, reasoning skills, argumentative competence, safety and sustainability protocols, conceptual understanding, sensical-perceptual skills at reading scales and menisci, mind–body coordination during delicate procedures of transferring volatile (and poisonous!) liquids, group dynamics, negotiation of meaning and task distribution, time management, and emotional, motivational, and volitional control and regulation, all while trying to learn and somehow appreciate their time in the laboratory. A previous study shows that most students appreciate being able to be in the laboratory and perform laboratory exercises [60].
In the larger corpus of data, conceptual discourse may not be the most prominent aspect of student learning processes, as can be discerned from the structure of different archetypes of discourse in the laboratory. Shown in Figure 5 is a visualized structure of discourse drawn from a pair (Alice and Belle) conducting an experiment on electrochemistry in the laboratory of physical chemistry, where the curriculum and instruction are mainly expository [61]. It shows how the line of connection between conceptual discourse and any other discourse (e.g., instrumental and epistemic) is the weakest, and hence it can be said that the interactions between students in this laboratory are not mainly concerned with concepts but more with procedures and instruments. This kind of insight was heretofore only gained from quantification of these types of interactions, such as in a previous discourse study on gas behavior [62]. However, the way they connect to each other is captured much better with epistemic network analysis.
CAMiLLa is aimed at providing more comprehensive and holistic information about student learning from a multidimensional view. It has been a daunting task to capture the details of learning processes through such a theoretical lens, especially when the video corpus was transcribed and analyzed with multimodal discourse analysis. Nonetheless, snippets from the discourse so far are valuable in sketching nuances and characterizing the interior architecture of student learning. Instances of authentic conversations during various discursive moves in the laboratory, as well as during the focus group interviews, point to the evidence for epistemic affect [63], which is an emerging construct in the learning sciences referring to affective and emotional states associated with co-construction of (scientific) knowledge. Triangulation of methods allows for cross-checking, elaboration, and upscaling of evidentiary claims [64].

5. Conclusions

This paper explored how a pertinent concept in carrying out science education research, i.e., methodological rigor, is described in laboratory education research literature and illustrated with a large research project aimed at improving learning in the laboratory at the university level. The front matter focused on three cornerstones of methodological rigor. It was argued that to enhance rigor, laboratory education researchers should (1) formulate theory-informed research questions, (2) minimize researcher and participant biases, and (3) employ iteration and triangulation in data collection methods. The first cornerstone involves prior research, such as reviewing existing theories and empirical findings, as well as developing a conceptual framework grounded in theory. To address the second cornerstone, researchers are encouraged to increase awareness on biases through training, collaborate and seek peer review, maintain transparency by thoroughly documenting procedures and limitations, and explore multiple stakeholder viewpoints. Lastly, the third cornerstone can be addressed by engaging in cyclical refinement within a single method or combining multiple qualitative and quantitative methods. The results demonstrate that multiperspectivity and triangulation of methods are key to improving rigor in this rapidly developing field of research. Detailed discursive moves provide a first-hand account of learning processes as they unfold in their authentic and naturalistic setting, complementing previously reconceptualized responses to prompts about learning generated through interviews and surveys.
Establishing methodological rigor is a prerequisite of high-quality empirical knowledge in science education research. However, it can be challenging, particularly when the research setting is a complex learning environment such as teaching laboratories. These challenges can be related to limited resources, including the number of researchers involved in various stages of the study. A dominant research paradigm in an institution could also be perceived as a limiting factor, in which researchers may feel they have to comply with the majority. For example, certain institutions may profile themselves more strongly within a qualitative research tradition, whereas others assign more value to large-scale quantitative research. Improving rigor may entail challenging such a paradigm. Indeed, this has been demonstrated in the paradigm wars over the past few decades [35,65]. Mixed methods research emerged against a backdrop of debates on incommensurability, but chief to the arguments for advancing this third paradigm is enhancing rigor [66]. Several aspects of rigor have been contextualized in laboratory education research. As illustrated in this paper, they are perhaps best addressed in a more substantial research project consisting of several lines of inquiry. Of course, any single study could and should strive to address rigor, but larger projects like this allow for methodological and epistemological diversity.

Funding

The research presented in this article is supported by Novo Nordisk Foundation, grant NNF 18SA0034990.

Institutional Review Board Statement

The study was approved by the Institutional Review Board (Case number 514-0278/21-5000).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data are available on reasonable request from the author.

Conflicts of Interest

There is no conflict of interest to declare.

References

  1. Agustian, H.Y.; Finne, L.T.; Jørgensen, J.T.; Pedersen, M.I.; Christiansen, F.V.; Gammelgaard, B.; Nielsen, J.A. Learning Outcomes of University Chemistry Teaching in Laboratories: A Systematic Review of Empirical Literature. Rev. Educ. 2022, 10, e3360. [Google Scholar] [CrossRef]
  2. Bretz, S.L. Evidence for the Importance of Laboratory Courses. J. Chem. Educ. 2019, 96, 193–195. [Google Scholar] [CrossRef]
  3. Hofstein, A.; Lunetta, V.N. The Laboratory in Science Education: Foundations for the Twenty-First Century. Sci. Educ. 2003, 88, 28–54. [Google Scholar] [CrossRef]
  4. Kirschner, P.A. Epistemology, Practical Work and Academic Skills in Science Education. Sci. Educ. 1992, 1, 273–299. [Google Scholar] [CrossRef]
  5. Sandoval, W.A.; Kawasaki, J.; Clark, H.F. Characterizing Science Classroom Discourse across Scales. Res. Sci. Educ. 2021, 51, 35–49. [Google Scholar] [CrossRef]
  6. Tang, K.S.; Park, J.; Chang, J. Multimodal Genre of Science Classroom Discourse: Mutual Contextualization between Genre and Representation Construction. Res. Sci. Educ. 2022, 52, 755–772. [Google Scholar] [CrossRef]
  7. Current, K.; Kowalske, M.G. The Effect of Instructional Method on Teaching Assistants’ Classroom Discourse. Chem. Educ. Res. Pract. 2016, 17, 590–603. [Google Scholar] [CrossRef]
  8. Wieselmann, J.R.; Keratithamkul, K.; Dare, E.A.; Ring-Whalen, E.A.; Roehrig, G.H. Discourse Analysis in Integrated STEM Activities: Methods for Examining Power and Positioning in Small Group Interactions. Res. Sci. Educ. 2021, 51, 113–133. [Google Scholar] [CrossRef]
  9. Zawacki-Richter, O.; Kerres, M.; Bedenlier, S.; Bond, M.; Buntins, K. Systematic Reviews in Educational Research: Methodology, Perspectives and Application; Springer VS: Wiesbaden, Germany, 2020. [Google Scholar]
  10. Coryn, C.L.S. The Holy Trinity of Methodological Rigor: A Skeptical View. J. MultiDiscip. Eval. 2007, 4, 26–31. [Google Scholar] [CrossRef]
  11. Erickson, F.; Gutierrez, K. Culture, Rigor, and Science in Educational Research. Educ. Res. 2002, 31, 21–24. [Google Scholar] [CrossRef]
  12. Johnson, J.L.; Adkins, D.; Chauvin, S. A Review of the Quality Indicators of Rigor in Qualitative Research. Am. J. Pharm. Educ. 2020, 84, 138–146. [Google Scholar] [CrossRef] [PubMed]
  13. Arfken, C.L.; MacKenzie, M.A. Achieving Methodological Rigor in Education Research. Acad. Psychiatry 2022, 46, 519–521. [Google Scholar] [CrossRef] [PubMed]
  14. Bunce, D.M. Constructing Good and Researchable Questions. In Nuts and Bolts of Chemical Education Research; ACS Publications: Washington, DC, USA, 2008; pp. 35–46. [Google Scholar]
  15. Agustian, H.Y. Considering the Hexad of Learning Domains in the Laboratory to Address the Overlooked Aspects of Chemistry Education and Fragmentary Approach to Assessment of Student Learning. Chem. Educ. Res. Pract. 2022, 23, 518–530. [Google Scholar] [CrossRef]
  16. Kiste, A.L.; Scott, G.E.; Bukenberger, J.; Markmann, M.; Moore, J. An Examination of Student Outcomes in Studio Chemistry. Chem. Educ. Res. Pract. 2017, 18, 233–249. [Google Scholar] [CrossRef]
  17. Wei, J.; Mocerino, M.; Treagust, D.F.; Lucey, A.D.; Zadnik, M.G.; Lindsay, E.D.; Carter, D.J. Developing an Understanding of Undergraduate Student Interactions in Chemistry Laboratories. Chem. Educ. Res. Pract. 2018, 19, 1186–1198. [Google Scholar] [CrossRef]
  18. Ellström, P.-E.; Kock, H. Competence Development in the Workplace: Concepts, Strategies and Effects. Asia Pac. Educ. Rev. 2008, 9, 5–20. [Google Scholar] [CrossRef]
  19. Sandberg, J. Understanding of Work: The Basis for Competence Development. In International Perspectives on Competence in the Workplace; Springer: Dordrecht, The Netherlands, 2009; pp. 3–20. [Google Scholar]
  20. Hamilton, J.B. Rigor in Qualitative Methods: An Evaluation of Strategies Among Underrepresented Rural Communities. Qual. Health Res. 2020, 30, 196–204. [Google Scholar] [CrossRef] [PubMed]
  21. Lalu, M.M.; Presseau, J.; Foster, M.K.; Hunniford, V.T.; Cobey, K.D.; Brehaut, J.C.; Ilkow, C.; Montroy, J.; Cardenas, A.; Sharif, A.; et al. Identifying Barriers and Enablers to Rigorous Conduct and Reporting of Preclinical Laboratory Studies. PLoS Biol. 2023, 21, e3001932. [Google Scholar] [CrossRef] [PubMed]
  22. Markow, P.G.; Lonning, R.A. Usefulness of Concept Maps in College Chemistry Laboratories: Students’ Perceptions and Effects on Achievement. J. Res. Sci. Teach. 1998, 35, 1015–1029. [Google Scholar] [CrossRef]
  23. Winberg, T.M.; Berg, C.A.R. Students’ Cognitive Focus during a Chemistry Laboratory Exercise: Effects of a Computer-Simulated Prelab. J. Res. Sci. Teach. 2007, 44, 1108–1133. [Google Scholar] [CrossRef]
  24. Hartner, M. Multiperspectivity. In Handbook of Narratology; Hühn, P., Meister, J.C., Pier, J., Schmid, W., Eds.; De Gruyter: Berlin, Germany, 2014; pp. 353–363. [Google Scholar]
  25. Kropman, M.; van Drie, J.; van Boxtel, C. The Influence of Multiperspectivity in History Texts on Students’ Representations of a Historical Event. Eur. J. Psychol. Educ. 2023, 38, 1295–1315. [Google Scholar] [CrossRef]
  26. Feyerabend, P.K. Knowledge, Science and Relativism; Preston, J., Ed.; Cambridge University Press: Cambridge, UK, 1999; Volume 3. [Google Scholar]
  27. Gillespie, A.; Cornish, F. Intersubjectivity: Towards a Dialogical Analysis. J. Theory Soc. Behav. 2010, 40, 19–46. [Google Scholar] [CrossRef]
  28. Bretz, S.L.; Fay, M.; Bruck, L.B.; Towns, M.H. What Faculty Interviews Reveal about Meaningful Learning in the Undergraduate Chemistry Laboratory. J. Chem. Educ. 2013, 90, 281–288. [Google Scholar] [CrossRef]
  29. Bruck, L.B.; Towns, M.H.; Bretz, S.L. Faculty Perspectives of Undergraduate Chemistry Laboratory: Goals and Obstacles to Success. J. Chem. Educ. 2010, 87, 1416–1424. [Google Scholar] [CrossRef]
  30. DeKorver, B.K.; Towns, M.H. General Chemistry Students’ Goals for Chemistry Laboratory Coursework. J. Chem. Educ. 2015, 92, 2031–2037. [Google Scholar] [CrossRef]
  31. Mack, M.R.; Towns, M.H. Faculty Beliefs about the Purposes for Teaching Undergraduate Physical Chemistry Courses. Chem. Educ. Res. Pract. 2016, 17, 80–99. [Google Scholar] [CrossRef]
  32. Santos-Díaz, S.; Hensiek, S.; Owings, T.; Towns, M.H. Survey of Undergraduate Students’ Goals and Achievement Strategies for Laboratory Coursework. J. Chem. Educ. 2019, 96, 850–856. [Google Scholar] [CrossRef]
  33. Vitek, C.R.; Dale, J.C.; Homburger, H.A.; Bryant, S.C.; Saenger, A.K.; Karon, B.S. Development and Initial Validation of a Project-Based Rubric to Assess the Systems-Based Practice Competency of Residents in the Clinical Chemistry Rotation of a Pathology Residency. Arch. Pathol. Lab. Med. 2014, 138, 809–813. [Google Scholar] [CrossRef]
  34. Chopra, I.; O’Connor, J.; Pancho, R.; Chrzanowski, M.; Sandi-Urena, S. Reform in a General Chemistry Laboratory: How Do Students Experience Change in the Instructional Approach? Chem. Educ. Res. Pract. 2017, 18, 113–126. [Google Scholar] [CrossRef]
  35. Denzin, N.K. Moments, Mixed Methods, and Paradigm Dialogs. Qual. Inq. 2010, 16, 419–427. [Google Scholar] [CrossRef]
  36. Gorard, S.; Taylor, C. Combining Methods in Educational and Social Research; Open University Press: Maidenhead, UK, 2004. [Google Scholar]
  37. Mayoh, J.; Onwuegbuzie, A.J. Toward a Conceptualisation of Mixed Methods Phenomenological Research. J. Mix. Methods Res. 2015, 9, 91–107. [Google Scholar] [CrossRef]
  38. Walker, J.P.; Sampson, V. Learning to Argue and Arguing to Learn: Argument-driven Inquiry as a Way to Help Undergraduate Chemistry Students Learn How to Construct Arguments and Engage in Argumentation during a Laboratory Course. J. Res. Sci. Teach. 2013, 50, 561–596. [Google Scholar] [CrossRef]
  39. Flick, U. Triangulation in Qualitative Research. In A Companion to Qualitative Research; SAGE Publications: London, UK, 2004. [Google Scholar]
  40. Lawrie, G.A.; Grøndahl, L.; Boman, S.; Andrews, T. Wiki Laboratory Notebooks: Supporting Student Learning in Collaborative Inquiry-Based Laboratory Experiments. J. Sci. Educ. Technol. 2016, 25, 394–409. [Google Scholar] [CrossRef]
  41. Horowitz, G. The Intrinsic Motivation of Students Exposed to a Project-Based Organic Chemistry Laboratory Curriculum. Master’s Thesis, Columbia University, New York, NY, USA, 2009. [Google Scholar]
  42. Seery, M.K.; Agustian, H.Y.; Christiansen, F.V.; Gammelgaard, B.; Malm, R.H. 10 Guiding Principles for Learning in the Laboratory. Chem. Educ. Res. Pract. 2024, 25, 383–402. [Google Scholar] [CrossRef]
  43. Agustian, H.Y. The Critical Role of Understanding Epistemic Practices in Science Teaching Using Wicked Problems. Sci. Educ. 2023. [Google Scholar] [CrossRef]
  44. Kelly, G.J.; Licona, P. Epistemic Practices and Science Education. In Science: Philosophy, History and Education; Springer Nature: Berlin, Germany, 2018; pp. 139–165. [Google Scholar]
  45. Braun, V.; Clarke, V. Using Thematic Analysis in Psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef]
  46. Marton, F.; Pong, W.Y. On the Unit of Description in Phenomenography. High. Educ. Res. Dev. 2005, 24, 335–348. [Google Scholar] [CrossRef]
  47. Jiménez-Aleixandre, M.P.; Reigosa Castro, C.; Diaz de Bustamante, J. Discourse in the Laboratory: Quality in Argumentative and Epistemic Operations. In Science Education Research in the Knowledge-Based Society; Psillos, D., Kariotoglou, P., Tselfes, V., Hatzikraniotis, E., Fassoulopoulos, G., Kallery, M., Eds.; Springer: Dordrecht, The Netherlands, 2003; pp. 249–257. [Google Scholar]
  48. Van Rooy, W.S.; Chan, E. Multimodal Representations in Senior Biology Assessments: A Case Study of NSW Australia. Int. J. Sci. Math. Educ. 2017, 15, 1237–1256. [Google Scholar] [CrossRef]
  49. Agustian, H.Y. Students’ Learning Experience in the Chemistry Laboratory and Their Views of Science: In Defence of Pedagogical and Philosophical Validation of Undergraduate Chemistry Laboratory Education. Ph.D. Thesis, The University of Edinburgh, Edinburgh, UK, 2020. [Google Scholar]
  50. Galloway, K.R.; Bretz, S.L. Development of an Assessment Tool to Measure Students’ Meaningful Learning in the Undergraduate Chemistry Laboratory. J. Chem. Educ. 2015, 92, 1149–1158. [Google Scholar] [CrossRef]
  51. McCann, E.J.; Turner, J.E. Increasing Student Learning through Volitional Control. Teach. Coll. Rec. 2004, 106, 1695–1714. [Google Scholar] [CrossRef]
  52. Pintrich, P.R.; Smith, D.A.F.; Garcia, T.; McKeachie, W.J. Reliability and Predictive Validity of the Motivated Strategies for Learning Questionnaire (MSLQ). Educ. Psychol. Meas. 1993, 53, 801–813. [Google Scholar] [CrossRef]
  53. Yoon, H.; Woo, A.J.; Treagust, D.; Chandrasegaran, A.L. The Efficacy of Problem-Based Learning in an Analytical Laboratory Course for Pre-Service Chemistry Teachers. Int. J. Sci. Educ. 2014, 36, 79–102. [Google Scholar] [CrossRef]
  54. Cai, Z.; Siebert-Evenstone, A.; Eagan, B.; Shaffer, D.W.; Hu, X.; Graesser, A.C. nCoder+: A Semantic Tool for Improving Recall of nCoder Coding. In Communications in Computer and Information Science; Springer: Berlin/Heidelberg, Germany, 2019; Volume 1112, pp. 41–54. [Google Scholar]
  55. Shaffer, D.W.; Collier, W.; Ruis, A.R. A Tutorial on Epistemic Network Analysis: Analyzing the Structure of Connections in Cognitive, Social, and Interaction Data. J. Learn. Anal. 2016, 3, 9–45. [Google Scholar] [CrossRef]
  56. Green, J.L.; Dixon, C.N. Exploring Differences in Perspectives on Microanalysis of Classroom Discourse: Contributions and Concerns. Appl. Linguist. 2002, 23, 393–406. [Google Scholar] [CrossRef]
  57. Agustian, H.Y.; Pedersen, M.I.; Finne, L.T.; Jo̷rgensen, J.T.; Nielsen, J.A.; Gammelgaard, B. Danish University Faculty Perspectives on Student Learning Outcomes in the Teaching Laboratories of a Pharmaceutical Sciences Education. J. Chem. Educ. 2022, 99, 3633–3643. [Google Scholar] [CrossRef]
  58. Hofstein, A.; Kind, P.M. Learning in and from Science Laboratories. In Second International Handbook of Science Education; Fraser, B.J., Tobin, K.G., McRobbie, C.J., Eds.; Springer Science & Business Media: Dordrecht, The Netherlands, 2012. [Google Scholar]
  59. Johnstone, A.H. Chemical Education Research in Glasgow in Perspective. Chem. Educ. Res. Pract. 2006, 7, 49–63. [Google Scholar] [CrossRef]
  60. Finne, L.T.; Gammelgaard, B.; Christiansen, F.V. When the Lab Work Disappears: Students’ Perception of Laboratory Teaching for Quality Learning. J. Chem. Educ. 2021, 99, 1766–1774. [Google Scholar] [CrossRef]
  61. Domin, D.S. A Review of Laboratory Instruction Styles. J. Chem. Educ. 1999, 76, 543. [Google Scholar] [CrossRef]
  62. Pabuccu, A.; Erduran, S. Investigating Students’ Engagement in Epistemic and Narrative Practices of Chemistry in the Context of a Story on Gas Behavior. Chem. Educ. Res. Pract. 2016, 17, 523–531. [Google Scholar] [CrossRef]
  63. Davidson, S.G.; Jaber, L.Z.; Southerland, S.A. Emotions in the Doing of Science: Exploring Epistemic Affect in Elementary Teachers’ Science Research Experiences. Sci. Educ. 2020, 104, 1008–1040. [Google Scholar] [CrossRef]
  64. Ton, G. The Mixing of Methods: A Three-Step Process for Improving Rigour in Impact Evaluations. Evaluation 2012, 18, 5–25. [Google Scholar] [CrossRef]
  65. Lederman, N.G.; Zeidler, D.L.; Lederman, J.S. Handbook of Research on Science Education: Volume III, 1st ed.; Routledge: New York, NY, USA, 2023. [Google Scholar]
  66. Johnson, R.B.; Onwuegbuzie, A.J. Mixed Methods Research: A Research Paradigm Whose Time Has Come. Educ. Res. 2004, 33, 14–26. [Google Scholar] [CrossRef]
Figure 1. Number of publications per year on laboratory learning outcomes in university chemistry education up to mid-2019 (n = 355) [1].
Figure 1. Number of publications per year on laboratory learning outcomes in university chemistry education up to mid-2019 (n = 355) [1].
Laboratories 01 00006 g001
Figure 2. The project structure.
Figure 2. The project structure.
Laboratories 01 00006 g002
Figure 3. Learning outcomes of laboratory instruction [57].
Figure 3. Learning outcomes of laboratory instruction [57].
Laboratories 01 00006 g003
Figure 4. Method to quantify US pharmacopeia tailing factor.
Figure 4. Method to quantify US pharmacopeia tailing factor.
Laboratories 01 00006 g004
Figure 5. Structure of connections between discourse archetypes in the laboratory.
Figure 5. Structure of connections between discourse archetypes in the laboratory.
Laboratories 01 00006 g005
Table 1. A stanza on multimodal conceptual discourse.
Table 1. A stanza on multimodal conceptual discourse.
SpeakersUtterancesNon-Verbal Cues and Action
ElianaAnd then there is the tailing factor?Eliana invokes a concept specific to chromatography.
FelixThe tailing factor…Felix intends to type, but discusses the concept instead.
ElianaWe have…Eliana looks at Felix.
FelixThis is the extent to which it is loop-sided.Felix looks at Eliana, explains the concept using hand gestures.
Felixi.e. one of them runs down, and the other it is like that where there is a tail, is it not like that?Felix looks at Eliana, explains the concept using hand gestures.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Agustian, H.Y. Methodological Rigor in Laboratory Education Research. Laboratories 2024, 1, 74-86. https://doi.org/10.3390/laboratories1010006

AMA Style

Agustian HY. Methodological Rigor in Laboratory Education Research. Laboratories. 2024; 1(1):74-86. https://doi.org/10.3390/laboratories1010006

Chicago/Turabian Style

Agustian, Hendra Y. 2024. "Methodological Rigor in Laboratory Education Research" Laboratories 1, no. 1: 74-86. https://doi.org/10.3390/laboratories1010006

APA Style

Agustian, H. Y. (2024). Methodological Rigor in Laboratory Education Research. Laboratories, 1(1), 74-86. https://doi.org/10.3390/laboratories1010006

Article Metrics

Back to TopTop