Next Article in Journal
The Effects of an Intervention Programme Using Information Communication and Technology on the Teaching and Learning of Physical Education in Singapore Schools
Next Article in Special Issue
High Expectations During Guided Pretend Play in Kindergarten: A Promising Way to Enhance Agency in a Digitalized Society?
Previous Article in Journal
Transforming Medical Education Through Intelligent Tools: A Bibliometric Exploration of Digital Anatomy Teaching
Previous Article in Special Issue
The #BookTok Connection: Examining Cultural and Linguistic Identity Expression in Online Reading Communities
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Analytical Gaze of Operators and Facilitators in Healthcare Simulations: Technologies, Agency and the Evolution of Instructional Expertise

by
Astrid Camilla Wiig
1,* and
Roger Säljö
2
1
Faculty of Humanities, Sports and Educational Science, Department of Educational Science, University of South-Eastern Norway, 3199 Borre, Norway
2
Faculty of Education, Department of Education, Communications and Learning, University of Gothenburg, 405 30 Göteborg, Sweden
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(3), 347; https://doi.org/10.3390/educsci15030347
Submission received: 10 December 2024 / Revised: 4 March 2025 / Accepted: 6 March 2025 / Published: 11 March 2025

Abstract

This article analyses the coordination between professionals, students and technology in the communication and appropriation of know-how in healthcare simulations. To be successful, simulations require continuous interventions by professionals (in this case, operators and facilitators), who analyse, assess and reflect on the actions participants take as the simulation evolves. This study builds on interaction analysis of 30 video-documented (15 h) conversations between operators and facilitators in post-simulation discussions of outcomes. The specific focus of the analysis is the nature of work done by operators/facilitators as they analyse and evaluate simulations. The results show the multilayered nature of these analyses. The operators and facilitators show three prominent types of consideration. They (a) calibrate what they have observed, (b) monitor the progress of the scenario as an instructional event, and (c) comment on their own contributions as instructors/participants. All these considerations have evaluative elements, and the agentic nature of technologies, students and professionals is addressed. One general observation of interest is the ways in which simulations provide access to student learning, and how these activities become accessible for professional scrutiny and judgement.

1. Introduction

During the past 100 years, simulators have come to play an increasing role for learning and instruction in professional training. Beginning in fields such as aviation and healthcare, simulators have been used in diverse settings, including vocational training (Braunstein et al., 2022), business education (Beranič & Heričko, 2022) and training of mariners (Sellberg et al., 2022), for example. In recent years, simulations have made their way into the regular educational systems through the use of virtual labs, virtual microscopes and similar resources (Lantz-Andersson et al., 2019).
In the present study, we examine the work of operators and facilitators when orchestrating learning and instruction in the context of full-scale simulators in healthcare training. Such environments are often well resourced and have access to qualified technical and healthcare staff with extensive and varied experiences of simulations. Successful execution of simulations requires continuous interventions by professionals, who monitor and evaluate what actions participants take as the simulation evolves (Wiig & Säljö, 2024). Our empirical focus is on the roles that operators and facilitators play in promoting meaningful and challenging learning opportunities in critical exercises in healthcare. More specifically, the analytical focus is on the joint agentic elements of participants and technologies in the co-creation, monitoring and development of a dynamic space for learning healthcare practices.
In research on healthcare simulations, the role of operators and facilitators is considered essential to simulations (see, e.g., Dieckmann et al., 2008; Husebø et al., 2024; Nordenström et al., 2023; Sawyer et al., 2016). In the literature, the terminology used for defining the roles and responsibilities of those who run simulations varies (cf. The Healthcare Simulation Standards of Best PracticeTM (HSSOBPTM) Operations, Decker et al., 2021). Usually, there is a division of labour in simulations, which implies that the operator primarily is responsible for “running the computer” (Seropian, 2003, p. 1698), while the facilitator is responsible for the exercises and for the instructional relevance of the debriefing. But, in practice, the roles are often not as clearcut as this. A simulation is a dynamically evolving process, where technology, pedagogy and healthcare work are intertwined and difficult to separate. This means that operators often, in addition to their responsibilities for the technology, will contribute by playing different roles in the scenario (doctor, patient and relatives), and facilitators will intervene with technologies, by supplementing the decisions on how to use technology with healthcare relevant input.
While simulator research describes the simulation process as a method most often organized as Briefing—Exercise—Debriefing, our interest in how operators and facilitators contribute is often thought of as the “backstage” work of professional evaluation and analysis in the learning environment. However, in addition to the three stages conventionally focused on, there is a critical part (Phase 3 in Figure 1), during which teams of operators and facilitators analyse the simulation that has just taken place and prepare for the debrief with the students. The work they do here is both retrospective and prospective. During this stage, the operators and facilitators analyse and evaluate the simulation, and, among other things, they decide if student performance lives up to expected standards.
Our research builds on the assumption that the teamwork of operators and facilitators, as well as the activities they engage in during simulations, should not be perceived as passive backstage activities of little relevance for the success of simulations as professionally relevant contexts for learning. On the contrary, their analytical gaze and their contributions to building and enacting know-how during simulation activities, are decisive for the outcomes and for the learning experiences of students during all phases. An interesting feature of simulations is precisely that they provide rich opportunities for following a range of significant learner activities, including how learners physically engage in health care practices, how they document their conceptual understanding when diagnosing patients, and how they communicate with patients and amongst themselves. Thus, operators and facilitators are in a position where they can both follow the delivery of care in detail in real time and consider the solutions students suggest and the difficulties they run into. Consequently, over time, the team of operators and facilitators develop first-hand insights into how to guide and evaluate student activities during simulations.
Thus, this study aims to use video documentation of the concrete work of operators and facilitators during ongoing simulation training sessions to analyse the nature of their work and their contributions to the progress of simulations as environments for learning (Phase 3, Figure 1). Put differently, the novelty of this article is our focus on the analytical gaze that operators and facilitators develop as a team in order to support learning and evolution of the simulation exercises. By examining the activities of these actors, our contribution to the field of educational sciences is that we can begin to better understand how the process of enacting professional know-how in healthcare is conducted and evaluated during the entire practice of simulations, and not only during debriefings. Analytically, we can investigate the agency that the different elements of the activity, students, the instructors (operators/facilitators), and the technologies, exert in the co-construction and development of learning experiences.
In addition, one interesting element of simulations as contexts for professional learning that should not be overlooked is that they increase our capacities for understanding the details of how students/learners appropriate and exercise their know-how in a performative sense. What students do and say in situ provides clues to their know-how in response to challenges in healthcare practices (Wiig & Säljö, 2024). Thus, from the perspective of evaluating learning, and in comparison to either paper-and-pencil tests or oral testing, simulators offer an environment for studying how participants enact their know-how in challenging situations, and this enactment can be documented, analysed and assessed in real time as well as retrospectively through video documentation (Chue et al., 2022; Sellberg et al., 2022). By analysing simulations, operators and facilitators (and researchers) gain access to the conceptual (explanations, diagnostic reasoning, etc.), physical (use of equipment, handling of patients etc.) and collaborative actions that students engage in as they handle the challenges of the exercise.
Another significant feature of this mode of organizing learning and instruction is that it relies on intimate continuous coordination between human agents and technologies, digital as well as physical. The assumption of this analysis is that the technologies exert agency in the sense that they build on the use of representations (documentation) and physical artifacts (manikins, medical equipment for interventions, etc.) that guide the work performed. These resources have a long history and have been designed on the basis of institutional experiences of providing care. Through their design and use, these resources contribute to the progression of the simulation and to the agency of all those involved. From a sociomaterial perspective, the agentic elements of technologies in social practices must be recognized as essential for understanding the nature of professional practices and for learning such practices (Fenwick, 2014; Hawley, 2021; Hutchins, 1995; Mäkitalo, 2016).
In the following, three questions will be addressed: how do operators and facilitators (a) describe, analyse and assess features of the students’ delivery of care, (b) monitor the progress of the scenario, including the function of the technology in these activities, and (c) comment on their own contributions during simulations. Thus, our focus is on the gaze of operators and facilitators as professional teams during simulations, the ways in which they build knowledge about simulations and decide on how these can be made more relevant as learning events. This multilayered nature of professional expertise implies that both the operators/facilitators (humans) and the technological resources serve as agents that co-determine the outcomes of simulations as contexts for learning.

2. Simulations, Learning and Feedback

2.1. Conceptual and Theoretical Background

The theoretical background of the research to be reported here is in a sociocultural and sociomaterial perspective, emphasizing the role of cultural tools/instruments—material and symbolic—in learning and development (Vygotsky, 1978). Cultural tools serve as “mediational means” (Wertsch, 2007) in human activities, and they play a crucial role in accumulating know-how and in building up a “cultural memory” (Donald, 2018) of solutions to problems previously encountered. By appropriating cultural tools, which may be intellectual (linguistic, conceptual, models), material (stethoscopes, reflex hammers) and/or both (heart rate monitors, digital surveillance instruments), novices appropriate professional know-how by means of which they will be able to exercise professional duties within an institution (Vygotsky, 1981). Simulations thus may be understood as resources for exposing novices to challenging situations mediated through technologies, and in which they should be able to appropriate know-how and learn to use the tools (physical/conceptual/procedural) of their profession.
In our analyses, we emphasize the value of attending to the participants’ perspective on how they engage in simulations. Thus, our aim is to unpack (see below) what operators and facilitators do as they analyse and intervene in simulated environments when scaffolding and evaluating student work, and when attempting to improve simulations and increase their relevance as a context for learning healthcare know-how.
Simulations, when well designed and realistic, represent an important mode of learning and documenting professional skills as well as the evolution of such skills in situ. Such pedagogical activities are generally organized in a structure which includes (a) a written scenario, (b) an introduction to students of the agenda and learning goals (briefing), (c) an exercise (with, in our case, operators/facilitators present), and (d) a debriefing session. The latter is central from a learning point of view, it is “the heart and soul of simulator training” as Rall et al. (2000, p. 517) put it. During debriefings, participants can share experiences of the session, and they can reflect on and learn from their own actions/mistakes as well as from those of their fellow students as these are discussed; video documentation may also be present during such sessions (Gormley & Fenwick, 2016). But, in our view, it is obvious that operators and facilitators also learn and develop professional expertise as they engage in simulations. Phase 3 (Figure 1) has both retrospective (analysing the simulation in question) and proactive (preparing for the debrief and for future improvements of the simulation) elements and is a central site of learning.

2.2. Simulations as Contexts for Learning in Healthcare: Contributions by Operators and Facilitators

There are few studies exploring how facilitators and operators contribute to simulations as teams and how their interventions guide student learning in this technology-saturated environment. While a growing number of studies is investigating learning outcomes and debriefing effectiveness, as well as providing recommendations for how to conduct effective debriefings using quantitative post-hoc methodologies, relatively little is known about the details of the so-called “backstage” teamwork of operators and facilitators during simulations (Wiig & Säljö, 2024). Most studies which include observations on how these actors contribute to simulations concern the debriefing phase (4) only.
In the simulation literature, there are meta-analytical studies mentioning the facilitator role during debriefings as a variable generating learning effects. Keiser and Arthur (2021), for instance, discuss how facilitation approaches contribute to the effectiveness of the debriefing in teams versus at the individual level, concluding that “the most effective combinations being the self-led facilitation approach coupled with a team-aligned AAR [after action review], and the self-led approach coupled with objective media” (p. 1007). However, these conclusions are rather abstract, and the authors do not go into details about how and in what ways operators and facilitators contribute in a functional sense. As Lymer and Sjöblom (2024) point out in their review of debriefings in various professions, “the precise nature of instructor behaviors that may influence learning outcomes is little understood” (p. 5).
The first comprehensive review of research that pays attention to the facilitator role was conducted by Tannenbaum and Cerasoli (2013). The authors, who focus on the debriefing phase, analysed 46 studies across professions where simulations are widely used, such as aviation, healthcare, military and organizational work. When examining the factors that contribute to the effectiveness of debriefing, Tannenbaum and Cerasoli argue that facilitation, structure and multimedia are widely thought to improve the quality of debriefs: “we found that facilitated debriefs (d = 0.75, or 27%) were about 3 times as effective as nonfacilitated debriefs (d = 0.25, or 10%)” (p. 239). In the conclusion, they argue that “it appears that facilitation helps, but the sample size for unfacilitated debriefs was too small to fully remove ambiguity” (p. 240). Consequently, the focus on best practices and guidelines for practitioners points to the predominantly normative knowledge interest in the debriefing setting in these studies. This implies a lack of attention to the details of how and why facilitators and operators intervene, and how they use and develop their professional expertise as support for learning as simulations unfold. The work of these expert teams, which is essential for the quality of simulations as learning events, are “black-boxed” (Latour, 1999, p. 304) rather than unpacked in such research.
Focusing on research structured around the role of interactions in simulations, Lymer and Sjöblom (2024) reviewed studies across aviation, healthcare and maritime education which make use of video-based methodology and document interactions as they occur in simulations. In the studies reviewed, facilitator guidance was highlighted as a central part of simulation practices, and the authors also point to the variety of ways in which guidance is exercised. The conclusions show the significant role of facilitator contributions to debriefings. This includes issues of how facilitator questions guide and steer the students’ perceptions of their own performance during the post-exercise reflections (Johansson et al., 2017), how facilitators use story-telling about lived experiences in debriefing settings to relate simulator experiences to the realities of work in maritime education (Sellberg & Wiig, 2020) and, finally, how facilitator guidance focuses embodied actions at a detailed level in aviation post-simulation debriefing (Roth, 2015). We argue that it is important to explore the teamwork of operator and facilitator activities during the simulation and during all phases (Figure 1), i.e., Brief—Exercise—Professional analysis—Debriefing, in order to understand how simulations unfold and support student learning.
With respect to the operator role, most studies in healthcare research mention the role of the operator as a technical expert running the simulator, while the major focus of the few studies that include these experts in their analysis is on the activities of the facilitator. The few studies which engage with the instructional significance of the operator contributions point to how student performance during a simulated scenario is linked to the operator’s pedagogical competence and technical expertise. At the same time, they highlight the need for more research on the operator role as a critical factor for the success of simulations (see, e.g., Waxman et al., 2019; Tamilselvan et al., 2023). Researchers also call for additional studies on effects, best practices and evidence-based guidelines to develop effective operator tools conducive to supporting learning (Gantt, 2012). For instance, Reierson et al. (2024) explored nursing students’ perceptions of the operator’s interpretation of the patient role during simulations and underlined that students had a limited understanding of the operator’s pedagogical role.
In sum, the research reviewed above demonstrates differences in both facilitator and operators’ roles in simulations and shows that there is a need for in-depth knowledge, exploring the contributions of these expert roles and their teamwork during simulator-based instruction. An interesting observation is that, to the best of our knowledge, no studies investigate the professional conversations and collaborations between teams of operators and facilitators during nurse care simulation exercises. Thus, and to sum up, the novelty of this study is the assumption that qualitative studies complement quantitative studies by analysing and detailing the teamwork of operators and facilitators as they engage in reflecting on the progress of the simulation and its instructional relevance. During this process, this study shows that operators and facilitators fine-tune their contributions during scenarios to support student learning while interacting with the manikin and other technological artifacts present in simulator laboratories. The contribution of this study is to provide contextualized descriptions of the details of the concrete work done by teams of operators and facilitators during ongoing pedagogical processes documented through video recordings. Thus, we seek to unpack the work done and its role for the unfolding of the simulation as a learning event.

3. Method and Data

The design of this study is inspired by interactional ethnography (Skukauskaite & Green, 2023) using video to document professional practices and discourses in healthcare. For this study, we observed two operator rooms in an advanced full-scale Nordic simulator laboratory to capture how the participants, operators (n = 6) and facilitators (n = 10), articulated their expertise as part of the simulator instruction. The data were generated in a compulsory simulation course in Basic Nursing. The recordings were made before the students’ first clinical placement in medical units at hospitals, i.e., in terms of clinical experience the students are novices. Students were divided into groups of 6–10 participants, and they were engaged in a total of six different scenarios over a two-day period. The scenarios focused on patients with deteriorating health conditions, such as hypoglycaemia, cardiac arrest, postoperative bleeding and angina pectoris. In each scenario, from two to three students acted as nurses, while the other students in the group acted as family members or observers. The learning outcomes included assessing how students prioritized relevant nursing actions for clinical assessment, leadership and decision-making. They were obliged to implement care procedures such as the airway, breathing, circulation, disability and exposure approach (ABCDE) to determine basic physiological values of the patient in pain, to execute clear leadership by using Close Loop for effective communication, and to manage the health-care intervention related to Identification, Situation, Background, Analysis and Advice (ISBAR) to ensure quality and patient safety. Thus, what they were intended to learn are the mediational means used by professionals in specific care situations. The data included below in this article are from two scenarios, heart stop and postoperative bleeding.
The simulation follows the pattern described in Figure 1. Each simulation lasted about 2 h and started with a briefing session in which the facilitator introduced the patient case, the equipment necessary for handling the exercise and the learning goals. The simulated scenario lasted 15 min. After watching the video of their own performance (15 min), the student group engaged in a collective debriefing that lasted 50 min. This session was led by the facilitator following the PEARLs script.1
The data for this study capture the reflections and analyses by operators and facilitators (Phase 3) as they prepared for the debriefing (Phase 4). The object of inquiry of our research is the activities of these professional teams as they reflected on what they saw during Phase 2. The recordings were made in the control room and lasted approximately 15–30 min each.
In total, 30 conversations, comprising approximately 15 h of video recordings, were generated. In addition, observation-logs, photos, and other artifacts, such as assignments and students’ observation-schemes, were collected. One camera with a wide-angled lens was placed in the corner of the operator’s control room to capture the digital equipment, such as screens and touch-panels for steering the manikins, and the interactions between the operators and facilitators during their conversations. A separate microphone at the desk secured high quality sound recordings. See Figure 2.
It should also be mentioned that each simulator suite is equipped with three wide-lens video cameras and one separate microphone that document student work. One of the cameras captured the whole room, another camera was placed over the hospital-bed to capture the detailed interactions between manikins and students, and one wide-angle camera was turned towards the observing students. The operators and facilitators used the 360 degree camera installed in the simulator suits to follow student activities. These recordings were the video documentation that operators and facilitators accessed as they analysed the simulator sessions. We should also point out that the operators and facilitators in this centre had professional backgrounds as nurses and extensive experiences of care and simulations.
The project was approved by the National Department of Ethics. All participants volunteered and signed an informed consent form. All personal information has been anonymized.

4. Results

4.1. Analytical Procedures

The material has been analysed through the lens of a sociocultural conceptual framework and by using interaction analysis (Jordan & Henderson, 1995). For this analysis, all the conversations among operators and facilitators were transcribed and subjected to collaborative analysis. Data were analysed in several data sessions involving the authors as well as in data sessions involving a more comprehensive research group that comprises a network of researchers from different disciplines working with interaction analysis of video-recorded data (cf. Heath et al., 2010). Preliminary findings were also presented at the American Educational Research Association 2024, receiving valuable recommendations to further elaborate and finetune the analyses (Wiig & Säljö, 2024). The three episodes selected for this article were transcribed at an intermediate level to allow researchers in professional learning and instruction environments in healthcare to follow the conversations and activities (Linell, 2011).
The video recordings, which documented the post-simulation analyses by operators and facilitators, were analysed, and three particularly interesting areas of participant concerns emerged. These areas imply that the teams of operator and facilitator were:
(a)
calibrating observations and interpretations made during the simulation exercises.
Here, the operators and facilitators describe and analyse central features of the students’ delivery of care in order to calibrate their individual observations and interpretations. The work done here serves to reach an agreement on what has evolved during the simulation and decide what needs to be communicated to students during the debrief. Assessing student performance and what has been demonstrated in terms of professional know-how, are central features of this work.
(b)
monitoring the progress of the scenario as an instructional event.
This involves considering (1) the use of the technological tools available during the simulation exercise, (2) attending to student activities and (3) the work by the operator in the operator room when monitoring and guiding the on-going simulation. Evaluative statements are part of this activity as well.
(c)
commenting on their own contributions.
Operators discuss their experiences acting as patients, doctors, etc., during the simulation exercise. This discussion included the normative ambitions of operators and facilitators to improve the simulations as contexts for learning and how well they succeed in supporting student learning.
The nature of these three types of professional analyses during Phase 3 will now be described in some detail.

4.2. Calibrating Observations and Interpretations Made During the Simulation Exercises

Delivery of care is a complex undertaking, and students engage analytical, communicative and physical activities that operators and facilitators have to analyse and evaluate. The following excerpts have been collected from conversations between two experienced participants, one operator and one facilitator, during the Heart Stop Scenario, a common simulation exercise. This is a stressful exercise for the students where they have to rescue the manikin’s life, and the operator and facilitator follow student work and look at the digital monitors providing data on the medical condition of the manikin/patient. One of the central issues of providing care in this emergency situation is the depth and intensity of the pressure exerted on the manikin’s chest. The operator can monitor the frequency (1–100%) and depth (1–10 centimetres) of correct heart and lung compressions through digital monitors of vital measures while simultaneously following the video documentation of student performance. On the monitors, correct procedures with respect to pace and depth of compressions are displayed in green colours and incorrect are in red. In Excerpt 1, we see how students struggle during this exercise, and we also see the detailed observations that the operator and facilitator make as they review student performance. See Figure 3:
  • Excerpt 1. Heart Stop
Education 15 00347 i001
The nurse students struggled in different ways to give the manikin correct heart and lung treatment. In the first line, the operator says, it will be interesting to see if they say something about it themselves, thus anticipating the upcoming debrief session, and if students are aware of their difficulties in performing CPR (cardiopulmonary resuscitation) correctly. According to the operators’ in situ observations of the monitor for vital measures, Stephen was maybe 3 max 4 cm, or only halfway down, but the frequency was ok (lines 2 and 3). Figure 4:
The facilitator asked two questions to address her concerns based on her observations during the exercise: Was the frequency satisfactory? (line 4) and, more precisely, was he a bit slow? (line 6). Noticing differences, the operator confirms her observation that Stephen was a little slow but underlines that the compressions scored almost 100 percent in depth, as was shown by the monitor (line 7). Somewhat surprised, the facilitator replies Oh really! At that level! (line 8), and she makes notes for the upcoming debriefing. The operator further explicates, by referring to the monitor: he was right on the border between 100, uhm on the green field and down towards what was the margins (lines 10 and 11). The analysis continues in lines 12–14, when they highlight how the other students act by saying, and Sarah? She wasn’t deep enough either, was she? No, none of them. These utterances demonstrate the ways the operator and the facilitator jointly negotiate their interpretations of the observations made. Together they calibrate and fill in missing information in their analyses of what is going on, and they have access to the technical mediation of the activities from the medical monitors. Thus, the situated calibration is tightly connected with their professional experience in nursing, their professional vision (Goodwin, 1994), and the simultaneous consultation of technical devices. In this sense, the assemblage of operator/facilitator and the technologies exert agency in the situation resulting in a detailed analysis of the progression of the provision of healthcare.

4.3. Monitoring the Progress of the Scenario as an Instructional Event

A simulation is a dynamic event. Even though there is a scenario based on a well-considered script, the activities of students are decisive for the progress and the relevance of the exercise. In response to the students’ initiatives and performance, operators and facilitators have to be creative to keep the exercise on track towards the intended learning goals. Excerpt 2 is from a conversation between a trainee operator (OPT), an experienced operator, and the facilitator in a scenario referred to as Postoperative Bleeding. The learning goals include engaging in systematic patient observation (ABCDE-procedure) to assess a deteriorating/critically ill patient with a bleeding wound, and to keep the patient alive.
The operators and the facilitator monitor the progression of the scenario as an instructional event and jointly reflect on its instructional relevance for the students. They discuss how to trigger students to perform expected procedures, such as checking blood pressure, in a post-operative scenario setting. They are well acquainted with the design of the scenario and the learning goals central to the exercise, but they are also continuously responsive to the nature and quality of activities that the students engage in. In the following excerpt, the facilitators discussed how they could manipulate the technical features of the manikin with a bleeding wound and intervene through triggers to support students, who showed a lack of understanding of what to do in a critical postoperative care situation (cf. Figure 5).
  • Excerpt 2: Postoperative Bleeding
Education 15 00347 i002
In this excerpt, the operators and the facilitator were concerned, even perplexed, that the students did not seem to notice the significance of blood pressure in a postoperative bleeding setting. Monitoring the students’ obvious lack of knowledge regarding the importance of examining the manikin’s circulation (to check for possible blood loss), they jointly reflect on why students failed to attend to this. One explanation is lack of knowledge on the part of students, but the other alternative they consider is if they should have manipulated the medical digital technologies to provide students with sharper clues to check vital measurements during the exercise. Thus, they turn their attention to the interrelationship between the simulation and their own contributions. The operator trainee calibrates her observation with the team: you adjusted the pulse down to emphasize the thing about painkillers (lines 2 and 4). The operator confirms, and the trainee explicates her reasoning by pointing to the monitor while saying then I thought, if one were to somehow highlight the issue of post-operative bleeding so maybe the heart rate could have been high and the blood pressure low to follow that track a bit more (lines 5, 7, and 9). Suggesting how to highlight the instructional goal of the scenario, the team members discuss how to finetune the technical indicators to steer students into understanding how to perform the correct procedures by checking the circulation of a bleeding patient. They realize that the students did not understand the point of the simulation script, underscoring that we can make slightly more visible triggers on that (line 12). Consequently, the professional team highlighted that, if the operator provides more obvious technical triggers on vital measures, the facilitator can bring it up in the post-simulation reflections; this would be something to elaborate on during the debriefing (line 13) in order to reach learning goals. Commenting on the fact that students do not follow the to-do list of the ABCDE-procedure, the operator admitted that the scenario and the instructional design do not properly support students’ performance because they simply do not relate to C (line 14), i.e., circulation.
The practical instructional problem for the operators and facilitator is how they can make the significant details of the scenario salient for the students through technical as well as human interventions. The suggestion they make, to provide a more relevant instructional event in the simulator suite, is to redesign the situation by (a) having the operator provide clearer triggers, and (b) changing the technical parameters visible on the monitors in the simulation suits. This kind of “backstage” work thus involves using experience and know-how from current simulations, the technical equipment and the professional clinical expertise to improve the instructional relevance. In this sense, the team members recognize the problems of the agency of the various elements in the simulation. The scenario was not clear enough, and neither the technology nor their own interventions provided sufficient triggers. This is a comprehensive and very detailed analysis of the work carried out by these expert teams in an instructional situation and its apparent failure to promote professional learning. The analysis by the experts, furthermore, considered the many levels involved (scenario, technology and human intervention).

4.4. Commenting on Their Own Contributions as Participants When Acting as Patients

The final transcript is also from the scenario Postoperative Bleeding, but a new student group was involved. During the scenario, the students noticed a bleeding wound, but they did not respond by performing the correct clinical procedures in the care situation (similar to what happened in Excerpt 4b). By externalizing their observations, the operators and the facilitator shared their reflections on their own contributions acting as the patient in pain, as doctors, etc., during the simulation exercise, and they evaluated if the students’ actions were satisfactory when responding to their prompts. We enter the conversation while the trainee operator, acting as the doctor in the scenario, shares her observations of her own contributions and reflects on how to improve the scenario and the instruction. She suggests allowing the doctor to use some additional prompts related to the bleeding wound earlier in the script/scenario to make it more obvious for students that they have to check the blood-soaked bandage and perform the expected clinical assessment in the care-situation. This opens a normative evaluation of the instructional relevance of the entire scenario, discussing how to provide triggers, changes in their role-play and how to fine-tune their own work to improve the simulation as a context for learning (See Figure 6):
  • Excerpt 3. Postoperative Bleeding
Education 15 00347 i003
Here, the operator trainee suggests that, in that doctor’s consultation, the operator, acting as the doctor, could have asked students if they should have examined the patient’s body a little earlier (line 1) during the simulation. The idea is to get students on to that track (line 3) of checking a surgical wound and start efficient treatment when blood soaked through the bandage, which was a central part of the scenario (line 5). The instructors realize that the students do not understand the point of the script/scenario/exercise; after all, they didn’t even think about it (line 7). The facilitator, observing the students during the exercise, confirms the evaluation and responds with a rhetorical question, they didn’t think of bleeding at all did they (line 9).
This led the team into discussions about their ambitions to improve the simulation as a context for learning these clinical skills. Consequently, they agreed that, to improve the scenario, we make a couple of such changes (line 10) to explore if we get some response on that (.) it will be interesting (line 11). Arguing that to improve the simulation moving back and forth is part of the continuous professional process of fine-tuning the simulation in relation to various student groups, that’s how learning is (line 12). Such a revision is also necessary in order to prepare students for situations they will encounter in their future working lives. Creating connections to a normal situation (line 15) in a hectic intensive hospital department, the trainee operator suggests several activities to revise the scenario to create realistic tasks and to trigger them a little more to intervene, since they should realize that they have to give some liquid, take another blood pressure and call the doctor a little sooner (line 15). The goal is to make the students aware of the urgency of this kind of situation. The facilitator agrees agreed and underlines that, in the daily life of nurses, it is important to recognize that if they are concerned that there is bleeding (.) they [should] call [the doctor] promptly (line 16). Together, the team members agree on the necessity of making students realize the significance of immediate action in that type of situation. The professional agenda behind this argument is that operators and facilitator are concerned that the students are acting as if nothing is very urgent (line 17) in the situation. The message they wanted to send is that, in situations of this kind, immediate action is expected.
The practical problem for the operators and the facilitator in this situation is similar to what we saw in Excerpt 1b) above; however, here they are commenting on their own contributions when acting as doctor, patient, etc., and considering whether these interventions provided enough information for the students to act on. Thus, the transcript document how the experts evaluate and calibrate their professional understanding of how their interventions co-determine several aspects of students’ performance. This concern matters in terms of how they as operators/facilitator initiate actions, notice variations in how individual students respond to them, and monitor the students’ actions to conclude whether they are too slow or not adequate. Thus, what they realize is that student performance of their know-how is, to some extent, relative to their own contributions during simulations.
The simulator technology allows a simulation to be continuously monitored by experts who are watching how students engage in meaning making while confronted with challenges. When they realized that the students do not perform the core elements of the scenarios in expected manners, they discussed possible interventions to remedy the problems preventing students from understanding the situation. Thus, the simulator team used their clinical experiences and know-how from emergency hospital wards, the technological equipment available and their expertise in simulations to improve students’ understanding/awareness of the nature of the situation, i.e., they want to get the students on to that track (line 3). In this sense, the team members recognize the problem of the agency of their own contributions when they performed the roles of doctor, relatives and patient in pain. They reflect on the point that students might know the correct procedures of what to do, but the problem is the relevance, or lack of, of the operators’ intervention. Put differently, the members of the team are uncertain if they appropriately triggered students’ understanding of how to make use of their experiences and possible know-how in the simulated situation.
Taken together, the empirical analyses show the significance of the role that operators and facilitators played for how simulations unfolded. They were active at several levels, as evaluators of student performance, analysts of how the scenario and responsible for the technology, and evaluators of their own contributions to the learning event. Outcomes of this kind of work are part of what the simulation is all about as a learning environment.

5. Discussion

In this study, we have scrutinized the contributions by operators and facilitators as professional teams as they engage in and develop simulations. We have shown how operators and facilitators reflect on the progress of the scenario, its instructional relevance and their own contributions to further develop simulations to create even more powerful learning settings. The goal has been to show that the “backstage” team activities are significant instances of professional learning while simultaneously improving the simulations’ instructional value.
The operators and facilitators reflected upon how the simulation, including the scenario, technology, student work and their own contributions, function as a coherent context for student learning of professional know-how. The problem they encountered was that the agentic elements of the scenario, the technology and their own contributions, all of which are intended to put the students on track, were not working as expected. One of the issues they struggled with was how the technologies and care situations they exposed students to sometimes seem to lack clarity or transparency for the students. When attempting to understand the roots of these elements of the simulation, a dichotomy of contrasting interpretations emerged among operators and facilitators. One reflection made when the simulation did not unfold according to expectations was that the students did not understand what to do in the scenario, and that this could lead to the difficulties observed. In that case, the conclusion was that students were not familiar with expected care procedures. At the same time, when facing cases in which the simulation does not work according to expectations, they suggested that the simulation technology was not transparent for the students, that is, they considered whether the problem was in the simulation and their own contributions, rather than in student knowledge. Consequently, they considered the problems that they observed from two perspectives. Either the students’ knowledge was below expectations, or, alternatively, the simulation did not work as intended because of the way in which the care situations were rendered by technology. These reflections on the nature of the problems they observed are formulated in a series of reflections regarding the dilemma of how clear the technology must be when presenting challenges, what they have to do when they intervene, and what students have to know in order to be able to respond with relevant care procedures.
The ways in which the expert teams reflect on student learning illuminate the disparities in perception between a professional and a nursing student when faced with a specific care situation. For instance, a bloody and wet bandage serves as a stimulus that a professional, equipped with patient experience, clinical expertise, and know-how, would instinctively respond to in a predictable manner. However, a stimulus does not necessarily prompt one specific action; rather, the response and the resulting action are contingent upon what you are looking for and what you are able to see in the situation. Consequently, the operators and facilitators realized that the bloody bandage, although a potent stimulus for a professional, fails to trigger the anticipated responses from the novice students. These students de facto observed the bloody bandage but failed to grasp its significance within the scenario. In other words, their perception, understanding, reaction and enacted response differ markedly from those of experienced professionals when confronted with a situation of this kind. Through their analyses, the experts attempted to consider how to modify the situation and the technology in ways that could recalibrate how students perceived the situation. They sought to design the simulation in such a way that students saw the bloody bandage through a new lens as a sign of a severe situation that required an immediate response. This process underlines the profound transformative potential of simulations—reshaping not only what students know about healthcare in general sense, but also how they will be able to see what they are facing through a professional lens. In other words, the concerns are how to improve students’ professional vision (Goodwin, 1994).
Operators and facilitators continuously discussed the effectiveness of scenarios and their relevance for professional learning, analysing student performance, their own contributions and the affordances of the technology in the learning situation. However, the agency of students in a situation relies on training to see and understand what the implications are of what you see. Therefore, the use of procedures and methodologies such as ISBAR, ABCDE or Close Loop must be trained, not just as instrumental techniques, but as ways of seeing something from a professional perspective. When reflecting on how to make students relate to the C (Circulation) of the ABCDE-procedure (Excerpt 2), they wanted students to understand how and why using systematic ways of determining basic physiological values of a patient in pain is vital and urgent. However, the simulation did not make students notice the significance of blood pressure in post-operative bleeding, and in response, the experts reflected on their contributions, contemplating various strategies of how to improve the scenario to get students on track. Thus, our findings suggest that there is an accumulation of professional judgement and learning as teams of operators and facilitators continuously attend to, including clinical, pedagogical, technological and simulation-specific issues.

6. Conclusions

By analysing the gaze and the initiatives of operators and facilitators during simulations, our study makes an original contribution to the existing research on the “backstage work” of these teams. Our study provides a new understanding of the ways in which operators and facilitators build knowledge about simulations and how these can be made more relevant as learning events. The multilayered nature of professional expertise implies that both the operators/facilitators (humans) and the technological resources serve as agents that co-determine the outcome of the simulations. In line with research on interaction in simulator-based training, this study confirms that simulators offer an environment for studying how participants enact their know-how, and this enactment can be documented in its conceptual and material dimensions in ways which are not possible in most other settings.

7. Limitations

Our study has limitations that should be considered. First, the research was carried out in the “context of discovery” (Hanson, 1958), i.e., our knowledge interest was to unpack what is occurring in simulations and how they are structured as learning experiences. Thus, our assumption was that what operators and facilitators contributed could not be considered “backstage activities”, as has been argued. Rather, the operators and facilitators exerted agency, and what they do, including the analyses they perform, were decisive for what students are able to learn. Second, we have chosen to focus our analysis on two scenarios in which the operators and facilitators work as teams, and do not explore the other four scenarios in this article. This is in line with the detailed level at which instruction and learning in simulations are analysed. Third, the simulator scenarios are regulated by certain structures and time frames that the participants might experience as being restrictive. Fourth, we also acknowledge that our analysis is limited to two control rooms located within the formal institutional context of one higher educational setting. Other professions and other institutions would provide additional information and other results. Therefore, we call for added in-depth and longitudinal studies on how teams of operators and facilitators structure learning and instruction in other simulator-based professions and higher education institutions.

Author Contributions

Conceptualization, A.C.W. and R.S.; methodology, A.C.W. and R.S.; formal analysis, A.C.W. and R.S.; investigation, A.C.W.; resources, A.C.W.; data curation, A.C.W.; writing—original draft preparation, A.C.W. and R.S.; writing—review and editing, A.C.W. and R.S.; visualization, A.C.W.; supervision, not relevant; project administration, A.C.W.; funding acquisition, A.C.W. and collaborators in the PROSIM project. All authors have read and agreed to the published version of the manuscript.

Funding

The research was conducted in Norway as part of the project Professional Education and simulation-based training (PROSIM, 2021–2025), funded by Norwegian Research Council, grant number 316212.

Institutional Review Board Statement

The project has been approved by the Norwegian Agency for Shared Services in Education and Research (SIKT).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data is unavailable due to privacy and ethical restrictions.

Acknowledgments

We thank the students, operators and facilitators who generously allowed us into their simulation suits to do this research. The views and opinions expressed in this manuscript are those of the authors, and do not necessarily represent the views or position of the participants.

Conflicts of Interest

The authors declare no conflicts of interest.

Note

1
PEARLS debriefing framework by Eppich and Cheng (2015) means Promoting Excellence And Reflective Learning in Simulation. The PEARLS framework integrates three common educational strategies used during debriefing, namely, (1) learner self-assessment, (2) facilitating focused discussion, and (3) providing information in the form of directive feedback and/or teaching.

References

  1. Beranič, T., & Heričko, M. (2022). The impact of serious games in economic and business education: A case of ERP business simulation. Sustainability, 14(2), 683. [Google Scholar] [CrossRef]
  2. Braunstein, A., Deutscher, V., Seifried, J., Winther, E., & Rausch, A. (2022). A taxonomy of social embedding—A systematic review of virtual learning simulations in vocational and professional learning. Studies in Educational Evaluation, 72, 101098. [Google Scholar] [CrossRef]
  3. Chue, S., Säljö, R., Lee, Y. J., & Pang, E. L. W. (2022). Spills and thrills: Internship challenges for learning in epistemic spaces. Journal of Education and Work, 35(2), 167–180. [Google Scholar] [CrossRef]
  4. Decker, S., Alinier, G., Crawford, S. B., Gordon, R. M., Jenkins, D., & Wilson, C. (2021). Healthcare simulation standards of best practiceTM. The debriefing process. Clinical Simulation in Nursing, 58, 27–32. [Google Scholar] [CrossRef]
  5. Dieckmann, P., Reddersen, S., Zieger, J., & Rall, M. (2008). A structure for video-assisted debriefs in simulator-based training of crisis recourse management. In R. Kyle, & B. W. Murray (Eds.), Clinical simulation: Operations, engineering, and management (pp. 667–676). Academic Press. [Google Scholar]
  6. Donald, M. (2018). The evolutionary origins of human cultural memory. In B. Wagoner (Ed.), Handbook of culture and memory (pp. 19–40). Oxford University Press. [Google Scholar]
  7. Eppich, W., & Cheng, A. (2015). Promoting excellence and reflective learning in simulation (PEARLS): Development and rationale for a blended approach to healthcare simulation debriefing. Simulation in Healthcare, 10(2), 106–115. [Google Scholar] [CrossRef]
  8. Fenwick, T. (2014). Sociomateriality in medical practice and learning: Attuning to what matters. Medical education, 48(1), 44–52. [Google Scholar] [CrossRef]
  9. Gantt, L. (2012). Who’s driving? The role and training of the human patient simulation operator. CIN: Computers, Informatics, Nursing, 30(11), 579–586. [Google Scholar] [CrossRef]
  10. Goodwin, C. (1994). Professional vision. American Anthropologist, 96(3), 606–633. [Google Scholar] [CrossRef]
  11. Gormley, G. J., & Fenwick, T. (2016). Learning to manage complexity through simulation: Students’ challenges and possible strategies. Perspectives on Medical Education, 5(3), 138–146. [Google Scholar] [CrossRef]
  12. Hanson, N. R. (1958). Patterns of discovery. Cambridge University Press. [Google Scholar]
  13. Hawley, S. (2021). Doing sociomaterial studies: The circuit of agency. Learning, Media, and Technology, 47(4), 413–426. [Google Scholar] [CrossRef]
  14. Heath, C., Hindmarsh, J., & Luff, P. (2010). Video in qualitative research. SAGE Publications, Inc. [Google Scholar] [CrossRef]
  15. Husebø, S. E., Reierson, I. Å., Hansen, A., & Solli, H. (2024). Post-simulation debriefing as a stepping stone to self-reflection and increased awareness—A qualitative study. Advances in Simulation, 9(1), 33. [Google Scholar] [CrossRef] [PubMed]
  16. Hutchins, E. (1995). How a cockpit remembers its speed. Cognitive Science, 19, 246–288. [Google Scholar] [CrossRef]
  17. Johansson, E., Lindwall, O., & Rystedt, H. (2017). Experiences, appearances, and interprofessional training: The instructional use of video in post-simulation debriefings. International Journal of Computer-Supported Collaborative Learning, 12, 91–112. [Google Scholar] [CrossRef]
  18. Jordan, B., & Henderson, A. (1995). Interaction analysis: Foundations and practice. The Journal of the Learning Sciences, 4(1), 39–103. [Google Scholar] [CrossRef]
  19. Keiser, N. L., & Arthur, W., Jr. (2021). A meta-analysis of the effectiveness of the after-action review (or debrief) and factors that influence its effectiveness. Journal of Applied Psychology, 106(7), 1007–1032. [Google Scholar] [CrossRef]
  20. Lantz-Andersson, A., Fauville, G., Edstrand, E., & Säljö, R. (2019). Concepts, materiality and emerging cognitive habits: The case of calculating carbon footprints for understanding environmental impact. In Å. Mäkitalo, T. Nicewonger, & M. Elam (Eds.), Designs for experimentation and inquiry: Approaching learning and knowing in digital transformation (pp. 13–30). Routledge. [Google Scholar]
  21. Latour, B. (1999). Pandora’s hope: Essays on the reality of science studies. Harvard University Press. [Google Scholar]
  22. Linell, P. (2011). Samtalskulturer: Vol. 1 [Conversational cultures: Vol: 1]. In Studies in language and culture 18. Linköping University. [Google Scholar]
  23. Lymer, G., & Sjöblom, B. (2024). Interaction in post-simulation debriefing. Learning, Culture and Social Interaction, 48, 100855. [Google Scholar] [CrossRef]
  24. Mäkitalo, Å. (2016). Professional learning and the materiality of social practice. In T. Fenwick, M. Nerland, & K. Jensen (Eds.), Professional learning in changing ccontexts (pp. 39–78). Routledge. [Google Scholar]
  25. Nordenström, E., Lymer, G., & Lindwall, O. (2023). Socialization and accountability. Instructional responses to peer feedback in healthcare simulation debriefing. In S. Keel (Ed.), Medical and healthcare interactions. Members’ competence and socialization (pp. 239–258). Routledge. [Google Scholar] [CrossRef]
  26. Rall, M., Manser, T., & Howard, S. K. (2000). Key elements of debriefing for simulator training. European Journal of Anesthesiology, 17(8), 516–517. [Google Scholar] [CrossRef]
  27. Reierson, I. Å., Haukedal, T. A., Eikeland Husebø, S. I., & Solli, H. (2024). Nursing students’ perspectives on the operator portraying the patient in simulation. Teaching and Learning in Nursing, 19(3), 293–297. [Google Scholar] [CrossRef]
  28. Roth, W. M. (2015). Cultural practices and cognition in debriefing: The case of aviation. Journal of Cognitive Engineering and Decision Making, 9(3), 263–278. [Google Scholar] [CrossRef]
  29. Sawyer, T., Eppich, W., Brett-Fleegler, M., Grant, V., & Cheng, A. (2016). More than one way to debrief: A critical review of healthcare simulation debriefing methods. Simulation in Healthcare, 11(3), 209–217. [Google Scholar] [CrossRef]
  30. Sellberg, C., & Wiig, A. C. (2020). Telling stories from the sea: Facilitating professional learning in maritime post-simulation debriefings. Vocations and Learning 13, 527–550. [Google Scholar] [CrossRef]
  31. Sellberg, C., Wiig, A. C., & Säljö, R. (2022). Mastering the artful practice of navigation: The situated endorsement of professional competence in post-simulation evaluations. Studies in Educational Evaluation, 72, 101111. [Google Scholar] [CrossRef]
  32. Seropian, M. A. (2003). General concepts in full scale simulation: Getting started. Anesthesia & Analgesia, 97(6), 1695–1705. [Google Scholar] [CrossRef]
  33. Skukauskaite, A., & Green, J. (2023). Ethnographic spaces of possibilities. Interactional ethnography in focus. In A. Skukauskaitê, & J. Green (Eds.), Interactional ethnography. Designing and conducting discourse-based ethnographic research. Routledge. [Google Scholar]
  34. Tamilselvan, C., Chua, S. M., Chew, H. S. J., & Devi, M. K. (2023). Experiences of simulation-based learning among undergraduate nursing students: A systematic review and meta-synthesis. Nurse Education Today, 121, 105711. [Google Scholar] [CrossRef]
  35. Tannenbaum, S. I., & Cerasoli, C. P. (2013). Do team and individual debriefs enhance performance? A Meta-Analysis. Human Factors, 55(1), 231–245. [Google Scholar] [CrossRef]
  36. Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Harvard University Press. [Google Scholar]
  37. Vygotsky, L. S. (1981). The instrumental method in psychology. In J. Wertsch (Ed.), The concept of activity in Soviet psychology (pp. 134–143). Sharpe. [Google Scholar]
  38. Waxman, K. T., Bowler, F., Forneris, S. G., Kardong-Edgren, S., & Rizzolo, M. A. (2019). Simulation as a nursing education disrupter. Nursing Administration Quarterly, 43(4), 300–305. [Google Scholar] [CrossRef]
  39. Wertsch, J. V. (2007). Mediation. In H. Daniels, M. Cole, & J. Wertsch (Eds.), The Cambridge companion to Vygotsky (pp. 178–192). Cambridge University Press. [Google Scholar]
  40. Wiig, A. C., & Säljö, R. (2024, April 11–14). Professional expertise as articulated through reflective commentaries by operators and facilitators during simulator-based learning activities. American Educational Research Association Annual Meeting 2024, Philadelphia, PA, USA. [Google Scholar]
Figure 1. Phases of simulator instruction. The focus of this article is the work carried out by operators and facilitators during Phase 3.
Figure 1. Phases of simulator instruction. The focus of this article is the work carried out by operators and facilitators during Phase 3.
Education 15 00347 g001
Figure 2. Illustration of the video- and audio recording of the operator room.
Figure 2. Illustration of the video- and audio recording of the operator room.
Education 15 00347 g002
Figure 3. Illustration of operator and facilitator reflecting over observations from the Heart Stop Scenario.
Figure 3. Illustration of operator and facilitator reflecting over observations from the Heart Stop Scenario.
Education 15 00347 g003
Figure 4. Illustration of the monitor used by operators and facilitators.
Figure 4. Illustration of the monitor used by operators and facilitators.
Education 15 00347 g004
Figure 5. Illustration of operators and facilitator reflecting over the Postoperative Bleeding Scenario.
Figure 5. Illustration of operators and facilitator reflecting over the Postoperative Bleeding Scenario.
Education 15 00347 g005
Figure 6. Illustration of operator and facilitator reflecting over the documentation of Postoperative Bleeding Scenario.
Figure 6. Illustration of operator and facilitator reflecting over the documentation of Postoperative Bleeding Scenario.
Education 15 00347 g006
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wiig, A.C.; Säljö, R. The Analytical Gaze of Operators and Facilitators in Healthcare Simulations: Technologies, Agency and the Evolution of Instructional Expertise. Educ. Sci. 2025, 15, 347. https://doi.org/10.3390/educsci15030347

AMA Style

Wiig AC, Säljö R. The Analytical Gaze of Operators and Facilitators in Healthcare Simulations: Technologies, Agency and the Evolution of Instructional Expertise. Education Sciences. 2025; 15(3):347. https://doi.org/10.3390/educsci15030347

Chicago/Turabian Style

Wiig, Astrid Camilla, and Roger Säljö. 2025. "The Analytical Gaze of Operators and Facilitators in Healthcare Simulations: Technologies, Agency and the Evolution of Instructional Expertise" Education Sciences 15, no. 3: 347. https://doi.org/10.3390/educsci15030347

APA Style

Wiig, A. C., & Säljö, R. (2025). The Analytical Gaze of Operators and Facilitators in Healthcare Simulations: Technologies, Agency and the Evolution of Instructional Expertise. Education Sciences, 15(3), 347. https://doi.org/10.3390/educsci15030347

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop