Simulated Fieldwork: A Virtual Approach to Clinical Education

The purpose of this study was to investigate student satisfaction and perceived clinical reasoning and learning using a computer-based simulation platform that incorporates case-based learning principles. The simulation was used to replace a previously scheduled face-to-face clinical rotation which was cancelled due to COVID-19. A descriptive design was used to implement the Satisfaction with Simulation Experience Scale (SSES) with students (n = 27) following each a low fidelity (paper cases) and high fidelity (SimucaseTM) simulation. A comparison of the SSES data following paper cases and simulation scenarios indicated statistically significant increases in Debrief and Reflection (p = 0.008) and Clinical Reasoning (p = 0.043), suggesting that students develop in-depth reflection, reasoning, and clinical abilities as they progress through their simulated experience.


Introduction
Due to the current landscape of our communities and educational system, faculty across the country are seeking out evidence-based, practical ways to meet the demands of curriculum. COVID-19 has affected the education system at multiple levels and has had a significant impact on the student's ability to meet requirements in professional programs such as clinical education. The use of simulation and simulated environments is one strategy to respond to lack of clinical training sites.

Simulation as a Model of Practice
Simulation can be described as a teaching-learning modality that replaces and strengthens real experiences with guided ones that evoke clinical reasoning and reproduces aspects of real scenarios using an interactive approach [1]. Simulation can be described as low or high fidelity, which refers to the "degree of realism associated with a particular simulation activity" [2] (p. 11). While the literature varies around categorizing fidelity, for the purposes of this model, the authors define peer-practice, paper-case studies and role play as lower fidelity simulations, whereas standardized patients, human patient simulators and simulation labs that can mimic physiological responses are closer to the high-fidelity range of simulations. Evidence indicates that simulation provides various opportunities for learners to develop competence and confidence [3]. In addition, many clinicians, educators, and healthcare leaders believe simulation promotes patient safety and raises the quality of patient care [4]. The literature indicates that when simulation is combined with faculty engagement through debriefing, students achieve higher results [5,6]. Most debriefing models follow a pre-brief, scenario, and debrief structure. Pre-debrief is often used by the instructor to reflect on the experience of the learner and their own experience with debriefing. Pre-debrief is often important in establishing learning objectives and can provide context to the client or case, the experience, or key points to consider in advance of the scenario. Sawyer and colleagues [7] suggest a Gather, Analyze and Summarize structure for post-simulation and participation in selected aspects" of occupational therapy practice [17] (p. 41). Current accreditation standards indicate that level I FW can be met in a variety of ways, including simulated environments, standardized patients, faculty-practice, faculty-led visits and supervision by a fieldwork educator in a practice setting [17]. Simulated environment is further defined by [17] as "a setting that provides an experience similar to a real-world setting in order to allow clients to practice specific occupations" (p. 54). The researchers believe that simulation can be an ideal way to deliver level I fieldwork experiences as it provides the students with the opportunity to repeatedly practice skills in a low-stakes environment. Although simulation education is an acceptable method to accomplish level I fieldwork, there is limited research in this area, particularly for occupational therapy.
This research study examined student perceptions in using Simucase™ to satisfy a one-week, level I fieldwork experience. Simucase™ is a computer-based simulation platform that provides students with experiences designed to teach complete processes using video-recordings of client scenarios. Currently, this technology is available for audiology, occupational therapy, physical therapy and speech-language pathology professions. Differing from other virtual learning resources, Simucase™ offers a comprehensive platform including simulations, part-task trainers (short scenarios focusing on one skill), and an observation video library [19]. Another benefit of Simucase is that it is designed to measure student skills and enhance clinical competency. These skills include honing clinical observations, interviewing clients and families, collaborating with other disciplines, administering and interpreting assessments, designing intervention plans, and implementing interventions, which are all conventional skill sets to be developed and built during level I FW. The simulation scoring provided within Simucase™ is based upon strength of the clinical decision making of the student. Virtual scenarios vary in age, diagnosis and practice settings, which can expose students to more diverse experiences that may not be available or accessible during traditional in-person fieldwork. For example, virtual patient simulations include patients across the life span (ages 2-80 years old) and within various practice settings such as community-based practice, home health, early intervention, school-systems and more. Diagnoses of the virtual patients include orthopedic, neurological, developmental and mental health conditions. Each simulation is very authentic, created from an actual client, and is submitted by a practicing clinician.
Using the Simucase™ platform, the researchers created a robust one-week curriculum to align with the traditional learning objectives within the academic program and best practice for simulation education. This included a structured debriefing to stimulate self-reflection and a rubric to evaluate student engagement and clinical performance. Ahead of the one-week simulation, students were provided the overview of the experience (See Appendix A) and were able to self-select particular simulations to engage with, which determined the smaller debriefing sessions. The researchers created structured learning activities to be completed by the student daily. Depending on the day, students were required to submit a deliverable ahead of their scheduled debrief using the educational institution's learning management platform, Blackboard™. Deliverables may have included a journal reflection, written sample of documentation based upon a simulation, a recorded video clip where the student demonstrates a relevant intervention, or their competency report, provided by Simucase™ after activities, such as the Part-Task Trainer, were completed. Four faculty members were involved to facilitate the simulation fieldwork experience which reduced the instructor to student ratio for debriefing sessions. Students met one to two times per day within their small groups via web conferencing to process their learning and engage in structured discussion around various aspects of each simulation. Faculty used the suggested debriefing prompts provided within the Simucase™ faculty platform to facilitate small group discussion and feedback sessions. Following each debrief session, faculty completed the rubric in Appendix B. The daily rubrics were averaged at the end of the week with comprehensive summative feedback on overall engagement across the entire one-week experience.

Instruments, Data Collection, and Analysis
Following Institutional Review Board (IRB) approval, data was collected through the Qualtrics platform, using the Satisfaction with Simulation Experience Scale (SSES) [20] and a final student evaluation and self-reflection questionnaire to evaluate their overall experience. The SSES is an 18-item self-rated scale that assesses student satisfaction with simulation in three areas: Debrief and Reflection, Clinical Reasoning, and Clinical Learning. The scale has well-documented reliability and validity. Williams and Dousek [21] found that the SSES has adequate internal consistency and construct validity. Using the SSES, participants provided a rating to each question using a five-point Likert scale (1 = Strongly Disagree to 5 = Strongly Agree). Data collection for the study occurred at two points. The SSES was initially administered at the start of the 3rd semester, prior to any simulation education, asking students to reflect on their experiences with paper-case scenarios and in-class discussions thus far in the curriculum. In addition to the post-SSE, students also completed a final evaluation at the end of the week (Appendix C), which stimulated self-reflection on their overall experience with the Simucase platform.
Descriptive statistics for scores on the SSES pre-and post-simulation were analyzed using Statistical Package for the Social Sciences version 25 [22]. A Wilcoxon signed ranks test was used due to the small sample size and the inability to assume a normal distribution; it revealed a significant difference in comparing the post-paper case and post-simulation means. In addition to the quantitative data, to understand the full experience of students following the simulation, qualitative data were reviewed and analyzed through concept-driven coding. The qualitative data were gathered through the use of the final evaluation (n = 29), where students had the chance to reflect on their experience. The three authors began the qualitative analysis with initial coding of the responses, then organization of the categories, and finally, provided a structure for the overall supporting quotes [23]. For the purposes of this manuscript, only the most salient quotes from each category were used to describe the findings, aligned with the quantitative data below.

Results
A Wilcoxen Signed Ranks Test indicated a statistically significant increase in two of the three sections of the SSES post-simulation: Debrief and Reflection (z = −2.67, p = 0.008) with a large effect size (r = 0.63); and Clinical Reasoning (z = −2.023, p = 0.043) with a large effect size (r = 0.64). Clinical Learning did not indicate a statistically significant difference (z = −1.826, p = 0.068, r = 0.65); however, individual statements in this section were shown to have significance. These findings suggest that students develop in-depth reflection, reasoning, and clinical abilities as they progress through their simulated experience. Supporting quotes from the final student evaluation and self-reflection questionnaire will also be shared within these results. Each of these sections, plus the students' perception of the overall experience, will be further elaborated on below.

Perceived Value of Debriefing and Reflection
As the literature suggests, debriefing and reflection are at the core of any simulation experience [5,6]. For these 29 students, these made a significant impact on their learning and perceived value of the program. Table 1 presents the correlating statements of the SSES in this section, mean scores, and the significance following the Simucase experience. In addition to the SSES scores, students often commented on the debriefing process throughout the post-fieldwork survey. One student stated: "While this fieldwork experience was different than we expected, I feel that I have learned so much from it. I felt that the daily debriefs were very helpful for me to view the cases and situations from points of view that I had not considered. It gave me a more well- Other students discussed how the debriefing specifically impacted their learning, in a way that was unique to this simulated experience. For example, one student commented: "The debriefing sessions were really helpful for me to articulate my clinical reasoning and any questions I had for the professor. These virtual debrief sessions allowed me to have the chance to communicate what I thought about certain cases/assessments and helped me improve my communication skills." Similarly, another student stated, "The most helpful aspect was the discussions and debriefs as that is not an aspect that we would have gotten to do as much of in a hands-on experience." The results of the SESS and the statements made by students indicate the impact that debriefing and reflection during simulation have on learning.

Perceived Use of Clinical Reasoning
The second focus area of the SSES, which was of particular importance to the researchers, was the perceived use of clinical reasoning. In occupational therapy, clinical reasoning is defined as, "the thinking that guides practice . . . [and that] cognitive activity constitutes the heart of the clinical enterprise" [24] (p. 601). In this section, as presented in Table 2, students had statistically significant growth in all statements, with particular emphasis on the opportunity to reflect on clinical ability, which supports data from the first section as well.
In the survey, students were asked to reflect on their overall learning through this experience. Similarly to the SSES data, the concept of clinical reasoning in some capacity was discussed by the majority of participants. One student discussed her use of diagnostic reasoning, an aspect of clinical reasoning often discussed in occupational therapy. She stated: "This was my first experience working on a case that involved children with ASD. I knew the signs and symptoms of ASD [autism spectrum disorder], but every child presents differently. So being able to work with three different cases involving ASD and seeing the differences helped expand my knowledge and experience with working with ASD but different children." Overall, 24 out of the 29 students referenced clinical reasoning skills and growth in some way. One example from a student who captured this finding stated, "I believe [this experience] helped my clinical reasoning skills because I was able to share my thinking in response to the faculty's questions, as well as be challenged to think from the different perspective of my peers." Table 2. Perceived use of clinical reasoning during the simulation process.

SSES Statement
Post-Paper Case Post-Simulation

Perceived Use of Clinical Learning
In the final section of the SSES, students assessed their perceived connections to clinical learning. In this scale, clinical learning asks the students to reflect on their overall abilities and skills to apply knowledge to practice. Table 3 presents the statements associated with this sub-section following the paper case and Simucase experiences. Again, student responses correlated with the findings of this SSES section. In terms of the overall experience, one student stated: "I felt that this virtual fieldwork experience has enhanced my skill set and comfortability in working with different populations and diagnosis. The Simucase format of watching videos of the patient during therapy sessions, reading through patient charts, intake forms, and case histories and then applying that information by making informed decisions on their plan of care through answering of questions was very helpful." Another student reflected on their own self-regulated learning and efficacy, stating: "I think this experience allowed me to be more comfortable with being wrong and trying new approaches than I would be in a facility with a new [Fieldwork Educator] and unfamiliar clients. It gave me the opportunity to experiment without any real implications if I chose something incorrectly."

Perception of Overall Experience and Preparedness
Since this experience was the final level I fieldwork experience prior to the students going out into their full time clinicals (Level II Fieldwork), it was important to also understand how they perceived the value of the experience in preparing them for Level IIs. One aspect that was continually referenced in the students' final assessment and post-survey was confidence. One student stated: "My confidence going into Level II fieldwork has increased knowing that I have practiced administering assessments, developing treatment plans for several different types of clients, writing SOAP notes, and discussing other considerations such as billing, ethical dilemmas, and safety issues. It helped immensely to be able to compare my experience with others and know that I was on the right track, as well as listen to advice from the professionals that are in our own department. I felt that the assignments were well-timed and appropriate to the types of things I would be practicing in a real fieldwork setting, and I'm glad I got to use Simucase, which makes you use clinical reasoning on real-life patients. I believe FWI was as good as it could get despite the circumstances!" Another student similarly discussed their preparedness for full time clinicals, stating: "Virtual fieldwork experience enhanced my skill set and confidence for level II fieldwork because it allowed me to use clinical reasoning through gathering data on each of the clients and using it to make ethical decisions for evaluations and interventions. I like how Simucase gave us feedback on our answers to help self-reflect on what I put and what to change. I also liked how we worked with multiple clients with different diagnoses, making it more realistic to in-person fieldwork. Finally, I liked how we were given opportunities to self-reflect throughout the week and share these reflections during the discussion." Students were also asked to provide honest feedback on what challenged them or did not go as anticipated in the experience. Multiple students suggested having the opportunity to process more cases, to better replicate the actual clinic setting. One student stated: "I think one way this fieldwork could be enhanced next year, is to add in more stimulations or videos on different interventions with various populations. During fieldwork in December we are seeing multiple sessions per day which isn't quite possible in a virtual setting but maybe adding in a way for us to get more exposure to how different sessions would run could be helpful." Eight of the students also stated in some way they would like a greater variety of pediatric diagnoses. One stated, "I did learn from those cases which helped to expand my experiences, but I would have liked working with other diagnoses such as CP [cerebral palsy] or spina bifida just to name a few."

Discussion
The results of this study identify students' self-reported satisfaction related to in-depth reflection, reasoning, clinical abilities, and preparedness upon completion of the simulated fieldwork I experience.
The findings in this study support the growing evidence that simulation can enhance student competence, confidence, and perceived readiness for clinical education [3,4,14].

Perceived Value of Debriefing and Reflection
Simulation provides the opportunity to fully review and discuss a case and seek clarification in a less critical time frame versus in the clinic. The structured debriefing sessions provided time for students to discuss performance, ask questions and reflect on each case and assignment. Faculty were able to summarize important issues and provide constructive feedback that furthered student understanding and learning. In this study, the students reported the positive impact of debriefings and reflections on their understanding and learning, which adds support to prior studies that identify debriefing as an integral part of simulation-based learning [25][26][27]. This aspect of debriefing and reflecting on a regular basis provides students a way to engage in constant real-time feedback, which can be different than a traditional clinical setting due to the fast-paced environment and productivity demands. Decreased feedback, high workload, and time constraints have been identified as barriers to learning in clinical settings [28][29][30].

Perceived Use of Clinical Reasoning
Student learning and understanding can be negatively impacted by limited exposure to varied client population at a clinical site [29]. The simulation environment provides the opportunity to engage with a variety of cases which vary in diagnoses and ages. In this study, the students' ability to independently engage and/or make clinical decisions with a variety of clients was enhanced in the simulated environment. The students identified how this variety expanded their clinical reasoning beyond a particular setting, diagnosis, or age. The students were required to make clinical decisions on their own. This varied exposure and independent thinking improved clinical reasoning as each case presented a new challenge and students evoked different diagnostic reasoning and clinical decision-making skills [26,29]. These skills are essential for future clinical rotation success. James and Mussleman [30] found that a cause of failure on future, more progressive fieldwork experiences was often a result of a lack of clinical reasoning and problem-solving skills. Participating in group discussions which required the students to reflect on and explain their thinking and listening to the thoughts of others supported student growth for use of clinical reasoning [26]. Students valued hearing different perspectives and reasons why different methods of intervention may be suitable for the same case [25].

Perceived Use of Clinical Learning
Students reported an overall improvement in their clinical learning after the simulated experience. Clinical learning asks the students to reflect on their overall abilities and skills to apply knowledge to practice. The simulations tested their clinical ability by challenging them to apply what they have learned to each step of the case study and make clinical decisions. The safe environment of the simulation enabled students to try new approaches and make mistakes without risk. Debriefing and reflection, working with different populations/diagnoses, and the safe environment for learning and making mistakes have been supported in the literature as enhancing the students' perceived use of clinical learning through simulation [25,31].

Perception of Overall Experience and Preparedness
Discussion of confidence was found in the student's final evaluation and self-reflection. Students reported the overall simulation experience enabled them to practice clinical skills, including assessing, identifying interventions, documenting, and clinical reasoning. The simulations and learning activities designed by the researchers also engaged the students in meaningful discussion and learning around important concepts such as ethical reasoning, billing and reimbursement, and safety issues. Through practice, reflection, and discussion, students reported an increase in self-confidence as they transitioned into their level II fieldwork rotation [3,14]. Studies about the preparedness for level II fieldwork and future practice identify confidence as a factor for success in both transitions [29,32].

Limitations and Future Research
The study design identified students' perceptions of and satisfaction with a computer-based simulation platform. Clinical competence was not measured through an outcome assessment. The necessary first step was to understand student perception of the aspects of simulation related to debriefing and reflection, clinical reasoning, clinical learning and overall experience and preparedness. The limitations of this study stem from the purposive sample and the lack of diversity in gender, age, and race. However, students who participated in this study were representative of the student population in similar sized occupational therapy educational programs [33]. The study also involved the findings of one cohort of students from a private, Catholic academic institution in the northeast, and may or may not reflect the behavior and attitudes of other occupational therapy students in different geographical areas or types of institutions. We would recommend repeating this study with a larger, more diverse group of students with different genders, ages, and races across various occupational therapy programs at different academic institutions. In addition, further exploration is needed of self-perceived clinical competence and reasoning versus actual assessment of specific outcomes of these skills.

Conclusions
Simulation is currently being explored as a model of practice to meet the increasing demands of clinical education within the health sciences. Through debriefing and reflection, simulation can enable students to experience clinical learning, and to develop clinical reasoning, confidence, and perceived readiness for level II fieldwork clinical education. Simulation also provides a way to increase the exposure of patient encounters across the lifespan and within various clinical arenas. Given the current landscape with shortage of fieldwork placements across the US and the impact of COVID-19 on site availability, understanding the student experience with these teaching modalities will help shape and further define the future use of simulation for level I FW. This study's findings suggest that simulated case scenarios enable students to experience aspects of clinical learning and clinical reasoning through guided inquiry, integration of knowledge, and reflection/feedback. Acknowledgments: The authors would like to acknowledge Emily Casile and Alexandria Raymond for their assistance in manuscript preparation. In addition, we would like to thank Maura Lavelle and Wendy Brzozowski from Simucase for their support in curriculum development, manuscript preparation, and student support.

Conflicts of Interest:
The authors declare no conflict of interest.

Appendix A
Educ. Sci. 2020, 10, x FOR PEER REVIEW 11 of 19 Appendix A   Productivity is a measure of output (work).

K
Using the AOTA Toolkit (https://www.aota.org/Practice/Ethics/Tools-for-Productivity-Requirements.aspx), determine how productivity might be measured at this site? K What strategies might you use to effectively meet productivity standards? K Come up with a potential unethical situation related to productivity with your Simucase client/practice. Determine any potential repercussions for this situation.  Comments are uninformative, lacking in appropriate terminology. Heavy reliance on opinion & personal taste, e.g., "I agree", "I disagree", "Me too", "Yes", "No" etc.

Information Seeking
Assertively seeks information to plan; carefully collects useful data from observing and interacting with the case; effective use of evidence Actively seeks information to support planning; occasionally does not pursue important leads.
Makes limited efforts to seek additional information from the patient; often seems not to know what information to seek and/or pursues unrelated or outdated information.
Is ineffective in seeking information; relies mostly on objective data; fails to collect relevant evidence

Prioritizing Data
Focuses on the most relevant and important data useful for explaining the case Generally focuses on the most important data and seeks further relevant information but also may try to attend to less pertinent data Makes an effort to prioritize data and focus on the most important, but also attends to less relevant or useful data

Appendix C Virtual Level I Fieldwork Evaluation/Reflection
Each student will complete this Level I fieldwork evaluation at the conclusion of the experience. Be honest! This is for you and your faculty to continue working on your professional development. Please rate yourself as you really felt you performed. Although this fieldwork was not completed as we intended, we still want to learn about what worked in this experience and what did not. Carefully respond to the reflective questions posed at the bottom of the evaluation. Thank you.
Part 1: Professional Behaviors. Please comment on how well prepared you feel for level II fieldwork, not that you have mastered all content. In one paragraph (less than 300 words) summarize your performance.
Part 2: Professional Skills. Please comment on how well prepared you feel for level II fieldwork, not that you have mastered all content. In one paragraph (less than 300 words) summarize your performance.
General Reflection on the Experience 1. Tell us how this virtual fieldwork experience enhanced your skill set and confidence for Level II fieldwork. Please be specific with features of the experience that were helpful.
2. Tell us how this virtual fieldwork experience could be modified to enhance your skill set and build your confidence for Level II fieldwork. Please be specific with suggestions.
3. Each of you have received feedback from previous level I fieldwork educators, faculty, and your peers in various ways. You also shared a goal in OTH512 for the week through Flipgrid. Please make a statement on progress you have made in the goal areas you have set for yourself based on this overall process.