Fire in the Operating Room: Use of Mixed Reality Simulation with Nurse Anesthesia Students

Background: The occurrence of a fire when implementing anesthesia is a high-risk, low-frequency event. The operating room is a high-stakes environment that has no room for error. Mixed reality simulation may be a solution to better prepare healthcare professionals. The purpose of this quantitative, descriptive study was to evaluate the technical and non-technical skills of student registered nurse anesthetists (SRNAs) who participated in a mixed reality simulation of an operating room fire. Methods: Magic Leap OneTM augmented reality headsets were used by 32 student registered nurse anesthetists to simulate an emergent fire during a simulated tracheostomy procedure. Both technical and non-technical skills were evaluated by faculty members utilizing a checklist. Results: The SRNAs’ overall mean technical skill performance was 18.16 ± 1.44 out of a maximum score of 20, and the mean non-technical skill performance was 91.25% out of 100%. Conclusions: This study demonstrated the utility and limitations in applying novel technology in simulation. Participants demonstrated a strong performance of technical and non-technical skills in the management of a simulated operating room fire. Recommendations for future applications include the use of multiple sensory inputs into the scenario design and including all core team members in the immersive mixed reality environment.


Introduction
The occurrence of a fire in the operating room (OR) is a high-risk, low-frequency event. Fire may readily occur in the OR due to the presence of highly concentrated oxygen in the anesthetizing process. Such an incident would lead to potentially life-threatening injuries for the patient and healthcare providers [1]. Thus, proper education and prevention of an OR fire is a priority in nurse anesthetist education.

Use of Simulation
The OR is a high-stakes environment that has no room for error. To better prepare nurse anesthetists, the use of a simulated environment is now the growing standard for healthcare education [2]. The real-life application of what is practiced in a simulated environment permits awareness and practice of routine to urgent skills that enhance the acquisition of knowledge that facilitates the appropriate Informatics 2020, 7, 40 2 of 13 patient care [3]. Advanced simulation scenarios that exemplify high acuity and low-frequency events include cardiac arrest, shock, and fire in the OR [4].
Simulation provides an environment for experiential learning to take place. According to Kolb, experiential learning is "the process whereby knowledge is created through the transformation of experience. Knowledge results from the combinations of grasping and transforming the experience." [5] (p. 38). Experiential learning theory includes two ways of grasping the experience: abstract conceptualization (concluding/learning from the experience) and concrete experience (doing/having an experience). Also included are two ways of transforming the experience: active experimentation (planning/trying out what you have learned) and reflective observation (reviewing and reflecting on the experience) [6]. The participation in simulation-based education supports experiential learning theory whereby the student registered nurse anesthetist can participate in managing a simulated emergency situation such as fire in the OR. As a member of the healthcare provider team in a student role, this acquired knowledge can be translated later into the clinical role of a certified registered nurse anesthetist. Thus, simulation is a valued teaching method for both technical and non-technical skills in healthcare.

Non-Technical Skills
Human factors are responsible for 80% of all accidents across high-risk organizations. The seminal document, To Err is Human, began a trajectory to the investigation of not only technical skills, but also non-technical skills, in industrial training including healthcare [7,8]. Flin, an industrial psychologist, examined tragic industrial events and reported the human error or non-technical components contributed to these tragic or near-miss events. Pilot training in management of human errors was initiated as crew resource management with the development of behavior markers for non-technical skills known as NOTECHS [9]. Ultimately, the continued emphasis on non-technical or soft skills investigation led a team of Scotland researchers to develop a taxonomy of behavior markers to utilize as identifiers for non-technical evaluation specifically with anesthesia practice [10].
One tool used to evaluate an anesthetist's performance in non-technical skills is the anesthetist non-technical skills (ANTS) system [10]. This tool consists of four categorical and 15 elements of anesthetist non-technical skills using a 1-4 rating system. The categorical level consists of Situation awareness, Decision-making, Task management, and Team working [10]. The ability to evaluate the performance of the anesthetist's non-technical skills enhances a culture of safety in the anesthetized patient. The OR is a dynamic environment with complex systems including facility, perioperative, medical delivery, medical and non-medical personnel. Further, the surgical patient is multifaceted with considerations such as surgical procedure, comorbidities, and individual differences. This environment takes a network of moving parts to provide a unified package of care delivery to the patient. Anesthesia providers in this high-stakes environment need technical and non-technical skills to safely deliver care for the perioperative patient [11]. The ability to provide anesthesia training in technical and non-technical skills in a simulated environment enhances the student's success in obtaining these skills. One study of first-year student registered nurse anesthetists (SRNAs) revealed that non-technical skills can be taught and are not simply acquired through working for years in an intensive care unit [12].

Use of Checklists
Checklists have been demonstrated to be effective in decreasing rates of infection, but also to increase efficiency and improve patient outcomes in the perioperative setting [13][14][15][16][17][18]. In the perioperative setting, OR crisis checklists have been instituted to streamline a logical and decisive approach to managing crises in the operative setting [15][16][17][18]. Utilizing a checklist composed of both technical and non-technical skills can enhance teaching and evaluation in simulation.

Mixed Reality in Nursing Education
The use of augmented and virtual reality in nursing education is rare but is growing [19]. Even more unique is the application of mixed reality in the niche field of nurse anesthesia. OR/airway Informatics 2020, 7, 40 3 of 13 fires occurring during a surgical procedure, particularly under anesthesia, can be extremely devastating. Although considered a rare event, it is a high-acuity, low-frequency situation that every anesthesia provider should be prepared for in the case that it occurs. Effective simulation training is necessary in OR fire prevention, as is awareness of early warning signs and effective treatment [1,20]. The ability to provide simulation-based education with visual mixed reality input can provide robust educational experiences that are normally difficult to mimic. However, research is lacking about the use of mixed reality training in nurse anesthesia.

Research Aims
The purpose of the study was to evaluate the technical and non-technical skills of student registered nurse anesthetists using mixed reality to simulate a fire in the OR setting. The research aims were to: (1) Assess SRNAs' technical skills while managing a simulated OR fire using mixed reality.
(2) Assess SRNA' non-technical skills while managing a simulated OR fire using mixed reality.

Methods
After receiving institutional review board (IRB) approval (IRB # 20190980), this quantitative, descriptive study was conducted during the third semester of the Bachelor of Science in Nursing-Doctor of Nursing Practice (BSN-DNP) Nurse Anesthesia Program in Fall 2019. Thirty-four student registered nurse anesthetists (SRNAs) participated during their regularly scheduled three-hour simulation class concomitant with their didactic course, Advanced Concepts in Anesthesia Nursing I. There were three sections of the simulation class with 11-12 SRNAs per session. The student learning objectives for the simulation were the following:

•
Student is able to understands fire risks in the OR and the fire triangle. • Student identifies fuel, ignition, and oxidizer sources in the perioperative setting.

•
Student is able to utilize new innovation to appropriately treat an OR/Airway fire.
Each session began with a pre-brief that discussed objectives of the simulation, brief content of prevention of fire in the OR, an introduction of the Magic Leap One TM (Plantation, FL, USA) augmented reality device, and obtaining informed consent. Risks, benefits, and alternatives were also discussed if an SRNA decided against participating.
The International Nursing Association for Clinical Simulation and Learning (INACSL) Standards of Best Practice: Simulation SM (Simulation Design Standard) was adhered to during scenario creation [21,22]. All eleven criteria within the Standard were met during the design of the OR/airway fire simulation. The mixed-reality script and objectives were provided to the faculty raters, and some faculty also participated as confederates in the scenario.

Instrument
Expert faculty members developed the OR Fire Checklist to score SRNA performance during the simulation. The checklist was based on the American Society of Anesthesiologists Task Force on Operating Room Fires-Practice Advisory for the Prevention and Management of Operating Room Fires with the addition of the non-technical categorical elements from the ANTS tool [10,20]. The Practice Advisory includes a stepwise Operating Room Fires Algorithm for the prevention and management of fires by OR personnel [20]. The checklist format and objectives were reviewed and discussed by the faculty in a meeting to establish consensus. Several discussions regarding the adaptation of the American Society of Anesthesiologist (ASA) OR Fires Algorithm, which constituted an SRNA's correct action, and verbalization/recognition of events were held with the expert raters. The Delphi method was also utilized for designating technical and non-technical skills in the OR/Airway Fire Checklist in the OR. The checklist involved 20 items, with one point for each item that was performed correctly and zero points if the item was not performed correctly. The minimum score possible was zero, and the maximum possible score was 20. Both technical and non-technical skills were incorporated within the checklist. As part of their rater training, all faculty raters viewed and scored one video-recorded student performance using the checklist. Once faculty raters achieved an 80% agreement on the checklist, raters moved forward to score the SRNAs' simulation performances.

Augmented Reality Headset
The augmented reality headset selected for this study was Magic Leap One™. Using the augmented reality headset would allow the student to carry out physical tasks in the simulated OR setting due to see-through visual elements of the glasses. The virtual elements of smoke, fire, and water were displayed in the glasses field-of-view at specific times in the simulation. Because the SRNAs interacted and responded to the virtual objects seen in the glasses as if they were real, the experience was considered a mixed reality experience. The setup included two headsets sharing the same augmented reality experience ( Figure 1). ethod was also utilized for designating technical and non-technical skills in the OR/Airway Fir ecklist in the OR. The checklist involved 20 items, with one point for each item that was performe rrectly and zero points if the item was not performed correctly. The minimum score possible wa ro, and the maximum possible score was 20. Both technical and non-technical skills wer orporated within the checklist. As part of their rater training, all faculty raters viewed and score e video-recorded student performance using the checklist. Once faculty raters achieved an 80% reement on the checklist, raters moved forward to score the SRNAs' simulation performances.

. Augmented Reality Headset
The augmented reality headset selected for this study was Magic Leap One™. Using th gmented reality headset would allow the student to carry out physical tasks in the simulated O tting due to see-through visual elements of the glasses. The virtual elements of smoke, fire, an ter were displayed in the glasses field-of-view at specific times in the simulation. Because th NAs interacted and responded to the virtual objects seen in the glasses as if they were real, th perience was considered a mixed reality experience. The setup included two headsets sharing th me augmented reality experience ( Figure 1). The student was assigned the first headset, without a controller, such that he/she would hav eir hands free to attend to the patient. The second headset, with a controller, was assigned to th tructor. The instructor would be the one to trigger the smoke, fire, and water based on th ogression of the simulation scenario.
Prior to the start of the simulation, the instructor would wear Magic Leap One™ and scan th om, creating a mesh of the environment, which included walls and equipment. This mesh wa The student was assigned the first headset, without a controller, such that he/she would have their hands free to attend to the patient. The second headset, with a controller, was assigned to the instructor. The instructor would be the one to trigger the smoke, fire, and water based on the progression of the simulation scenario.
Prior to the start of the simulation, the instructor would wear Magic Leap One™ and scan the room, creating a mesh of the environment, which included walls and equipment. This mesh was stored in the headset, allowing Magic Leap One™ to display virtual elements in places that made sense, considering the equipment in the room and wall constraints ( Figure 2). Next, an orientation was provided to the SRNAs in a separate debriefing room by the School of Engineering faculty/team in order for each SRNA to view, fit, and orient the headset. If the SRNA was visually impaired (i.e., wore glasses) the headset was adjusted to fit the vision of the student. Once the OR/airway fire simulation team was ready, the SRNA was taken to the OR to receive a report on the patient in the scenario. Each scenario was approximately 5 to 14 minutes long and was followed by faculty-led debriefing using the Promoting Excellence and Reflective Learning in Simulation (PEARLS) method [23]. Each student was then asked to complete a brief survey anonymously via Qualtrics™ (Provo, UT, USA) that requested demographic information and asked about their experience with the augmented reality headset.

Airway Fire Application
The airway fire application specifically built for this project allowed the instructor to identify two spawning points for the augmented reality elements. The first was the mouth of the patient (Figures 3 and 4), and the second point was on the floor on the patient's bedside. The first point was the location of the airway fire, and the second point was meant to hold a secondary, but no less dangerous, bedside fire resulting from the staff removing and throwing surgical blankets on the floor in an effort to disencumber the burning patient ( Figure 5). Next, an orientation was provided to the SRNAs in a separate debriefing room by the School of Engineering faculty/team in order for each SRNA to view, fit, and orient the headset. If the SRNA was visually impaired (i.e., wore glasses) the headset was adjusted to fit the vision of the student. Once the OR/airway fire simulation team was ready, the SRNA was taken to the OR to receive a report on the patient in the scenario. Each scenario was approximately 5 to 14 min long and was followed by faculty-led debriefing using the Promoting Excellence and Reflective Learning in Simulation (PEARLS) method [23]. Each student was then asked to complete a brief survey anonymously via Qualtrics™ (Provo, UT, USA) that requested demographic information and asked about their experience with the augmented reality headset.

Airway Fire Application
The airway fire application specifically built for this project allowed the instructor to identify two spawning points for the augmented reality elements. The first was the mouth of the patient (Figures 3 and 4), and the second point was on the floor on the patient's bedside. The first point was the location of the airway fire, and the second point was meant to hold a secondary, but no less dangerous, bedside fire resulting from the staff removing and throwing surgical blankets on the floor in an effort to disencumber the burning patient ( Figure 5).      From a nearby control room, the instructor would share the augmented reality experience with the student via a Wi-Fi connection and trigger the smoke mid-way through the scenario ( Figure 6). At that point, the application would start spawning the smoke out of the patient's mouth. After the student noticed the smoke and acted (or not), the instructor would trigger the next augmented reality element, the fire, out of the airway location. An assistant would then pull the surgical blanket and dispose of it on the floor, at which time the instructor would switch modes with the controller and trigger the appearance of a large quantity of smoke from the blanket. Because of the nature of the augmented reality headset blocking some peripheral vision, the smoke on the floor could go unnoticed by the student. Additionally, the student could be focused on the airway fire and miss the second fire. If the student would pour saline or water in an attempt to extinguish the airway fire, the instructor had an option to trigger virtual water poured on the patient's mouth. If no action was taken From a nearby control room, the instructor would share the augmented reality experience with the student via a Wi-Fi connection and trigger the smoke mid-way through the scenario ( Figure 6). From a nearby control room, the instructor would share the augmented reality experience with the student via a Wi-Fi connection and trigger the smoke mid-way through the scenario ( Figure 6). At that point, the application would start spawning the smoke out of the patient's mouth. After the student noticed the smoke and acted (or not), the instructor would trigger the next augmented reality element, the fire, out of the airway location. An assistant would then pull the surgical blanket and dispose of it on the floor, at which time the instructor would switch modes with the controller and trigger the appearance of a large quantity of smoke from the blanket. Because of the nature of the augmented reality headset blocking some peripheral vision, the smoke on the floor could go unnoticed by the student. Additionally, the student could be focused on the airway fire and miss the second fire. If the student would pour saline or water in an attempt to extinguish the airway fire, the instructor had an option to trigger virtual water poured on the patient's mouth. If no action was taken At that point, the application would start spawning the smoke out of the patient's mouth. After the student noticed the smoke and acted (or not), the instructor would trigger the next augmented reality element, the fire, out of the airway location. An assistant would then pull the surgical blanket and dispose of it on the floor, at which time the instructor would switch modes with the controller and trigger the appearance of a large quantity of smoke from the blanket. Because of the nature of the augmented reality headset blocking some peripheral vision, the smoke on the floor could go unnoticed by the student. Additionally, the student could be focused on the airway fire and miss the second fire. If the student would pour saline or water in an attempt to extinguish the airway fire, the instructor had an option to trigger virtual water poured on the patient's mouth. If no action was taken to manage the bedside fire, the instructor could trigger a virtual simulated flame three feet wide by five feet high.
According to the scenario, the assistant would call for support, and when support arrived, brandishing a fire extinguisher over the secondary fire, the instructor would trigger a large virtual stream from the extinguisher to the floor. The last option in the smoke-fire-water cycle would trigger the disappearance of the virtual elements, effectively terminating the simulation and resetting the application for the next participant.
The simulation made creative use of augmented reality technology and adapted it to the needs of the experiment. This was accomplished by sharing the experience between headsets and separating the controller from the student headset so the instructor could control the simulation based on the script and the evolution of the scenario. The setup afforded considerable advantages compared to scenarios traditionally carried out with real smoke machines. This format setup was easily reset between participants, offered a mess-free experience, and gave total control to the instructor over the triggers.

Data Collection
All student simulation performances were video recorded (Figure 7). Six nurse anesthesia faculty experts viewed the recordings retrospectively, and two raters scored each performance using the OR Fire Checklist. Average performance scores were entered into Qualtrics by the faculty members. Data were separated by type of skill (technical vs. non-technical) for analysis.
Informatics 2020, 7, x 8 of 13 to manage the bedside fire, the instructor could trigger a virtual simulated flame three feet wide by five feet high. According to the scenario, the assistant would call for support, and when support arrived, brandishing a fire extinguisher over the secondary fire, the instructor would trigger a large virtual stream from the extinguisher to the floor. The last option in the smoke-fire-water cycle would trigger the disappearance of the virtual elements, effectively terminating the simulation and resetting the application for the next participant.
The simulation made creative use of augmented reality technology and adapted it to the needs of the experiment. This was accomplished by sharing the experience between headsets and separating the controller from the student headset so the instructor could control the simulation based on the script and the evolution of the scenario. The setup afforded considerable advantages compared to scenarios traditionally carried out with real smoke machines. This format setup was easily reset between participants, offered a mess-free experience, and gave total control to the instructor over the triggers.

Data Collection
All student simulation performances were video recorded (Figure 7). Six nurse anesthesia faculty experts viewed the recordings retrospectively, and two raters scored each performance using the OR Fire Checklist. Average performance scores were entered into Qualtrics by the faculty members. Data were separated by type of skill (technical vs. non-technical) for analysis.

Data Analysis
All data were de-identified and aggregated. Data were analyzed via Qualtrics and presented descriptively. Group means and percentages were reported.

Data Analysis
All data were de-identified and aggregated. Data were analyzed via Qualtrics and presented descriptively. Group means and percentages were reported.

Sample
Thirty-three (first-year, third semester) SRNAs were enrolled in this study. However, one of the student's videos was only partially recorded; therefore, it was not included. The study was conducted within the typical delivery of a course towards the end of the Fall 2019 semester. Out of the 32 participants with usable video performance data, 28 reported demographic data (87.50%). The mean age was 26 years (range 22-35 years), SD ± 3.51. The majority, 23/28 (82.14%), were females, while only 5/28 (17.86%) were males. Race was reported as White 70.97% (22/28), Asian 12.90% (4/28), Native Hawaiian or Pacific Islander 3.23% (1/28), and Other (reported as Latino or Hispanic) 12.90% (4/28). This cohort of SRNAs reported having 3.11 years (mean) of nursing experience (±0.82) 3-5-year bracket (median) and a range from 2 to 6 years. Thirteen of 28 SRNAs (46.43%) had used mixed reality headsets in the past, while the rest did not. A mean of 1.92 years (±2.37) was reported by those who utilized mixed reality headsets prior to this study (range 1-10 years).

Students' Simulation Performance
Overall, the SRNAs' simulation performance scores in the areas of technical and non-technical skill performance had means of 80% to 97%. The high overall mean scores reflect the effective management of the simulated airway fire by the SRNAs. In addition, the study provides evidence to support the use of mixed reality as an effective, added layer for simulation-based education. An OR fire with visual effects such as smoke, flames, and pouring water over high-fidelity mannequins cannot be effectively simulated unless adding technical elements such as augmented reality headsets. The SRNAs' experiences and anecdotal comments effectively captured the usefulness of this technology. The SRNAs, with such limited clinical experiences, were able to effectively demonstrate high proficiency in the management of the OR fire using both technical and non-technical skills in this mixed reality environment.
SRNAs obtained the highest scores in technical skills on fire prevention (5.81/6 = 97%) and the highest scores for non-technical skills in teamwork (3.80/4 = 95%). The lowest scores for technical skills were in fire management with a mean 3.2/80% and for non-technical skills were decision-making with a mean 8.94/ 89%. On the checklist, there were four questions that the raters scored differently for the same student. The questions were TM: check for early signs of fire; SA: recognition of early signs of fire; DM/TM: immediately stop the flow of all gases; and SA/DM/TM: examine endotracheal tube (ETT) for missing fragments in the airway (consider bronchoscopy). This difference can be attributed to the difference in confederates giving clues to the SRNAs, SRNAs' incomplete or assumed actions, or simply raters' variability. Inconsistency of the scenario interpretation such as when gases were considered stopped at the time of ETT removal vs. turning off inhaled gases prior to removal of the ETT was scored differently by the raters despite scoring the pilot simulation greater than 80% among the raters' agreement. Another point in the TM that showed inconsistency among the raters was "check for early signs of airway fire". When using augmented reality headsets, some areas of the periphery vision of the SRNAs were lost and not captured by the video recordings unless the student was directly looking at the source of smoke. In a real-life situation, sensory signs such as smell or hearing of a spark from the electrocautery were lost in the simulated OR fire. Even though the augmented reality headset displayed details such as smoke and fire, the addition of sensory stimulation such as sound and smell should be considered in the future when assessing early recognition of fire.
Another question-TM: Check for early signs of fire-may need to be revised in a checklist during simulation scenarios as such a question is difficult to define with respect to what constitutes "early sign of fire" in such cases. This interpretation may be subject to difficulties even in a real-life scenario when following the OR fire algorithm. The fourth question that raised inconsistent ratings of SRNAs' performance was SA/DM/TM: examine ETT for missing fragments in the airway (consider bronchoscopy). Authors agree that when assessing this skill in future simulations, it may be beneficial to discolor or disfigure the tip of the ETT to simulate a burned tip, thereby enhancing the fidelity and prompting further examination of the airway by the SRNAs. Despite this disagreement in the skill rating, the majority of the SRNAs acted appropriately and used a video laryngoscope to re-establish the airway, which is consistent with past findings that early introduction of SRNAs to OR fire checklists for prevention and treatment is effective in simulation training necessary for effective OR fire treatment [20].
Previous research supports the necessity and use of a virtual environment platform for fire in OR training of health professionals [24,25]. The addition of a mixed reality platform in this study provided an additional layer of realism to the simulation scenario. The smoke and fire were highly realistic. However, auditory and olfactory stimuli would have increased the fidelity and added to the situational awareness of the overall experience. Additional stimuli such as the crackling of fire or the smell of smoke may have provided benefit to the student's response time and ability to manage the situation at hand. These enhancements may be used in future OR fire simulations.
Over the past twenty-five years, hospitals and universities have invested millions of dollars into facilities, equipment, and personnel for the development of simulation-based education centers for the healthcare professions' students and clinicians. With the growth of virtual, augmented, or mixed reality simulations, centers are now seeing a push to adopt these new technologies. The cost to outfit one learner and three to four role player participants for a mixed reality simulation scenario may incur $12,500 for augmented reality headsets, such as the Magic Leap One™, plus the development or purchase cost of the scenario software. By comparison, when this same scenario is run using theatrical fog as a high-fidelity immersive simulation, the cost of the fog machine and set-up is approximately $75. However, due to the time required to turn over the fog simulation, including exhausting the accumulated fog from the room, learners have to rotate through the experience in small groups to complete the training session in the scheduled block of time. With the technology-based, mixed reality simulation, the turnover time was under five-minutes and learners were able to be scheduled for an individual experience in the scheduled block of time. Beyond this specific use, the headsets are available for other training simulations, whereas the fog machine has a limited application. Future adoption of technology-based simulations should focus on the best attainment of learner outcomes and not solely on hardware and software costs.

Limitations
There were several limitations to this study. In addition to the small sample size, there was some actor inconsistency amongst the scenario scripts. Improvement of clear script guidelines and instruction for the scenario actors is essential for future simulated scenario success. Additionally, having six different faculty raters may have led to variability within the scoring. Because of the nature of the pre-briefing that discussed elements of an OR fire, SRNAs may have scored high because they may have expected a fire to occur. The scores may have been different if the simulated fire was more of a surprise. Further, only one faculty member at a time was able to view the immersive mixed reality environment.

Conclusions
Professionals in high-stakes environments must be proficient in both the technical and non-technical skills of their discipline. Within the healthcare professions, non-technical skills, or more accurately referred to as behavioral skills particularly in the context of teamwork, are essential to patient outcomes and safety. Simulation-based education combined with advances in mixed reality technologies hold promise to enable healthcare professionals to learn, practice, and improve their performance of technical and behavioral skills, particularly in high-risk-limited-event situations such as OR fires. This study demonstrated the utility and limitations in our current ability to merge the instructional and technological educational methodologies of mixed reality simulation. Participants demonstrated high proficiency with technical and behavioral skills in the management of a simulated OR fire. Future studies and applications should consider how to adapt multiple sensory inputs including sound and smell into the scenario design as well as attempting to include all core team members into the immersive mixed reality environment.

Conflicts of Interest:
The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.