Teaching with an EEG simulator and it ́s learning process was evaluated at Biomedical laboratory science students

Methods based on simulation pedagogy are widely used to practice hands on skills in safety environment. The usability of an EEG-simulator on clinical neurophysiology course was evaluated. Second year biomedical laboratory science students (N=35) on this course was included in the study. They were divided into three groups. Two groups used the EEG simulator with different feedback modes and one group without use of the simulator. Results was expected to reveal a correlation between user experience and learning outcomes. This study made used of a mixed method study design. During the study students were asked to keep a learning diary throughout the course on their experience. Diaries were analyzed qualitatively based on content analyses. Quantitative analyses based on an UX questionnaire that measures classical usability aspects (efficiency, perspicuity, dependability) and user experience aspects (novelty, stimulation) and the students’ feelings to use simulator. The quantitative data was analyzed using SPSSTM software. The quantitative and qualitative analyses showed that the use of EEG-simulator which was evaluating teachinglearning process have an extra benefit in clinical neurophysiology education and students felt that simulator is useful in learning. The simulation debriefing session should be followed by a full theoretical and practical session. Students compare their learning from the simulator with that of the actual placement which fosters the reflective practice of learning again deepening the understanding of the EEG electrode placement and different wave patterns.

interact in a virtual environment, increasing contact, for example. Simulations especially support learning of hands-on skills, but also theoretical knowledge as shown in previous studies [4,5]. Simulations provide a safe environment for students to practice the role of a registered health professional [6]. Simulation supports collaborative learning among health professionals from different disciplines and fosters multi-disciplinary work [7].
Effective simulation learning, however, requires a positive user experience (UX). UX consists of subjective cognitions of situational learning with complex and dynamic concepts. This is a consequence of the user's internal content space (motivation, mood, expectations), characteristics of how the system should be used (usability, functionality), and the context in which the interaction takes place (organizational or social environment, voluntary use) [8]. To minimize the risk of missed learning outcomes through a misconception of the virtual simulator space, it is paramount to gauge UX before implementing simulators in a classroom environment [9].

Background
With this study, we continue our work with a PC-based EEG-simulator we developed to assist Biomedical Laboratory Science (BLS) students to practice clinical neurophysiology skills [10]. The purpose of the simulator was to enhance the students' learning experience within EEG and provide more opportunities to practice the placement of EEG sensors.
This study aims to examine suitable teaching practice with such a simulator. We attempted to extend our findings to simulators in general for teaching biomedical laboratory sciences. To ensure the validity of our experiment and to identify potential participant bias in evaluating the simulator teaching practices, this study includes a comprehensive UX questionnaire. We obtained the qualitative data for evaluating the teaching practice by requesting each participant to diarize their activities and sentiments toward learning with the simulator.

Study design
Thirty-five (35) second year BLS students (24± 3 years) were included in the study. All students participated in the clinical neurophysiology course introduction and lectures. A partially mixed sequential dominant status design, across several phases, was used. Each phase took place in succession, with more emphasis on the qualitative phases for the objective of this study [11].
Quantitatively, this study compares the learning of EEG registration between two BLS student groups who used the simulator and a control group without simulator exposure. Over the course of the experiment, all students received an opportunity to qualitatively express their opinions on the best way to use the simulator. The two groups who used the simulator kept learning diaries of their experiences. We compared the qualitative learning diary entries with the quantitative The User Experience Questionnaire (UEQ) questionnaire results to eliminate potentially negative comments on the teaching practice caused by undesirable UX and to enrich our insight into the experience diaries. UEQ and diaries collected information related to novelty, stimulation, dependability, efficiency, perspicuity, and attractiveness of the simulator.

Study participants
After initial lectures and course demonstration, the students were divided randomly into three groups: Group 1 (n=11), Group 2 (n=12), and Group 3 (n=12). All three groups had an edX-platform lecture about EEG placement, theory, and interpretation. After the lecture, groups 1 and 2 continued with three simulator sessions (one per week), while group 3 kept with the existing practice of completing tasks toward self-preparation for a practical session to follow. After the three weeks of preparation, all three groups completed a practical session.

Measuring instruments and process
The quantitative study used the (UEQ) [12] to survey the user experience. The students were required to answer the UEQ after the three simulator sessions. The simulator group was divided into two groups according to the feedback type from the simulator. Group 1 had fuzzy feedback and group 2 had precise feedback regarding their virtual sensor placement. Although we investigated the effects of the two feedback systems in earlier work [4], we kept the natural division in this experiment for the purpose of reliability by cross analyzing all results of groups 1 and 2.
We did not give the simulator groups elaborate guidance on how to use it; students received some technical information and were left to learn with the simulator once per week for three weeks following the edX-platform lecture. After each session, the simulator students were asked to reflect on: Content (familiar, unfamiliar); difficulty level; Usability (ease of use, guidelines, look and feel and value); experience/emotion (enjoyed, hated, frustrated, fun, useful); and Process (learning vs. playing, secure before going into practice, value and provided feedback on learning progressed ( Figure 1). The students completed the diaries in their mother-tongue (Finnish), which we later had translated to English for analysis purposes. The students were allowed to complete the diaries in any of the most recognizable formats (slide sets, spreadsheets or word processor files). We collected the diaries before the students participated in the practical (clinical practice) session. **** Figure 1 near here****** Figure 1: Experiment design After the clinical practice training ended, all three groups could use the simulator to ensure the same exposure and equal opportunity for all students before examination. In conclusion, reflective conversations were held between the course lecturer and all three experiment groups. These conversations served to obtain thoughts, in a less formal (nonintimidating) setting, about how best to incorporate the simulator into the current teaching environment.

Quantitative results
There was no significant difference between the UEQ results from the two simulator student groups. Hence, we pooled the UEQ-measurements into a single dataset (n=21). The Statistical Package for the Social Sciences (SPSS) Student Version 25 for Windows were used to calculate, analyse and present the descriptive and inferential UEQ statistics. Analysis of UEQ scores for the six dimensions used to measure the level of the student's EEG simulation experience were reported using mean, median and standard deviation (SD). The internal consistency of responses for each UEQ-factor was determined by Cronbach's alpha scale of reliability and a UEQ performed to examine the underlying variance in student responses between the 26 measured items. Although it was not the intention of this study, we did notice some patterns along gender lines and set out to investigate this some more. Admittedly, we cannot read too much into these results because of the low number of males (indicative of the course gender profile) in the study, but we do discuss some key points later in the article. Distribution of data was checked for normality and comparison of means tested using Mann-Whitney U test. Statistical significance was set at 0.05. Table 1 summarizes the pooled UEQ results. Table 1: Quantitative results from the user experience questionnaire

Qualitative results
The results from the diaries were analysed using AtlasTi 8. A content analyses approach was used, and inductive coding to construct a theory and deductive coding to prove the theory was applied [13]. Themes and codes were created on the following topics linking them back to the UEQ: the attractiveness, perspicuity, efficiency, dependability, stimulation and novelty. The diaries also mentioned aspects related to learning with the simulator. This last topic includes codes about how to use the simulator in the course context and what subject matter would be good to know before using the simulator. Responses were independently reviewed, and keywords colour coded by one member of the research (MB). Discrepancies on coding were discussed among two more of the study researchers and agreement was reached by consensus. The codes were placed into their respective categories and examined for emergence of common themes. Table 2 lists the categories, themes and codes with their relative groundedness from the qualitative analysis. Table 2: Qualitative results from the user diaries The students were most outspoken about learning with the simulator (groundedness = 86) and perspicuity (groundedness = 79), or intuitiveness, of the simulator. The learning with simulator category contained an array of codes that all pointed to a positive experience and gave us cause for a deeper exploration into the student perspective of the best practices for implementing such a simulator in a course. The perspicuity received much attention because there were some initial struggles to learn how to use the simulator. The discussion for this falls outside the scope of this study and we will address the simulator onboarding in a next development iteration of the simulator.
We transcribed and quantatized the reflective conversations with all the students after the clinical practice. These conversations revealed that 17 (n=21) of the simulator students felt that the simulation supported the learned theory of EEG sensor placement, while 4 (n=12) of the control group expressed that the self-exploration helped them. Moreover, 15 (n=21) of the simulator group and 4 (n=21) of the control group expressed selfefficacy in setting up the electrodes during the practical session.

Combined findings analysis
Although we analyzed some of the themes and codes from the diaries and the respective UEQ categories along gender lines, we once again note the limitation posed by the low number of males in the study. The percentages given throughout our discussion were derived from table 2 and indicate the proportions of male (out of 5) and female (out of 13) responses. Table 3 shows a summary of the response proportions per code. Table 3: Male and female response proportions After the first simulation, students found the use of the simulator challenging. At the end of second and third sessions, they found the simulator familiar, and their degree of frustration diminished enough for them to achieve a sense of flow [14]. Students thought that the simulator helped them learn the skill of placing electrodes on a skull and recognizing EEG waves. Students also felt that the simulator helped them remember the names of the electrodes, identify the placement of the electrodes, and measure the nasion-inion point that is before pre-auricular points, guiding them to work with real patients in the practical session. The control group mentioned they had difficulty remembering electrode names and skull electrode placement.
While some students rated the simulator as a real clinical situation, others felt that the simulator did not give a high enough realism.
However, the human / patient's head cannot be turned in the same way, making a more limited turn more convenient. S13 Still, turning the head was difficult and nerve-wracking. S18 It nevertheless supported their learning of the EEG-set up theory and hands-on work. Initially, students felt that the simulator was complicated, but it became easier after the second session. Both males (60%) and females (69%) became familiar with the simulator and appreciated simulation in a positive way, females (54%) and males (50%). Females got more frustrated (92%) than male (60%).

The feeling was that I could put electrodes on the head of a real person with a few games. S7
Male students rated the perspicuity (intuitiveness) of the simulator lower (2.50) than female students (3.97). When asking the students about this, we learned that the males were less inclined to read the on-screen usage prompts at the initial simulator attempts. After several attempts, though, both women (69%) and men (60%) did become familiar with the simulator.
Today I played the electrode layout twice and it already went well. S18 Both males (80%) and female (92%) students thought that the simulator included clear guidelines. In the UEQ dependability was significant different between males (2,38) and female (3,25). However, the Cronbach alpha (0.462) indicates this measure to be less reliable. This could be because of too few participants. Once the guidelines were acknowledged, students felt that they were in control of the simulated electrode placement, giving the simulator a high dependability.
Regardless of gender, students thought the simulator was an excellent way to learn (value). Observation during the practical session and reflection conversations clearly indicated that the simulator groups more easily understood the practical and laboratory protocol and its relationship in clinical situations. Students who did not use the simulator needed more teacher guidance in clinical situations.
I need teacher guide a lot S25 There was a significant (p = .031) difference between stimulating male and female students to use the simulator. 92% of women compared to 60% of men were not motivated to use the simulator. This was confirmed by the results of the UEQ questionnaire ( Table  2). Once again, we attribute these negative sentiments to a first impression because after the second and third simulator experiences, both male (40%) and female (77%) students felt motivated to reach the learning outcomes with the simulator.
After the first simulation session, half of the students thought that learning with the simulator was tremendous and that it added much value to the course I felt that simulation helped to set up electrodes on the skull and give the idea of it. S4

The simulator taught a surprising amount about interpreting the curves, and in the end, it felt like the game was going well. S7
However, the work leading up to using the simulator came under criticism. Female students (54%) felt they needed theoretical knowledge (e.g., EEG wave recognition) before learning by simulation, compared to 20% of the males.
Without an adequate theoretical basis, it was difficult to identify artifacts and waves from EEG curves. S11 During the reflective discussions, students also expressed it would be useful to see EEG registration in a clinical setting before learning with a simulator. The simulator was perceived as important for 40% of male and 23% of female students before EEG registration in a clinical situation, but it cannot be a substitute for hands-on training and actual clinical situations.
Of course, the simulator is no substitute for hands-on training, but it could be a good addition to studying. S12

Discussion
This study intended to answer the question of how to best use an EEG simulator in BLS teaching practice. However, before recommending a suitable simulator teaching strategy, we wanted to be sure that the simulator in question was well-received by the student population of a BLS course on EEG knowledge and practical skills. We attempted to gain this understanding by quantitative and qualitative analyses of the student experiences with the simulator and how this shaped their perceived self-efficacy during a subsequent practical session. Our investigation showed various levels of appreciation for the simulator design and raised the notion that male and female students differed in their learning experiences with the simulator. Although gender-related differences in accepting simulator teaching strategy was not the primary focus of this study, the evidence thereof compelled us to discuss some of these findings in more detail.

Simulator efficacy
For digital learning tools to be effective, they need to exhibit what is known as a state of flow for the learner. Flow theory states that learners are optimally immersed in their activity when the activity in question is challenging enough to not cause a constant sense of anxiety (frustration at not being able to complete a task or set of tasks) or boredom (irritation at continuously being able to complete the activity without some sense of challenge) [15].
After the first simulator lesson, students mentioned that using the simulator was challenging. In some cases, it resulted in students not enjoying the simulator. However, after the second and third simulations, the students were committed to the point where they immersed themselves in the simulator and got the most out of the learning experience. In the study, students were encouraged to re-introduce the simulator after the first session, but we understand that some students may have rejected the simulator experience because of their initially frustrating experience. This is part of consistent challenge identification and good simulator design. Johnson & Wiles, 2003 explain that simulators should be challenging enough to match a learner's skill level and varying difficulty levels. The concept of challenge is again related to flow theory [16]. Care must be taken to avoid a steep learning curve when starting to use simulators in a teaching and learning environment. The reason for this caution is that students are not only learning a new subject or topic but are now expected to also learn to use a simulator, creating a double cognitive burden that can lead to overload and consequent frustration. Based on our experience, we conclude that the initial effects of simulator-supplemented instruction can be reduced in two ways: (a) to simplify the implementation of the simulator by gradually implementing the features of the simulator; or (b) prepare students sufficiently for the experience with theory lessons that support the operation of the simulator's virtual environment. Vorderer et al., 2003 similarly state that a lack of previous simulator or learning experience is likely to lead to feelings of frustration [17].
The qualitative results showed that clear guidelines had the third highest count in diaries, supporting that establishing prior knowledge or scaffolding the initial simulator encounter would lead to a more pleasant simulator introduction. The request for guidelines supports EEG-simulator dependability, which was evaluated significantly more positively by male students (Table 5c). Men and women have significantly different brain patterns behind the production of visually controlled movements, although both groups are equally adept at the task [18]. Our qualitative findings, likewise, indicate that the trustworthiness of a simulator among female students largely depends on understanding how to use the simulator and what a simulator is used for. Without trustworthiness (or dependability), students are unlikely to show a continued interest in using a simulator [19].
In extension to the dependability, the hedonic quality (or stimulation) of the simulator was also evaluated significantly more positively by male students in the quantitative measurement (Table 5d) than by female students. This may be because men process visual stimuli differently and have a more acute ability to recognize their intended implication in training simulators, thereby more quickly acknowledging the value of the visuals and being stimulated by them [20]. Given that more than half of BLS students is female, the requirement for a softer introduction to the simulator is reiterated. Notwithstanding the gender difference in hedonic quality, the UEQ-questionnaire showed that the average student did like the simulator's layout and feel (Table 2), and this is further reflected in the qualitative results where only one student indicated that the application/simulator is ugly. The fact that the simulator proved to be attractive goes a long way to securing its value as a teaching tool. Students are less likely to be immersed by unappealing teaching aids [21].
Another factor that influences the appeal of a simulator is shown by the "Novelty" quality. The novelty measure of our EEG simulator showed that almost one-third of all students were not pleased while using the simulator. From the diary entries, we found that the primary reason students felt discontent with the simulator was based on the technical problems during the first simulation session relating to unrealistic virtual head movement and confusing feedback regarding the virtual electrode placement. Students expect simulators to have a high functional fidelity and if such expectations are not met, the overall experience (or novelty) of the simulator is lost [22]. With lecturer presence to explain and assist in the head movement and feedback, we were able to alleviate the initial resistance to the simulator enough that significant learning still took place [4].

Learning by simulation
All students said that they would prefer to learn some theory before using the simulator as it would make them feel more prepared before using the simulator. This factor could also contribute to the element of frustration reported by the students during the first simulation session. Previous learnings and knowledge are essential factors for simulation to support and teach skills [23] but also enjoyability. For individuals to enjoy the simulation process, they need to feel capable [24]. Students enjoyed the EEG simulator scenario realism because they were able to apply the knowledge acquired in the course and use it in clinical situations decision. They also benefited from the reflection they experienced. In a previous study, we found that EEG-simulator gave the greatest improvement in learning as assessed by pre-and post-test knowledge [4]. This met our expectations that the simulation focuses on Bloom's higher learning goals [25].

Influence of gender
There are distinct differences in gender related to current research. There is difference how male and female recognize objects and horizontal and vertical axes, like this research how they felt the simulator. Combined, the findings suggest that there are gender differences in observation simulators as well [26]. In the EEG-simulator, there was x, y and zcoordinates where students needed to place electrodes; therefore, male thought EEG-simulator measurement and electrode placement in a different, more positive way than female do. Both male and female thought that simulator was challenging to use. Although the learning situation is complicated, effective debriefing is essential to guide learners and as part of the learning process. Debriefing is never straightforward because it is a sensitive part of a simulation session. During that time, learners are taught to understand their weaknesses, what skills they need to develop, but also what aspects of professional practice they have learned appropriately. Not all instructors have it regardless of their clinical expertise, knowledge of simulation techniques simulation processes [27].
An interesting point from the diaries after the study is that many of the students not only commented on the simulator, but also offered opinions on the sequence of our study process ( Figure 2).  To improve learning with an EEG simulator we propose to start with a high-level theory lesson as the preamble, where the use of the simulator as well as the theory related to different electrodes, their placement and EEG waveforms are explained. The simulator should then be available to students so they may learn the practical application of electrodes and reinforce their knowledge of EEG. After this first simulator session, students and teacher should engage in a debriefing session with a focused discussion on the simulator experience. The debriefing should be followed by an in-depth theory and practical session, which includes actual electrode placement. At this point, students should be tasked with comparing learning-with-simulator to actual placement. This promotes reflective learning practice to deepen understanding [28] of EEG electrode placement and different wave patterns. Students can now actively pursue simulator self-study to hone their skills before evaluation of their hands-on practice and identification of EEG patterns.

Conclusions
This study on the student's experiences of simulation learning process supports our hypothesis that students will achieve better learning experiences with the simulator.
Gender only played a role in the experience of the look and feel of the simulation, but results are inconclusive due to the unequal group. This could be considered when developing a simulation. The outcome of the study is the process (Figure 4); which is feasible for both genders. The created process improves successful implementation of the simulation by providing a clear path, which additionally allows opportunity for various learning styles and teaching approaches to be implemented.
The article also led to major changes in our pedagogical plan of the simulation implementation in the classroom.
The new pedagogical plan must be tested and further developed based on the feedback when theory is, and an automated immediate theory-based feedback is included in the simulator. Well-designed feedback system empowers student's engagement more comprehensively with the content.

Patents
Author Contributions: For research articles with several authors, a short paragraph specifying their individual contributions must be provided. The following statements should be used "Conceptualization, M.B. and W.R.; methodology, Chrisna Botha-Ravyse.; software, Werner Ravyse; validation, Marko Björn. and Werner Ravyse; formal analysis, Marko Björn.; investigation, Marko Björn; resources, M.B; data curation, M.B.; writing-original draft preparation, Marko Björn.; writing-review and editing, Jonne Laurila.; visualization, Chrisna Botha-Ravyse.; supervision, Tuula Keinonen.; project administration, Marko Björn.; funding acquisition, Ravyse Werner. All authors have read and agreed to the published version of the manuscript." Please turn to the CRediT taxonomy for the term explanation. Authorship must be limited to those who have contributed substantially to the work reported.
Funding: There was no external founders for this research.