Next Article in Journal
The Impact of Empathy and Perspective-Taking on Medical Student Satisfaction and Performance: A Meta-Ethnography and Proposed Bow-Tie Model
Previous Article in Journal
Use of Videos as Disability Educational Tools for Medical Students
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Simulation Needs Assessment Project (SNAP): Use of the Borich Model in Undergraduate Medical Education

1
John A. Burns School of Medicine, University of Hawai’i, Honolulu, HI 96813, USA
2
JABSOM Biostatistics Core Facility, John A. Burns School of Medicine, University of Hawai’i, Honolulu, HI 96813, USA
3
SimTiki Simulation Center, John A. Burns School of Medicine, University of Hawai’i, Honolulu, HI 96813, USA
*
Authors to whom correspondence should be addressed.
Int. Med. Educ. 2025, 4(4), 42; https://doi.org/10.3390/ime4040042
Submission received: 30 August 2025 / Revised: 7 October 2025 / Accepted: 10 October 2025 / Published: 20 October 2025

Abstract

Manikin-based simulation is widely used in undergraduate medical education to develop clinical reasoning and communication skills. The Borich Needs Assessment Model has been applied in fields such as nursing and global health to identify gaps between perceived importance and performance, but it has not been used to evaluate simulation-based learning in undergraduate medical education. We applied the Borich model to assess student perceptions of competencies developed in an established simulation curriculum and to inform future simulation curriculum development. A cross-sectional survey was administered to first-, second-, and fourth-year medical students at the University of Hawaii John A. Burns School of Medicine. Students rated eight SNAP competencies for importance, self-reported performance, and perceived influence of simulation. Weighted discrepancy scores were calculated using the Borich model. Faculty completed a parallel survey to compare competency prioritization. Among 164 student respondents, all competencies were rated as highly important. The greatest performance and influence gaps were reported for “Apply knowledge covered in the unit or rotation to simulation cases” (MWDS = 1.37 and 1.61, respectively). Priorities varied by student year, and agreement between faculty and student rankings was limited. The findings highlight a perceived gap between simulation curriculum and knowledge application. The Borich model effectively identified performance gaps and can support targeted simulation curriculum refinement.

1. Introduction

Manikin-based simulation is a cornerstone of clinical skills training for healthcare professionals worldwide. Its effectiveness for enhancing clinical competence, procedural proficiency, teamwork, and communication has driven widespread adoption in undergraduate medical education (UME) [1,2,3,4,5,6,7,8]. A 2011 Association of American Medical Colleges (AAMC) survey found that roughly 84% of US medical schools had integrated simulation experiences into their curricula [9]. Since then, UME simulation integration has expanded, with up to 94% of interns reporting simulation during medical school [10]. However, the extent of simulation use, its educational objectives, and placement within the UME curriculum vary significantly across institutions [6,7,10,11]. While simulation is commonly used to enhance clinical skill development with experiential learning, little research has explored medical student (MS) priorities and preferences regarding simulation-based learning. Needs assessments in medical education help identify and prioritize instructional requirements, including the integration of e-learning [12,13,14]. While existing studies highlight the effectiveness of simulation in meeting the perceived learning needs of nursing students, research specific to UME students remains limited [7,15]. The aim of this study was to address this gap by investigating UME student-reported learning needs, self-reported key competency performance, and perceptions of the influence of the institution’s integrated simulation-based education on competency development. In addition, we sought to compare student-identified needs assessment priorities with simulation faculty-reported prioritization of educational needs and competency development.
At the University of Hawaii John A. Burns School of Medicine (JABSOM), manikin-based simulation is integrated throughout the four-year curriculum. First-year (MS1) and second-year medical students (MS2) participate in 3–4 manikin simulations annually, aligned with concurrent body-system-based problem-based learning (PBL) curricular coursework. For example, MS1s engage in traumatic pneumothorax and atrial fibrillation case management scenarios during the cardiac and pulmonary unit. Third-year (MS3) and fourth-year medical students (MS4) participate in scenario-based manikin simulations and skills training with task trainers 2–3 times per year, tailored to specialty clerkship content. For example, third-year pediatric clerkship students manage febrile seizures and child-abuse-related head trauma management simulation scenarios. A comprehensive primary needs assessment of the overall UME simulation curriculum has not been conducted.
A curricular needs assessment systematically identifies gaps between current and desired educational outcomes and aids in focusing curriculum development and resource allocation to target these gaps [16,17]. The Borich Needs Assessment Model, initially developed for needs assessment in teacher training programs, has since been applied in other educational settings [17,18,19,20,21,22]. The Borich model’s core methodology is a discrepancy analysis, comparing students’ perceived importance of competencies with self-reported performance [17]. In this context, performance is defined as “the ability to accurately execute the behavior in a real or simulated environment in the presence of an observer” [17]. Competencies encompass “the knowledge, skills and attitudes needed to be successful” [17,23,24,25]. By ranking discrepancies between perceived importance and performance, the Borich model comprises a data-driven framework for curriculum refinement.
The Borich model incorporates respondent perceptions of the importance and effectiveness of current curricular elements to derive a prioritized educational needs assessment list. It has been applied in healthcare education, almost exclusively in Asian countries [19,26]. Common UME needs assessment methods include the Delphi method, Harden’s 10-question framework, surveys, and focus groups [14,27]. The Borich model’s application to UME manikin-based simulation curricula has not been previously reported. This study reports the application of the Borich Needs Assessment Model to assess UME student-perceived learning needs and the influence of manikin-based simulation exercises on competency development, with the goal of identifying priorities for simulation curriculum improvement at JABSOM, a 4-year Liasion Committee on Medical Education accredited medical school.

2. Materials and Methods

A single-institution curriculum Simulation Needs Assessment Project (SNAP) for manikin-based simulation in UME was conducted at JABSOM using the Borich model. This survey was originally implemented as part of internal program evaluation and quality improvement at the SimTiki Simulation Center. The data were collected anonymously, with no identifiable private information or linkage codes, and no interaction or intervention conducted for research purposes. This work was determined to be Not Human Subjects Research by the University of Hawaii Human Studies Program (Protocol 2025-00725). Participation was voluntary and anonymous.
The simulation curriculum competencies listed below were determined a priori by the institutional simulation center directors. These were derived as part of a pragmatic program evaluation and were intentionally aligned with both JABSOM graduation objectives and the AAMC Entrustable Professional Activities (EPAs), which provide a nationally recognized framework for UME competencies [28,29]. While no standardized list of competencies for manikin-based simulation curricula exists, this mapping ensured that competencies selected were educationally relevant, feasible to teach and assess in simulation, and consistent with broader professional expectations. The competencies were not externally validated by expert consensus, and we recognize this as a limitation.
SNAP Competencies and Framework Mapping:
  • Application of knowledge covered in the unit/rotation to simulation cases—an overarching simulation curricular goal.
  • Interpretation of clinically relevant information—EPA 3, EPA 5, EPA 10; JABSOM objective: interpret diagnostic tests.
  • Formulation of a prioritized differential diagnosis—EPA 2; JABSOM objective: develop differential diagnoses.
  • Identification of the different roles and responsibilities in a healthcare team—EPA 9; JABSOM objective: apply principles of interprofessional team-based care.
  • Effective communication with healthcare team members—EPA 9; JABSOM objective: communicate with and educate peers.
  • Application of technical skills to simulation cases—EPA 12; JABSOM objective: perform routine procedural skills.
  • Medical management decision-making—EPA 5; JABSOM objective: develop and implement treatment plans.
  • Functioning effectively in an acute clinical setting—EPA 10; JABSOM objective: recognize and manage patients requiring urgent/emergent care.
SNAP competencies were presented in an online 32-item REDCap™ survey [30,31]. Students rated each competency on a 5-point scale (1 = low to 5 = high) in three domains: perceived importance rating (ImR), perceived current self-performance rating (PR), and influence rating (InR) of the influence of the curricular simulation experience on competency performance.
The Borich model was used to prioritize competencies for curricular needs assessment based upon student rankings. Each competency was rated individually by students; the scores were then totaled, and a mean rating value was determined for each competency within a domain. Performance discrepancy scores (PDSs) for each competency were calculated for each participant by subtracting the PR from the ImR. Similarly, influence discrepancy scores (InDSs) for each competency were calculated for each student by subtracting the InR from the ImR. Weighted discrepancy scores (WDSs) for each competency were calculated for each responder by multiplying the PDS by the mean ImR. Mean weighted discrepancy scores (MWDSs) for each competency were derived by averaging the WDSs across all respondents. Borich model assessments that use a five-point scale typically yield MWDSs ranging from −20 to +20 [25,32]. Competencies were ranked from highest to lowest MWDS to determine priority areas for simulation curriculum improvement.
Survey data was collected anonymously, targeting a total of 233 undergraduate MS1s, MS2s, and MS4s between March 2024 and June 2024. A convenience sampling method was used, which excluded MS3s due to logistical factors. The students completed the survey before entering the final manikin simulation session of the academic year. Faculty who served as pre-clerkship and clerkship course directors within the Office of Medical Education and assigned learning community mentors were invited by email to complete an anonymous survey, ranking the importance of the eight SNAP competencies for MS1s, MS2s, and MS4s. As the purpose of this parallel faculty survey was to compare faculty and student perceptions and rankings of competency importance, faculty not involved in simulation were excluded due to potential limitations in their firsthand knowledge regarding the competencies that simulation is designed to address.
Statistical analyses were performed using R software (version 4.4.0) and the R packages “dplyr,” [33] “gtsummary,” [34] “kableExtra,” [35] “ltm,” [36] “psych,” [37] and “stats.” [38] Descriptive statistics, including the mean, standard deviation, minimum, and maximum, were calculated for InR, PR, and ImR. Paired t-tests were used to analyze discrepancies between InR and both PR and ImR. Kendall’s tau coefficient was used to assess agreement between the faculty and student rankings of competencies for each MS class. A Cronbach’s alpha coefficient of 0.97 indicated high internal consistency of the student questionnaire. This high value is consistent with the instrument’s design, as the eight competencies are intended to measure aspects of a single construct (overall medical student competence in the simulation curriculum) rather than distinct constructs.

3. Results

Demographics

Data was collected from 164 out of the total 233 MSs: 37.8% (n = 62) MS4s, 35.4% (n = 58) MS2s, and 26.8% (n = 44) MS1s. The respective response rates were 81.6%, 69.9%, and 55.7%. Of the 164 student participants, three had incomplete responses for the PR questions and were excluded from those specific analyses resulting in a sample size of n = 161 for analyses related to perceived performance. The faculty survey response rate was 44%; (31/70). Five faculty who reported not teaching with simulation were excluded from the analysis.
Students across all year groups rated all eight manikin-based simulation SNAP competencies as highly important. The mean ImR ranged narrowly from 4.26 to 4.41. The standard deviation for all competencies was <1. The students ranked “Apply knowledge covered in the unit/rotation to simulation cases” as the most important competency (mean ImR = 4.41), while “Apply technical skills to simulation cases (e.g., intubation, chest compressions, defibrillation, airway management)” was ranked as the least important competency (mean ImR = 4.26). The mean PR ranged from 4.05 to 4.17, indicating the students’ self-rated competency performance in the medium to high range. Significant differences were found between students’ perceived importance and self-reported performance ratings for all eight competencies (p < 0.001). The performance MWDSs ranged from 0.75 to 1.37. The Borich model MWDS results indicated that the largest gap between students’ self-rated competency performance and perceived competency importance was for SNAP competency #1, “Apply knowledge covered in the unit/rotation to simulation cases” (MWDS = 1.37, 95% CI = 0.21–0.41). The smallest gap was reported for the SNAP competency #4, “Identify the different roles and responsibilities in a healthcare team” (MWDS = 0.75, 95% CI = 0.07–0.27). These results are summarized in Table 1.
There were significant differences (p < 0.001) between students’ ImR and the influence of manikin-based simulation on their competency performance. The influence MWDSs ranged from 0.55 to 1.61. The mean InR ranged from 4.04 to 4.18. Students reported the greatest influence of simulation on “Make medical management decisions (e.g., administering medications, starting CPR, etc.)” (influence MWDS = 0.55, 95% CI = 0.03–0.23) and lowest influence on “Apply knowledge covered in the unit/rotation to simulation cases” (influence MWDS = 1.61, 95% CI = 0.14–0.36). These results are summarized in Table 2.
Faculty (n = 26) ranked “Applying knowledge covered in the unit/rotation to simulation cases” as the most important competency for the MS1s and MS2s, while “Apply technical skills to simulation cases” was the lowest priority competency. For MS4s, “Interpret clinically relevant information” was rated the most important and “Identify the different roles and responsibilities in a healthcare team” was rated the least important. Faculty consistently ranked “Apply technical skills to simulation cases” among the lowest priorities for all years. These results are summarized in Table 3. For the MS1 class, a Kendall’s tau of 0.500 (p = 0.109) indicated a positive, moderate association between faculty and student rankings. However, this association was not statistically significant. For the MS2 class, a Kendall’s tau of 0.143 (p = 0.719) showed a weak, positive association that was not statistically significant. A Kendall’s tau of −0.214 (p = 0.548) indicated a weak, negative association for the MS4 class, suggesting a low level of disagreement that was also not statistically significant.
MS1s reported the largest performance MWDSs for “Apply knowledge covered in the unit/rotation to simulation cases”, “Interpret clinically relevant information”, and “Form a prioritized differential diagnosis.” MS2s and MS4s reported the largest gaps between performance and different competencies, such as “Apply technical skills to simulation cases” and “Ability to function effectively in an acute clinical setting.” MS4s ranked “Make medical management decisions” more highly than MS1s and MS4s. Class rankings of performance MWDSs are summarized in Table 4.
MS class rankings of InDS by MWDS.
Across all years, students reported the two largest influence MWDS rankings for “Apply knowledge covered in the unit/rotation to simulation cases,” “Communicate effectively with healthcare team members,” and “Interpret clinically relevant information.” Class rankings for influence MWDSs are summarized in Table 5. MS1s had similar rankings for both performance and influence MWDS.

4. Discussion

The Borich model needs assessment of the simulation-based UME curriculum in this study identified the ability to “Apply knowledge covered in the unit/rotation to simulation cases” as a high priority for improvement. Overall, the students indicated there was a gap between their performance in this competency and how important it was. The students also reported that their experience in the simulation curriculum had the least influence on their performance in this competency. Notably, all eight competencies had a mean ImR > 4, suggesting that JABSOM students perceived all eight competencies to be of relatively high importance. These findings support the general applicability and relevance of our simulation-based curriculum and offer a focus for revision and improvement. The highest priority gaps identified here suggest that focus is needed on simulation program modifications, including a careful review of individual simulation scenario design contexts, with attention to improving alignment with the design of unit-based content including PBL cases.
Despite highly valuing the ability to apply learned knowledge to simulation cases, the students still felt unable to do so, and believed that the manikin simulation curriculum had a relatively small impact on this skill. The original Borich paper suggests that “when a competency is highly valued but poorly performed, training may have been insufficient rather than ineffective” [17].
Considering the potential for “insufficient” training as a cause of the identified gaps, we identified that JABSOM students engage in manikin-based experiences 2–4 times throughout the year, which may be an insufficient amount of time for students to perceive any influence on their performance. Previous studies indicate that students desire more simulation-based experiences (SBEs) in medical school [33,34,35,36,37]. Takayesu et al. describe a simulation curriculum in which 200 students at Harvard Medical School engaged in high-fidelity simulations with five simulators in five integrated teaching labs over the course of 4 h [37]. Frequency is not mentioned, and detailed descriptions of manikin-based simulation curricula at other undergraduate medical institutions are lacking. Given the constraints of faculty availability and a full UME curriculum, increasing simulation exposure may not be feasible [38]. Instead, faculty should clarify how simulation integrates and aligns with other learning modalities to manage student expectations. Additionally, communicating explicit connections from classroom and PBL curricula to the simulation lab may enhance students’ perception of its influence. These previously identified barriers and strategies are relevant to our setting and offer us methods for the revision of the curriculum based on the findings of our Borich analysis.
A subgroup analysis revealed that students’ highest-priority competencies varied by class year, reflecting variances in training focus. MS1s ranked “Applying knowledge covered in the unit/rotation to simulation cases” as the highest-priority competency, likely due to the MS1 curricular preclinical focus on didactic, PBL and preparation for the United States Medical Licensing Exam Step 1. MS2s prioritized “Ability to function effectively in an acute clinical setting,” possibly reflecting a focus on and concerns regarding the approaching transition to MS3 clinical clerkships [39]. MS4s ranked “Apply technical skills to simulation cases” as their top priority, likely due to the impending transition to residency with skills competencies required for intern year success [40]. Although not stratified by year, previous studies confirm that the elements of SBE most valued by medical students are “learning a new skill under supervision,” “applying prior knowledge to a clinical scenario,” and “identifying gaps in knowledge/skill” [33]. Previous studies have found significant differences between preclinical and clinical student satisfaction levels in SBE; preclinical-phase students reported more favorable perceptions of SBE than clinical-phase students [35,41]. While differences in undergraduate MS SBE satisfaction ratings between preclinical and clinical years have been reported, the evolution of MS preferences over time has not [34]. Paskins et al. surveyed 246 MSs and found that year level and experiences during medical school did not affect students’ perceived utility of SBE [34]. Understanding these shifting priorities can help educators to tailor simulation curricula to align with students’ developmental stages.
In this study, the faculty and students differed in their rankings of competency priorities. There tended to be agreement between faculty and MS1s and MS2s, but disagreement with the MS4s. While we did not explore the reasons underlying these differences, we hypothesize that faculty may have prioritized competencies based on the RIME framework emphasizing student progression from “reporters” to “interpreters” [42]. In contrast, students may have ranked competencies based on personal interests or specialty-specific skills relevant for the intern year [43] and chosen residency specialty [40]. The faculty and student responses regarding simulation curricula are disparate, as demonstrated by Nuzhat et al., who found that 54.7% of students but 100% of faculty agreed that simulation’s role in the undergraduate curriculum was the integration of basic sciences and clinical application [44]. Addressing these discrepancies by aligning faculty and student expectations and priorities could enhance the educational effectiveness of simulation-based sessions [45].
The narrow range of mean ImR and MWDS values suggests that students generally perceive JABSOM’s manikin-based simulations as addressing core competencies effectively. The Borich model detected small but statistically significant differences with the narrow range between competencies, demonstrating the utility of the model in identifying curricular priorities.
The Borich model has previously been applied in healthcare education needs assessment, predominantly in Asian countries. Ziwei et al. utilized the model to identify priority areas for improving undergraduate nursing student palliative care competencies at a single university in China [19]. The highest priority competency identified was interdisciplinary team cooperation [19]. A similar approach was applied in South Korea to assess core nursing skill competency, comparing the future importance of virtual reality training with their current competency [46]. The study identified invasive nursing procedures, such as enema administration and intramuscular injection, as key areas for virtual-reality-based training. The Borich model has also been used to assess broader and more current medical education needs. A study of 678 MSs across South Korean institutions applied the model to evaluate post-COVID-19-era global health competencies [47]. Other applications of the Borich model in South Korea include needs assessments for dental hygienists, community visiting nurses, new intensive care unit nurses, and nurse educators [18,26,48]. These studies demonstrate the versatility of the Borich model in evaluating existing curricula, guiding the development of new educational programs and assessing evolving competency needs in healthcare education. Its straightforward methodology makes it a valuable tool for identifying priorities in a broad range of healthcare training contexts.
A limitation of this study is its generalizability, as it was conducted at a single institution with a relatively small sample size, exploring SNAP competencies developed locally by simulation center faculty and aligned with institutional objectives and AAMC EPAs, but not externally validated through expert consensus. However, as a pragmatic program evaluation, the use of locally derived competencies was appropriate for identifying institution-specific priorities, and the lack of external validation should be considered in interpreting the findings. A convenience sampling method was used with the goal of surveying the entire population of accessible students in the institution. The absence of MS3s limits insights into competency priorities during early clerkships. Although response and self-selection bias are present in survey data, this method allowed us to gain insight into the perceived needs of our target population [49]. To mitigate a single simulation influencing student response, student responses were collected immediately before their last simulation of the academic year, weeks or months since each student’s most recent simulation experience. We did not collect data that would allow a deeper understanding of the reasons underlying students’ perceived needs. The narrow range of mean ImR and MWDS values could also indicate the presence of acquiescence bias, straight-lining, response style, or social desirability bias in our study, which are common issues with self-reported measures [50]. Other potential unmeasured influences include lack of engagement or attention during survey completion or a survey design resulting in repetitive responses. Despite these concerns, previous studies similarly found that students rate SBE experiences highly and express high satisfaction with simulation curricula [33,35,37,51,52]. The Borich methodology has an inherent risk of individual bias, as it only relies on subjective, self-reported data. Our study modified the Borich model by assessing students’ perception of curriculum influence on their performance of the competencies. InR by our definition, is not synonymous with “Perception of Skill Attainment” as described in the original Borich model. However, we used it in this context to derive a perceived InDS, which could affect the validity of our needs assessment model. The survey maintained excellent internal consistency and reliability despite these modifications. Lastly, drawing comparative conclusions from our data is limited due to the sparsity of data on the structure of manikin-based simulation curricula at other institutions and because the study relied on subjective self-reported student ratings without corroboration against objective performance measures.

5. Conclusions

This is, to our knowledge, the first report of the Borich model’s use in UME manikin-based simulation curriculum assessment. Follow-up studies with more diverse cohorts of MSs could address whether this model provides a reliable framework for assessing SBE more broadly in UME. Comparative studies between UME institutions could help identify program models and strategies that may be most effective for student learning. Future studies may consider incorporating open-ended responses to questionnaires inspired by the Borich model to provide insight into the rationale behind student ratings and to guide faculty curriculum evaluation and revision. Based on our single-site, self-reported data, the Borich model appears to be a practical and easily applied approach for identifying relative priorities in manikin-based simulation curricula, though its effectiveness should be interpreted cautiously until validated against objective performance measures.

Author Contributions

Conceptualization: S.W., M.R., J.L.-J. and B.W.B.; methodology: S.W., M.R., J.L.-J. and B.W.B.; software: M.R.; validation: M.R.; formal analysis: M.R.; investigation: S.W., B.S., N.H., J.L.-J. and B.W.B.; data curation: M.R.; writing—original draft preparation: S.W., M.R. and N.H.; writing—review and editing: B.S., J.L.-J. and B.W.B.; visualization: B.S. and M.R.; supervision: J.L.-J. and B.W.B.; project administration: J.L.-J. and B.W.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

This study did not meet institutional definitions of Human Research. Participation was voluntary and consent was not required.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author(s).

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AAMCAssociation of American Medical Colleges
ImRPerceived Importance Rating
InDSInfluence Discrepancy Score
InRPerceived Influence Rating
JABSOMJohn A. Burns School of Medicine
MImRMean Importance Rating
MSMedical Student
MS1First-year Medical Student
MS2Second-year Medical Student
MS3Third-year Medical Student
MS4Fourth-year Medical Student
MWDSMean Weighted Discrepancy Score
PDSPerformance Discrepancy Score
PRPerceived Performance Rating
SBESimulation-based Education
SNAPSimulation Needs Assessment Project
UMEUndergraduate Medical Education
WDSWeighted Discrepancy Score

References

  1. Okuda, Y.; Bryson, E.O.; DeMaria, S.; Jacobson, L.; Quinones, J.; Shen, B.; Levine, A.I. The Utility of Simulation in Medical Education: What Is the Evidence? Mt. Sinai J. Med. J. Transl. Pers. Med. 2009, 76, 330–343. [Google Scholar] [CrossRef] [PubMed]
  2. McGaghie, W.C.; Issenberg, S.B.; Petrusa, E.R.; Scalese, R.J. A critical review of simulation-based medical education research: 2003–2009: Simulation-based medical education research 2003–2009. Med Educ. 2010, 44, 50–63. [Google Scholar] [CrossRef] [PubMed]
  3. McGaghie, W.C.; Issenberg, S.B.; Cohen, E.R.; Barsuk, J.H.; Wayne, D.B. Does Simulation-Based Medical Education with Deliberate Practice Yield Better Results Than Traditional Clinical Education? A Meta-Analytic Comparative Review of the Evidence. Acad. Med. 2011, 86, 706–711. [Google Scholar] [CrossRef]
  4. McGaghie, W.C.; Issenberg, S.B.; Petrusa, E.R.; Scalese, R.J. Revisiting ‘A critical review of simulation-based medical education research: 2003–2009’. Med. Educ. 2016, 50, 986–991. [Google Scholar] [CrossRef]
  5. Nara, N.; Beppu, M.; Tohda, S.; Suzuki, T. The Introduction and Effectiveness of Simulation-based Learning in Medical Education. Intern. Med. 2009, 48, 1515–1519. [Google Scholar] [CrossRef]
  6. Diaz-Navarro, C.; Armstrong, R.; Charnetski, M.; Freeman, K.J.; Koh, S.; Reedy, G.; Smitten, J.; Ingrassia, P.L.; Matos, F.M.; Issenberg, B. Global consensus statement on simulation-based practice in healthcare. Adv. Simul. 2024, 9, 19. [Google Scholar] [CrossRef]
  7. Scerbo, M.W. Healthcare Simulation Training Guidelines and Literature Reviews from the Third Society for Simulation in Healthcare Research Summit. Simul. Healthc. 2024, 19, S1–S3. [Google Scholar] [CrossRef]
  8. Barry Issenberg, S.; Mcgaghie, W.C.; Petrusa, E.R.; Lee Gordon, D.; Scalese, R.J. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med. Teach. 2005, 27, 10–28. [Google Scholar] [CrossRef]
  9. Passiment, M.; Sacks, H.; Huang, G. Medical Simulation in Medical Education: Results of an AAMC Survey; Association of American Medical Colleges: Washington, DC, USA, 2011; Available online: https://www.aamc.org/media/22586/download (accessed on 5 January 2025).
  10. Campbell, K.K.; Wong, K.E.; Kerchberger, A.M.; Lysikowski, J.; Scott, D.J.; Sulistio, M.S. Simulation-Based Education in US Undergraduate Medical Education: A Descriptive Study. Simul. Healthc. 2023, 18, 359–366. [Google Scholar] [CrossRef]
  11. Huang, G.C.; Sacks, H.; DeVita, M.; Reynolds, R.M.; Gammon, W.M.; Saleh, M.; Gliva-McConvey, G.; Owens, T.M.; Anderson, J.P.; Stillsmoking, K.P.; et al. Characteristics of Simulation Activities at North American Medical Schools and Teaching Hospitals: An AAMC-SSH-ASPE-AACN Collaboration. Simul. Healthc. 2012, 7, 329–333. [Google Scholar] [CrossRef] [PubMed]
  12. Elendu, C.; Amaechi, D.C.; Okatta, A.U.; Amaechi, E.C.M.; Elendu, T.C.B.; Ezeh, C.P.M.; Elendu, I.D.B. The impact of simulation-based training in medical education: A review. Medicine 2024, 103, e38813. [Google Scholar] [CrossRef]
  13. Gordon, C.J.; Ryall, T.; Judd, B. Simulation-based assessments in health professional education: A systematic review. J. Multidiscip. Healthc. 2016, 9, 69–82. [Google Scholar] [CrossRef]
  14. Farhadi, Z.; Rezaei, E.; Bazrafkan, L.; Amini, M.; Sanaiey, N.Z.; Barati-Boldaji, R.; Mehrabi, M. Need assessment of medical school curriculum for MOOCs: Perspectives of instructors and students of Shiraz University of Medical Sciences. BMC Med. Educ. 2024, 24, 141. [Google Scholar] [CrossRef]
  15. Badowski, D.; Rossler, K.L.; Reiland, N. Exploring student perceptions of virtual simulation versus traditional clinical and manikin-based simulation. J. Prof. Nurs. 2021, 37, 683–689. [Google Scholar] [CrossRef]
  16. Cuiccio, C.; Husby-Slater, M. Needs Assessment Guidebook: Supporting the Development of District and School Needs Assessments. Published Online May 2018. Available online: https://eric.ed.gov/?id=ED606124 (accessed on 5 January 2025).
  17. Borich, G.D. A Needs Assessment Model for Conducting Follow-Up Studies. J. Teach. Educ. 1980, 31, 39–42. [Google Scholar] [CrossRef]
  18. Kim, D.; Kim, H.; Ko, Y. Analysis of Educational Needs of Home Care Nurses: Utilizing Borich’s Needs Assessment and the Locus for Focus Model. Res. Community Public Health Nurs. 2024, 35, 240. [Google Scholar] [CrossRef]
  19. Ziwei, K.; Mengjiao, C.; Yongjie, Z.; Mengqi, Z.; Yeqin, Y. Optimizing palliative care education through undergraduate nursing students’ perceptions: Application of importance-performance analysis and Borich needs assessment model. Nurse Educ. Today 2023, 122, 105719. [Google Scholar] [CrossRef] [PubMed]
  20. Park, K.H.; Song, M.K. Priority Analysis of Educational Needs of Forest Healing Instructors Related to Programs for Cancer Survivors: Using Borich Needs Assessment and the Locus for Focus Model. Int. J. Environ. Res. Public Health 2022, 19, 5376. [Google Scholar] [CrossRef]
  21. Chen, Y.; Wang, Y.; Yi, T.; Hu, Y.; Qi, Y.; Xie, Z.; Xia, L.; Dong, C. Priority analysis of educational needs related to geriatric nursing competence among Chinese undergraduate nursing students: Application of Borich needs assessment, importance-performance analysis and locus for focus model. Nurse Educ. Pract. 2025, 83, 104253. [Google Scholar] [CrossRef]
  22. Lee, Y.; Jeong, S.; Cho, D. Assessing adult and continuing education needs in South Korea metropolitan areas using Borich’s needs assessment model. Eur. J. Train. Dev. 2021, 45, 832–844. [Google Scholar] [CrossRef]
  23. Zahid Iqbal, M.; Alam Malik, S.; Khan, R.A. Answering the journalistic six on the training needs assessment of pharmaceutical sales representatives: Comparative perspectives of trainers and trainees. Int. J. Pharm. Healthc. Mark. 2012, 6, 71–96. [Google Scholar] [CrossRef]
  24. McClelland, D.C. Testing for competence rather than for “intelligence”. Am. Psychol. 1973, 28, 1. [Google Scholar] [CrossRef]
  25. Caillouet, O.; Harder, A. Conducting the Needs Assessment #8: The Borich Model: WC415/AEC754, 4/2022. EDIS 2022. [Google Scholar] [CrossRef]
  26. Shin, S.; Hong, E.; Do, J.; Lee, M. An analysis of the educational needs priorities for clinical nurse educators: Utilizing the Borich needs assessment and the locus for focus model. J. Korean Acad. Soc. Nurs. Educ. 2023, 29, 405–414. [Google Scholar] [CrossRef]
  27. Ahmed, Y.A.; Alneel, S. Analyzing the Curriculum of the Faculty of Medicine, University of Gezira using Harden’s 10 questions framework. J. Adv. Med. Educ. Prof. 2017, 5, 60–66. [Google Scholar] [PubMed]
  28. Obeso, V.; Brown, D.; Phillipi, C. Core Entrustable Professional Activities for Entering Residency; Association of American Medical Colleges: Washington, DC, USA, 2017. [Google Scholar]
  29. Menon, V.; Bhoja, R.; Reisch, J.; Kosemund, M.; Hogg, D.; Ambardekar, A. Acquisition of Teamwork and Communication Skills Using High-Technology Simulation for Preclerkship Medical Students. Simul. Healthc. 2021, 16, e181–e187. [Google Scholar] [CrossRef]
  30. Harris, P.A.; Taylor, R.; Minor, B.L.; Elliott, V.; Fernandez, M.; O’Neal, L.; McLeod, L.; Delacqua, G.; Delacqua, F.; Kirby, J.; et al. The REDCap consortium: Building an international community of software platform partners. J. Biomed. Inform. 2019, 95, 103208. [Google Scholar] [CrossRef]
  31. Harris, P.A.; Taylor, R.; Thielke, R.; Payne, J.; Gonzalez, N.; Conde, J.G. Research electronic data capture (REDCap)—A metadata-driven methodology and workflow process for providing translational research informatics support. J. Biomed. Inform. 2009, 42, 377–381. [Google Scholar] [CrossRef]
  32. Narine, L.; Harder, A. Comparing the Borich model with the Ranked Discrepancy Model for competency assessment: A novel approach. Adv. Agric. Dev. 2021, 2, 96–111. [Google Scholar] [CrossRef]
  33. Ensor, N.; Sivasubramaniam, M.; Laird, A.J.; Roddis, B.; Qin, K.R.; Pacilli, M.; Nestel, D.; Nataraja, R.M. Medical students’ experiences and perspectives on simulation-based education. Int. J. Healthc. Simul. 2024, 1–9. [Google Scholar] [CrossRef]
  34. Paskins, Z.; Peile, E. Final year medical students’ views on simulation-based teaching: A comparison with the Best Evidence Medical Education Systematic Review. Med. Teach. 2010, 32, 569–577. [Google Scholar] [CrossRef]
  35. Naggar, M.; Almaeen, A. Student’s perception towards medical-simulation training as a method for clinical teaching. J. Pak. Med. Assoc. 2020, 70, 618–623. [Google Scholar] [CrossRef]
  36. Agha, S.; Alhamrani, A.Y.; Khan, M.A. Satisfaction of medical students with simulation based learning. Saudi Med. J. 2015, 36, 731–736. [Google Scholar] [CrossRef] [PubMed]
  37. Takayesu, J.K.; Farrell, S.E.; Evans, A.J.; Sullivan, J.E.; Pawlowski, J.B.; Gordon, J.A. How Do Clinical Clerkship Students Experience Simulator-Based Teaching? A Qualitative Analysis. Simul. Healthc. 2006, 1, 215–219. [Google Scholar] [CrossRef] [PubMed]
  38. Heitz, C.; Eyck, R.; Smith, M.; Fitch, M. Simulation in Medical Student Education: Survey of Clerkship Directors in Emergency Medicine. West. J. Emerg. Med. 2011, 12, 455–460. [Google Scholar] [CrossRef] [PubMed]
  39. Sarikaya, O.; Civaner, M.; Kalaca, S. The anxieties of medical students related to clinical training: Anxieties of Medical Students. Int. J. Clin. Pract. 2006, 60, 1414–1418. [Google Scholar] [CrossRef]
  40. Andrews, M.A.; Paolino, N.D.; DeZee, K.J.; Hemann, B. Perspective of the Graduating Medical Student: The Ideal Curriculum for the Fourth Year of Undergraduate Medical Education. Mil. Med. 2016, 181, e1455–e1463. [Google Scholar] [CrossRef]
  41. Mujamammi, A.H.; Alqahtani, S.A.; Alqaidi, F.A.; A Alharbi, B.; Alsubaie, K.M.; E Alhaisoni, F.; Sabi, E.M. Evaluation of Medical Students’ Satisfaction With Using a Simulation-Based Learning Program as a Method for Clinical Teaching. Cureus 2024, 16, e59364. [Google Scholar] [CrossRef]
  42. Pangaro, L. A new vocabulary and other innovations for improving descriptive in-training evaluations. Acad. Med. 1999, 74, 1203–1207. [Google Scholar] [CrossRef]
  43. Eva, K.W.; Munoz, J.; Hanson, M.D.; Walsh, A.; Wakefield, J. Which Factors, Personal or External, Most Influence Students’ Generation of Learning Goals? Acad. Med. 2010, 85, S102–S105. [Google Scholar] [CrossRef]
  44. Nuzhat, A.; Salem, R.O.; Al Shehri, F.N.; Al Hamdan, N. Role and challenges of simulation in undergraduate curriculum. Med. Teach. 2014, 36 (Suppl. 1), S69–S73. [Google Scholar] [CrossRef]
  45. Larsen, D.P.; Wesevich, A.; Lichtenfeld, J.; Artino, A.R.; Brydges, R.; Varpio, L. Tying knots: An activity theory analysis of student learning goals in clinical education. Med. Educ. 2017, 51, 687–698. [Google Scholar] [CrossRef] [PubMed]
  46. Jeong, E.; Lim, J. An Analysis of Priorities in Developing Virtual Reality Programs for Core Nursing Skills: Cross-sectional Descriptive Study Using the Borich Needs Assessment Model and Locus for Focus Model. JMIR Serious Games 2022, 10, e38988. [Google Scholar] [CrossRef]
  47. Kim, S.; Kyung, S.Y.; Park, I.B.; Yune, S.J.; Park, K.H. Analysis of the perceptions, competencies, and educational needs for global health among Korean medical students. Korean J. Med. Educ. 2024, 36, 1. [Google Scholar] [CrossRef] [PubMed]
  48. Han, Y.K.; Yeo, A.N. Analysis of Needs for Clinical Dental Hygienist’s Performances Using Borich Needs Assessment and the Locus for Focus Model. J. Dent. Hyg. Sci. 2023, 23, 1–12. [Google Scholar] [CrossRef]
  49. McCawley, P. Methods for Conducting an Educational Needs Assessment: Guidelines for Cooperative Extension System Professionals; University of Idaho Extension: Moscow, ID, USA, 2009. [Google Scholar]
  50. Paap, K.R.; Anders-Jefferson, R.T.; Balakrishnan, N.; Majoubi, J.B. The many foibles of Likert scales challenge claims that self-report measures of self-control are better than performance-based measures. Behav. Res. Methods 2023, 56, 908–933. [Google Scholar] [CrossRef]
  51. Kodikara, K.G.; Karunaratne, W.C.D.; Chandratilake, M.N. High fidelity simulation in undergraduate medical curricula: Experience of fourth year medical students. South-East Asian J. Med. Educ. 2020, 13, 25. [Google Scholar] [CrossRef]
  52. Weller, J.M. Simulation in undergraduate medical education: Bridging the gap between theory and practice. Med. Educ. 2004, 38, 32–38. [Google Scholar] [CrossRef]
Table 1. Student ImR and PR rating of SNAP competencies.
Table 1. Student ImR and PR rating of SNAP competencies.
SNAP
Competency
Mean
Student
ImR
(SD) 1
Mean Student PR
(SD) 1
tp-Value 295% Confidence IntervalBorich MWDS 3
LowerUpper
#1
Apply knowledge covered in the
unit/rotation to simulation cases
4.41
(0.71)
4.10 (0.79)5.95<0.0010.210.411.37
#8
Ability to function effectively
in an acute clinical setting
4.35
(0.75)
4.05 (0.83)4.98<0.0010.180.421.30
#6
Apply technical skills to simulation cases
(intubation, chest compressions, defibrillation,
airway management, etc.)
4.26
(0.85)
3.98 (0.90)4.65<0.0010.160.401.19
#5
Communicate effectively with healthcare team members (not including patient)
4.41
(0.76)
4.17 (0.77)5.32<0.0010.150.341.08
#7
Make medical management decisions
(administering medications, starting CPR, etc.)
4.30
(0.77)
4.06 (0.79)4.11<0.0010.130.361.05
#2
Interpret clinically relevant information
(history, physical exam, vitals, labs, imaging, etc.)
4.33
(0.74)
4.10 (0.76)4.55<0.0010.130.331.00
#3
Formulate a prioritized differential diagnosis
4.29
(0.80)
4.08 (0.77)3.91<0.0010.110.320.92
#4
Identify the different roles and
responsibilities in a healthcare team
4.36
(0.77)
4.20 (0.78)3.36<0.0010.070.270.75
Borich model priorities for ImR and PR (n = 161). Note: The sample size for perceived performance questions is n = 161 due to incomplete responses from three participants. 1 Standard deviation. 2 Paired t-test. 3 Ranked Borich MWDS for each SNAP competency.
Table 2. Student ImR and InR on self-performance ratings of SNAP competency.
Table 2. Student ImR and InR on self-performance ratings of SNAP competency.
SNAP
Competency
Mean
Student
ImR
(SD) 1
Mean
Student InR
(SD) 1
tp-Value 295% Confidence IntervalBorich MWDS
LowerUpper
#1
Apply knowledge covered in the
unit/rotation to simulation cases
4.41
(0.71)
4.04 (0.82)6.45<0.0010.250.481.61
#5
Communicate effectively with
healthcare team members
4.41
(0.76)
4.16 (0.79)4.67<0.0010.140.361.10
#2
Interpret clinically relevant information
4.33
(0.74)
4.12 (0.90)3.96<0.0010.110.320.92
#8
Ability to function effectively in an acute clinical setting
4.35
(0.75)
4.14 (0.80)4.34<0.0010.110.300.90
#4
Identify the different roles and
responsibilities in a healthcare team
4.36
(0.77)
4.16 (0.77)3.71<0.0010.090.300.85
#3
Formulate a prioritized differential diagnosis
4.29
(0.80)
4.10 (0.82)3.67<0.0010.090.300.84
#6
Apply technical skills to simulation cases
4.26
(0.85)
4.09 (0.80)3.320.001110.070.280.75
#7
Make medical management decisions
4.30
(0.77)
4.18 (0.85)2.500.013530.030.230.55
Borich model priorities for ImR and InR (n = 164). 1 Standard deviation. 2 Paired t-test.
Table 3. Faculty importance ranking of SNAP competencies for MS classes.
Table 3. Faculty importance ranking of SNAP competencies for MS classes.
Faculty
Importance Ranking
MS1
Class
MS2
Class
MS4
Class
1#1
Apply knowledge covered in the unit/rotation to simulation cases
#1
Apply knowledge covered in the unit/rotation to simulation cases
#2
Interpret clinically relevant information
2#4
Identify different roles and responsibilities in a healthcare team
#2
Interpret clinically
relevant information
#5
Communicate effectively with healthcare team members
3#5
Communicate effectively with healthcare team members
#5
Communicate effectively with healthcare team members
#8
Ability to function effectively in an acute clinical setting
4#2
Interpret clinically relevant information
#3
Formulate a prioritized
differential diagnosis
#7
Make medical management decisions
5#3
Formulate a prioritized
differential diagnosis
#4
Identify different roles and
responsibilities in a healthcare team
#3
Formulate a prioritized
differential diagnosis
6#7
Make medical
management decisions
#8
Ability to function effectively in an acute clinical setting
#1
Application of knowledge covered in the unit/rotation to simulation cases
7#8
Ability to function effectively in an acute clinical setting
#7
Make medical
management decisions
#6
Apply technical skills
to simulation cases
8#6
Apply technical skills
to simulation cases
#6
Apply technical skills
to simulation cases
#4
Identify different roles and
responsibilities in a healthcare team
Table 4. Medical student class rankings performance discrepancy score (PDS) by MWDS.
Table 4. Medical student class rankings performance discrepancy score (PDS) by MWDS.
MWDS RankingMS1
PDS Rankings
(n = 44)
MS2
PDS Rankings
(n = 56)
MS4
PDS Ranking
(n = 61)
1#1
Apply knowledge covered in the unit/rotation to simulation cases
#8
Ability to function effectively in an acute clinical setting
#6
Apply technical skills to
simulation cases
2#2
Interpret clinically
relevant information
#1
Apply knowledge covered in the unit/rotation to simulation cases
#8
Ability to function effectively in an acute clinical setting
3#3
Formulate a prioritized differential diagnosis
#6
Apply technical skills
to simulation cases
#7
Make medical
management decisions
4#8
Ability to function effectively in an acute clinical setting
#2
Interpret clinically
relevant information
#1
Application of knowledge covered in the unit/rotation to simulation cases
5#5
Communicate effectively with healthcare team members
#5
Communicate effectively with healthcare team members
#5
Communicate effectively with healthcare team members
6#6
Apply technical skills
to simulation cases
#7
Make medical
management decisions
#4
Identify different roles and responsibilities in a healthcare team
7#7
Make medical
management decisions
#4
Identify different roles and
responsibilities in a healthcare team
#3
Formulate a prioritized
differential diagnosis
8#4
Identify different roles and
responsibilities in a healthcare team
#3
Formulate a prioritized
differential diagnosis
#2
Interpret clinically
relevant information
MS rankings of PDS by MWDS. Note: The sample size for perceived performance questions is n = 161 due to incomplete responses from three participants.
Table 5. Medical student class rankings of InDS by MWDS.
Table 5. Medical student class rankings of InDS by MWDS.
MWDS RankingMS1
InDS Rankings
(n = 44)
MS2
InDS Rankings
(n = 58)
MS4
InDS Rankings
(n = 62)
1#1
Apply knowledge covered in the unit/rotation to simulation cases
#2
Interpret clinically
relevant information
#1
Apply knowledge covered in the unit/rotation to simulation cases
2#2
Interpret clinically
relevant information
#1
Apply knowledge covered in the unit/rotation to simulation cases
#5
Communicate effectively with healthcare team members
3#5
Communicate effectively with healthcare team members
#5
Communicate effectively with healthcare team members
#6
Apply technical skills
to simulation cases
4#3
Formulate a prioritized
differential diagnosis
#4
Identify different roles and responsibilities in a healthcare team
#3
Formulate a prioritized
differential diagnosis
5#8
Ability to function effectively in an acute clinical setting
#8
Ability to function effectively in an acute clinical setting
#8
Ability to function effectively in an acute clinical setting
6#4
Identify different roles and
responsibilities in a healthcare team
#3
Formulate a prioritized
differential diagnosis
#7
Make medical
management decisions
7#6
Apply technical skills
to simulation cases
#6
Apply technical skills
to simulation cases
#4
Identify different roles and
responsibilities in a healthcare team
8#7
Make medical
management decisions
#7
Make medical
management decisions
#2
Interpret clinically
relevant information
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wong, S.; Serikawa, B.; Roman, M.; Hada, N.; Lee-Jayaram, J.; Berg, B.W. Simulation Needs Assessment Project (SNAP): Use of the Borich Model in Undergraduate Medical Education. Int. Med. Educ. 2025, 4, 42. https://doi.org/10.3390/ime4040042

AMA Style

Wong S, Serikawa B, Roman M, Hada N, Lee-Jayaram J, Berg BW. Simulation Needs Assessment Project (SNAP): Use of the Borich Model in Undergraduate Medical Education. International Medical Education. 2025; 4(4):42. https://doi.org/10.3390/ime4040042

Chicago/Turabian Style

Wong, Samantha, Bradson Serikawa, Meliza Roman, Nicole Hada, Jannet Lee-Jayaram, and Benjamin W. Berg. 2025. "Simulation Needs Assessment Project (SNAP): Use of the Borich Model in Undergraduate Medical Education" International Medical Education 4, no. 4: 42. https://doi.org/10.3390/ime4040042

APA Style

Wong, S., Serikawa, B., Roman, M., Hada, N., Lee-Jayaram, J., & Berg, B. W. (2025). Simulation Needs Assessment Project (SNAP): Use of the Borich Model in Undergraduate Medical Education. International Medical Education, 4(4), 42. https://doi.org/10.3390/ime4040042

Article Metrics

Back to TopTop