Next Article in Journal
Learning as a Skill to Be Learned: A Campus-Wide Framework to Support Student Learning and Success
Previous Article in Journal
Therapy Dogs District-Wide: Mental Health and Well-Being Influences in PK-12 Education
Previous Article in Special Issue
The Effects of the Clinical Simulation of Transfusion Reactions on Nursing Students’ Knowledge Gain: A Pragmatic Clinical Trial
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Innovative Hands-On Approach for Magnetic Resonance Imaging Education of an Undergraduate Medical Radiation Science Course in Australia: A Feasibility Study

1
Curtin Medical School, Curtin University, GPO Box U1987, Perth, WA 6845, Australia
2
Curtin Medical Research Institute (Curtin MRI), Faculty of Health Sciences, Curtin University, GPO Box U1987, Perth, WA 6845, Australia
3
Western Australia National Imaging Facility Node, The University of Western Australia (UWA), 35 Stirling Highway, Crawley, WA 6009, Australia
4
Siemens Healthcare Pty Ltd., 355 Scarborough Beach Road, Osborne Park, WA 6017, Australia
5
UWA Medical School, The University of Western Australia (UWA), 35 Stirling Highway, Crawley, WA 6009, Australia
6
Department of Radiology, Royal Perth Hospital, Victoria Square, Perth, WA 6000, Australia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2025, 15(7), 930; https://doi.org/10.3390/educsci15070930
Submission received: 6 June 2025 / Revised: 11 July 2025 / Accepted: 19 July 2025 / Published: 21 July 2025
(This article belongs to the Special Issue Technology-Enhanced Nursing and Health Education)

Abstract

As yet, no study has investigated the use of a research magnetic resonance imaging (MRI) scanner to support undergraduate medical radiation science (MRS) students in developing their MRI knowledge and practical skills (competences). The purpose of this study was to test an innovative program for a total of 10 s- and third-year students of a MRS course to enhance their MRI competences. The study involved an experimental, two-week MRI learning program which focused on practical MRI scanning of phantoms and healthy volunteers. Pre- and post-program questionnaires and tests were used to evaluate the competence development of these participants as well as the program’s educational quality. Descriptive statistics, along with Wilcoxon signed-rank and paired t-tests, were used for statistical analysis. The program improved the participants’ self-perceived and actual MRI competences significantly (from an average of 2.80 to 3.20 out of 5.00, p = 0.046; and from an average of 34.87% to 62.72%, Cohen’s d effect size: 2.53, p < 0.001, respectively). Furthermore, they rated all aspects of the program’s educational quality highly (mean: 3.90–4.80 out of 5.00) and indicated that the program was extremely valuable, very effective, and practical. Nonetheless, further evaluation should be conducted in a broader setting with a larger sample size to validate the findings of this feasibility study, given the study’s small sample size and participant selection bias.

1. Introduction

Magnetic resonance imaging (MRI) is one of the most advanced medical imaging (MI) modalities (Robinson et al., 2022). Unlike most other MI modalities, it does not pose any risk of ionizing radiation while still providing excellent performance in soft tissue visualization, comprehensive evaluation of physiological status of body structures, and functional imaging (Baker et al., 2024; Imura et al., 2021; Mittendorff et al., 2022; Ng, 2022b; Rocca et al., 2024). Over the last decade, the use of MRI has increased significantly (Martella et al., 2023). This growth can be attributed to the increasing recognition of MRI’s clinical and diagnostic advantages among patients, healthcare professionals, and researchers (Bor et al., 2021).
In Australia, MRI examinations have increased by 112% in the last decade, highlighting the importance of MRI education for medical radiation practitioners (MRPs) (Martella et al., 2023; Mittendorff et al., 2022). However, a review paper published in 2017 demonstrated that the traditional approach for MRI education within undergraduate medical radiation science (MRS) courses, with a focus on MRI theory, is insufficient for future MRPs to be competent in MRI scanning, even with substantial experiential learning within an MRI environment after graduation (Westbrook, 2017). In 2019, the Australian registering body, Medical Radiation Practice Board of Australia (MRPBA), recognized this issue and changed its MRI-related professional capability requirements for Australian MRPs from demonstration of understanding and knowledge and explanation of principles and clinical applications to performing MRI if MRI was part of MRPs’ practice (MRPBA, 2013, 2020). This is of paramount importance, given the inherent safety risks to patients and staff, such as hearing loss, thermal burn, and projectile injury caused by three electromagnetic fields for image acquisition, namely the time-varying gradient field, radiofrequency, and strong static magnetic field, respectively (Baker et al., 2024; Dewland et al., 2013; Fortier et al., 2025; Mittendorff et al., 2022). A recent review article also underscored the importance of improving MRI education for MRPs, as the number of MRI safety incidents continues to rise (Mittendorff et al., 2022).
Nonetheless, placement opportunities for pre-registration MRS education to obtain hands-on experience are limited (Chaka & Hardy, 2021; Hazell et al., 2020; Vestbøstad et al., 2020). To address this issue, the use of computer-based simulation (CBS) has become popular in MRS education, especially after the COVID-19 pandemic (Chaka & Hardy, 2021; Chau & Arruzza, 2023; Chau et al., 2022; Hasoomi et al., 2024; Jimenez et al., 2023; Ng, 2022a; Vestbøstad et al., 2020). Although several recent literature reviews acknowledge the benefits of CBS, such as supporting students’ pre-clinical skills development and potentially reducing clinical placement hours, CBS should be considered a complement rather than a replacement for clinical placements (Chaka & Hardy, 2021; Chau et al., 2022; Jimenez et al., 2023; Ng, 2022a). The findings of these review articles appear to be in line with the recommendations of the World Health Organization (WHO) that both simulation-based learning and hands-on experiences from clinical placements are essential for healthcare professional education (WHO, 2011). However, it is noted that the level of fidelity in simulation-based learning influences students’ learning outcomes. Multiple meta-analyses of nursing studies involving 10 to 352 participants have demonstrated that high-fidelity simulation has large positive effects across all (psychomotor, cognitive, and affective) domains of competence development (Kim et al., 2016; La Cerra et al., 2019).
As yet, only one literature review has specifically covered CBS for MRI education in MRS (Cataldo et al., 2023; Chaka & Hardy, 2021). The review by Chaka and Hardy (2021) indicated a limited use of CBS in MRI education to meet the increasing demand of MRI examinations and improve the standard of practice as a result of one original study identified and included in the review (Elshami & Abuzaid, 2017). The MRI simulator used in that original study only provided a simulation of an MRI system console for undergraduate students to set scanning protocols and initiate scans. It did not provide simulations for other important tasks in MRI examinations, such as patient preparation and positioning (Elshami & Abuzaid, 2017). In 2020, Cooley et al. (2020) developed a low-cost (<USD 10,000), tabletop MRI scanner with a 10 mm field of view to provide a high-fidelity simulated learning experience for undergraduate MRI education. However, the tabletop scanner could not fully simulate all aspects of MRI examinations, e.g., patient safety check, preparation and positioning, etc.
In 2023, Colucci et al. (2023) demonstrated a novel hands-on approach for radiology residents’ training in MRI physics and scanning with statistically significant improvement of residents’ MRI competences upon training completion. Their findings could be explained by Knowles’ (1990) adult learning theory, which suggests that learning is more effective for adults when they recognize the relevance of knowledge and skills to real-world situations and perceive them as directly applicable to their professional duties. In the same year, the imaging platform of the Western Australia (WA) National Imaging Facility (NIF) Node was expanded with a high-specification, research-only MAGNETOM Vida 3.0 T scanner (Siemens Healthineers, Forchheim, Germany) (NIF, 2025c). This is the only research-dedicated MRI scanner in WA and the equipment created a unique opportunity to pilot Colucci et al.’s (2023) novel hands-on approach for Curtin University’s Bachelor of Science (BSc) (MRS) students (after they completed the MRI theoretical component of the 4-year course) to further develop their MRI competences, including knowledge and practical skills at the WA NIF (NIF, 2025c). According to the International Nursing Association for Clinical Simulation and Learning (INACSL) Standards of Best Practice, simulation-based learning experiences should be pilot tested with measurable objectives prior to full implementation (INACSL Standards Committee, 2016). The purpose of this study was to pilot an innovative program for second- and third-year students of the MRS course to enhance their MRI competences. It was hypothesized that the program helped to improve the undergraduate students’ MRI competences, with positive student perception of this initiative.

2. Materials and Methods

This was an experimental study based upon the methodology developed by Colucci et al. (2023). Ten second- and third-year MRS students who completed Curtin University’s BSc (MRS) MRI theoretical component were recruited to participate in the hands-on MRI learning activities through a two-week, full-day summer vacation scholarship program administrated by Curtin Medical School (scholarship amount: ~USD 900/student). Students who failed instrumentation units (subjects) which covered the MRI theoretical component and/or had any ferromagnetic implant/pregnancy/breast feeding/prior surgery were excluded. The exclusion criteria were established to ensure that the participants had sufficient MRI knowledge and could be considered healthy volunteers/safe for MRI scanning (Medicines and Healthcare Products Regulatory Agency, 2021). The required sample size was calculated using the following Equation (1) (Kadam & Bhalerao, 2010):
n = 2 Z α + Z 1 β 2 σ 2 Δ 2
where Zα is 1.96 for a two-tailed test with a significance level of 0.05; Z1−β is 0.8416 for a power of 80%; estimated σ is 0.15; and the estimated effect size Δ was 0.22, based on the similar study by Colucci et al. (2023).
Based on this calculation, it was determined that at least 8 (n = 7.3) students were required for the pre- and post-program evaluation. This study was conducted in accordance with the Declaration of Helsinki and approved by the Curtin University Human Research Ethics Committee (the approval number was HRE2024-0674 and the date of approval was 21 November 2024). The following information was provided in the participant information statement to inform potential participants of the risks associated with participating in the program. Written informed consent was obtained from all participants.
“There are no foreseeable physical risks (including projectile injury, implant displacement, acoustic trauma and magnet quench causing asphyxiation) from this research project (as all MRI scans will be conducted as per the WA NIF Node’s standard operating procedures and you have already completed MRI safety training as part of the MRI theoretical component of Curtin University’s Bachelor of Science (MRS) program, enabling you to observe all safety requirements at all times). You will be asked to complete the attached WA NIF Node MRI safety questionnaire which the WA NIF Node MRI personnel will thoroughly check to ensure you do not have any metal on you before you enter the scanning room.
There might be 2 potential psychological risks associated with the MRI scanning of yourself and your peers.
  • Incidental findings observed during the MRI scanning. This might cause potential stress to respective participants due to health and privacy concerns. You should see your GPs [general practitioners] if you suspect having any medical diseases or conditions. In case of any potentially significant findings observed during the MRI scanning, we will advise you to approach your GPs according to Ahpra’s [Australian Health Practitioner Regulation Agency’s] requirements. When you are scanned by your peers, they will be able to see MRI images of your body, resulting in privacy concern. However, all participants are registered as students with MRPBA and required to observe its privacy and confidentiality requirements during the MRI scanning. It is not uncommon for healthcare professionals to provide clinical services to those who they know in clinical practice, e.g., healthcare professionals visiting their staff clinics run by their colleagues for medical services, etc. Participant numbers (codes) instead of names will be entered into the MRI scanner for initiating the scans to further address any privacy concerns.
  • Triggering claustrophobia when being scanned by your peers. However, you can use the MRI call button provided to terminate the scan at any point.”

2.1. MRI Learning Program

The two-week program, which took place from 3 to 14 February 2025, comprised two parts. At the beginning of Part 1, a briefing about the program activities was provided to all participants to meet INACSL Standards of Best Practice Criterion 7, “Begin simulation-based experiences with a prebriefing” (INACSL Standards Committee, 2016). They were required to complete a pre-program survey regarding their background and self-perceived MRI competences, as well as a one-hour MRI knowledge test. They were not informed about the pre-program knowledge test arrangement before the test as per Colucci et al.’s (2023) study setting. After the pre-program test, they completed the following prescribed readings (self-study) within the remaining time of the first week of the program.
These two readings had a combination of texts, graphical illustrations, and MRI system snapshots which could promote more effective learning as per Mayer’s (2002) multimedia principle of multimedia learning. Also, all participants were encouraged to approach the research team member who was a physicist responsible for teaching the MRI theoretical component of the BSc (MRS) course for any MRI learning support during that week. These arrangements supported the participants in preparing effectively for the second week’s activities and maximizing their outcomes, in line with the INACSL Standards of Best Practice for simulation-based learning (INACSL Standards Committee, 2016). No further test was administered after the participants completed the self-study in the first week.
Table 1 shows the timetable for Part 2 of the program, which took place during the second week at the WA NIF. It included multiple sessions to review the content covered in the prescribed readings, as well as the participants’ prior learning within the BSc (MRS) course. However, the emphasis of Part 2 was on hands-on MRI scanning of phantom objects (fruits and vegetables) and participants by their peers (other participants) under the guidance of an MRI radiologist, radiographers, and scientists to gain a complete experience of MRI scanning from both MRP and patient perspectives. The research-only MAGNETOM Vida 3.0 T scanner used in the program had a 70 cm bore, XT gradients (60 mT/m amplitude and 200 T/m/s slew rate per axis), and XA50 software. The MRI scans covered the abdomen, ankle, heart, hip, knee, liver, lumbosacral spine, and vascular system. No contrast agent (dye) was used for any human MRI scans. These scanning activities allowed every participant to gain practical experience in undertaking all steps required to complete MRI examinations (such as patient safety check, preparation, positioning, and discharge, as well as scanning protocol selection, initiation, and image processing) and to fully understand the situations faced by MRI patients, thus leading to participants providing better care of patients in the future. On the final day of the program, the participants were asked to complete the post-program survey and test (duration: 1 h) to evaluate the efficacy of the training (Colucci et al., 2023). These arrangements were made based on INACSL Standards of Best Practice for simulation-based learning (INACSL Standards Committee, 2016).

2.2. Pre- and Post-Program Questionnaires and Tests

The pre- and post-program questionnaires and tests were developed based on the respective data collection instruments by Colucci et al. (2023) to evaluate the program in accordance with the INACSL Standards of Best Practice for simulation-based learning (INACSL Standards Committee, 2016). The pre-program questionnaire had four questions—two multiple choice questions (MCQs) for collecting participants’ background information, one 5-point scale question about their self-perceived MRI competences (1 = nearly nothing; 2 = a few concepts but still very confusing; 3 = understand the basics but more need to be learnt; 4 = enough understanding for undertaking routine MRI scans; 5 = advanced understanding for managing all MRI scans including troubleshooting), and one open question about their needs and expectations of the program as well as any comments/suggestions. There were 10 questions in the post-program questionnaire, including 9 5-point scale questions for the evaluation of the educational quality of the program and self-perceived MRI competences as well as 1 open question for further comments/suggestions. Appendix A.1 and Appendix A.2 illustrate the sample questions of pre- and post-program questionnaires.
The pre- and post-program test questions were identical (38 questions in total, with 35 MCQs and 3 short answer questions). They were developed for assessing the participants’ actual competences rather than self-perception. A total of 9 of the questions were adapted from Colucci et al.’s (2023) tests (with 25 pre- and 40 post-program questions and 25 common questions between the 2 tests). Unlike Colucci et al.’s (2023) tests, which predominantly focused on MRI physics, this study’s tests had nearly half (17) of the questions emphasizing clinical practice of MRI, as well as 13 more questions in the pre-program test. Appendix B.1 and Appendix B.2 show sample physics, instrumentation, and clinical practice questions of our pre- and post-program tests, respectively. Each test question was worth one mark. Prior to administration, our pre- and post-program questionnaires and tests were piloted with eight experts (three MRI radiographers from different clinical centers including public and private settings and five MRI scientists from the Siemens Healthineers (Australian and New Zealand) Scientific Team) who were not directly involved in this study. They were asked to assess the content validity of the instruments, resulting in several revisions to ensure that the tools were capable of measuring participants’ competences and perceptions of the program (Ng et al., 2024).

2.3. Data Analysis

SPSS Statistics 30 (International Business Machines Corporation, New York, NY, USA) was used for statistical analysis. Descriptive statistics (percentage of frequency for MCQs, and mean, median, minimum and maximum, and standard deviation (SD) for 5-point scale questions) were employed to analyze the questionnaire and test data to evaluate the educational quality of the program as well as the participants’ MRI competences. Median values of the self-perceived MRI competences from the pre- and post-program questionnaires were compared through Wilcoxon signed-rank tests. A comparison of average overall test marks of pre- and post-program tests was conducted using paired t-test. In addition, subgroup analyses for second- and third-year students, with and without an MRI/physics background, and physics and instrumentation and clinical practice questions were conducted with Holm–Bonferroni correction. The median and mean comparisons were used to identify any statistically significant improvement of the participants’ MRI competences after the program. Any p-value < 0.05 represented statistical significance. The effect size of the MRI competence change (pre- and post-program test mark difference) was determined based on Cohen’s d score, with >0.2, >0.5, >0.8, and >1.3 indicating small, medium, large, and very large effects (i.e., magnitudes of change), respectively (Colucci et al., 2023; Sullivan & Feinn, 2012). The open questions of the questionnaires were analyzed using content analysis with quasi-statistics as an accounting system to illustrate any additional insights (Ng et al., 2024).

3. Results

3.1. Participants’ Background and Needs and Expectation

Twenty-three expressions of interest were received from the BSc (MRS) students (second-year: n = 12; course-weighted average (CWA): 78.2% (mean) and 63.5–83.5% (range) and third-year: n = 11; CWA: 79.1% (mean) and 72.6–85.7% (range)) to join the hands-on MRI learning program. Ten (second-year: 40%; third-year: 60%) students were selected for participation. All but one were high-achieving students (CWA > 80%). Seventy percent of participants did not have any MRI/physics background before studying the BSc (MRS) course. The other three (30%) worked as MRI radiographer assistants in clinical centers during their undergraduate studies. Table 2 shows the participants’ perceived needs and expectations of the program prior to their participation. At least 50% of the participants indicated that they expected to obtain hands-on experience in MRI scanning parameter selection, and performing scans, as well as image interpretation. These were their top three needs as well.

3.2. Participants’ Competence Improvement

In this program, each participant performed between five and nine MRI scans, and was scanned by their peers a similar number of times. Table 3 illustrates the participants’ self-perceived MRI competences and overall test marks (actual competences) before and after the program. All participants’ self-perceived and actual competences (overall test marks) significantly improved after the program. The average self-perceived competences were increased by 0.40 (from 2.80 to 3.20, i.e., from approaching the level of “understanding the basics but more need to be learnt” to slightly beyond that level towards “enough understanding for undertaking routine MRI scans”). The mean overall test mark increased from 34.87% (fail) to 62.72% (pass) with a mean difference of 27.85% and a very large Cohen’s d effect size, 2.53. Although the subgroup analyses for their self-perceived competences did not have any significant increase, all subgroup analyses for test marks significantly increased with greater actual competence improvements for second-year participants as well as those with MRI assistant work experience after the program (mean difference: 32.24% and 28.22%; Cohen’s d effect sizes: 3.47 and 4.73, respectively). Figure 1 shows an interval plot of the test mark mean differences (actual competence improvements) for all participants and various subgroups.
Table A1 and Table A2 show participants’ test mark improvements with statistical significance after the program for physics and instrumentation as well as clinical practice sections (p < 0.001–0.037) except for the physics and instrumentation section for the participants with MRI radiographer assistant work experience (p = 0.076). The clinical practice section mark increases (32.68–40.44%; Cohen’s d effect sizes: 2.29–5.53) were greater than those of physics and instrumentation questions (17.86–27.98%; Cohen’s d effect sizes: 1.15–2.57). Table A1 demonstrates that the second-year participants had the greatest physics and instrumentation section improvement (27.98%; Cohen’s d effect size: 2.57). Table A2 illustrates that the program had the greatest positive impact on the participants with MRI assistant work experience in terms of Cohen’s d effect size (5.53) for clinical practice question mark improvement.

3.3. Program’s Educational Quality Evaluation

Table 4 illustrates the participants’ post-program response to the 5-point scale questions regarding its educational quality. They had very positive perceptions on all aspects of the program. Apart from question 5 of the post-program questionnaire, the mean scores of all questions were 3.90–4.80 out of 5.00. For question 5, every participant indicated that the level of supervision of the program was perfect (mean: 3.00; SD: 0.00).
The participants’ open comments on the program were positive (Table 5). Half of the participants indicated that their needs and expectations were at least met, and the program was extremely valuable, and very effective and practical. Nonetheless, the majority (70%) suggested that more time should be allocated for learning from the MRI radiographers. Several of them (30%) recommended a consolidation of existing program activities in order to include more in the future. Figure 2 and Figure 3 show the example phantom and peer scanning sessions of the MRI learning program, respectively.

4. Discussion

To the best of our knowledge, this was the first study to use a research MRI scanner for undergraduate MRS students to obtain hands-on MRI experiences for further developing their competences including knowledge and practical skills. Although Colucci et al. (2023) indicated that institutions might face challenges in accessing research MRI facilities for implementing such a novel hands-on MRI learning program, our pilot experience has demonstrated this feasibility through the WA NIF (Department of Education, Australian Government, 2024; NIF, 2025a). In Australia, pre-registration MRS education is available in the Australian Capital Territory (ACT), New South Wales, Queensland, South Australia, Victoria, and Western Australia (Australian Health Practitioner Regulation Agency, 2024). This highlights the potential for other Australian undergraduate MRS course providers, except the one in the ACT, to follow our approach in implementing similar hands-on MRI learning programs utilizing the NIF nodes in those states (Australian Health Practitioner Regulation Agency, 2024; NIF, 2025b).
Our study design resembled that of Colucci et al. (2023) in terms of its small sample size, range of MRI learning activities, use of a research MRI scanner, data collection tools, and analysis strategies. Comparisons of this study’s findings with theirs reveal very similar outcomes. For example, their mean residents’ confidence (self-perceived competence) improved from 2.86 to 3.33, while ours increased from 2.80 to 3.20. Also, their average overall test mark (actual competence) increased from 50.00% to 72.00% (difference: 22.00%; Cohen’s d effect size: 1.29), while ours improved from 34.87% to 62.72% (difference: 27.85%; Cohen’s d effect size: 2.53). The confidence and competence improvements demonstrated in the two studies were both statistically significant (Colucci et al. (2023): p = 0.01 and <0.001; ours: p = 0.046 and <0.001), respectively.
The main difference between this study and Colucci et al.’s (2023) novel MRI learning program was that theirs was designed for postgraduate senior resident training in radiology, whereas ours aimed at senior undergraduate MRS students. Another major difference was that about half of our test questions covered clinical practice but theirs focused on physics, as clinical MRI practice was already part of their formal residency training. According to the Medical Radiation Practice Accreditation Committee (2019), clinical competence assessment should include direct observation of performance which was absent in our program. However, our pre- and post-program test design including question types aligned with the American Board of Radiology’s (2018) qualifying (core) exam requirements for clinical competence assessment.
Although Colucci et al. (2023) did not conduct any subgroup analysis of their third- and fourth-year resident test marks, they anticipated that their third-year residents would show greater improvement, as their fourth-year residents had more practice experience. Our findings (Table 3, Table A1 and Table A2, and Figure 1) support this expectation with second-year students’ mean overall test, physics and instrumentation, and clinical practice question mark increases of 32.24%, 27.98%, and 40.44%, with Cohen’s d effect sizes of 3.47, 2.57, and 2.30, respectively. Additionally, Table A2 illustrates that our program had the greatest positive impact on the participants with MRI work experience for handling the clinical practice questions (Cohen’s d effect size: 5.53). This finding aligns with Knowles’ (1990) adult learning theory that adults are more motivated to learn when factors, such as career alignment and prior experiences support their learning (Ng et al., 2008). Furthermore, our participants demonstrated greater mark improvements in clinical practice questions compared to physics and instrumentation questions (Table A1 and Table A2). This highlights the strength of our program’s emphasis on clinical practice. Besides, this was expected, as even radiologist residents perceived physics and instrumentation as challenging (Colucci et al., 2023; Day et al., 2022).
Our program’s educational quality evaluation results are very similar to those of Colucci et al. (2023) with only two items having mean scores below 4.00 (Table 4). Participants in both studies indicated that the level of supervision in the programs was perfect (rating: 3.00). The perceived value of the phantom scanning session, as rated by our participants, had a mean score of 3.90. This may be due to their desire for the program activities to be consolidated, allowing coverage of all routine clinical scans, as reflected in Table 5’s results. Nonetheless, the results in Table 2 and Table 5, as well as Figure 2 and Figure 3, clearly indicate that the participants’ needs and expectations were met.
Although our results were highly positive, consistent with the findings of Colucci et al.’s (2023) study, two challenges are anticipated in implementing the program as a formal course-wide module for practical MRI competence development. Our program lasted two weeks, whereas Colucci et al.’s (2023) arrangement spanned only one week. To avoid overloading our participants, a week was allocated for them to complete the required pre-readings before the practical activities. Colucci et al. (2023) suggested that self-study was effective, but they did not allocate a dedicated week for it. Instead, their participants were expected to complete the pre-readings after hours, which were then reviewed by their program staff the following day. Our participants’ feedback appears to support this arrangement (Table 5). Thus, shortening the MRI learning program to one week could be a more pragmatic approach if implemented as a formal program, given that the average year-group size is 50. In this case, five weeks would be sufficient for all students to gain hands-on MRI scanning experience. Additionally, at any given time, two program staff members supervised our participants during the practical activities. According to Colucci et al.’s (2023) study, one MRI radiographer should be sufficient for supervision, and their participants also found this arrangement appropriate, aligning with our participants’ suggestion (Table 5). This arrangement could enhance the financial and practical viability of the MRI learning program.
This study had three major limitations. Only ten students were selected from the second- and third-year cohorts, representing half of the year cohorts in the BSc (MRS) four-year course. While the sample size was small, it was sufficient for the program evaluation, as determined using Equation (1) (Colucci et al., 2023; Kadam & Bhalerao, 2010). Although the small sample size in this study might have impacted the subgroup analyses and introduced potential type I errors (false positives), the Holm–Bonferroni method was used to correct their p-values. Furthermore, large Cohen’s d effect sizes (1.15–5.53) were observed in the subgroup analyses of the test data. It is noted that subgroup statistical power could drop to 20–30% without adjustments for multiple comparisons. Hence, at least small effects (Cohen’s d > 0.2) were expected in these subgroup analyses (Burke et al., 2015; Sullivan & Feinn, 2012). Additionally, all but one of the participants were high-achieving (CWA > 80%). The selection of high-achieving students was made because Colucci et al.’s (2023) hands-on MRI learning program was originally designed for the postgraduate resident training in radiology, whereas in this study, it was adapted for undergraduate students’ MRI competence development. Inevitably, this introduced selection bias, and our results may not be generalizable to a course-wide implementation setting. Given that the mean CWAs of the second- and third-year students (78.2% and 79.1%, respectively) were close to those of our study participants, it is anticipated that potential concerns are unlikely to be significant. Moreover, although only the content validity of the questionnaires and tests was assessed by eight experts, our findings regarding participants’ self-perceived and actual competences appear to be consistent. Additional psychometric validation of the data collection instruments should be conducted in the future.
Based on the findings and limitations of our pilot study, we propose that, in the short term, this MRI learning program should be implemented as a one-week co-curricular program through the University initiative “Curtin Extra” during the academic year break (January to mid-February). It will target students who have just completed the BSc (MRS) MRI theoretical component (second-year study) and will not be subject to a quota. This approach may help mitigate selection bias and the ceiling effect (the inability to yield significant gains) because our findings indicate that second-year students demonstrated greater improvements than their third-year counterparts (Staus et al., 2021). However, all future participants must meet the established MRI safety requirements and provide written consent to participate. This co-curricular program will offer an opportunity to further validate our data collection instruments, assess the generalizability of this pilot study’s findings, and evaluate the financial and practical feasibility of the program (Ng, 2020). If positive outcomes are observed during this extended implementation, the program will subsequently be incorporated into the clinical practice (placement) unit of the BSc (MRS) course, immediately prior to third-year study, during the academic break. In that context, standard clinical placement assessments (including learning contract, direct observation, and reflective writing) would be used to evaluate students’ MRI competence development. This approach also supports the implementation of Kolb’s (1984) experiential learning cycle, encompassing the stages of concrete experience, reflective observation, abstract conceptualization, and active experimentation. As another future research direction, we aim to investigate the long-term impact of this MRI learning program by assessing participants’ MRI competence once they begin professional practice after graduation.

5. Conclusions

This was the first study to use a research MRI scanner to provide undergraduate MRS students with hands-on MRI experience in a two-week program. Our results show that participants’ MRI confidence and competence improved significantly after completing the program. They demonstrated greater improvements in clinical practice areas compared to physics and instrumentation, highlighting our program’s strong focus on clinical practice. Additionally, our program appears to be particularly beneficial for second-year students and those with prior MRI work experience. Participants also rated the program’s educational quality highly, indicating that it was extremely valuable, highly effective and practical. Nonetheless, further evaluation of the program’s effectiveness should be conducted in a broader setting with a larger sample size to validate the findings of this feasibility study, given the small sample size and participant selection bias.

Author Contributions

Conceptualization, C.K.C.N., Z.S. and P.M.P.; methodology, C.K.C.N., S.V., H.M., P.F., R.D. and P.M.P.; software, C.K.C.N., S.V., H.M. and P.M.P.; validation, C.K.C.N., S.V., H.M., P.F., R.D. and P.M.P.; formal analysis, C.K.C.N.; investigation, C.K.C.N., S.V., H.M., P.F., R.D. and P.M.P.; resources, C.K.C.N., S.V., H.M., P.F., Z.S., R.D. and P.M.P.; data curation, C.K.C.N. and P.F.; writing—original draft preparation, C.K.C.N., S.V., H.M., P.F., Z.S., R.D. and P.M.P.; writing—review and editing, C.K.C.N., S.V., H.M., P.F., Z.S., R.D. and P.M.P.; visualization, C.K.C.N.; supervision, C.K.C.N. and P.M.P.; project administration, C.K.C.N., R.D. and P.M.P.; funding acquisition, C.K.C.N., S.V., H.M., P.F., Z.S., R.D. and P.M.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research and the APC were funded by Royal Perth Hospital Imaging Research Grant 2024, grant number SGA0260924.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by Curtin University Human Research Ethics Committee (approval number: HRE2024-0674 and date of approval: 21 November 2024).

Informed Consent Statement

Written informed consent was obtained from all participants involved in the study, as well as for the publication of this paper.

Data Availability Statement

The data are not publicly available due to ethical restrictions.

Acknowledgments

We would like to thank the WA Country Health Service Deputy Chief Medical Imaging Technologist, MRI radiographers from Perth Children’s Hospital and Perth Radiological Clinic (PRC) and MRI scientists of Siemens Heathineers (Australia and New Zealand) Scientific Team, for their support in piloting the pre- and post-program questionnaires and tests. Additionally, we extend our gratitude to the PRC MRI radiographers for supervising the participants’ MRI scanning and to the students who participated in this study for their valuable contributions of time and effort. We acknowledge the facilities, and scientific and technical assistance of the Western Australia National Imaging Facility (WA NIF) Node, a National Collaborative Research Infrastructure Strategy (NCRIS) capability, at the Centre for Microscopy, Characterisation, and Analysis, The University of Western Australia (UWA).

Conflicts of Interest

Author Hamed Moradi was employed by the company Siemens Healthcare Pty Ltd. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ACTAustralian Capital Territory
AhpraAustralian Health Practitioner Regulation Agency
BScBachelor of Science
CBSComputer-based simulation
CIConfidence interval
Curtin MRICurtin Medical Research Institute
FOVField of view
GdGadolinium
GPsGeneral practitioners
INACSLInternational Nursing Association for Clinical Simulation and Learning
MCQsMultiple choice questions
MIMedical imaging
MRIMagnetic resonance imaging
MRPBAMedical Radiation Practice Board of Australia
MRPsMedical radiation practitioners
MRSMedical radiation science
NCRISNational Collaborative Research Infrastructure Strategy
NIFNational Imaging Facility
SDStandard deviation
SNRSignal-to-noise ratio
UWAThe University of Western Australia
WA Western Australia
WHOWorld Health Organization

Appendix A. Sample Pre- and Post-Program Questionnaire Questions

Appendix A.1. Pre-Program Question

  • Do you have any MRI and/or physics background prior to studying Bachelor of Science (Medical Radiation Science) units? (select all that apply)
  • None
  • An additional undergraduate physics/engineering/mathematics course/more
  • An additional undergraduate physics unit/more
  • An additional undergraduate engineering unit/more
  • An additional undergraduate mathematics unit/more
  • A physics/engineering/mathematics related postgraduate course/more
  • Previous experience as an MRI assistant
  • Previous research experience in MRI
  • Previous experience in using MRI modality console

Appendix A.2. Post-Program Question

  • Please rate the effectiveness of this program for learning MRI using the following 5-point scale.
  • Very ineffective
  • Ineffective
  • Neutral
  • Effective
  • Very effective

Appendix B. Sample Pre- and Post-Program Test Questions

Appendix B.1. Physics and Instrumentation Questions

  • Time-of-flight (TOF) MR angiography is able to distinguish blood from surrounding tissue. Which of the following is correct?
    (a)
    A rapid succession of pulses saturates the tissue, causing an inability to produce a strong signal. When blood flows into the slice, it is not saturated; therefore, it is able to generate a strong MR signal in response to subsequent pulses.
    (b)
    The TOF method is an extension of the diffusion-weighted imaging where flowing liquids are detected in the same manner that diffusing molecules are detected. The fast-flowing blood produces a very bright signal.
    (c)
    The TOF method relies on distinguishing between the time for the blood versus the tissue spins to align with a rapid succession of gradient pulses.
    (d)
    TOF MRA is a 3D reconstruction algorithm that filters noise within each slice in parallel to applying enhancement filters for blood. This advanced processing algorithm is all applied in real time and able to produce a seamless 3D rendering akin to fluoroscopy techniques.
    (e)
    A series of spin echoes are collected at increasing time steps whereby the time spacing is akin to parabolic “time of flight” equation.
  • Diffusion weighted imaging is typically used to measure the diffusion of:
    ___________________

Appendix B.2. Clinical Practice Questions

  • You are setting up to perform a scan using a spin echo sequence and wish to acquire 25 slices. Once you have entered your scan parameters, the system determines the Max # slices to be 24. What is a quick way to avoid subjecting the patient to a second scan? Select all that are correct.
    (a)
    Increase TR slightly to allow more time to collect more slices.
    (b)
    Decrease the flip angle slightly to make acquisition times shorter.
    (c)
    Change the number of slices to 24 and accept that you will not have 25.
    (d)
    Decrease TE slightly to decrease time per slice.
    (e)
    There is no need to avoid a second scan as the extra time is insignificant.
  • MFGRE (multiecho fast gradient-echo) sequences are analyzed to measure cardiac iron content. A set of three gated images are analyzed by tracing small regions around the interventricular septum on each image. The system displays a graph and a table of signal times (Figure A1). What is represented by the curves?
    _______________________________________________________________________
Figure A1. Signal times graph and table.
Figure A1. Signal times graph and table.
Education 15 00930 g0a1

Appendix C. Participants’ Pre- and Post-Program Test Marks for Physics and Instrumentation, and Clinical Practice Questions

Table A1. Participants’ pre- and post-program test marks (%) for physics and instrumentation questions only.
Table A1. Participants’ pre- and post-program test marks (%) for physics and instrumentation questions only.
BeforeAfterp-ValueMean
Difference
(95% CI)
Cohen’s d
Effect Size
(95% CI)
MeanMinimumMedianMaximumSDMeanMinimumMedianMaximumSD
All Participants
35.4819.0534.5257.1411.6957.3833.3358.3385.7115.160.00121.90
(11.75–32.05)
1.54
(0.59–2.46)
Second-Year Participants
28.5723.8128.5733.334.3556.5542.8659.5264.299.400.02827.98
(10.65–45.30)
2.57
(0.39–4.73)
Third-Year Participants
40.0819.0540.4857.1413.0857.9433.3353.5785.7118.970.03717.86
(1.53–34.18)
1.15
(0.06–2.17)
Participants without MRI/Physics Background
35.3723.8135.7147.628.8555.4433.3359.5273.8113.090.02620.07
(6.03–34.10)
1.32
(0.26–2.34)
Participants with MRI Assistant Work Experience
35.7119.0530.9557.1419.4961.9042.8657.1485.7121.820.07626.19
(−6.73–59.11)
1.98
(−0.15–4.06)
CI, confidence interval; SD, standard deviation.
Table A2. Participants’ pre- and post-program test marks (%) for clinical practice questions only.
Table A2. Participants’ pre- and post-program test marks (%) for clinical practice questions only.
BeforeAfterp-ValueMean
Difference
(95% CI)
Cohen’s d
Effect Size
(95% CI)
MeanMinimumMedianMaximumSDMeanMinimumMedianMaximumSD
All Participants
38.2425.4936.2758.8212.0175.1955.8877.4597.0614.90<0.00136.96
(26.63–47.29)
2.56
(1.23–3.86)
Second-Year Participants
35.7825.4929.4158.8215.6676.2358.8277.4591.1813.880.01940.44
(12.49–68.39)
2.30
(0.29–4.28)
Third-Year Participants
39.8726.4737.2555.8810.2274.5155.8873.5397.0616.810.00234.64
(20.79–48.48)
2.63
(0.84–4.37)
Participants without MRI/Physics Background
39.2226.4737.2558.8213.1878.0155.8882.3597.0615.230.00238.79
(23.10–54.49)
2.29
(0.81–3.73)
Participants with MRI Assistant Work Experience
35.9525.4935.2947.0610.8068.6258.8261.7685.2914.510.01132.68
(17.99–47.36)
5.53
(0.69–10.72)
CI, confidence interval; SD, standard deviation.

References

  1. American Board of Radiology. (2018). Item writers’ guide. Available online: https://www.theabr.org/wp-content/uploads/2020/09/Item-Writers-Guide-2018.pdf (accessed on 11 May 2025).
  2. Australian Health Practitioner Regulation Agency. (2024). Accredited programs of study. Available online: https://www.medicalradiationpracticeboard.gov.au/Accreditation/Accredited-programs-of-study.aspx (accessed on 12 May 2025).
  3. Baker, C., Nugent, B., Grainger, D., Hewis, J., & Malamateniou, C. (2024). Systematic review of MRI safety literature in relation to radiofrequency thermal injury prevention. Journal of Medical Radiation Sciences, 71(3), 445–460. [Google Scholar] [CrossRef]
  4. Bor, D. S., Sharpe, R. E., Bode, E. K., Hunt, K., & Gozansky, W. S. (2021). Increasing patient access to MRI examinations in an integrated multispecialty practice. Radiographics, 41(1), E1–E8. [Google Scholar] [CrossRef] [PubMed]
  5. Burke, J. F., Sussman, J. B., Kent, D. M., & Hayward, R. A. (2015). Three simple rules to ensure reasonably credible subgroup analyses. BMJ, 351, h5651. [Google Scholar] [CrossRef] [PubMed]
  6. Cataldo, J., Collins, S., Walker, J., & Shaw, T. (2023). Use of virtual reality for MRI preparation and technologist education: A scoping review. Journal of Medical Imaging and Radiation Sciences, 54(1), 195–205. [Google Scholar] [CrossRef] [PubMed]
  7. Chaka, B., & Hardy, M. (2021). Computer based simulation in CT and MRI radiography education: Current role and future opportunities. Radiography, 27(2), 733–739. [Google Scholar] [CrossRef]
  8. Chau, M., Arruzza, E., & Johnson, N. (2022). Simulation-based education for medical radiation students: A scoping review. Journal of Medical Radiation Sciences, 69(3), 367–381. [Google Scholar] [CrossRef]
  9. Chau, M., & Arruzza, E. S. (2023). Maximising undergraduate medical radiation students’ learning experiences using cloud-based computed tomography (CT) software. Simulation & Gaming, 54(4), 447–460. [Google Scholar] [CrossRef]
  10. Colucci, P. G., Gao, M. A., Schweitzer, A. D., Chang, E. W., Riyahi, S., Taya, M., Lu, C., Ballon, D., Min, R. J., & Prince, M. R. (2023). A novel hands-on approach towards teaching diagnostic radiology residents MRI scanning and physics. Academic Radiology, 30(5), 998–1004. [Google Scholar] [CrossRef]
  11. Cooley, C. Z., Stockmann, J. P., Witzel, T., LaPierre, C., Mareyam, A., Jia, F., Zaitsev, M., Wenhui, Y., Zheng, W., Stang, P., Scott, G., Adalsteinsson, E., White, J. K., & Wald, L. L. (2020). Design and implementation of a low-cost, tabletop MRI scanner for education and research prototyping. Journal of Magnetic Resonance, 310, 106625. [Google Scholar] [CrossRef]
  12. Day, J., Devers, C. J., Wu, E., Devers, E. E., & Gomez, E. (2022). Development of educational media for medical trainees studying MRI physics: Effect of media format on learning and engagement. Journal of the American College of Radiology, 19(6), 711–721. [Google Scholar] [CrossRef]
  13. Department of Education, Australian Government. (2024). National collaborative research infrastructure strategy (NCRIS). Available online: https://www.education.gov.au/ncris (accessed on 12 May 2025).
  14. Dewland, T. A., Hancock, L. N., Sargeant, S., Bailey, R. K., Sarginson, R. A., & Ng, C. K. C. (2013). Study of lone working magnetic resonance technologists in Western Australia. International Journal of Occupational Medicine and Environmental Health, 26(6), 837–845. [Google Scholar] [CrossRef] [PubMed]
  15. Elshami, W., & Abuzaid, M. (2017). Transforming magnetic resonance imaging education through simulation-based training. Journal of Medical Imaging and Radiation Sciences, 48(2), 151–158. [Google Scholar] [CrossRef] [PubMed]
  16. Fortier, E., Bellec, P., Boyle, J. A., & Fuente, A. (2025). MRI noise and auditory health: Can one hundred scans be linked to hearing loss? The case of the Courtois NeuroMod project. PLoS ONE, 20(1), e0309513. [Google Scholar] [CrossRef] [PubMed]
  17. Hasoomi, N., Fujibuchi, T., & Arakawa, H. (2024). Developing simulation-based learning application for radiation therapy students at pre-clinical stage. Journal of Medical Imaging and Radiation Sciences, 55(3), 101412. [Google Scholar] [CrossRef]
  18. Hazell, L., Lawrence, H., & Friedrich-Nel, H. (2020). Simulation based learning to facilitate clinical readiness in diagnostic radiography. A meta-synthesis. Radiography, 26(4), e238–e245. [Google Scholar] [CrossRef]
  19. Imura, T., Mitsutake, T., Iwamoto, Y., & Tanaka, R. (2021). A systematic review of the usefulness of magnetic resonance imaging in predicting the gait ability of stroke patients. Scientific Reports, 11(1), 14338. [Google Scholar] [CrossRef]
  20. INACSL Standards Committee. (2016). INACSL standards of best practice: Simulation SM Simulation design. Clinical Simulation in Nursing, 12, S5–S12. [Google Scholar] [CrossRef]
  21. Jimenez, Y. A., Gray, F., Di Michele, L., Said, S., Reed, W., & Kench, P. (2023). Can simulation-based education or other education interventions replace clinical placement in medical radiation sciences? A narrative review. Radiography, 29(2), 421–427. [Google Scholar] [CrossRef]
  22. Kadam, P., & Bhalerao, S. (2010). Sample size calculation. International Journal of Ayurveda Research, 1(1), 55–57. [Google Scholar] [CrossRef]
  23. Kim, J., Park, J. H., & Shin, S. (2016). Effectiveness of simulation-based nursing education depending on fidelity: A meta-analysis. BMC Medical Education, 16, 152. [Google Scholar] [CrossRef]
  24. Knowles, M. (1990). The adult learner: A neglected species (4th ed.). Gulf Publishing Company. [Google Scholar]
  25. Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Prentice Hall. [Google Scholar]
  26. La Cerra, C., Dante, A., Caponnetto, V., Franconi, I., Gaxhja, E., Petrucci, C., Alfes, C. M., & Lancia, L. (2019). Effects of high-fidelity simulation based on life-threatening clinical condition scenarios on learning outcomes of undergraduate and postgraduate nursing students: A systematic review and meta-analysis. BMJ Open, 9(2), e025306. [Google Scholar] [CrossRef] [PubMed]
  27. Martella, M., Lenzi, J., & Gianino, M. M. (2023). Diagnostic technology: Trends of use and availability in a 10-year period (2011–2020) among sixteen OECD countries. Healthcare, 11, 2078. [Google Scholar] [CrossRef] [PubMed]
  28. Mayer, R. E. (2002). Multimedia learning. Psychology of Learning and Motivation, 41, 85–139. [Google Scholar] [CrossRef]
  29. McRobbie, D. W., Moore, E. A., & Graves, M. J. (2017). MRI from picture to proton (3rd ed.). Cambridge University Press. [Google Scholar] [CrossRef]
  30. Medical Radiation Practice Accreditation Committee. (2019). Accreditation standards: Medical radiation practice 2019. Available online: https://www.ahpra.gov.au/documents/default.aspx?record=WD21/30784&dbid=AP&chksum=p31uaKToamWisUOvmFKtUg%3d%3d&_gl=1*18fusvm*_ga*MjAxNjg2NDg1LjE3MjUzMzM3MTY.*_ga_F1G6LRCHZB*czE3NDcwMzE1ODckbzI1JGcxJHQxNzQ3MDMxNjEzJGowJGwwJGgw (accessed on 11 May 2025).
  31. Medical Radiation Practice Board of Australia (MRPBA). (2013). Professional capabilities for medical radiation practice. Available online: https://www.medicalradiationpracticeboard.gov.au/documents/default.aspx?record=WD13%2f12534&dbid=AP&chksum=OIuB81d6eQCqo%2bewP9PHOA%3d%3d (accessed on 11 May 2025).
  32. Medical Radiation Practice Board of Australia (MRPBA). (2020). Professional capabilities for medical radiation practice. Available online: https://www.medicalradiationpracticeboard.gov.au/documents/default.aspx?record=WD19%2f29238&dbid=AP&chksum=qSaH9FIsI%2ble99APBZNqIQ%3d%3d (accessed on 11 May 2025).
  33. Medicines and Healthcare Products Regulatory Agency. (2021). Safety guidelines for magnetic resonance imaging equipment in clinical use. Available online: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/958486/MRI_guidance_2021-4-03c.pdf (accessed on 11 May 2025).
  34. Mittendorff, L., Young, A., & Sim, J. (2022). A narrative review of current and emerging MRI safety issues: What every MRI technologist (radiographer) needs to know. Journal of Medical Radiation Sciences, 69(2), 250–260. [Google Scholar] [CrossRef] [PubMed]
  35. National Imaging Facility (NIF). (2025a). Funders and partners. Available online: https://anif.org.au/funders-and-partners/ (accessed on 12 May 2025).
  36. National Imaging Facility (NIF). (2025b). Instruments and infrastructure. Available online: https://anif.org.au/what-we-do/our-capabilities/capabilities/ (accessed on 12 May 2025).
  37. National Imaging Facility (NIF). (2025c). Western Australia NIF node. Available online: https://anif.org.au/uwa/ (accessed on 5 May 2025).
  38. Ng, C. K. C. (2020). Evaluation of academic integrity of online open book assessments implemented in an undergraduate medical radiation science course during COVID-19 pandemic. Journal of Medical Imaging and Radiation Sciences, 51(4), 610–616. [Google Scholar] [CrossRef]
  39. Ng, C. K. C. (2022a). A review of the impact of the COVID-19 pandemic on pre-registration medical radiation science education. Radiography, 28(1), 222–231. [Google Scholar] [CrossRef]
  40. Ng, C. K. C. (2022b). Artificial intelligence for radiation dose optimization in pediatric radiology: A systematic review. Children, 9(7), 1044. [Google Scholar] [CrossRef]
  41. Ng, C. K. C., Baldock, M., & Newman, S. (2024). Use of smart glasses (assisted reality) for Western Australian X-ray operators’ continuing professional development: A pilot study. Healthcare, 12(13), 1253. [Google Scholar] [CrossRef]
  42. Ng, C. K. C., White, P., & McKay, J. C. (2008). Establishing a method to support academic and professional competence throughout an undergraduate radiography programme. Radiography, 14, 255–264. [Google Scholar] [CrossRef]
  43. Robinson, N. B., Gao, M., Patel, P. A., Davidson, K. W., Peacock, J., Herron, C. R., Baker, A. C., Hentel, K. A., & Oh, P. S. (2022). Secondary review reduced inpatient MRI orders and avoidable hospital days. Clinical Imaging, 82, 156–160. [Google Scholar] [CrossRef]
  44. Rocca, M. A., Preziosa, P., Barkhof, F., Brownlee, W., Calabrese, M., De Stefano, N., Granziera, C., Ropele, S., Toosy, A. T., Vidal-Jordana, À., Di Filippo, M., & Filippi, M. (2024). Current and future role of MRI in the diagnosis and prognosis of multiple sclerosis. Lancet Regional Health-Europe, 44, 100978. [Google Scholar] [CrossRef]
  45. Staus, N. L., O’Connell, K., & Storksdieck, M. (2021). Addressing the ceiling effect when assessing STEM out-of-school time experiences. Frontiers in Education, 6, 690431. [Google Scholar] [CrossRef]
  46. Sullivan, G. M., & Feinn, R. (2012). Using effect size—Or why the p value is not enough. Journal of Graduate Medical Education, 4(3), 279–282. [Google Scholar] [CrossRef]
  47. Vestbøstad, M., Karlgren, K., & Olsen, N. R. (2020). Research on simulation in radiography education: A scoping review protocol. Systematic Reviews, 9(1), 263. [Google Scholar] [CrossRef]
  48. Westbrook, C. (2017). Opening the debate on MRI practitioner education—Is there a need for change? Radiography, 23, S70–S74. [Google Scholar] [CrossRef]
  49. World Health Organization (WHO). (2011). Patient safety curriculum guide: Multi-professional edition. Available online: https://iris.who.int/bitstream/handle/10665/44641/9789241501958_eng.pdf?sequence=1 (accessed on 11 May 2025).
Figure 1. Interval plot of test mark mean differences (actual magnetic resonance imaging (MRI) competence improvements) for all participants (n = 10) and various subgroups: second-year (n = 4) and third-year (n = 6) and with (n = 3) and without MRI/physics background (n = 7), depicted as mean and 95% confidence intervals.
Figure 1. Interval plot of test mark mean differences (actual magnetic resonance imaging (MRI) competence improvements) for all participants (n = 10) and various subgroups: second-year (n = 4) and third-year (n = 6) and with (n = 3) and without MRI/physics background (n = 7), depicted as mean and 95% confidence intervals.
Education 15 00930 g001
Figure 2. Phantom scanning session example: (a) phantom (watermelon) positioning, (b) scanning protocol selection, initiation, and image processing, and (c) image analysis under program staff (research team) supervision.
Figure 2. Phantom scanning session example: (a) phantom (watermelon) positioning, (b) scanning protocol selection, initiation, and image processing, and (c) image analysis under program staff (research team) supervision.
Education 15 00930 g002aEducation 15 00930 g002b
Figure 3. Peer scanning session example: (a) patient safety check and preparation, (b) positioning, and (c) scanning protocol selection, initiation, and image processing under program staff (research team) supervision.
Figure 3. Peer scanning session example: (a) patient safety check and preparation, (b) positioning, and (c) scanning protocol selection, initiation, and image processing under program staff (research team) supervision.
Education 15 00930 g003aEducation 15 00930 g003b
Table 1. Timetable for Part 2 of the magnetic resonance imaging (MRI) learning program.
Table 1. Timetable for Part 2 of the magnetic resonance imaging (MRI) learning program.
MondayTuesdayWednesdayThursdayFriday
Morning
  • MRI technology review and safety 1
  • First contact with scanner: Entering patient data and parameter cards 2
  • Introduction to Gd-based contrast agents 1
  • Effect of Gd in dilution series 2
  • MRI scan of ankle and knee: Scanning protocol and sequences, and anatomy 2
  • MRI scan of lumbosacral spine: Scanning protocol and sequences, and anatomy 2
  • Cardiac MRI and angiography: Scanning protocol and sequences, and anatomy 2
Afternoon
  • Influence of acquisition parameters on phantom studies: Matrix, FOV, phase/frequency, different sequences, and T1-T2-T2* 2
  • Comparing SNR, contrast and blurriness from acquisitions 2
  • Chemical shift artifacts: Phase/frequency, receiver bandwidth (Hz/pixel), FOV, and matrix size 2
  • Second-order chemical shift 2
  • Liver MRI scan: Scanning protocol and sequences, and anatomy 2
  • MRI scan of abdomen and hip: Scanning protocol and sequences, and anatomy 2
  • Musculoskeletal anatomy 1
  • Post-program survey and test
FOV, field of view; Gd, gadolinium; SNR, signal-to-noise ratio. 1 knowledge review. 2 hands-on.
Table 2. Participants’ needs and expectations before magnetic resonance imaging learning program.
Table 2. Participants’ needs and expectations before magnetic resonance imaging learning program.
Needs and Expectation%
Scanning parameter selection60
Performing scans50
Image interpretation50
Scanning room preparation30
Patient preparation30
Patient positioning30
Patient care30
Physics and instrumentation10
Understanding their contribution to this research study10
Table 3. Participants’ self-perceived magnetic resonance imaging (MRI) competences and overall test marks before and after the program.
Table 3. Participants’ self-perceived magnetic resonance imaging (MRI) competences and overall test marks before and after the program.
BeforeAfterp-ValueMean
Difference
(95% CI)
Cohen’s d
Effect Size
(95% CI)
MeanMinimumMedianMaximumSDMeanMinimumMedianMaximumSD
Self-perceived
competences
All Participants
2.802.003.003.000.423.203.003.004.000.420.046--
Second-Year Participants
2.752.003.003.000.503.253.003.004.000.500.157--
Third-Year Participants
2.832.003.003.000.413.173.003.004.000.410.157--
Participants without MRI/Physics Background
2.862.003.003.000.383.143.003.004.000.380.157--
Participants with MRI Assistant Work Experience
2.672.003.003.000.583.333.003.004.000.580.157--
Test mark
(%)
All Participants
34.8725.8832.0250.008.7662.7243.4262.9482.8913.93<0.00127.85
(19.98–35.72)
2.53
(1.21–3.82)
Second-Year Participants
30.4825.8828.9538.165.4362.7247.3764.9173.6811.180.00632.24
(17.44–47.04)
3.47
(0.70–6.27)
Third-Year Participants
37.7926.3237.0650.009.7362.7243.4259.8782.8916.550.00824.93
(12.50–37.36)
2.10
(0.59–3.58)
Participants without MRI/Physics Background
35.2127.6333.7747.377.1462.9143.4263.1681.5813.320.00227.69
(15.65–39.74)
2.13
(0.72–3.49)
Participants with MRI Assistant Work Experience
34.0625.8826.3250.0013.8062.2847.3756.5882.8918.440.01528.22
(13.38–43.05)
4.73
(0.53–9.19)
CI, confidence interval; SD, standard deviation.
Table 4. Participants’ perception of the magnetic resonance imaging (MRI) program’s educational quality.
Table 4. Participants’ perception of the magnetic resonance imaging (MRI) program’s educational quality.
QuestionMeanMinimumMedianMaximumSD
1. Program effectiveness for learning 14.204.004.005.000.42
2. Program engagement 24.504.004.505.000.53
3. Adequacy of peer contribution 34.403.004.505.000.70
4. Adequacy of facilitator (program staff) contribution 34.403.004.505.000.70
5. Adequacy of supervision 43.003.003.003.000.00
6. Value of MRI knowledge review session 54.203.004.005.000.63
7. Value of phantom scanning session 53.903.004.005.000.74
8. Value of peer scanning session 54.804.005.005.000.42
SD, standard deviation. 1 1 = very ineffective; 2 = ineffective; 3 = neutral; 4 = effective; 5 = very effective. 2 1 = very unengaging; 2 = unengaging; 3 = neutral; 4 = engaging; 5 = very engaging. 3 1 = very inadequate; 2 = inadequate; 3 = adequate; 4 = very adequate; 5 = well beyond all expectations. 4 1 = very inadequate; 2 = inadequate; 3 = perfect setting; 4 = a little bit over-supervision; 5 = over-supervision. 5 1 = no value; 2 = little value; 3 = some value; 4 = good value; 5 = exceptional value.
Table 5. Participants’ open comments following the program.
Table 5. Participants’ open comments following the program.
Strengths%
Needs and expectations met/exceeding expectations50
Extremely valuable/very effective/very practical50
Enhanced understanding of physics and instrumentation20
Learnt from a range of professionals (radiologist, radiographers, and scientists)10
Perfect number of participants10
Subdividing participants into 2 groups to run parallel activities for efficiency10
Test focused on practical aspects10
Suggestions
More time to learn from radiographers70
Consolidating program activities to include more30
Subdividing participants into 2 groups to run all activities in parallel (to include more activities)20
Assigning dedicated pre-readings a day before respective practical activities rather than to week 110
Covering all routine scans10
Simplifying pre-readings with dedicated support sessions in week 110
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ng, C.K.C.; Vos, S.; Moradi, H.; Fearns, P.; Sun, Z.; Dickson, R.; Parizel, P.M. Innovative Hands-On Approach for Magnetic Resonance Imaging Education of an Undergraduate Medical Radiation Science Course in Australia: A Feasibility Study. Educ. Sci. 2025, 15, 930. https://doi.org/10.3390/educsci15070930

AMA Style

Ng CKC, Vos S, Moradi H, Fearns P, Sun Z, Dickson R, Parizel PM. Innovative Hands-On Approach for Magnetic Resonance Imaging Education of an Undergraduate Medical Radiation Science Course in Australia: A Feasibility Study. Education Sciences. 2025; 15(7):930. https://doi.org/10.3390/educsci15070930

Chicago/Turabian Style

Ng, Curtise K. C., Sjoerd Vos, Hamed Moradi, Peter Fearns, Zhonghua Sun, Rebecca Dickson, and Paul M. Parizel. 2025. "Innovative Hands-On Approach for Magnetic Resonance Imaging Education of an Undergraduate Medical Radiation Science Course in Australia: A Feasibility Study" Education Sciences 15, no. 7: 930. https://doi.org/10.3390/educsci15070930

APA Style

Ng, C. K. C., Vos, S., Moradi, H., Fearns, P., Sun, Z., Dickson, R., & Parizel, P. M. (2025). Innovative Hands-On Approach for Magnetic Resonance Imaging Education of an Undergraduate Medical Radiation Science Course in Australia: A Feasibility Study. Education Sciences, 15(7), 930. https://doi.org/10.3390/educsci15070930

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop