Innovative Hands-On Approach for Magnetic Resonance Imaging Education of an Undergraduate Medical Radiation Science Course in Australia: A Feasibility Study
Round 1
Reviewer 1 Report
Comments and Suggestions for AuthorsThe paper introduces a novel approach involving hands-on MRI education for an undergraduate medical radiation science course, presenting a feasibility study on a group of ten 2nd and 3rd year students in order to improve theoretical knowledge and to gain practical understanding / familiarity with MRI, at a level above the traditional approach. The initiative of organizing a 2-week summer school for this learning purpose is to be commended.
The paper is generally well written and the background / premise justifying this study is clear. The design of the questionnaire and the data analysis are adequate.
Below are a few comments to be addressed by the authors:
- Was there any theoretical assessment undertaken in the first week of the program on the knowledge gained by the students after studying the two textbooks? (unless the aim was to gain better understanding on the practical applications during their training).
- Please add a few sentences on the technical/physical parameters of the MR machine used within this training course.
- What was the maximum number of scans one student was exposed to (in case there were multiple scans per individual)?
- An additional bonus of this study is the fact that students got the opportunity to be in a patient’s position; this is an important aspect, particularly in MR imaging, as for many patients this experience can be stressful, claustrophobic and generally negative. Through this training, students can learn to show more empathy towards patients and learn how to deal with various scenarios while also improving their communication skills.
Author Response
The paper introduces a novel approach involving hands-on MRI education for an undergraduate medical radiation science course, presenting a feasibility study on a group of ten 2 and 3rd year students in order to improve theoretical knowledge and to gain practical understanding / familiarity with MRI, at a level above the traditional approach. The initiative of organizing a 2-week summer school for this learning purpose is to be commended.
The paper is generally well written and the background / premise justifying this study is clear. The design of the questionnaire and the data analysis are adequate.
Response: Thank you for your comment.
Below are a few comments to be addressed by the authors:
1. Was there any theoretical assessment undertaken in the first week of the program on the knowledge gained by the students after studying the two textbooks? (unless the aim was to gain better understanding on the practical applications during their training).
Response: Thank you for your comment. No further test was administered after the participants completed the self-study in the first week as the focus of the program was the practical applications. To address this comment, the following sentence “No further test was administered after the participants completed the self-study in the first week.” has been added to the second last paragraph of Section 2.1 for clarification.
2. Please add a few sentences on the technical/physical parameters of the MR machine used within this training course.
Response: Thank you for your comment. For addressing this comment, the following technical/physical parameters, “The research-only MAGNETOM Vida 3.0 T scanner used in the program had a 70 cm bore, XT gradients (60 mT/m amplitude, 200 T/m/s slew rate per axis), and XA50 software.” have been added to the last paragraph of Section 2.1.
3. What was the maximum number of scans one student was exposed to (in case there were multiple scans per individual)?
Response: Thank you for your comment. For addressing this comment, the following sentence “In this program, each participant performed between five and nine MRI scans and was scanned by their peers a similar number of times.” has been added to Section 3.2.
4. An additional bonus of this study is the fact that students got the opportunity to be in a patient’s position; this is an important aspect, particularly in MR imaging, as for many patients this experience can be stressful, claustrophobic and generally negative. Through this training, students can learn to show more empathy towards patients and learn how to deal with various scenarios while also improving their communication skills.
Response: Thank you for your comment.
Reviewer 2 Report
Comments and Suggestions for AuthorsComments and Suggestions for Authors
1. Theoretical framework and context
The manuscript provides a concise overview of the state of the art, yet it would benefit from a more extensive engagement with the international literature on clinical simulation—particularly studies involving larger cohorts. A broader contextualization would sharpen the positioning and relevance of the present contribution.
2. Design and methods
-
Please elaborate on the development of the pre-/post-test questionnaire, including evidence of content validity, total number of items (with at least one illustrative example), and the pilot-testing procedure.
-
Refine the reporting of effect sizes by specifying the statistic employed (e.g., Cohen’s d, Hedges’ g) and providing an educational or clinical rationale for what constitutes a “very large” effect.
-
Multiple t-tests were conducted without adjustment for multiplicity; discuss the potential inflation of Type I error or apply a correction method such as Holm-Bonferroni.
3. Results
The tables are comprehensive but could be streamlined—e.g., by merging certain sub-cohorts or moving detailed breakdowns to the supplementary material. A succinct figure depicting mean changes with 95 % confidence intervals would also enhance clarity.
4. Discussion and conclusions
While the discussion is generally robust, the generalizability of the findings warrants further nuance given that the sample comprised predominantly high-achieving students. Please comment on how the programme could be scaled to less selective cohorts and specify the human and financial resources that such expansion would entail.
5. Limitations
The limitations section could be strengthened by addressing two additional aspects:
-
Statistical power: Was the target power achieved for the subgroup analyses?
-
Long-term effects: Is there any evidence (or a plan to gather it) regarding whether the observed benefits persist at 6–12 months
Author Response
Comments and Suggestions for Authors
1. Theoretical framework and context
The manuscript provides a concise overview of the state of the art, yet it would benefit from a more extensive engagement with the international literature on clinical simulation—particularly studies involving larger cohorts. A broader contextualization would sharpen the positioning and relevance of the present contribution.
Response: Thank you for your comment. For addressing this comment, the following sentences, “However, it is noted that the level of fidelity in simulation-based learning influences students’ learning outcomes. Multiple meta-analyses of nursing studies involving 10 to 352 participants have demonstrated that high-fidelity simulation has large positive effects across all (psychomotor, cognitive, and affective) domains of competence development (Kim et al. 2016; La Cerra et al., 2019).” have been added into the third paragraph of Introduction section.
2. Design and methods
Please elaborate on the development of the pre-/post-test questionnaire, including evidence of content validity, total number of items (with at least one illustrative example), and the pilot testing procedure.
Response: Thank you for your comment. For addressing this comment, the last sentence of Section 2.2 has been changed to “Prior to administration, our pre- and post-program questionnaires and tests were piloted with eight experts (three MRI radiographers from different clinical centers including public and private settings and five MRI scientists from the Siemens Healthineers (Australian and New Zealand) Scientific Team) who were not directly involved in this study. They were asked to assess the content validity of the instruments, resulting in several revisions to ensure that the tools were capable of measuring participants’ competences and perceptions of the program (Ng et al., 2024).” Also, Appendix A. Sample Pre- and Post-Program Questionnaire Questions has been added. The total number of items in the pre- and post-program questionnaires are stated in Section 2.2 and they are 4 and 10, respectively.
Refine the reporting of effect sizes by specifying the statistic employed (e.g., Cohen’s d, Hedges’ g) and providing an educational or clinical rationale for what constitutes a “very large” effect.
Response: Thank you for your comment. To address this, the term, “Cohen’s d” has been added immediately before the words, “effect size” when reporting the respective findings in the Results and Discussion sections. Besides, the second last sentence of Section 2.3 has been changed to “Effect size of MRI competence change (pre- and post-program test mark difference) was determined based on Cohen’s d score with >0.2, >0.5, >0.8 and >1.3 indicating small, medium, large and very large effects (i.e., magnitudes of change), respectively (Colucci et al., 2023; Sullivan & Feinn, 2012).” with an additional medical education paper cited.
Multiple t-tests were conducted without adjustment for multiplicity; discuss the potential inflation of Type I error or apply a correction method such as Holm-Bonferroni.
Response: Thank you for your comment. For addressing this comment, the following sentences, “Although the small sample size in this study might have impacted the subgroup analyses and introduced potential type I errors (false positives), such errors do not appear to be present in the subgroup analyses of the questionnaire data, as all p-values exceeded 0.05. Furthermore, large Cohen’s d effect sizes (1.15-5.53) were observed in the subgroup analyses of the test data. It is noted that subgroup statistical power could drop to 20-30% without adjustments for multiple comparisons. Hence, at least small effects (Cohen’s d > 0.2) were expected in these subgroup analyses (Burke et al., 2015; Sullivan & Feinn, 2012).” have been added to the second last paragraph of the Discussion section.
3. Results
The tables are comprehensive but could be streamlined—e.g., by merging certain sub-cohorts or moving detailed breakdowns to the supplementary material. A succinct figure depicting mean changes with 95 % confidence intervals would also enhance clarity.
Response: Thank you for your comment. For addressing this comment, Table 4. Participants’ pre- and post-program test marks for physics and instrumentation questions only and Table 5. Participants’ pre- and post-program test marks for clinical practice questions only have been moved to Appendix C. Besides, an interval plot of test mark mean differences (actual magnetic resonance imaging (MRI) competence improvements) with 95 % confidence intervals for all participants and various subgroups has been added (Figure 1).
4. Discussion and conclusions
While the discussion is generally robust, the generalizability of the findings warrants further nuance given that the sample comprised predominantly high-achieving students. Please comment on how the programme could be scaled to less selective cohorts and specify the human and financial resources that such expansion would entail.
Response: Thank you for your comment. For addressing this comment, the following sentences “Based on the findings and limitations of our pilot study, we propose that, in the short term, this MRI learning program be implemented as a one-week co-curricular program through the University initiative, “Curtin Extra” during the academic year break (January to mid-February). It will target students who have just completed the BSc (MRS) MRI theoretical component (second-year study) and will not be subject to a quota. However, all future participants must meet the established MRI safety requirements and provide written consent to participate. This co-curricular program will offer an opportunity to further validate our data collection instruments, assess the generalizability of this pilot study’s findings, and evaluate the financial and practical feasibility of the program. If positive outcomes are observed during this extended implementation, the program will subsequently be incorporated into the clinical practice (placement) unit of the BSc (MRS) course, immediately prior to third-year study, during the academic break. In that context, standard clinical placement assessments (including learning contract, direct observation, and reflective writing) would be used to evaluate students’ MRI competence development.” have been added as the last paragraph of the Discussion section.
5. Limitations
The limitations section could be strengthened by addressing two additional aspects:
Statistical power: Was the target power achieved for the subgroup analyses?
Response: Thank you for your comment. For addressing this comment, the following sentences “Although the small sample size in this study might have impacted the subgroup analyses and introduced potential type I errors (false positives), such errors do not appear to be present in the subgroup analyses of the questionnaire data, as all p-values exceeded 0.05. Furthermore, large Cohen’s d effect sizes (1.15-5.53) were observed in the subgroup analyses of the test data. It is noted that subgroup statistical power could drop to 20-30% without adjustments for multiple comparisons. Hence, at least small effects (Cohen’s d > 0.2) were expected in these subgroup analyses (Burke et al., 2015; Sullivan & Feinn, 2012).” have been added to the second last paragraph of the Discussion section.
Long-term effects: Is there any evidence (or a plan to gather it) regarding whether the observed benefits persist at 6–12 months
Response: Thank you for your comment. For addressing this comment, the following sentence, “As another future research direction, we aim to investigate the long-term impact of this MRI learning program by assessing participants’ MRI competence once they begin professional practice after graduation.” has been added to the last paragraph of the Discussion section.
Reviewer 3 Report
Comments and Suggestions for AuthorsReview
Title: Innovative Hands-on Approach for Magnetic Resonance Imaging Education of an Undergraduate Medical Radiation Science Course in Australia: A Feasibility Study
Manuscript Type: Original Article
Thematic Area: Radiologic Science Education / Medical Education
Relevance of the article
-
The study addresses a relevant and timely gap in MRS education, grounded in recent changes to professional guidelines in Australia.
-
The use of a research-only MRI scanner for educational purposes is an original and valuable contribution.
-
The experimental design, although based on a small sample, is well thought out, with appropriate statistical analysis and clear interpretation of results.
-
The evaluation of the program’s educational quality is comprehensive, combining quantitative metrics and qualitative content analysis.
-
The discussion is critical and well contextualized with previous studies, particularly that of Colucci et al. (2023).
3. Points Requiring Revision
3.1. Validation of Instruments
-
The assessment instruments (tests and questionnaires) were based on a previous study, but the manuscript lacks information on their psychometric validation. It is recommended to include details on content validity and reliability (e.g., expert review, Cronbach’s alpha if applicable).
3.2. Discussion on Selection Bias
-
The selected sample predominantly includes high-achieving students (9 out of 10). This selection may limit the generalizability of the results. It is advisable to further emphasize this limitation in the discussion.
3.3. Safety of Practical Procedures
-
The peer MRI scanning practice (without contrast agents) raises some ethical and deontological concerns. While its implementation is acceptable, the description of safety measures (participant screening, eligibility, medical supervision) is superficial. A more detailed section is recommended, especially considering potential risks associated with high magnetic fields.
3.4. Curricular Applicability
-
The discussion could be strengthened with explicit recommendations on how to integrate this program into formal undergraduate MRS curricula. For example: proposed workload, curricular units, assessment criteria.
3.5. Presentation of Results
-
The tabular presentation is clear but quite extensive. Including comparative (pre/post) visual graphs would improve the visual accessibility of the key findings.
4. Conclusion and Recommendation
Recommendation: Minor Revisions
The manuscript is recommended for acceptance after incorporation of the following improvements:
-
Complete and review any missing information (e.g., funding, ethics approval).
-
Include details regarding the validation of the assessment instruments used.
-
Expand the discussion on limitations related to sample selection bias.
-
Provide a more detailed description of the ethical and safety safeguards related to peer scanning.
-
Explicitly propose how this practical activity could be integrated into the formal curriculum.
-
Consider including graphical representations to better summarize the results.
-
From an ethical standpoint, the manuscript lacks a description of the procedures to follow in case of incidental findings, since the study was also conducted on supposedly healthy individuals. Volunteers should be given the opportunity to decide what the investigators should do if such findings occur, in full respect of their autonomy.
Author Response
Review
Title: Innovative Hands-on Approach for Magnetic Resonance Imaging Education of an Undergraduate Medical Radiation Science Course in Australia: A Feasibility Study
Manuscript Type: Original Article
Thematic Area: Radiologic Science Education / Medical Education
Relevance of the article
The study addresses a relevant and timely gap in MRS education, grounded in recent changes to professional guidelines in Australia.
The use of a research-only MRI scanner for educational purposes is an original and valuable contribution.
The experimental design, although based on a small sample, is well thought out, with appropriate statistical analysis and clear interpretation of results.
The evaluation of the program’s educational quality is comprehensive, combining quantitative metrics and qualitative content analysis.
The discussion is critical and well contextualized with previous studies, particularly that of Colucci et al. (2023).
Response: Thank you for your comment.
3. Points Requiring Revision
3.1. Validation of Instruments
The assessment instruments (tests and questionnaires) were based on a previous study, but the manuscript lacks information on their psychometric validation. It is recommended to include details on content validity and reliability (e.g., expert review, Cronbach’s alpha if applicable).
Response: Thank you for your comment. For addressing this comment, the last sentence of Section 2.2 has been changed to “Prior to administration, our pre- and post-program questionnaires and tests were piloted with eight experts (three MRI radiographers from different clinical centers including public and private settings and five MRI scientists from the Siemens Healthineers (Australian and New Zealand) Scientific Team) who were not directly involved in this study. They were asked to assess the content validity of the instruments, resulting in several revisions to ensure that the tools were capable of measuring participants’ competences and perceptions of the program (Ng et al., 2024).” Also, the following sentences, “Moreover, only content validity of the questionnaires and tests was assessed by eight experts. Additional psychometric validation of the data collection instruments should be conducted in the future.” have been added to the second last paragraph of the Discussion section.
3.2. Discussion on Selection Bias
The selected sample predominantly includes high-achieving students (9 out of 10). This selection may limit the generalizability of the results. It is advisable to further emphasize this limitation in the discussion.
Response: Thank you for your comment. For addressing this comment, the following content has been added to the Discussion section as the last paragraph.
“Based on the findings and limitations of our pilot study, we propose that, in the short term, this MRI learning program be implemented as a one-week co-curricular program through the University initiative, “Curtin Extra” during the academic year break (January to mid-February). It will target students who have just completed the BSc (MRS) MRI theoretical component (second-year study) and will not be subject to a quota. However, all future participants must meet the established MRI safety requirements and provide written consent to participate. This co-curricular program will offer an opportunity to further validate our data collection instruments, assess the generalizability of this pilot study’s findings, and evaluate the financial and practical feasibility of the program. If positive outcomes are observed during this extended implementation, the program will subsequently be incorporated into the clinical practice (placement) unit of the BSc (MRS) course, immediately prior to third-year study, during the academic break. In that context, standard clinical placement assessments (including learning contract, direct observation, and reflective writing) would be used to evaluate students’ MRI competence development. As another future research direction, we aim to investigate the long-term impact of this MRI learning program by assessing participants’ MRI competence once they begin professional practice after graduation. (Ng, 2020).”
3.3. Safety of Practical Procedures
The peer MRI scanning practice (without contrast agents) raises some ethical and deontological concerns. While its implementation is acceptable, the description of safety measures (participant screening, eligibility, medical supervision) is superficial. A more detailed section is recommended, especially considering potential risks associated with high magnetic fields.
Response: Thank you for your comment. For addressing this comment, the following information has been added to Section 2.
“The following information was provided in the participant information statement to inform potential participants of the risks associated with participating in the program.
“There are no foreseeable physical risks (including projectile injury, implant displacement, acoustic trauma and magnet quench causing asphyxiation) from this research project (as all MRI scans will be conducted as per the WA NIF Node’s standard operating procedures and you have already completed MRI safety training as part of the MRI theoretical component of Curtin University’s Bachelor of Science (MRS) program, enabling you to observe all safety requirements at all times). You will be asked to complete the attached WA NIF Node MRI safety questionnaire which the WA NIF Node MRI personnel will thoroughly check to ensure you do not have any metal on you before you enter the scanning room.
There might be 2 potential psychological risks associated with the MRI scanning of yourself and your peers.
1. Incidental findings observed during the MRI scanning. This might cause potential stress to respective participants due to health and privacy concerns. You should see your GPs [general practitioners] if you suspect having any medical diseases or conditions. In case of any potentially significant findings observed during the MRI scanning, we will advise you to approach your GPs according to Ahpra’s [Australian Health Practitioner Regulation Agency’s] requirements. When you are scanned by your peers, they will be able to see MRI images of your body, resulting in privacy concern. However, all participants are registered as students with MRPBA and required to observe its privacy and confidentiality requirements during the MRI scanning. It is not uncommon for healthcare professionals to provide clinical services to those who they know in clinical practice, e.g. healthcare professionals visiting their staff clinics run by their colleagues for medical services, etc. Participant numbers (codes) instead of names will be entered into the MRI scanner for initiating the scans to further address any privacy concerns.
2. Triggering claustrophobia when being scanned by your peers. However, you can use the MRI call button provided to terminate the scan at any point.”
3.4. Curricular Applicability
The discussion could be strengthened with explicit recommendations on how to integrate this program into formal undergraduate MRS curricula. For example: proposed workload, curricular units, assessment criteria.
Response: Thank you for your comment. For addressing this comment, the following content has been added to the Discussion section as the last paragraph.
“Based on the findings and limitations of our pilot study, we propose that, in the short term, this MRI learning program be implemented as a one-week co-curricular program through the University initiative, “Curtin Extra” during the academic year break (January to mid-February). It will target students who have just completed the BSc (MRS) MRI theoretical component (second-year study) and will not be subject to a quota. However, all future participants must meet the established MRI safety requirements and provide written consent to participate. This co-curricular program will offer an opportunity to further validate our data collection instruments, assess the generalizability of this pilot study’s findings, and evaluate the financial and practical feasibility of the program. If positive outcomes are observed during this extended implementation, the program will subsequently be incorporated into the clinical practice (placement) unit of the BSc (MRS) course, immediately prior to third-year study, during the academic break. In that context, standard clinical placement assessments (including learning contract, direct observation, and reflective writing) would be used to evaluate students’ MRI competence development. As another future research direction, we aim to investigate the long-term impact of this MRI learning program by assessing participants’ MRI competence once they begin professional practice after graduation. (Ng, 2020).”
3.5. Presentation of Results
The tabular presentation is clear but quite extensive. Including comparative (pre/post) visual graphs would improve the visual accessibility of the key findings.
Response: Thank you for your comment. For addressing this comment, an interval plot of test mark mean differences (actual magnetic resonance imaging (MRI) competence improvements) with 95 % confidence intervals for all participants and various subgroups has been added (Figure 1).
4. Conclusion and Recommendation
Recommendation: Minor Revisions
The manuscript is recommended for acceptance after incorporation of the following improvements:
1. Complete and review any missing information (e.g., funding, ethics approval).
Response: Thank you for your comment. The funding, ethics approval and other necessary information was stated in the original manuscript submitted. Please see below for the information.
“Funding: This research and the APC were funded by Royal Perth Hospital Imaging Research Grant 2024, grant number SGA0260924.
Institutional Review Board Statement: The study was conducted in accordance with the Declaration of Helsinki, and approved by Curtin University Human Research Ethics Committee (approval number: HRE2024-0674 and date of approval: 21 November 2024).
Acknowledgments: We would like to thank the WA Country Health Service Deputy Chief Medical Imaging Technologist, MRI radiographers from Perth Children’s Hospital and Perth Radiological Clinic (PRC) and MRI scientists of Siemens Heathineers (Australia and New Zealand) Scientific Team, for their support in piloting the pre- and post-program questionnaires and tests. Additionally, we extend our gratitude to the PRC MRI radiographers for supervising the participants’ MRI scanning and to the students who participated in this study for their valuable contributions of time and effort. We acknowledge the facilities, and scientific and technical assistance of the NIF, a National Collaborative Research Infrastructure Strategy (NCRIS) capability, at the Centre for Microscopy, Characterisation, and Analysis, the UWA.”
However, as per the Education Sciences’ Instructions for Authors (https://www.mdpi.com/journal/education/instructions), “A double-blind peer review process is applied, where authors; identities are not known to reviewers”. Hence, the information such as funding, ethics approval and acknowledgements details was removed by the in-house editorial staff for the double-blind peer review and the manuscript provided to the reviewers did not have this information. We will ensure that this information will be stated in the published paper.
2. Include details regarding the validation of the assessment instruments used.
Response: Thank you for your comment. For addressing this comment, the last sentence of Section 2.2 has been changed to “Prior to administration, our pre- and post-program questionnaires and tests were piloted with eight experts (three MRI radiographers from different clinical centers including public and private settings and five MRI scientists from the Siemens Healthineers (Australian and New Zealand) Scientific Team) who were not directly involved in this study. They were asked to assess the content validity of the instruments, resulting in several revisions to ensure that the tools were capable of measuring participants’ competences and perceptions of the program (Ng et al., 2024).” Also, the following sentences, “Moreover, only content validity of the questionnaires and tests was assessed by eight experts. Additional psychometric validation of the data collection instruments should be conducted in the future.” have been added to the second last paragraph of the Discussion section.
3. Expand the discussion on limitations related to sample selection bias.
Response: Thank you for your comment. For addressing this comment, the following content has been added to the Discussion section as the last paragraph.
“Based on the findings and limitations of our pilot study, we propose that, in the short term, this MRI learning program be implemented as a one-week co-curricular program through the University initiative, “Curtin Extra” during the academic year break (January to mid-February). It will target students who have just completed the BSc (MRS) MRI theoretical component (second-year study) and will not be subject to a quota. However, all future participants must meet the established MRI safety requirements and provide written consent to participate. This co-curricular program will offer an opportunity to further validate our data collection instruments, assess the generalizability of this pilot study’s findings, and evaluate the financial and practical feasibility of the program. If positive outcomes are observed during this extended implementation, the program will subsequently be incorporated into the clinical practice (placement) unit of the BSc (MRS) course, immediately prior to third-year study, during the academic break. In that context, standard clinical placement assessments (including learning contract, direct observation, and reflective writing) would be used to evaluate students’ MRI competence development. As another future research direction, we aim to investigate the long-term impact of this MRI learning program by assessing participants’ MRI competence once they begin professional practice after graduation. (Ng, 2020).”
4. Provide a more detailed description of the ethical and safety safeguards related to peer scanning.
Response: Thank you for your comment. For addressing this comment, the following information has been added to Section 2.
“The following information was provided in the participant information statement to inform potential participants of the risks associated with participating in the program.
“There are no foreseeable physical risks (including projectile injury, implant displacement, acoustic trauma and magnet quench causing asphyxiation) from this research project (as all MRI scans will be conducted as per the WA NIF Node’s standard operating procedures and you have already completed MRI safety training as part of the MRI theoretical component of Curtin University’s Bachelor of Science (MRS) program, enabling you to observe all safety requirements at all times). You will be asked to complete the attached WA NIF Node MRI safety questionnaire which the WA NIF Node MRI personnel will thoroughly check to ensure you do not have any metal on you before you enter the scanning room.
There might be 2 potential psychological risks associated with the MRI scanning of yourself and your peers.
1. Incidental findings observed during the MRI scanning. This might cause potential stress to respective participants due to health and privacy concerns. You should see your GPs [general practitioners] if you suspect having any medical diseases or conditions. In case of any potentially significant findings observed during the MRI scanning, we will advise you to approach your GPs according to Ahpra’s [Australian Health Practitioner Regulation Agency’s] requirements. When you are scanned by your peers, they will be able to see MRI images of your body, resulting in privacy concern. However, all participants are registered as students with MRPBA and required to observe its privacy and confidentiality requirements during the MRI scanning. It is not uncommon for healthcare professionals to provide clinical services to those who they know in clinical practice, e.g. healthcare professionals visiting their staff clinics run by their colleagues for medical services, etc. Participant numbers (codes) instead of names will be entered into the MRI scanner for initiating the scans to further address any privacy concerns.
2. Triggering claustrophobia when being scanned by your peers. However, you can use the MRI call button provided to terminate the scan at any point.”
5. Explicitly propose how this practical activity could be integrated into the formal curriculum.
Response: Thank you for your comment. For addressing this comment, the following content has been added to the Discussion section as the last paragraph.
“Based on the findings and limitations of our pilot study, we propose that, in the short term, this MRI learning program be implemented as a one-week co-curricular program through the University initiative, “Curtin Extra” during the academic year break (January to mid-February). It will target students who have just completed the BSc (MRS) MRI theoretical component (second-year study) and will not be subject to a quota. However, all future participants must meet the established MRI safety requirements and provide written consent to participate. This co-curricular program will offer an opportunity to further validate our data collection instruments, assess the generalizability of this pilot study’s findings, and evaluate the financial and practical feasibility of the program. If positive outcomes are observed during this extended implementation, the program will subsequently be incorporated into the clinical practice (placement) unit of the BSc (MRS) course, immediately prior to third-year study, during the academic break. In that context, standard clinical placement assessments (including learning contract, direct observation, and reflective writing) would be used to evaluate students’ MRI competence development. As another future research direction, we aim to investigate the long-term impact of this MRI learning program by assessing participants’ MRI competence once they begin professional practice after graduation. (Ng, 2020).”
6. Consider including graphical representations to better summarize the results.
Response: Thank you for your comment. For addressing this comment, an interval plot of test mark mean differences (actual magnetic resonance imaging (MRI) competence improvements) with 95 % confidence intervals for all participants and various subgroups has been added (Figure 1).
7. From an ethical standpoint, the manuscript lacks a description of the procedures to follow in case of incidental findings, since the study was also conducted on supposedly healthy individuals. Volunteers should be given the opportunity to decide what the investigators should do if such findings occur, in full respect of their autonomy.
Response: Thank you for your comment. For addressing this comment, the following information has been added to Section 2.
“The following information was provided in the participant information statement to inform potential participants of the risks associated with participating in the program.
“There are no foreseeable physical risks (including projectile injury, implant displacement, acoustic trauma and magnet quench causing asphyxiation) from this research project (as all MRI scans will be conducted as per the WA NIF Node’s standard operating procedures and you have already completed MRI safety training as part of the MRI theoretical component of Curtin University’s Bachelor of Science (MRS) program, enabling you to observe all safety requirements at all times). You will be asked to complete the attached WA NIF Node MRI safety questionnaire which the WA NIF Node MRI personnel will thoroughly check to ensure you do not have any metal on you before you enter the scanning room.
There might be 2 potential psychological risks associated with the MRI scanning of yourself and your peers.
1. Incidental findings observed during the MRI scanning. This might cause potential stress to respective participants due to health and privacy concerns. You should see your GPs [general practitioners] if you suspect having any medical diseases or conditions. In case of any potentially significant findings observed during the MRI scanning, we will advise you to approach your GPs according to Ahpra’s [Australian Health Practitioner Regulation Agency’s] requirements. When you are scanned by your peers, they will be able to see MRI images of your body, resulting in privacy concern. However, all participants are registered as students with MRPBA and required to observe its privacy and confidentiality requirements during the MRI scanning. It is not uncommon for healthcare professionals to provide clinical services to those who they know in clinical practice, e.g. healthcare professionals visiting their staff clinics run by their colleagues for medical services, etc. Participant numbers (codes) instead of names will be entered into the MRI scanner for initiating the scans to further address any privacy concerns.
2. Triggering claustrophobia when being scanned by your peers. However, you can use the MRI call button provided to terminate the scan at any point.”
Round 2
Reviewer 2 Report
Comments and Suggestions for AuthorsThe revised manuscript addresses several points raised in the initial round—most notably, the authors now (i) disclose Cohen’s d as their effect-size metric, (ii) relocate two extensive tables to an appendix, and (iii) supply sample questionnaire items. While these steps represent genuine progress, a series of substantive shortcomings still prevent the study from meeting the standards of scientific soundness expected for publication.
1. Theoretical foundation
The introduction remains largely descriptive and fails to integrate established learning frameworks with the intervention at hand. Merely adding two meta-analytic citations on simulation fidelity does not suffice. The authors should articulate how experiential learning (Kolb), adult-learning principles (Knowles), multimedia learning theory (Mayer), and INACSL/Jeffries standards explicitly underpin each programme component (phantom scanning, peer scanning, preparatory readings). Ideally, at least one testable hypothesis should be derived from each theoretical construct.
2. Psychometric transparency
Although eight domain experts reviewed the survey instruments, no quantitative evidence of content validity (e.g., I-CVI, Aiken’s V) or internal consistency (Cronbach’s α or KR-20) is provided. Because self-assessed competence constitutes a primary outcome, rigorous psychometric documentation is essential; otherwise, interpretation of the observed gains remains speculative.
3. Statistical robustness
The manuscript still reports a large number of subgroup t-tests without any multiplicity control. A narrative statement that “type-I error does not appear to be present” is not adequate. The authors must apply a recognised correction procedure (Holm–Bonferroni, Benjamini–Hochberg) or, preferably, employ multivariate models (e.g., mixed ANOVA with time × group) and report adjusted p-values alongside effect sizes and 95 % confidence intervals.
4. Presentation of figures and tables
The newly added interval plot is helpful; however, figure numbering is now out of sequence, captions omit sample sizes, and several screenshots disrupt the flow of results. All figures should be renumbered consecutively, include n and error bars, and—if space is limited—move illustrative photographs to supplementary material. Table and appendix numbering should likewise be harmonised.
5. External validity and selection bias
The sample comprises exclusively high-achieving students (course-weighted average > 80 %), yet the manuscript offers no quantitative comparison with the wider cohort. Please present demographic and academic characteristics of the entire second- and third-year population, discuss potential ceiling effects, and outline recruitment strategies to include more diverse performers in future iterations.
Author Response
The revised manuscript addresses several points raised in the initial round—most notably, the authors now (i) disclose Cohen’s d as their effect-size metric, (ii) relocate two extensive tables to an appendix, and (iii) supply sample questionnaire items. While these steps represent genuine progress, a series of substantive shortcomings still prevent the study from meeting the standards of scientific soundness expected for publication.
Response: Thank you for your comment.
1. Theoretical foundation
The introduction remains largely descriptive and fails to integrate established learning frameworks with the intervention at hand. Merely adding two meta-analytic citations on simulation fidelity does not suffice. The authors should articulate how experiential learning (Kolb), adult-learning principles (Knowles), multimedia learning theory (Mayer), and INACSL/Jeffries standards explicitly underpin each programme component (phantom scanning, peer scanning, preparatory readings). Ideally, at least one testable hypothesis should be derived from each theoretical construct.
Response: Thank you for your comment. For addressing this comment, the following changes have been made for various sections to show the theoretical foundations of our pilot study.
The following sentences, “Their findings could be explained by Knowles’ (1990) adult learning theory, which suggests that learning is more effective for adults when they recognize the relevance of knowledge and skills to real-world situations and perceive them as directly applicable to their professional duties.” and “According to the International Nursing Association for Clinical Simulation and Learning (INACSL) Standards of Best Practice, simulation-based learning experiences should be pilot tested with measurable objectives prior to full implementation (INACSL Standards Committee, 2016).” have been added to the last paragraph of the Introduction section.
The second sentence of Section 2.1 has been changed to “At the beginning of Part 1, a briefing about the program activities was provided to all participants for meeting the INACSL Standards of Best Practice Criterion 7, “Begin simulation-based experiences with a prebriefing” (INACSL Standards Committee, 2016).” Also, the following sentences, “These two readings had a combination of texts, graphical illustrations and MRI system snapshots which could promote more effective learning as per Mayer’s (2002) multimedia principle of multimedia learning.”, “These arrangements supported the participants in preparing effectively for the second-week activities and maximizing their outcomes, in line with the INACSL Standards of Best Practice for simulation-based learning (INACSL Standards Committee, 2016).” and “These arrangements were made based on INACSL Standards of Best Practice for simulation-based learning (INACSL Standards Committee, 2016).” have been added to Section 2.1.
The first sentence of Section 2.2 has been changed to “The pre- and post-program questionnaires and tests were developed based on the respective data collection instruments by Colucci et al. (2023) to evaluate the program in accordance with the INACSL Standards of Best Practice for simulation-based learning (INACSL Standards Committee, 2016).”
The following sentence, “This approach also supports the implementation of Kolb’s (1984) Experiential Learning Cycle, encompassing the stages of concrete experience, reflective observation, abstract conceptualization, and active experimentation.” has been added to the last paragraph of the Discussion section.
2. Psychometric transparency
Although eight domain experts reviewed the survey instruments, no quantitative evidence of content validity (e.g., I-CVI, Aiken’s V) or internal consistency (Cronbach’s α or KR-20) is provided. Because self-assessed competence constitutes a primary outcome, rigorous psychometric documentation is essential; otherwise, interpretation of the observed gains remains speculative.
Response: Thank you for your comment. For addressing this comment, the second last sentence of the second last paragraph has been changed to “Moreover, although only the content validity of the questionnaires and tests was assessed by eight experts, our findings regarding participants’ self-perceived and actual competences appear to be consistent.”
3. Statistical robustness
The manuscript still reports a large number of subgroup t-tests without any multiplicity control. A narrative statement that “type-I error does not appear to be present” is not adequate. The authors must apply a recognised correction procedure (Holm–Bonferroni, Benjamini–Hochberg) or, preferably, employ multivariate models (e.g., mixed ANOVA with time × group) and report adjusted p-values alongside effect sizes and 95 % confidence intervals.
Response: Thank you for your comment. For addressing this comment, the fifth sentence of Section 2.3 has been changed to “In addition, subgroup analyses for second- and third-year students, with and without MRI / physics background, and physics and instrumentation and clinical practice questions were conducted with Holm-Bonferroni correction.” Besides, the third sentence of the second last paragraph of the Discussion section has been changed to “Although the small sample size in this study might have impacted the subgroup analyses and introduced potential type I errors (false positives), the Holm-Bonferroni method was used to correct their p-values.” The respective p-values in Tables 3, C1 and C2 have been corrected accordingly.
4. Presentation of figures and tables
The newly added interval plot is helpful; however, figure numbering is now out of sequence, captions omit sample sizes, and several screenshots disrupt the flow of results. All figures should be renumbered consecutively, include n and error bars,
and—if space is limited—move illustrative photographs to supplementary material. Table and appendix numbering should likewise be harmonised.
Response: Thank you for your comment. For addressing this comment, the figure, table and appendix numbering, captions and screenshots have been reviewed, necessary changes have been made. For example, Figure 1 caption has been changed to: “Interval plot of test mark mean differences (actual magnetic resonance imaging (MRI) competence improvements) for all participants (n=10) and various subgroups: second-year (n=4); third-year (n=6); and with (n=3) and without MRI / physics back-ground (n=7), depicted as mean and 95% confidence intervals.”
5. External validity and selection bias
The sample comprises exclusively high-achieving students (course-weighted average > 80 %), yet the manuscript offers no quantitative comparison with the wider cohort. Please present demographic and academic characteristics of the entire second- and third-year population, discuss potential ceiling effects, and outline recruitment strategies to include more diverse performers in future iterations.
Response: Thank you for your comment. For addressing this comment, the first sentence of Section 3.1 has been changed to “Twenty-three expressions of interest were received from the BSc (MRS) students (second-year: n=12; course weighted average (CWA): 78.2% (mean) and 63.5-83.5% (range) and third-year: n=11; CWA: 79.1% (mean) and 72.6-85.7% (range)) to join the hands-on MRI learning program.” The following sentence, “Given that the mean CWAs of the second- and third-year students (78.2% and 79.1%, respectively) were close to those of our study participants, it is anticipated that potential concerns are unlikely to be significant.” has been added to the second last paragraph of the Discussion section. Besides, immediately after the sentence of the last paragraph of the Discussion section, “It will target students who have just completed the BSc (MRS) MRI theoretical component (second-year study) and will not be subject to a quota.”, the following sentence “This approach may help mitigate selection bias and the ceiling effect (inability to yield significant gains) because our findings indicate that second-year students demonstrated greater improvements than their third-year counterparts (Staus, et al., 2021).” has been added.