Next Article in Journal
Magnetic Mishap: Multidisciplinary Care for Magnet Ingestion in a 2-Year-Old
Previous Article in Journal
Correction: Strauss et al. Insights from a Decade of Optimizing Emergency Medical Services Across Three Major Regions in Switzerland. Emerg. Care Med. 2024, 1, 368–381
 
 
Article
Peer-Review Record

No Learner Left Behind: How Medical Students’ Background Characteristics and Psychomotor/Visual–Spatial Abilities Correspond to Aptitude in Learning How to Perform Clinical Ultrasounds†

Emerg. Care Med. 2025, 2(3), 31; https://doi.org/10.3390/ecm2030031
by Samuel Ayala 1,*, Eric R. Abrams 2, Lawrence A. Melniker 1, Laura D. Melville 1 and Gerardo C. Chiricolo 3
Reviewer 1:
Reviewer 2: Anonymous
Reviewer 3:
Reviewer 4:
Emerg. Care Med. 2025, 2(3), 31; https://doi.org/10.3390/ecm2030031
Submission received: 12 April 2025 / Revised: 11 June 2025 / Accepted: 18 June 2025 / Published: 25 June 2025

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

This manuscript investigates whether psychomotor and visual-spatial abilities, along with background characteristics, can predict aptitude in clinical ultrasound among third-year medical students. The topic is timely and relevant, especially with the growing integration of point-of-care ultrasound in medical curricula. This study has practical implications for personalized educational strategies in clinical training.

Major Comments:

Page 2, Lines 77–83: While psychomotor and visual-spatial abilities are defined, the connection between these constructs and specific ultrasound skills could be strengthened. To ground the study theoretically, consider referencing specific cognitive load theories or skill acquisition frameworks.

Page 7, Lines 233–236: The study focuses solely on the subxiphoid cardiac view, which limits generalizability. It is suggested that this be acknowledged more prominently in the discussion and that multi-view assessments be recommended in future studies.

Page 7, Lines 237–239: The ultrasound task was scored by a single rater, which may introduce subjective bias. The use of multiple blinded raters or inter-rater reliability data would enhance the rigor.

Page 6, Table 4: While p-values indicate significance, effect sizes or confidence intervals are not reported. Including these would help interpret the practical significance of findings.

Minor Comments:

Page 1, Lines 16–18: Consider rephrasing “lag in assessing this skill in trainees” to “a gap in assessing this skill among trainees” for improved clarity.

Page 6–7, Lines 205–211: The discussion reiterates definitions and skill overlaps with limited synthesis. Streamlining this with a more precise comparison to prior work (e.g., Nicholls et al.) would improve readability.

Overall, this is a well-executed, innovative study that addresses an educational gap in ultrasound training. With some revisions to strengthen methodological transparency and generalizability, this manuscript will make a valuable contribution to medical education literature.

Author Response

Reviewer 1

This manuscript investigates whether psychomotor and visual-spatial abilities, along with background characteristics, can predict aptitude in clinical ultrasound among third-year medical students. The topic is timely and relevant, especially with the growing integration of point-of-care ultrasound in medical curricula. This study has practical implications for personalized educational strategies in clinical training.

Major Comments:

Page 2, Lines 77–83: While psychomotor and visual-spatial abilities are defined, the connection between these constructs and specific ultrasound skills could be strengthened. To ground the study theoretically, consider referencing specific cognitive load theories or skill acquisition frameworks. Amazing advice! I have added a lot more information, references, and connections to my study in the introduction section. Major addition and I hope it added a strong groundwork.

Page 7, Lines 233–236: The study focuses solely on the subxiphoid cardiac view, which limits generalizability. It is suggested that this be acknowledged more prominently in the discussion and that multi-view assessments be recommended in future studies. Really great point. We struggled as authors to pick a view that was both anatomically accessible and not too complex to visualize and understand for novice sonographers. A simple view such as soft tissue ultrasound may miss the complex spatial relationships. A challenging ultrasound such as locating the gallbladder (at times requiring advanced psychomotor maneuvers to find) may have been met with failure and frustration. We agreed upon the subxiphoid view of the heart, locating this view in a single healthy model, so that understanding chamber locations and other surrounding organs would satisfy the unique spatial relationships. While at the same time, the psychomotor movements thought to be possible in novice sonographers. The points above are added to the discussion section as suggested.  

Page 7, Lines 237–239: The ultrasound task was scored by a single rater, which may introduce subjective bias. The use of multiple blinded raters or inter-rater reliability data would enhance the rigor. I enjoy your recommendations. Also, another great point that we struggled with as authors; possibility of having multiple raters also rate students on their ultrasound task skill of finding the subxiphoid view. As the students were accessible only during a 6-week rotation, we were able to organize only one ultrasound task session. These sessions on average took approximately 12-15 minutes (includes initial training and rating by a single rater) and had to be coordinated with the same single rater and same healthy volunteer to allow for reproducibility, but liable to bias. In addition, at a separate session the students were also given a survey to fill out, complete a visual spatial test (Purdue Spatial Relations Test), and a psychomotor task (Purdue Pegboard Test). As can be seen, a lot of coordination for student volunteers. The study definitely could have benefitted from multiple raters, with more ample coordination. The points above are also added to the discussion section.

Page 6, Table 4: While p-values indicate significance, effect sizes or confidence intervals are not reported. Including these would help interpret the practical significance of findings. I have now included confidence intervals to data section.

Minor Comments:

Page 1, Lines 16–18: Consider rephrasing “lag in assessing this skill in trainees” to “a gap in assessing this skill among trainees” for improved clarity. Changed, thanks.

Page 6–7, Lines 205–211: The discussion reiterates definitions and skill overlaps with limited synthesis. Streamlining this with a more precise comparison to prior work (e.g., Nicholls et al.) would improve readability. Again, thank you. I have added and tried to make more connections and comparisons (to prior work) clearer in the discussion section.

Overall, this is a well-executed, innovative study that addresses an educational gap in ultrasound training. With some revisions to strengthen methodological transparency and generalizability, this manuscript will make a valuable contribution to medical education literature. Thanks!!

Reviewer 2 Report

Comments and Suggestions for Authors

why so less participation from students

in 2 years period only 97? explain

kindly improve discussion

 

Author Response

Reviewer 2

why so less participation from students

in 2 years period only 97? Explain. Very keen of you to notice. In the department in which I work, third-year medical students rotate every 6 weeks. There is also a wide variation in the number of students that presented to the rotation (sometimes 4 to 10). In addition, there are multiple breaks throughout the year. In the two-year period of time, I was able to enroll 97 with consultation stopped at that number. Looking at previous studies from my reference section, many of the studies had less than 100 participants. I have added a small statement in results section. Thanks for asking.

kindly improve discussion. Absolutely, I have added more comments and references to the discussion section and another limitation that was noted by another review. Thanks.

Reviewer 3 Report

Comments and Suggestions for Authors

Hello, dear colleagues!

On the one hand, your research is original and may be of interest to readers. However, I have to admit that the manuscript contains a number of shortcomings that significantly reduce its scientific and practical value.

First of all, the writing style does not meet academic standards: it is overloaded with repetitions ("This begs the question...", "the objective was to determine whether these characteristics...are predictive... of aptitude...for learning ultrasound"), the phrases are often redundant and vague. The descriptions of the methodology are not detailed: the details of the procedure, the criteria for inclusion/exclusion of students are not specified, there is no information on the validity of the tests used.

Unfortunately, the relevance and discussion sections are based on a limited (12 sources), more than half of which are older than 10 years. It is quite difficult to assess from these links whether your research is really relevant today, when the growth of digitalization is so intense

In general, the text needs significant revision and in-depth analysis.

Author Response

Reviewer 3

On the one hand, your research is original and may be of interest to readers. However, I have to admit that the manuscript contains a number of shortcomings that significantly reduce its scientific and practical value.

First of all, the writing style does not meet academic standards: it is overloaded with repetitions ("This begs the question...", "the objective was to determine whether these characteristics...are predictive... of aptitude...for learning ultrasound"), the phrases are often redundant and vague. The descriptions of the methodology are not detailed: the details of the procedure, the criteria for inclusion/exclusion of students are not specified, there is no information on the validity of the tests used.

-Thanks for recommendation on the repetitive statements. I have rephrased these and added detail when possible.

-concerning the methodology, I’m not sure what is needed in detail. I described the procedure as best as possible. I even included specifics in the appendix section to many of the steps. Possibly may clarify things.

-inclusion/exclusion criteria are specified on lines 92-97, I've added a little more information as well. 

-information on the validity of the tests. this is information that I mention in appendix sections and in my discussion

Unfortunately, the relevance and discussion sections are based on a limited (12 sources), more than half of which are older than 10 years. It is quite difficult to assess from these links whether your research is really relevant today, when the growth of digitalization is so intense. Thanks for noticing. I did have approximately 6-8 more articles that I used as background material, but did not include in the introduction or discussion section because I didn’t quote them. I have now re-looked at this and made appropriate additions to these sections with corresponding references listed. In addition, I have taken another look at the literature available with the help of an academic librarian, to search for more recent articles. I do have to mention that this is an innovative study on visual spatial and psychomotor intelligences in medicine, in which there are few studies that have been recently done. I am trying to understand and address the educational gap.

In general, the text needs significant revision and in-depth analysis. I’m working hard on it with great recommendations from peers.

Reviewer 4 Report

Comments and Suggestions for Authors

Here are my comments:

  1. In this manuscript, the participants are identified as third-year MD students. However, their undergraduate academic backgrounds are not specified. It would be valuable to include this information, particularly to explore whether students with prior studies in basic sciences or pharmaceutical sciences demonstrate any differences in performance or learning outcomes.
  2. Learning efficacy is often closely linked to the language of instruction, especially when it aligns with the learner’s mother tongue. Therefore, it is recommended to include data on the students’ native languages to assess whether language congruence has any impact on their learning experience or academic performance.
  3. In the references part, please make a consistent format in capital letters. For example, ref 1 and 2 are different styles.

Author Response

Reviewer 4

  1. In this manuscript, the participants are identified as third-year MD students. However, their undergraduate academic backgrounds are not specified. It would be valuable to include this information, particularly to explore whether students with prior studies in basic sciences or pharmaceutical sciences demonstrate any differences in performance or learning outcomes. I think this is a really valid point. I am sure the undergraduate academic backgrounds are really interesting and knowing performance/learning outcomes may be of benefit. In Appendix A I did ask some basic background information, however, I did not gather that information from the students. I will add this lack of information in the limitations sections.
  2. Learning efficacy is often closely linked to the language of instruction, especially when it aligns with the learner’s mother tongue. Therefore, it is recommended to include data on the students’ native languages to assess whether language congruence has any impact on their learning experience or academic performance. Again, would be a great point to see if any correlation and I appreciate the recommendation. Unfortunately, I did not gather that information from the students. I will add this lack of information in the limitations sections.
  3. In the references part, please make a consistent format in capital letters. For example, ref 1 and 2 are different styles. Noted, and changed. Thanks for catching that.

 

Round 2

Reviewer 3 Report

Comments and Suggestions for Authors

Dear colleagues, hello!

Thank you for working on the manuscript, now it has become clearer to me what you have researched, but nevertheless there are still several positions that need clarification

1. Sample size: how was the calculation carried out, according to what formula and with what inputs?
2. How did you validate specialists for the included methods, for example, ultrasound?
3. The list of references contains many references to articles that are 15 years old or more, which casts doubt on the relevance

Author Response

Dear colleagues, hello!

Thank you for working on the manuscript, now it has become clearer to me what you have researched, but nevertheless there are still several positions that need clarification. I will do my best.

  1. Sample size: how was the calculation carried out, according to what formula and with what inputs? Thanks for this question. We had our hospital statistician Mr. Matt Briggs assist us with statistics. We looked at the literature available. From the approximately 10 studies reviewed in endoscopy, anesthesia, histopathology, etc (some of which I used in my references), many had an N <50. Many 10-20. Therefore, we determined that approximately 100 would be adequate. Unfortunately, I am not aware of any calculation carried out by the statistician. Note, we kept in mind the constraints of the students for the study. We assumed practically all of them would be available for all testing, therefore, a good catchment. However, each cohort in a 6-week period had a variable number of students (from 4-10 per cohort), and were only available for that time period. We felt two years, approximately 100 students adequate with his guidance.

  2. How did you validate specialists for the included methods, for example, ultrasound? I am sorry, I did not realize and should have added this previously. The specialist used in the methods sections were Emergency Medicine Physician with a minimal of 4 years-experience in the field, and a rigorous one-year Ultrasound Fellowship. They all worked in an academic institution where they used ultrasound clinically and taught it to students, residents, and ultrasound fellows. I have clarified this more in detail, beginning line 152.

 


  1. The list of references contains many references to articles that are 15 years old or more, which casts doubt on the relevance. I do notice that there is a paucity of articles for this subject. As a clinical sonographer and educator myself, I do see the importance of assessing a learner’s skills when I observe them performing ultrasound. Although, I do not administer standardized tests prior to them starting (Purdue Pegboard of Purdue Visual Test) a rotation, I constantly and consistently am using the same ultrasound task rubric in my head to assess skills. As I work with them, I am also mentally assessing their ability to make fine psychomotor adjustments, so that the organ in question on a machine can be pictured. Yes, I see that the references are older, and I do include a systematic review that was published in 2023, which also notes this paucity of information. Personally, I feel that as an educator of medical students and ultrasound, it is only recently (within the past 10 years) that medical schools are beginning to include formal ultrasound education and training into their curriculum. Some have longitudinal programs from years 1-4. Unfortunately, many still have no formal ultrasound training and rely on sparse introduction during clerkship (3rd year) and specialty rotation (4th year). Until, more robust ultrasound education is incorporated into medical schools, this aptitude approach is not fully appreciated. This information is added lines 333-346. Thanks. 

Round 3

Reviewer 3 Report

Comments and Suggestions for Authors

Dear authors!
Thank you, for now it is all clear with your manuscript

Back to TopTop