Next Article in Journal
Community-Engaged Learning Within the Medical Curriculum: Evaluating Learning Outcomes and Implementation Challenges
Previous Article in Journal
When Personal Identity Meets Professional Identity: A Qualitative Study of Professional Identity Formation of International Medical Graduate Resident Physicians in the United States
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

Structured Reporting in Radiology Residency: A Standardized Approach to Assessing Interpretation Skills and Competence

1
Department of Diagnostic Radiology, Faculty of Health Sciences, Queen’s University, 99 University Ave, Kingston, ON K7L 3N6, Canada
2
Department of Medical Imaging, University of Toronto, 2109 Medical Sciences Building, 263 McCaul St 4th Floor, Toronto, ON M5T 1W7, Canada
*
Author to whom correspondence should be addressed.
Int. Med. Educ. 2025, 4(1), 2; https://doi.org/10.3390/ime4010002
Submission received: 29 December 2024 / Revised: 13 February 2025 / Accepted: 16 February 2025 / Published: 18 February 2025

Abstract

:
The field of radiology heavily relies on image interpretation and reporting. Radiology residents undergo evaluations primarily based on their interpretation skills, often encountering varied cases with differing complexities. Assessing resident performance in such a diverse setting poses challenges due to variability in judgment among assessors. One aspect of training that can be standardized is the reporting process. Developing a structured reporting system could aid in evaluating resident milestones and achievement of Entrustable Professional Activities (EPAs), facilitating standardized assessment and comparison among peers. From our experiences, we describe a logical reasoning pathway followed by residents in their training, progressing from recognizing abnormalities to describing findings, identifying associated positive and negative findings, and recommending appropriate management. Each step provides evidence of milestone achievement and can be assessed through structured reporting. We propose that a grading system can be applied to assess perception skills, description accuracy, recognition of associated findings, formulation of differential diagnoses, recommendations, and consultation with clinicians. Comparison between junior and senior resident reports allows for monitoring progression and identifying areas for improvement. Although implementing this grading system poses challenges, it offers potential benefits in providing standardized assessment and guiding individualized learning curves for residents. Despite its limitations, once established, the system could enhance residency training in diagnostic imaging.

The profession of radiology has traditionally consisted in large part of the interpretation of images and generation of a report, with little direct consulting physician or patient exposure [1]. Radiology residents are exposed to, and often evaluated on, their assessment of imaging, with little control over the type or severity of the presenting disease process or the subtlety or complexity of the associated imaging findings [2]. During their block rotations, it is possible that a junior resident (Transition to Discipline or Foundations Resident) may be exposed to difficult, rare cases with subtle findings and requiring complex management, while a senior resident (Core or Transition to Practice Resident) may be exposed to straightforward, common cases with obvious imaging findings and management. Attempting to evaluate resident performance and progression in this inherently unstandardized setting is difficult, and variability in assessor judgement of entrustment across participants and assessment settings is a concern [3]. Ongoing reforms in medical residency training/education, such as Competency Based Education (CBE), mastery learning, Entrustable Professional Activities (EPAs), and adaptive learning workplaces, have emphasized the need for more valid assessments of learning outcomes [4].
What can be standardized and structured, however, is the imaging report. Residents are taught to follow a reporting format during their training, which is often specific to a modality or, in some cases, to a particular body part [5]. The development of a structured report to aid in the assessment of resident milestone and EPA achievement would be valuable and could provide a standardized method by which to evaluate resident progression and allow comparison with their peers. Importantly, the structured report may allow earlier identification of a specific weakness that limits the resident’s ability to achieve specialty status. The purpose of this study was to develop a standardized process to document and assess resident milestone and EPA achievement throughout their residency and maximize individualized learning curve development.
Residents complete several years of training to develop the expertise required for interpreting radiological images, gradually acquiring the necessary skills [6]. From our experiences as academic radiologists at a large tertiary care center and teaching institution, we have found that the ‘thought and reasoning’ process that radiology residents go through to successfully achieve this subspecialty expertise is consistent and follows an expected path. We propose that this process can be explained in the following steps:
  • Similar to the first year intern recognizing who is sick and requires urgent treatment versus who may be sick but there is time to investigate, the radiology resident must learn to recognize which images are normal or abnormal early on in training. This forms the foundation for ‘on-call’ responsibility and is reliant on knowledge gained from medical school, early resident teaching/boot-camp experience and, to some degree, the resident’s inherent perceptual skills.
  • After acquiring that initial ability to recognize that there is an imaging abnormality, the resident begins to progressively learn how to describe the finding, using the expected nomenclature. This knowledge is gained through exposure with the attending radiologist during block rotations that continue through residency and through available learning resources.
  • As the resident acquires further imaging knowledge and experience, the thought and reasoning processes are focused on advancing their perceptual and descriptive skills to include significant associated ‘positive’ imaging findings, supporting the diagnosis they now have in their mental knowledge bank and are considering in that particular case.
  • With ongoing exposure to varying disease processes, associated imaging and an advancing imaging knowledge base, the progressing resident learns to recognize the value of significant ‘negative’ findings on the imaging study, which can help lead to a more useful consultative report and accurate diagnosis, in order to better guide patient management.
  • As the resident progresses through residency and this initial learning pathway of recognizing, describing, and evaluating for significant positive and negative findings, they will become increasingly more confident with their suspected diagnosis and potential differential diagnoses, when appropriate.
  • Continuing on their learning path, the resident gains more familiarity and experience with the different available imaging modalities, becoming knowledgeable about their advantages/disadvantages through further exposure during block rotations, and becoming competent in recommending appropriate further necessary imaging, as is needed, to guide patient management.
  • Finally, nearing the end of training, the resident will have had exposure to and teaching on most imaging diagnoses and the available imaging modalities. Combined with their specialty rounds/consultations and growing responsibility in patient management during training, the resident will have gained the experience to recommend appropriate consultation to a subspecialty service, together with the knowledge of what needs to come next for the patient and with what urgency.
Each of the steps along this logical reasoning pathway can provide evidence for residents’ achievement of milestones and EPA fulfillment, from the Transition to Discipline (TTD) phase through to the Transition to Practice (TTP), and can be assessed in a resident’s adequately structured report [7]. Evaluation and grading (awarding of points) of each step along the path can be utilized to provide a more standardized assessment of residents and better allow comparison with peers and overall progression through residency, as illustrated with the following case (Figure 1) and the associated senior and junior residents’ reports (Figure 2 and Figure 3).
Step 1—Initially, the resident is evaluated on recognizing the abnormality. This can be determined from the resident’s report, including their PACS note. At the beginning of residency, this will be heavily weighted towards ER (on-call) exposure and common conditions (e.g., pneumothorax). We expect the resident, early on, to recognize an obvious or emergent finding, in order to show competence for on-call duties. The perception of the abnormality can be allocated points in the grading of the report (i.e., 3 points for identification of the primary abnormality versus 0 points for missing it).
Step 2—After evaluating the resident’s ability to perceive the abnormality, we can grade how the resident described the abnormal finding. We expect the senior’s description to be more precise and to utilize standard jargon, compared to the junior’s. This allows for variability in point allocation and provides a means to stratify residents’ abilities (i.e., 3 points for a description of an aggressive ill-marginated mixed lytic and sclerotic tibial metaphyseal lesion versus 1 point for a description of a tibial radiolucency).
Step 3—After perception and description of the primary abnormality, we can then grade the identification and description of the associated positive findings (i.e., 2 points for a description of an interrupted periosteal reaction and another 2 points for an associated soft tissue mass, as in this case).
Step 4—Knowing and reporting the significant associated negative findings provides evidence of an expanded knowledge base and further resident advancement (i.e., 2 points for a description of no involvement of the epiphysis and joint space and another 2 points for stating that there is no further lesion in the tibia).
Step 5—Further confidence and understanding of the positive and negative findings by the resident is reflected in the impression stating the suspected diagnosis and differential diagnoses, with points awarded or potentially subtracted. It would be expected that the senior’s report includes a more thorough differential and/or accurate diagnosis, as compared to the junior’s.
Step 6—Recommending appropriate follow up imaging and how it should be performed should be part of the standardized report and can be awarded points (i.e., in this case, 2 points for a recommendation of follow up MR to include both proximal and distal joints).
Step 7—Finally, the advanced resident report should include a recommendation for appropriate consultative services and with what urgency (i.e., in this case, 2 points for a recommendation of urgent oncologic and orthopedic consultations).
When we compare the junior and senior residents’ reports and allot points for each step, we appreciate that the junior resident had the perceptual skill to identify the abnormality, which we would expect early on in residency (Figure 3). If the resident did not notice the abnormality then concern for a possible perception deficiency would be raised and a decision on whether the resident is ready to transition to the next level and on-call duties, which is a key milestone development goal, would need to be addressed [8].
The senior resident more fully and accurately describes the primary abnormality, in this case an aggressive bone tumor, obtaining more points than the junior, and also demonstrates an appreciation of the significance of associated positive and negative findings (i.e., interrupted periosteal reaction, soft tissue mass, no other lesion, etc.). The senior resident also accumulates more points by demonstrating a knowledge of what follow up imaging and protocol is needed, as compared to the junior who may not have had exposure to MRI at their stage of training. Finally, the more advanced resident’s report illustrates a subspecialty knowledge of what is needed next for this patient and how urgently it is needed. Overall, the total points for the senior resident would be expected to be higher than the junior’s and would be expected to increase at each step of the pathway over the residency. A similar grading system was utilized in a pilot study as part of a previously published paper on competency based medical education [9].
Mastery learning and testing are essential to the achievement and assessment of EPA and resident milestones [8]. Each step in our described learning path, whether assessing perceptual skills, differential diagnoses, image modality, or medical consultative knowledge bases, could be evaluated/graded to determine specific areas of weakness and help demonstrate improvement and accomplishment of milestones over time. Comparison with peers could provide valuable insight and guide training. This grading may help in the development of individualized learning curves for the residents. Assessment utilizing learning curves that detail the results of deliberate practice offer advantages over traditional methods, as educators may draw conclusions about the rate of learning and effectiveness and determine if competence has been achieved during training. Although the rate of learning may be different among residents, most individuals should eventually achieve mastery. Low or falling learning curves could help identify individuals who require intervention and need tailored learning plan development [10].
Apart from the image report grading allowing for assessment of related milestones, the full patient report can also contain information demonstrating further competence in achievement of EPAs and milestones, as depicted in Figure 4.
The implementation of such a grading system needs further assessment. Our goal is to embed particular cases (for which mock reports and grading are already available) into the resident’s rotation work folder, to best standardize assessment of residents over time. The grading could then allow for comparison with their peers and be used to monitor progression. A further possibility would be to grade several randomly selected cases that were previously dictated by the resident during their block rotations. Criterion based standard/cut scores could ultimately be derived, once enough residents have been assessed and their grades correlated with outcomes [11].
However, there are multiple limitations apparent to the suggested resident assessment. Although many reporting templates are available, these, in general, are not disease-specific, and therefore, more dedicated templates to allow for more appropriate standardized reporting would first need to be developed. Residents would need to be taught and exposed to the appropriate templates during their training. Non-template reporting would be challenging to grade, due to the time required to filter out the ‘steps’ that are needed to assess achievement of milestones. Additionally, ‘mock’ reports would need to be developed to cover the many pathologies seen in radiology, to allow for grading. These adjustments would likely initially require extensive time and collaboration. Over time, as cases accumulate, this endeavor should become less onerous. The department would also need substantial additional IT support to allow cases to be embedded into the PACS, to allow for resident dictation during their routine block reporting. A hospital commitment to optimize resident training would be needed. Of course, another limitation is that not all skills required to obtain mastery in diagnostic imaging lend themselves to this sort of analysis [10].
Despite the limitations, once the system is established, the process could provide a more standardized assessment of residents and their progression though residency, allowing for documentation of milestone/EPA accomplishment. Individual learning curves could be established that could focus training on specific areas of concern and standard scores could be developed to document mastery in diagnostic imaging.

Funding

This research received no external funding.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Rockall, A.G.; Justich, C.; Helbich, T.; Vilgrain, V. Patient communication in radiology: Moving up the agenda. Eur. J. Radiol. 2022, 155, 110464. [Google Scholar] [CrossRef]
  2. Kwan, B.Y.M.; Mussari, B.; Moore, P.; Meilleur, L.; Islam, O.; Menard, A.; Soboleski, D.; Cofie, N. A pilot study on diagnostic radiology residency case volumes from a Canadian perspective: A marker of resident knowledge. Can. Assoc. Radiol. J. 2020, 71, 448–452. [Google Scholar] [CrossRef] [PubMed]
  3. Wildenberg, J.C.; Chen, P.-H.; Scanlon, M.H.; Cook, T.S. Attending radiologist variability and its effect on radiology resident discrepancy rates. Acad. Radiol. 2017, 24, 694–699. [Google Scholar] [CrossRef] [PubMed]
  4. Jeyalingam, T.; Walsh, C.M.; Tavares, W.; Mylopoulos, M.; Hodwitz, K.; Liu, L.W.; Heitman, S.J.; Brydges, R. Variable or fixed? Exploring entrustment decision making in workplace-and simulation-based assessments. Acad. Med. 2022, 97, 1057–1064. [Google Scholar] [CrossRef]
  5. Burns, J.; Catanzano, T.M.; Schaefer, P.W.; Agarwal, V.; Kim, D.; Goiffon, R.J.; Jordan, S.G. Structured reports and radiology residents: Friends or foes? Acad. Radiol. 2022, 29, S43–S47. [Google Scholar] [CrossRef] [PubMed]
  6. Rutgers, D.R.; Van Raamt, F.; Ten Cate, T.J. Development of competence in volumetric image interpretation in radiology residents. BMC Med. Educ. 2019, 19, 122. [Google Scholar] [CrossRef] [PubMed]
  7. The Royal College of Physicians and Surgeons of Canada. Entrustable Professional Activities for Diagnostic Radiology. Available online: https://www.royalcollege.ca/rcsite/documents/ibd/diagnostic-radiology-str-e.pdf (accessed on 9 August 2024).
  8. Yudkowsky, R.; Park, Y.S.; Lineberry, M.; Knox, A.; Ritter, E.M. Setting mastery learning standards. Acad. Med. 2015, 90, 1495–1500. [Google Scholar] [CrossRef] [PubMed]
  9. Castro, D.; Yang, J.; Greer, M.L.; Kwan, B.; Sauerbrei, E.; Hopman, W.; Soboleski, D. Competency-based medical education—Towards the development of a standardized pediatric radiology testing module. Acad. Radiol. 2020, 27, 1622–1632. [Google Scholar] [CrossRef] [PubMed]
  10. Pusic, M.; Pecaric, M.; Boutis, K. How much practice is enough? Using learning curves to assess the deliberate practice of radiograph interpretation. Acad. Med. 2011, 86, 731–736. [Google Scholar] [CrossRef] [PubMed]
  11. Hejri, S.M.; Jalili, M. Standard setting in medical education: Fundamental concepts and emerging challenges. Med. J. Islam. Repub. Iran 2014, 28, 34. [Google Scholar] [PubMed]
Figure 1. Radiographs of the tibia provided for sample case of aggressive osseous tibial lesion including (A) frontal and (B) lateral projections.
Figure 1. Radiographs of the tibia provided for sample case of aggressive osseous tibial lesion including (A) frontal and (B) lateral projections.
Ime 04 00002 g001
Figure 2. Sample senior resident report illustrating the application of the multi-step reasoning pathway.
Figure 2. Sample senior resident report illustrating the application of the multi-step reasoning pathway.
Ime 04 00002 g002
Figure 3. Graded mock (A) senior and (B) junior resident reports. Points allocated to each element of the report have been circled in red.
Figure 3. Graded mock (A) senior and (B) junior resident reports. Points allocated to each element of the report have been circled in red.
Ime 04 00002 g003
Figure 4. EPAs assessed within mock (A) senior and (B) junior resident reports. Any EPA which has not been satisfied in the reports has been marked with a red cross.
Figure 4. EPAs assessed within mock (A) senior and (B) junior resident reports. Any EPA which has not been satisfied in the reports has been marked with a red cross.
Ime 04 00002 g004
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Castro, D.; Mishra, S.; Kwan, B.Y.M.; Nasir, M.U.; Daneman, A.; Soboleski, D. Structured Reporting in Radiology Residency: A Standardized Approach to Assessing Interpretation Skills and Competence. Int. Med. Educ. 2025, 4, 2. https://doi.org/10.3390/ime4010002

AMA Style

Castro D, Mishra S, Kwan BYM, Nasir MU, Daneman A, Soboleski D. Structured Reporting in Radiology Residency: A Standardized Approach to Assessing Interpretation Skills and Competence. International Medical Education. 2025; 4(1):2. https://doi.org/10.3390/ime4010002

Chicago/Turabian Style

Castro, Denise, Siddharth Mishra, Benjamin Y. M. Kwan, Muhammad Umer Nasir, Alan Daneman, and Donald Soboleski. 2025. "Structured Reporting in Radiology Residency: A Standardized Approach to Assessing Interpretation Skills and Competence" International Medical Education 4, no. 1: 2. https://doi.org/10.3390/ime4010002

APA Style

Castro, D., Mishra, S., Kwan, B. Y. M., Nasir, M. U., Daneman, A., & Soboleski, D. (2025). Structured Reporting in Radiology Residency: A Standardized Approach to Assessing Interpretation Skills and Competence. International Medical Education, 4(1), 2. https://doi.org/10.3390/ime4010002

Article Metrics

Back to TopTop