Feedback in Medical Education and Clinical Practice: What’s Next?

A special issue of Healthcare (ISSN 2227-9032). This special issue belongs to the section "Nursing".

Deadline for manuscript submissions: 31 December 2024 | Viewed by 1112

Special Issue Editors


E-Mail Website
Guest Editor
Unit of Development and Research in Medical Education (UDREM), Faculty of Medicine, University of Geneva, 1211 Geneva, Switzerland
Interests: supervision; clinical reasoning; collaborative reasoning between healthcare professionnals; faculty development

E-Mail Website
Guest Editor
Unit of Development and Research in Medical Education (UDREM), Faculty of Medicine, University of Geneva, 1211 Genève, Switzerland
Interests: training in medical education (pedagogical training workshops for faculty members; course on medical education (ESME); MOOC on clinical reasoning supervision); small group learning; feedback in medical education; academic medicine; research methodology

Special Issue Information

Dear Colleagues,

Feedback is considered as an essential element of the educational process that can help trainees to develop their skills, grow professionally, and reach their maximum potential. While clinical educators and trainees agree on the importance of feedback during supervision in clinical settings, its provision remains challenging in the clinical context, and its impact on learning and longitudinal professional growth remains difficult to measure.

Over the past decades, the literature on feedback has focused to a large extent on the feedback provider and on feedback techniques. However, the most optimally delivered feedback on clinical practice may have a poor impact if the feedback is not perceived in its intended way by the learner. Different factors, such as the relationship between supervisors and trainees, and institutional feedback cultures influence how a trainee perceives feedback, which in turn determines the impact this feedback has on their learning and progression. In this Special Issue, we thus aim to go beyond the principles of feedback delivery and bring together commentaries, original research, short reports, and reviews that explore and document elements pertaining to feedback in the clinical setting, which determine the way feedback may be perceived by trainees.

Dr. Marie Claude Audétat
Dr. Sophie Wurth
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Healthcare is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • feedback
  • clinical training
  • supervisor trainee relationship
  • institutional feedback culture
  • longitudinal professional growth

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

12 pages, 1433 KiB  
Article
Quality Assessment and Modulating Factors on Self-Regulatory Behavior in Peer-Assisted Medical Learning
by Jannis Achenbach, Laura Nockelmann, Michaela Thomas and Thorsten Schäfer
Healthcare 2023, 11(15), 2223; https://doi.org/10.3390/healthcare11152223 - 7 Aug 2023
Viewed by 729
Abstract
Objectives: Standardized extracurricular skills labs courses have been developed in recent decades and are important approaches in peer-assisted medical learning (PAL). To provide high quality training and achieve effective learning strategies, continuous evaluations and quality assessments are essential. This research aims to evaluate [...] Read more.
Objectives: Standardized extracurricular skills labs courses have been developed in recent decades and are important approaches in peer-assisted medical learning (PAL). To provide high quality training and achieve effective learning strategies, continuous evaluations and quality assessments are essential. This research aims to evaluate quality data from medical students participating in extracurricular skills labs courses at Ruhr-University Bochum to prospectively optimize concepts and didactical training and standardize processes. Additionally, we set out to assess and quantify drivers that are influencing factors of the self-reflection of competencies. Methods: The analysis was based on a routine assessment of n = 503 attendees of the PAL courses in the skills labs in three consecutive semesters, who voluntarily participated in the evaluation. We analyzed the effects of age, semester and their interaction on the self-reflection of competencies in technical skills courses using moderated regression and simple slope analyses, as previously published. A univariate analysis of variance (ANOVA) with post hoc Tukey HSD testing was used to analyze group means in estimated competencies using IBM SPSS Statistics V.28. Results: An analysis of variance revealed a significant increase in self-assessed competencies when comparing pre- vs. post-course evaluation data in all 35 depicted items (all p < 0.001). A total of 65.5% of the items were adjusted significantly differently, revealing modified self-reflected pre-course levels compared to those stated before. A moderated regression analysis revealed that age (R2 = 0.001, F(1;2347) = 1.88, p < 0.665), semester of study (∆R2 = 0.001, ∆F (1;2346) = 0.012, p < 0.912) and their interaction (∆R2 = 0.001, ∆F (1;2345) = 10.72, p < 0.227) did not explain a significant amount of the variance in self-reflection variance. A simple slope analysis of earlier (b = 0.07, t = 0.29, p < 0.773) and later semesters of study (b = 0.06, t = 0.07, p < 0.941) did not differentiate from zero. Conclusions: The presented evaluation paradigm proved to be a useful tool to encourage students to initiate self-regulatory and self-reflective behavior. The cohesive evaluation of the large cohort of attendees in extracurricular, facultative skills labs courses was helpful in terms of quality assessments and future adaptations. Further evaluation paradigms should be implemented to assess other influencing factors, such as gender, on self-reflection, since age and semester did not explain significant differences in the model. Full article
(This article belongs to the Special Issue Feedback in Medical Education and Clinical Practice: What’s Next?)
Show Figures

Figure 1

Back to TopTop