From Written Tests to OSCE: A Study on the Perceptions of Assessment Reform by Students and Faculty in the French Dental Curriculum
Abstract
1. Introduction
2. Materials and Methods
2.1. Analyzed Intervention: OSCE Description
2.2. Aims
2.3. Study Design and Duration
2.4. Questionnaire
2.5. Study Subjects
2.6. Collection of Data
2.7. Data Analysis
3. Results
3.1. Students’ Perception of OSCE (Table 1)
| OSCE Students’ Perception | OSCE Assessors’ Perception | |||||
|---|---|---|---|---|---|---|
| This evaluation format | Disagree N (%) | Neutral N (%) | Agree N (%) | Disagree N (%) | Neutral N (%) | Agree N (%) |
| Exam was fair | 9 (11%) | 33 (39%) | 42 (50%) | 0 (0%) | 0 (0%) | 8 (100%) |
| Wide knowledge area covered | 5 (6%) | 12 (14%) | 67 (80%) | 1 (13%) | 2 (25%) | 5 (63%) |
| Needed generally more time | 35 (42%) | 20 (24%) | 29 (35%) | 4 (50%) | 1 (13%) | 3 (38%) |
| Exams well administered | 1 (1%) | 15 (18%) | 68 (81%) | 0 (0%) | 2 (25%) | 6 (75%) |
| Exams very stressful | 5 (6%) | 15 (18%) | 64 (76%) | 0 (0%) | 3 (38%) | 5 (63%) |
| Exams well structured & sequenced | 2 (2%) | 13 (15%) | 69 (82%) | 0 (0%) | 0 (0%) | 8 (100%) |
| Exam minimized chance of failing | 39 (46%) | 38 (45%) | 7 (8%) | 3 (38%) | 4 (50%) | 1 (13%) |
| OSCE less stressful than other exams | 57 (68%) | 23 (27%) | 4 (5%) | 7 (88%) | 1 (13%) | 0 (0%) |
| Allowed student to compensate in some areas | 28 (33%) | 25 (30%) | 31 (37%) | 2 (25%) | 1 (13%) | 5 (63%) |
| Highlighted areas of weakness | 6 (7%) | 19 (23%) | 59 (70%) | 0 (0%) | 0 (0%) | 8 (100%) |
| Exam intimidating | 3 (4%) | 7 (8%) | 74 (88%) | 0 (0%) | 0 (0%) | 8 (100%) |
| Student aware of level of information needed | 12 (14%) | 19 (23%) | 53 (63%) | 0 (0%) | 4 (50%) | 4 (50%) |
| Wide range of clinical skills covered | 0 (0%) | 12 (14%) | 72 (86%) | 0 (0%) | 2 (25%) | 6 (75%) |
| Quality of performance testing | ||||||
| Fully aware of nature of exam | 12 (14%) | 21 (25%) | 51 (61%) | 0 (0%) | 4 (50%) | 4 (50%) |
| Tasks reflected those taught | 2 (2%) | 14 (17%) | 68 (81%) | 0 (0%) | 0 (0%) | 8 (100%) |
| Time at each station was adequate | 26 (31%) | 21 (25%) | 37 (44%) | 0 (0%) | 2 (25%) | 6 (75%) |
| Setting and context felt authentic | 4 (5%) | 10 (12%) | 70 (83%) | 0 (0%) | 2 (25%) | 6 (75%) |
| Instructions were clear and unambiguous | 25 (30%) | 28 (33%) | 31 (37%) | 0 (0%) | 4 (50%) | 4 (50%) |
| Tasks asked to perform were fair | 1 (1%) | 17 (20%) | 66 (79%) | 0 (0%) | 1 (13%) | 7 (88%) |
| Sequence of questions logical and appropriate | 1 (1%) | 11 (13%) | 72 (86%) | 0 (0%) | 0 (0%) | 8 (100%) |
| Exam provided opportunities to learn | 10 (12%) | 17 (20%) | 57 (68%) | 1 (13%) | 2 (25%) | 5 (63%) |
| Perception of validity and reliability | ||||||
| Exam scores provide true measure of clinical skills | 19 (23%) | 37 (44%) | 28 (33%) | 0 (0%) | 1 (13%) | 7 (88%) |
| Exam scores are standardized | 10 (12%) | 57 (68%) | 17 (20%) | 0 (0%) | 6 (75%) | 2 (25%) |
| This evaluation format is a useful experience for future practice | 4 (5%) | 10 (12%) | 70 (83%) | 0 (0%) | 0 (0%) | 8 (100%) |
| Personality, ethnicity and gender will not affect scores | 25 (30%) | 33 (39%) | 26 (31%) | 2 (25%) | 2 (25%) | 4 (50%) |
3.2. Comparison of OSCE with MCQs and WTs (Table 2 and Table 3a,b)
| Students’ Rating | Assessors’ Rating | |||||
|---|---|---|---|---|---|---|
| Rating of evaluation formats | ||||||
| Which of the following formats is easiest? | Difficult | Undecided | Easy | Difficult | Undecided | Easy |
| OSCE | 27 (33%) | 47 (57%) | 9 (11%) | 2 (25%) | 2 (25%) | 4 (50%) |
| MCQ | 33 (39%) | 29 (35%) | 22 (26%) | 1 (13%) | 4 (50%) | 3 (38%) |
| WT | 21 (25%) | 33 (40%) | 29 (35%) | 3 (38%) | 3 (38%) | 2 (25%) |
| Which of the following formats is fairest? | Unfair | Undecided | Fair | Unfair | Undecided | Fair |
| OSCE | 8 (10%) | 25 (30%) | 50 (60%) | 0 (0%) | 2 (25%) | 6 (75%) |
| MCQ | 32 (38%) | 26 (31%) | 26 (31%) | 2 (25%) | 1 (13%) | 5 (63%) |
| WT | 11 (13%) | 38 (46%) | 34 (41%) | 4 (50%) | 2 (25%) | 2 (25%) |
| From which of the following formats do you learn most? | Learn very little | Undecided | Learn a lot | Learn very little | Undecided | Learn a lot |
| OSCE | 1 (1%) | 18 (22%) | 64 (77%) | 1 (13%) | 1 (13%) | 6 (75%) |
| MCQ | 45 (54%) | 24 (29%) | 14 (17%) | 6 (75%) | 2 (25%) | 0 (0%) |
| WT | 17 (21%) | 40 (48%) | 26 (31%) | 4 (50%) | 3 (38%) | 1 (13%) |
| Which of the following formats should be used more often in the clinical years of the programme? | Used much less | Undecided | Used much more | Used much less | Undecided | Used much more |
| OSCE | 2 (2%) | 12 (14%) | 70 (83%) | 1 (13%) | 0 (0%) | 7 (88%) |
| MCQ | 40 (48%) | 23 (27%) | 21 (25%) | 2 (25%) | 4 (50%) | 2 (25%) |
| WT | 13 (15%) | 38 (45%) | 33 (39%) | 2 (25%) | 4 (50%) | 2 (25%) |
| (a) | |||||||||||
| OSCE | MCQ | WT | |||||||||
| This evaluation format | Disagree N (%) | Neutral N (%) | Agree N (%) | Disagree N (%) | Neutral N (%) | Agree N (%) | p-value | Disagree N (%) | Neutral N (%) | Agree N (%) | p-value |
| Exam was fair | 9 (11%) | 33 (39%) | 42 (50%) | 35 (42%) | 20 (24%) | 29 (35%) | <0.001 | 14 (17%) | 28 (33%) | 42 (50%) | 0.56 |
| Wide knowledge area covered | 5 (6%) | 12 (14%) | 67 (80%) | 7 (8%) | 9 (11%) | 68 (81%) | 0.92 | 30 (36%) | 17 (20%) | 37 (44%) | >0.99 |
| Needed generally more time | 35 (42%) | 20 (24%) | 29 (35%) | 41 (49%) | 19 (23%) | 24 (29%) | 0.23 | 27 (32%) | 29 (35%) | 28 (33%) | 0.53 |
| Exams well administered | 1 (1%) | 15 (18%) | 68 (81%) | 40 (48%) | 17 (20%) | 27 (32%) | >0.99 | 11 (13%) | 28 (33%) | 45 (54%) | >0.99 |
| Exams very stressful | 5 (6%) | 15 (18%) | 64 (76%) | 19 (23%) | 34 (40%) | 31 (37%) | >0,99 | 18 (21%) | 38 (45%) | 28 (33%) | >0.99 |
| Exams well structured & sequenced | 2 (2%) | 13 (15%) | 69 (82%) | 23 (27%) | 31 (37%) | 30 (36%) | >0.99 | 13 (15%) | 32 (38%) | 39 (46%) | >0.99 |
| Exam minimized chance of failing | 39 (46%) | 38 (45%) | 7 (8%) | 49 (58%) | 31 (37%) | 4 (5%) | 0.09 | 30 (36%) | 28 (33%) | 26 (31%) | 0.005 |
| OSCE less stressful than other exams | 57 (68%) | 23 (27%) | 4 (5%) | 46 (55%) | 33 (39%) | 5 (6%) | 0.01 | 45 (54%) | 26 (31%) | 13 (15%) | <0.001 |
| Allowed student to compensate in some areas | 28 (33%) | 25 (30%) | 31 (37%) | 47 (56%) | 20 (24%) | 17 (20%) | 0.001 | 13 (15%) | 22 (26%) | 49 (58%) | 0.001 |
| Highlighted areas of weakness | 6 (7%) | 19 (23%) | 59 (70%) | 35 (42%) | 22 (26%) | 27 (32%) | >0.99 | 25 (30%) | 26 (31%) | 33 (39%) | >0.99 |
| Exam intimidating | 3 (4%) | 7 (8%) | 74 (88%) | 36 (43%) | 22 (26%) | 26 (31%) | >0.99 | 37 (44%) | 22 (26%) | 25 (30%) | >0.99 |
| Student aware of level of information needed | 12 (14%) | 19 (23%) | 53 (63%) | 32 (38%) | 20 (24%) | 32 (38%) | >0.99 | 9 (11%) | 25 (30%) | 50 (60%) | 0.97 |
| Wide range of clinical skills covered | 0 (0%) | 12 (14%) | 72 (86%) | 30 (36%) | 20 (24%) | 34 (40%) | >0.99 | 29 (35%) | 28 (33%) | 27 (32%) | >0.99 |
| (b) | |||||||||||
| OSCE | MCQ | WT | |||||||||
| Quality of performance testing | Disagree N (%) | Neutral N (%) | Agree N (%) | Disagree N (%) | Neutral N (%) | Agree N (%) | p-value | Disagree N (%) | Neutral N (%) | Agree N (%) | p-value |
| Fully aware of nature of exam | 12 (14%) | 21 (25%) | 51 (61%) | 19 (23%) | 17 (20%) | 48 (57%) | 0.27 | 12 (14%) | 19 (23%) | 53 (63%) | 0.81 |
| Tasks reflected those taught | 2 (2%) | 14 (17%) | 68 (81%) | 18 (21%) | 30 (36%) | 36 (43%) | >0.99 | 4 (5%) | 19 (23%) | 61 (73%) | 0.19 |
| Time at each station was adequate | 26 (31%) | 21 (25%) | 37 (44%) | 14 (17%) | 16 (19%) | 54 (64%) | 0.002 | 20 (24%) | 21 (25%) | 43 (51%) | 0.17 |
| Setting and context felt authentic | 4 (5%) | 10 (12%) | 70 (83%) | 54 (64%) | 17 (20%) | 13 (15%) | >0.99 | 35 (42%) | 22 (26%) | 27 (32%) | >0.99 |
| Instructions were clear and unambiguous | 25 (30%) | 28 (33%) | 31 (37%) | 52 (62%) | 21 (25%) | 11 (13%) | >0.99 | 16 (19%) | 29 (35%) | 39 (46%) | 0.06 |
| Tasks asked to perform were fair | 1 (1%) | 17 (20%) | 66 (79%) | 22 (26%) | 32 (38%) | 30 (36%) | >0.99 | 2 (2%) | 32 (38%) | 50 (60%) | 0.002 |
| Sequence of questions logical and appropriate | 1 (1%) | 11 (13%) | 72 (86%) | 31 (37%) | 31 (37%) | 22 (26%) | >0.99 | 5 (6%) | 30 (36%) | 49 (58%) | >0.99 |
| Exam provided opportunities to learn | 10 (12%) | 17 (20%) | 57 (68%) | 35 (42%) | 30 (36%) | 19 (23%) | >0.99 | 24 (29%) | 29 (35%) | 31 (37%) | >0.99 |
| Perception of validity and reliability | |||||||||||
| Exam scores provide true measure of clinical skills | 19 (23%) | 37 (44%) | 28 (33%) | 66 (79%) | 13 (15%) | 5 (6%) | >0.99 | 44 (52%) | 25 (30%) | 15 (18%) | <0.001 |
| Exam scores are standardized | 10 (12%) | 57 (68%) | 17 (20%) | 20 (24%) | 32 (38%) | 32 (38%) | 0.56 | 31 (37%) | 41 (49%) | 12 (14%) | 0.001 |
| This evaluation format is a useful experience for future practice | 4 (5%) | 10 (12%) | 70 (83%) | 47 (56%) | 28 (33%) | 9 (11%) | >0.99 | 14 (17%) | 42 (50%) | 28 (33%) | >0.99 |
| Personality, ethnicity and gender will not affect scores | 25 (30%) | 33 (39%) | 26 (31%) | 3 (4%) | 8 (10%) | 73 (87%) | >0.99 | 7 (8%) | 23 (27%) | 54 (64%) | >0.99 |
3.3. Students vs. Assessors (Table 1 and Table 2)
4. Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| WT | Written Test |
| MCQs | Multiple Choice Questions |
| OSCE | Objective Structured Clinical Examination |
| CTCS | Clinical and Therapeutic Synthesis Certificate |
References
- Müller, S.; Settmacher, U.; Koch, I.; Dahmen, U. A pilot survey of student perceptions on the benefit of the OSCE and MCQ modalities. GMS J. Med. Educ. 2018, 35, Doc51. [Google Scholar] [PubMed]
- Rushforth, H.E. Objective structured clinical examination (OSCE): Review of literature and implications for nursing education. Nurse Educ. Today 2007, 27, 481–490. [Google Scholar] [CrossRef] [PubMed]
- Sung, H.; Kim, M.; Park, J.; Shin, N.; Han, Y. Effectiveness of Virtual Reality in Healthcare Education: Systematic Review and Meta-Analysis. Sustainability 2024, 16, 8520. [Google Scholar] [CrossRef]
- Harden, R.M.; Stevenson, M.; Downie, W.W.; Wilson, G.M. Assessment of clinical competence using objective structured examination. Br. Med. J. 1975, 1, 447–451. [Google Scholar] [CrossRef] [PubMed]
- Harden, R.M.; Gleeson, F.A. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med. Educ. 1979, 13, 39–54. [Google Scholar] [CrossRef]
- Alaidarous, S.; Mohamed, T.A.; Masuadi, E.; Wali, S.; Almalki, A. Saudi Internal Medicine Residents׳ Perceptions of the Objective Structured Clinical Examination as a Formative Assessment Tool. Health Prof. Educ. 2016, 2, 121–129. [Google Scholar] [CrossRef]
- Bertrand, C.; Hodges, B.; Segouin, C.; Gagnayre, R.; Ammirati, C.; Marty, J.; Farcet, J.P. Les examens cliniques par objectifs structurés. Le Prat. En Anesth. Reanim. 2008, 12, 212–217. [Google Scholar] [CrossRef]
- Foley, T.; McLoughlin, K.; Walsh, E.K.; Leggett, P.; O’Reilly, M.; Owens, M.; Jennings, A.A. The candidate perspective of the clinical competency test (CCT) of the MICGP examination: A mixed-methods study. BJGP Open 2018, 2, bjgpopen18X101605. [Google Scholar] [CrossRef] [PubMed][Green Version]
- Brand, H.; Schoonheim-Klein, M. Is the OSCE more stressful? Examination anxiety and its consequences in different assessment methods in dental education. Eur. J. Dent. Educ. 2009, 13, 147–153. [Google Scholar] [CrossRef] [PubMed]
- Schoonheim-Klein, M.E.; Habets, L.L.M.H.; Aartman, I.H.A.; van der Vleuten, C.P.; Hoogstraten, J.; van der Velden, U. Implementing an Objective Structured Clinical Examination (OSCE) in dental education: Effects on students’ learning strategies. Eur. J. Dent. Educ. 2006, 10, 226–235. [Google Scholar] [CrossRef] [PubMed]
- Schoonheim-Klein, M.; Hoogstraten, J.; Habets, L.; Aartman, I.; Van der Vleuten, C.; Manogue, M.; Van der Velden, U. Language background and OSCE performance: A study of potential bias. Eur. J. Dent. Educ. 2007, 11, 222–229. [Google Scholar] [CrossRef] [PubMed]
- Schoonheim-Klein, M.; Muijtens, A.; Habets, L.; Manogue, M.; Van der Vleuten, C.; Hoogstraten, J.; Van der Velden, U. On the reliability of a dental OSCE, using SEM: Effect of different days. Eur. J. Dent. Educ. 2008, 12, 131–137. [Google Scholar] [CrossRef] [PubMed]
- Eberhard, L.; Hassel, A.; Bäumer, A.; Becker, F.; Beck-Mußotter, J.; Bömicke, W.; Corcodel, N.; Cosgarea, R.; Eiffler, C.; Giannakopoulos, N.N.; et al. Analysis of quality and feasibility of an objective structured clinical examination (OSCE) in preclinical dental education: A preclinical dental OSCE. Eur. J. Dent. Educ. 2011, 15, 172–178. [Google Scholar] [PubMed]
- Mühling, T.; Schreiner, V.; Appel, M.; Leutritz, T.; König, S. Comparing Virtual Reality–Based and Traditional Physical Objective Structured Clinical Examination (OSCE) Stations for Clinical Competency Assessments: Randomized Controlled Trial. J. Med. Internet Res. 2025, 27, e55066. [Google Scholar] [CrossRef] [PubMed]
- UNECD. Le Mal-Être Des Étudiants En Odontologie: Parlons-en et Agissons! National Survey of the French National Union of Dental Students. 2018. Available online: https://www.unecd.com/dossier_presse/enquete-votre-bien-etre-parlons-en/ (accessed on 15 December 2025).
- Pierre, R.; Wierenga, A.; Barton, M.; Branday, J.; Christie, C. Student evaluation of an OSCE in paediatrics at the University of the West Indies, Jamaica. BMC Med. Educ. 2004, 4, 22. [Google Scholar] [CrossRef] [PubMed]
- Yeates, P.; Sebok-Syer, S.S. Hawks, Doves and Rasch decisions: Understanding the influence of different cycles of an OSCE on students’ scores using Many Facet Rasch Modeling. Med. Teach. 2016, 39, 92–99. [Google Scholar] [CrossRef] [PubMed]
- Tamblyn, R.; Abrahamowicz, M.; Brailovsky, C.; Grand’Maison, P.; Lescop, J.; Norcini, J.; Girard, N.; Haggerty, J. Association Between Licensing Examination Scores and Resource Use and Quality of Care in Primary Care Practice. JAMA 1998, 280, 989–996. [Google Scholar] [CrossRef] [PubMed]
- Tamblyn, R.; Abrahamowicz, M.; Dauphinee, W.D.; Hanley, J.A.; Norcini, J.; Girard, N.; Grand’Maison, P.; Brailovsky, C. Association Between Licensure Examination Scores and Practice in Primary Care. JAMA 2002, 288, 3019–3026. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Published by MDPI on behalf of the Academic Society for International Medical Education. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Prosper, A.; Broutin, A.; Lê, S.; Cecchin-Albertoni, C.; Monsarrat, P.; Thomas, C.; Laurencin, S.; Cousty, S.; Gendron, B.; Destruhaut, F.; et al. From Written Tests to OSCE: A Study on the Perceptions of Assessment Reform by Students and Faculty in the French Dental Curriculum. Int. Med. Educ. 2026, 5, 7. https://doi.org/10.3390/ime5010007
Prosper A, Broutin A, Lê S, Cecchin-Albertoni C, Monsarrat P, Thomas C, Laurencin S, Cousty S, Gendron B, Destruhaut F, et al. From Written Tests to OSCE: A Study on the Perceptions of Assessment Reform by Students and Faculty in the French Dental Curriculum. International Medical Education. 2026; 5(1):7. https://doi.org/10.3390/ime5010007
Chicago/Turabian StyleProsper, Alison, Alice Broutin, Sylvie Lê, Chiara Cecchin-Albertoni, Paul Monsarrat, Charlotte Thomas, Sara Laurencin, Sarah Cousty, Bénédicte Gendron, Florent Destruhaut, and et al. 2026. "From Written Tests to OSCE: A Study on the Perceptions of Assessment Reform by Students and Faculty in the French Dental Curriculum" International Medical Education 5, no. 1: 7. https://doi.org/10.3390/ime5010007
APA StyleProsper, A., Broutin, A., Lê, S., Cecchin-Albertoni, C., Monsarrat, P., Thomas, C., Laurencin, S., Cousty, S., Gendron, B., Destruhaut, F., Diemer, F., Minty, M., Valéra, M.-C., Delrieu, J., Canceill, T., Blasco-Baque, V., & Marty, M. (2026). From Written Tests to OSCE: A Study on the Perceptions of Assessment Reform by Students and Faculty in the French Dental Curriculum. International Medical Education, 5(1), 7. https://doi.org/10.3390/ime5010007

