Next Article in Journal
Feasibility of Script Concordance Test Development: A Qualitative Study of Medical Educators’ Experiences
Previous Article in Journal
ChatGPT in Health Professions Education: Findings and Implications from a Cross-Sectional Study Among Students in Saudi Arabia
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

From Written Tests to OSCE: A Study on the Perceptions of Assessment Reform by Students and Faculty in the French Dental Curriculum

1
Department of Dentistry, Faculty of Health, Université de Toulouse, 3 Chemin des Maraîchers, CHU de Toulouse, 31400 Toulouse, France
2
InCOMM (Intestine ClinicOmics Metabolism & Microbiota), Institut des Maladies Métaboliques et Cardiovasculaires (I2MC), 31432 Toulouse, France
3
Association pour la Sauvegarde des Enfants Invalide (ASEI), Centre Paul Dottin, 31520 Ramonville, France
4
Inserm RESTORE Team Research Center, Université de Toulouse, 31400 Toulouse, France
5
SPHERE Team, Inserm UMR 1295, Centre d’Epidémiologie et de Recherche en Santé des Populations (CERPOP), Université de Toulouse, 31400 Toulouse, France
6
LAPLACE Laboratory-Unité Mixte de Recherche Centre National Recherche Scientifique Institut National Polytechnique, Université de Toulouse, 31400 Toulouse, France
7
Laboratoire Interdisciplinaire de Recherche en Didactique, Education et Formation (EA 3749), Département des Sciences de L’éducation, Université Montpellier Paul Valery, 34199 Montpellier, France
8
BIOETHICS Team, Inserm UMR 1295, Centre d’Epidémiologie et de Recherche en Santé des Populations (CERPOP), Université de Toulouse, 31400 Toulouse, France
9
U1048 Inserm, Institut des Maladies Métaboliques et Cardiovasculaires, Université de Toulouse, 31400 Toulouse, France
10
Unité Mixte de Recherceh Education Formation Travail Savoir, Université Toulouse II Jean Jaurès, 31058 Toulouse, France
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Int. Med. Educ. 2026, 5(1), 7; https://doi.org/10.3390/ime5010007
Submission received: 28 November 2025 / Revised: 19 December 2025 / Accepted: 23 December 2025 / Published: 6 January 2026

Abstract

Traditional assessment methods in dental education, such as written tests and multiple-choice questions, primarily measure theoretical knowledge but inadequately evaluate clinical and interpersonal competencies. The Objective Structured Clinical Examination (OSCE), recognized globally for its validity and reliability, addresses these limitations and is widely adopted in medical curricula; however, its implementation in dental education remains poorly undocumented. This study explored perceptions of OSCE compared to traditional formats within the Clinical and Therapeutic Synthesis Certificate (CTSC) at Toulouse Faculty of Health during its first OSCE-based session in January 2019. Eighty-four fifth-year students and eight faculty assessors completed a validated questionnaire assessing fairness, educational value, and stress levels. Results indicated that OSCE was perceived as covering diverse clinical skills (86%) and offering authentic scenarios (83%). Despite being stressful (76%), OSCE was considered the fairest (60% vs. MCQ 31%, WT 41%; p < 0.001) and most educational (77% vs. MCQ 17%, WT 31%). Eighty-three percent of students recommended its broader use, while assessors unanimously endorsed its fairness and utility. Both groups highlighted its formative potential. These findings support OSCE’s integration into French dental curricula to strengthen competency-based assessment and enhance clinical skill evaluation.

1. Introduction

Traditional assessment methods in health education, such as written tests (WTs) and multiple-choice questions (MCQs), primarily measure theoretical knowledge but fail to adequately evaluate practical and relational skills essential for clinical practice [1,2]. In parallel, immersive educational technologies, such as virtual reality (VR), are increasingly explored to support simulation-based learning and clinical skills training in health professions education [3].
To address these limitations, innovative approaches have emerged in medical education, notably the Objective Structured Clinical Examination (OSCE) introduced by Harden et al. in 1975 [4,5]. OSCE is a performance-based assessment consisting of multiple stations where candidates demonstrate clinical skills, knowledge, and professional behavior in standardized, simulated scenarios. Its validity, reliability, and objectivity have been widely documented [6,7,8], and its implementation in dental curricula has been reported in countries such as The Netherlands [9,10,11,12] and Germany [13]. More recently, immersive technologies such as virtual reality (VR) have also been investigated within OSCE settings, with randomized studies reporting feasibility and comparable assessment performance to traditional physical stations [14]. However, no published study has described its integration into French dental education.
In 2018, a national survey conducted by the French National Union of Dental Students (UNECD) highlighted students’ concerns regarding their well-being and their desire for pedagogical evolution [15]. Among the recommendations were the development of innovative assessment methods and improved preparation for patient–practitioner interactions. Responding to these findings, the Faculty of Dentistry at Toulouse initiated a reform of the Clinical and Therapeutic Synthesis Certificate (CTSC), traditionally assessed through a four-hour written test. A dedicated working group concluded that OSCE was the most appropriate format to meet these needs, given its ability to evaluate a broad range of clinical skills and its strong docimological qualities [3,5,6]. Accordingly, OSCE was implemented as a replacement for the traditional written assessment (and related formats such as MCQ/WT) within the CTSC, rather than as an additional complementary method. Supplementary Material provides detailed information on the OSCE station design, checklists, and implementation procedures at the Faculty of Dentistry, Toulouse.
Objective: This study aims to assess students’ and teachers’ perceptions of OSCE as an assessment method within the CTSC and to compare these perceptions with those of traditional formats (MCQ and WT).

2. Materials and Methods

2.1. Analyzed Intervention: OSCE Description

This OSCE included a circuit of five stations, which assessed the following clinical skills: medical history recording, diagnosis process, drug prescription, communication with the patient, and description of an operating procedure’s clinical steps.
The evaluated disciplines include prevention, conservative dentistry, endodontics, periodontology, oral surgery, prosthodontics, dental imaging, and pediatric dentistry.
This evaluation format allows the standardized exposure of students to a wide variety of clinical skills within a relatively short time period.
Each station is 6 minutes in duration. An interval of 2 min between stations allows the student to get from one room to the other and the assessor to validate the correction grid. This procedure allowed each of the students to complete the circuit over a 37 min period. Assessment took place on one day. Eighty-five students of the 5th-year promotion were evaluated. Eleven teachers/assessors were involved in organizing two parallel circuits.
A standardized technique of marking was used, and students’ performances were assessed by criteria detailed in items corresponding to skills, defined in advance and listed as a checklist for each station. Criterion-based scoring was used, with each checklist item scored 0 (omitted), 0.5 (incomplete), or 1 (complete). Some items, dealing with factors of crucial importance for the patient, could lead to the station mark falling to 0 if they were omitted, for example, in cases where the patient’s health is endangered, such as prescribing a medication to which the patient is allergic.
The form in which the exam was provided, its construction, and its content validity [8] were previously established thanks to a core group comprised of a teaching hospital assistant and a resident in general dentistry. Stations were selected to represent the knowledge and skills necessary to daily dental practice and to reflect authentic clinical situations, including emergency management of dental trauma, pain management, and communication skills in sensitive situations. Checklists were designed to include the features thought to be most important by the development committee. Through discussions, consensus was achieved on the checklist items and structure.
Following this exam, a feedback and correction session was organized at Toulouse’s Faculty of Health, Department of Odontology.

2.2. Aims

Primary Objective:
To evaluate students’ and teachers’ perceptions of the implementation of the Objective Structured Clinical Examination (OSCE) as an assessment method within the Clinical and Therapeutic Synthesis Certificate (CTSC).
Secondary Objectives:
To compare these perceptions with those of traditional assessment formats (multiple-choice questions [MCQs] and written tests [WTs]).
To identify convergences and divergences between students and teachers regarding fairness, validity, reliability, and educational impact.
To explore the feasibility and acceptability of OSCE as an innovative assessment tool in a French dental curriculum.

2.3. Study Design and Duration

In this monocentric observational study, we used a validated self-administered questionnaire adapted from Pierre et al. [16] that students and teachers filled out after the evaluation by OSCE. The analytic approach was quantitative.

2.4. Questionnaire

The validated questionnaire developed and published by Pierre et al. in 2004 [16] to evaluate a pediatric OSCE at West Indies University, Jamaica, and reused by Alairdarous et al. in 2016 [5] seemed to us the most pertinent to evaluate our OSCE’s perception. Drawing inspiration from Müller et al.’s comparative approach of OSCE and MCQ’s perception [1], Pierre’s questionnaire items were translated to deal with OSCE, MCQ, and WT’s evaluation formats. The translation was reviewed and approved by a native English speaker (English instructor, Université Toulouse III—Paul Sabatier, Toulouse, France). The questionnaire’s suitability for MCQ and WT formats was approved by the OSCE development committee.

2.5. Study Subjects

Based on Alairdarous et al.’s observation that excluding assessors can be overly restrictive [6], we chose to include both students and assessors in our study. The inclusion criteria were the following: Students had to be part of the 5th-year promotion. A total of 85 students were concerned. Assessors were eligible if they were members of the CTSC jury and had contributed to the development of at least one examination item or to the grading of at least one MCQ assessment and WT assessment. In total, 12 assessors were included.
Thus, 97 participants were included. Each of these participants was asked to complete Pierre et al.’s translated questionnaire about the three evaluation formats.
This study was conducted in accordance with the principles outlined in the Declaration of Helsinki. All of the participants were informed about the study’s progress and gave their oral consent to participate. This research was approved by the Ethical Committee of the Department of Dentistry under number RnIPH 2024-119.

2.6. Collection of Data

Six assessors received their questionnaires in paper format; the six others received it in electronic Google Forms® (Google, Mountain View, CA, USA) format. The different questionnaires were administered in a random order. One of the assessors was excluded because he did not participate in the development or the correction of evaluations in MCQ and WT format. The 85 students received a link via email to complete the questionnaire on Google Forms® (Google, Mountain View, CA, USA). They were distributed into 5 groups of 17, in which the questionnaires related to different evaluation formats were administered in a random order.
The data were anonymized before analysis.

2.7. Data Analysis

For the descriptive analysis, data are presented as frequencies and means.
Descriptive and non-parametric tests were applied using R® software (Bell, Murray Hill, NJ, USA) to compare students’ perception of different evaluation formats. We used Wilcoxon rank sum and signed rank tests. The threshold of significance was fixed at 5% (p ≤ 0.05).

3. Results

A total of 84 students (99% of those assessed) and 8 assessors (73% of those invited) completed the questionnaire. Students completed the questionnaire after receiving their individual OSCE results.

3.1. Students’ Perception of OSCE (Table 1)

Most students expressed a positive opinion about the structure and administration of OSCE. Specifically, 82% agreed that the exam was well structured and sequenced, and 81% considered it well administered. Regarding coverage, 86% reported that OSCE assessed a wide range of clinical skills, while 80% felt it encompassed a broad knowledge base. In terms of authenticity, 83% agreed that the setting and context felt realistic, and 81% confirmed that the tasks reflected what had been taught. However, OSCE was perceived as intimidating by 88% of students and stressful by 76%, with 68% stating that it was more stressful than other examinations. From an educational perspective, 68% indicated that OSCE provided opportunities for learning, and 83% considered it a practical and useful experience for future professional practice. Nevertheless, only 33% believed that OSCE scores truly measured essential clinical skills, and just 20% trusted the standardization of the assessment.
Table 1. Students’ perception of OSCE vs. assessors’ one.
Table 1. Students’ perception of OSCE vs. assessors’ one.
OSCE
Students’ Perception
OSCE
Assessors’ Perception
This evaluation formatDisagree
N (%)
Neutral
N (%)
Agree
N (%)
Disagree
N (%)
Neutral
N (%)
Agree
N (%)
Exam was fair9
(11%)
33 (39%)42
(50%)
0
(0%)
0
(0%)
8
(100%)
Wide knowledge area covered5
(6%)
12
(14%)
67
(80%)
1
(13%)
2
(25%)
5
(63%)
Needed generally more time35
(42%)
20
(24%)
29
(35%)
4
(50%)
1
(13%)
3
(38%)
Exams well administered1
(1%)
15
(18%)
68
(81%)
0
(0%)
2
(25%)
6
(75%)
Exams very stressful5
(6%)
15
(18%)
64
(76%)
0
(0%)
3
(38%)
5
(63%)
Exams well structured & sequenced2
(2%)
13
(15%)
69
(82%)
0
(0%)
0
(0%)
8
(100%)
Exam minimized chance of failing39
(46%)
38
(45%)
7
(8%)
3
(38%)
4
(50%)
1
(13%)
OSCE less stressful than other exams57
(68%)
23
(27%)
4
(5%)
7
(88%)
1
(13%)
0
(0%)
Allowed student to compensate in some areas28
(33%)
25
(30%)
31
(37%)
2
(25%)
1
(13%)
5
(63%)
Highlighted areas of weakness6
(7%)
19
(23%)
59
(70%)
0
(0%)
0
(0%)
8
(100%)
Exam intimidating3
(4%)
7
(8%)
74
(88%)
0
(0%)
0
(0%)
8
(100%)
Student aware of level of information needed12
(14%)
19
(23%)
53
(63%)
0
(0%)
4
(50%)
4
(50%)
Wide range of clinical skills covered0
(0%)
12
(14%)
72
(86%)
0
(0%)
2
(25%)
6
(75%)
Quality of performance testing
Fully aware of nature of exam12
(14%)
21
(25%)
51
(61%)
0
(0%)
4
(50%)
4
(50%)
Tasks reflected those taught2
(2%)
14
(17%)
68
(81%)
0
(0%)
0
(0%)
8
(100%)
Time at each station was adequate26
(31%)
21
(25%)
37
(44%)
0
(0%)
2
(25%)
6
(75%)
Setting and context felt authentic4
(5%)
10
(12%)
70
(83%)
0
(0%)
2
(25%)
6
(75%)
Instructions were clear and unambiguous25
(30%)
28
(33%)
31
(37%)
0
(0%)
4
(50%)
4
(50%)
Tasks asked to perform were fair1
(1%)
17
(20%)
66
(79%)
0
(0%)
1
(13%)
7
(88%)
Sequence of questions logical and appropriate1
(1%)
11
(13%)
72
(86%)
0
(0%)
0
(0%)
8
(100%)
Exam provided opportunities to learn10
(12%)
17
(20%)
57
(68%)
1
(13%)
2
(25%)
5
(63%)
Perception of validity and reliability
Exam scores provide true measure of clinical skills19
(23%)
37
(44%)
28
(33%)
0
(0%)
1
(13%)
7
(88%)
Exam scores are standardized10
(12%)
57
(68%)
17
(20%)
0
(0%)
6
(75%)
2
(25%)
This evaluation format is a useful experience for future practice4
(5%)
10
(12%)
70
(83%)
0
(0%)
0
(0%)
8
(100%)
Personality, ethnicity and gender will not affect scores25
(30%)
33
(39%)
26
(31%)
2
(25%)
2
(25%)
4
(50%)

3.2. Comparison of OSCE with MCQs and WTs (Table 2 and Table 3a,b)

When asked to rank the three assessment formats, students rated OSCE as the fairest option, with 60% selecting it compared to 31% for MCQs and 41% for written tests. Statistical analysis confirmed significant differences between OSCE and MCQs (p < 0.001) and between OSCE and written tests (p = 0.002). In terms of learning impact, 77% of students stated that OSCE provided the greatest learning opportunity, whereas only 17% said the same for MCQs and 31% for written tests. Regarding preferred frequency, 83% of students wanted OSCE to be used more often during clinical years, compared to 25% for MCQ and 39% for written tests. However, OSCE was consistently perceived as more stressful than MCQs (p = 0.01) and written tests (p < 0.001). Students also considered OSCE scores to better reflect essential clinical skills than written tests (33% vs. 18%, p < 0.001). Finally, time allocation was viewed as more adequate for MCQs (64%) than for OSCE (44%, p = 0.002).
Table 2. Students’ rating of evaluation formats vs. assessors’ one.
Table 2. Students’ rating of evaluation formats vs. assessors’ one.
Students’ RatingAssessors’ Rating
Rating of evaluation formats
Which of the following formats is easiest?Difficult UndecidedEasyDifficultUndecidedEasy
OSCE27
(33%)
47
(57%)
9
(11%)
2
(25%)
2
(25%)
4
(50%)
MCQ33
(39%)
29
(35%)
22
(26%)
1
(13%)
4
(50%)
3
(38%)
WT21
(25%)
33
(40%)
29
(35%)
3
(38%)
3
(38%)
2
(25%)
Which of the following formats is fairest?UnfairUndecidedFair UnfairUndecidedFair
OSCE8
(10%)
25
(30%)
50
(60%)
0
(0%)
2
(25%)
6
(75%)
MCQ32
(38%)
26
(31%)
26
(31%)
2
(25%)
1
(13%)
5
(63%)
WT11
(13%)
38
(46%)
34
(41%)
4
(50%)
2
(25%)
2
(25%)
From which of the following formats do you learn most?Learn very littleUndecidedLearn a lotLearn very littleUndecidedLearn a lot
OSCE1
(1%)
18
(22%)
64
(77%)
1
(13%)
1
(13%)
6
(75%)
MCQ45
(54%)
24
(29%)
14
(17%)
6
(75%)
2
(25%)
0
(0%)
WT17
(21%)
40
(48%)
26
(31%)
4
(50%)
3
(38%)
1
(13%)
Which of the following formats should be used more often in the clinical years of the programme?Used much lessUndecidedUsed much moreUsed much lessUndecidedUsed much more
OSCE2
(2%)
12
(14%)
70
(83%)
1
(13%)
0
(0%)
7
(88%)
MCQ40
(48%)
23
(27%)
21
(25%)
2
(25%)
4
(50%)
2
(25%)
WT13
(15%)
38
(45%)
33
(39%)
2
(25%)
4
(50%)
2
(25%)
Table 3. (a) Students’ perception of OSCE vs. their perception of MCQs and WTs. (b) Students’ perception of OSCE vs. their perception of MCQs and WTs (follow-up).
Table 3. (a) Students’ perception of OSCE vs. their perception of MCQs and WTs. (b) Students’ perception of OSCE vs. their perception of MCQs and WTs (follow-up).
(a)
OSCEMCQWT
This evaluation formatDisagree
N (%)
Neutral
N (%)
Agree
N (%)
Disagree
N (%)
Neutral
N (%)
Agree
N (%)
p-valueDisagree
N (%)
Neutral
N (%)
Agree
N (%)
p-value
Exam was fair9
(11%)
33 (39%)42
(50%)
35
(42%)
20
(24%)
29
(35%)
<0.00114
(17%)
28
(33%)
42
(50%)
0.56
Wide knowledge area covered5
(6%)
12
(14%)
67
(80%)
7
(8%)
9
(11%)
68
(81%)
0.9230
(36%)
17
(20%)
37
(44%)
>0.99
Needed generally more time35
(42%)
20
(24%)
29
(35%)
41
(49%)
19
(23%)
24
(29%)
0.2327
(32%)
29
(35%)
28
(33%)
0.53
Exams well administered 1
(1%)
15
(18%)
68
(81%)
40
(48%)
17
(20%)
27
(32%)
>0.9911
(13%)
28
(33%)
45
(54%)
>0.99
Exams very stressful5
(6%)
15
(18%)
64
(76%)
19
(23%)
34
(40%)
31
(37%)
>0,9918
(21%)
38
(45%)
28
(33%)
>0.99
Exams well structured & sequenced2
(2%)
13
(15%)
69
(82%)
23
(27%)
31
(37%)
30
(36%)
>0.9913
(15%)
32
(38%)
39
(46%)
>0.99
Exam minimized chance of failing39
(46%)
38
(45%)
7
(8%)
49
(58%)
31
(37%)
4
(5%)
0.0930
(36%)
28
(33%)
26
(31%)
0.005
OSCE less stressful than other exams57
(68%)
23
(27%)
4
(5%)
46
(55%)
33
(39%)
5
(6%)
0.0145
(54%)
26
(31%)
13
(15%)
<0.001
Allowed student to compensate in some areas28
(33%)
25
(30%)
31
(37%)
47
(56%)
20
(24%)
17
(20%)
0.00113
(15%)
22
(26%)
49
(58%)
0.001
Highlighted areas of weakness6
(7%)
19
(23%)
59
(70%)
35
(42%)
22
(26%)
27
(32%)
>0.9925
(30%)
26
(31%)
33
(39%)
>0.99
Exam intimidating3
(4%)
7
(8%)
74
(88%)
36
(43%)
22
(26%)
26
(31%)
>0.9937
(44%)
22
(26%)
25
(30%)
>0.99
Student aware of level of information needed12
(14%)
19
(23%)
53
(63%)
32
(38%)
20
(24%)
32
(38%)
>0.999
(11%)
25
(30%)
50
(60%)
0.97
Wide range of clinical skills covered0
(0%)
12
(14%)
72
(86%)
30
(36%)
20
(24%)
34
(40%)
>0.9929
(35%)
28
(33%)
27
(32%)
>0.99
(b)
OSCEMCQWT
Quality of performance testingDisagree
N (%)
Neutral
N (%)
Agree
N (%)
Disagree
N (%)
Neutral
N (%)
Agree
N (%)
p-valueDisagree
N (%)
Neutral
N (%)
Agree
N (%)
p-value
Fully aware of nature of exam12
(14%)
21
(25%)
51
(61%)
19
(23%)
17
(20%)
48
(57%)
0.2712
(14%)
19
(23%)
53
(63%)
0.81
Tasks reflected those taught2
(2%)
14
(17%)
68
(81%)
18
(21%)
30
(36%)
36
(43%)
>0.994
(5%)
19
(23%)
61
(73%)
0.19
Time at each station was adequate26
(31%)
21
(25%)
37
(44%)
14
(17%)
16
(19%)
54
(64%)
0.00220
(24%)
21
(25%)
43
(51%)
0.17
Setting and context felt authentic4
(5%)
10
(12%)
70
(83%)
54
(64%)
17
(20%)
13
(15%)
>0.9935
(42%)
22
(26%)
27
(32%)
>0.99
Instructions were clear and unambiguous25
(30%)
28
(33%)
31
(37%)
52
(62%)
21
(25%)
11
(13%)
>0.9916
(19%)
29
(35%)
39
(46%)
0.06
Tasks asked to perform were fair1
(1%)
17
(20%)
66
(79%)
22
(26%)
32
(38%)
30
(36%)
>0.992
(2%)
32
(38%)
50
(60%)
0.002
Sequence of questions logical and appropriate1
(1%)
11
(13%)
72
(86%)
31
(37%)
31
(37%)
22
(26%)
>0.995
(6%)
30
(36%)
49
(58%)
>0.99
Exam provided opportunities to learn10
(12%)
17
(20%)
57
(68%)
35
(42%)
30
(36%)
19
(23%)
>0.9924
(29%)
29
(35%)
31
(37%)
>0.99
Perception of validity and reliability
Exam scores provide true measure of clinical skills19
(23%)
37
(44%)
28
(33%)
66
(79%)
13
(15%)
5
(6%)
>0.9944
(52%)
25
(30%)
15
(18%)
<0.001
Exam scores are standardized10
(12%)
57
(68%)
17
(20%)
20
(24%)
32
(38%)
32
(38%)
0.5631
(37%)
41
(49%)
12
(14%)
0.001
This evaluation format is a useful experience for future practice4
(5%)
10
(12%)
70
(83%)
47
(56%)
28
(33%)
9
(11%)
>0.9914
(17%)
42
(50%)
28
(33%)
>0.99
Personality, ethnicity and gender will not affect scores25
(30%)
33
(39%)
26
(31%)
3
(4%)
8
(10%)
73
(87%)
>0.997
(8%)
23
(27%)
54
(64%)
>0.99

3.3. Students vs. Assessors (Table 1 and Table 2)

Assessors’ responses were generally aligned with students’ perceptions, although some differences emerged. Both groups agreed that the OSCE was well structured, with 100% of assessors and 82% of students sharing this view. They also concurred that the exam covered a wide range of skills (75% vs. 86%) and provided authentic scenarios (75% vs. 83%). Assessors unanimously considered the OSCE fair and useful (100% for both), whereas only 50% of students perceived it as fair and 83% as useful. However, divergences were noted regarding the ability of OSCE to measure clinical skills: 88% of assessors expressed confidence in this aspect compared to just 33% of students. Assessors were also more satisfied with time allocation (75% vs. 44%), while students remained more uncertain about fairness and standardization.

4. Discussion

This study provides evidence that the Objective Structured Clinical Examination (OSCE) is perceived as a fair, well-structured, and educational assessment method by both students and assessors. Students particularly appreciated OSCE’s ability to cover a wide range of clinical skills (86%) and simulate authentic clinical scenarios (83%). These findings confirm the docimological qualities of OSCE—validity, reliability, and objectivity—previously reported in the literature [1,2,5,6].
However, OSCE was also considered stressful and intimidating by the majority of participants (76% and 88%, respectively), which is consistent with previous studies in medical and dental education [9,10]. Stress may be explained by the novelty of the format and the high stakes of the CTSC exam, which determines students’ eligibility to practice as substitute dentists. Despite this, students recognized OSCE’s formative value, with 68% reporting learning opportunities during the exam and 83% considering it useful for future practice.
When compared to traditional formats, OSCE was rated as the fairest assessment method (60% vs. 31% for MCQs and 41% for WTs; p < 0.001) and the one that promotes the most learning (77% vs. 17% for MCQs and 31% for WTs). Furthermore, 83% of students expressed that OSCE should be used more frequently in clinical years, confirming its perceived relevance for skill-based evaluation.
Our findings align with international studies highlighting OSCE’s advantages over traditional assessment formats. Pierre et al. [16] and Müller et al. [4] reported similar enthusiasm among students after their first OSCE experience, despite its stressful nature. Brand and Schoonheim-Klein [9] also noted that OSCE induces higher anxiety compared to MCQs, which our results corroborate.
The novelty of this study lies in its context: to our knowledge, this is the first documented implementation of OSCE in a French dental curriculum. While OSCE has been successfully integrated into medical education in France [17] and dental programs in Europe [9], no prior publication has addressed its feasibility and acceptability in French dental education. This originality reinforces the importance of our findings for faculties seeking to modernize assessment strategies.
The adoption of OSCE responds directly to the recommendations of the UNECD survey [13], which called for fairer evaluations and improved preparation for clinical and relational practice. Although some stations targeted communication in sensitive contexts, the scoring approach may have emphasized observable clinical actions over broader interpersonal competencies, which are also essential in daily practice. Beyond its certification role, OSCE demonstrates strong formative qualities, offering students opportunities for realistic self-assessment and skill development. The literature confirms that OSCE stimulates learning and enhances achievement of specific clinical competencies [10].
For educators, OSCE provides a structured and objective framework that minimizes bias and supports standardization through predefined checklists [9]. However, criterion-based scoring alone may not guarantee perceived standardization or score meaning. Even with checklists, variability in assessors’ interpretation and application of criteria, especially during a first implementation, may have contributed to the limited confidence reported by students (20% for standardization; 33% for essential clinical skills). In addition, its stressful nature must be addressed through preparatory sessions and supportive feedback to optimize student experience. The positive perception among assessors further validates OSCE’s integration into the CTSC and encourages its extension to other clinical assessments within the curriculum. Assessors may also have perceived the OSCE as intimidating due to the responsibility associated with real-time evaluation of high-stakes performance, the need to apply standardized criteria consistently across stations, and the novelty of the format during its first implementation.
This study has several limitations. The assessor sample was small (n = 8), and the experience was limited to a single faculty and a first implementation, which may affect generalizability. In addition, students completed the questionnaire after receiving their individual OSCE results, which may have influenced their perceptions of the assessment. Finally, while the questionnaire was validated and translated, back-translation was not performed, which could introduce minor linguistic bias.
Future research should include multiple faculties, larger assessor samples, and longitudinal studies to evaluate OSCE’s impact on clinical competence and student performance over time. It would also be relevant to explore strategies to reduce stress, such as mock OSCE sessions or structured feedback, and to assess whether OSCE improves long-term clinical outcomes, as suggested by Canadian studies linking OSCE scores to practice quality [18,19].
In summary, the OSCE was positively received by students and assessors and perceived as fairer and more educational than traditional formats, despite its stressful nature. However, these findings should be interpreted with caution, as only 33% believed OSCE scores measured essential clinical skills and only 20% trusted the assessment’s standardization. This suggests that, beyond its summative role, the OSCE format may be particularly valuable as a structured teaching and feedback approach, while further efforts should focus on strengthening standardization and assessor calibration.

5. Conclusions

The implementation of the Objective Structured Clinical Examination (OSCE) within the Clinical and Therapeutic Synthesis Certificate (CTSC) at Toulouse Faculty of Health, Department of Dentistry represents a significant step toward modernizing assessment methods in French dental education. This study shows that OSCE is perceived by both students and assessors as a fair, structured, and educational format that better evaluates essential clinical skills compared to traditional methods such as multiple-choice questions and written tests. Despite its stressful nature, OSCE was widely accepted and recognized for its formative value, offering opportunities for authentic clinical simulation and skill development. These findings validate the integration of OSCE into the CTSC and support its broader adoption across clinical years. Future research should focus on multi-center studies, strategies to reduce stress, and longitudinal evaluation of OSCE’s impact on clinical competence and professional practice. By addressing these aspects, OSCE can become a cornerstone of competency-based assessment in dental education in France.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/ime5010007/s1, File S1: Toulouse Dental School’s New Pedagogical Method for Assessing Clinical Skills: Students’ and Evaluators’ Perspectives.

Author Contributions

Conceptualization, A.B., V.B.-B. and M.M. (Mathieu Marty); methodology, A.P., B.G. and M.M. (Mathieu Marty); software, A.P., T.C. and J.D.; validation, S.C. and S.L. (Sylvie Lê); formal analysis, A.B., P.M. and M.M. (Mathieu Marty); investigation, P.M., M.M. (Matthieu Minty), F.D. (Florent Destruhaut), A.B., A.P., F.D. (Franck Diemer), C.T. and C.C.-A.; writing—original draft preparation, A.B., A.P., M.-C.V. and M.M. (Mathieu Marty); writing—review and editing, A.B., M.M. (Matthieu Minty), S.L. (Sara Laurencin) and M.-C.V.; supervision, V.B.-B., M.M. (Mathieu Marty) and B.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

This study was conducted in accordance with the principles outlined in the Declaration of Helsinki. This research was approved by the Ethical Committee of the Department of Dentistry under number RnIPH 2024-119.

Informed Consent Statement

Verbal informed consent was obtained after providing participants with comprehensive information about the study objectives, procedures, and confidentiality measures. This approach was deemed ethically appropriate given the nature of the research, the use of anonymized questionnaire data, and the absence of clinical intervention and was approved by the institutional review board.

Data Availability Statement

The data are available upon request from the corresponding author.

Acknowledgments

The authors acknowledge the CTSC consortium: Philippe Kemoun, Jean-Noel Vergnes, Emmanuelle Noirrit, Rémi Esclassan, Coralie Bataille, Géromine Fournier, Cathy Nabet, Antoine Galibourg, Luc Raynaldy, Alexia Vinel, and Isabelle Bailleul-Forestier.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
WTWritten Test
MCQsMultiple Choice Questions
OSCEObjective Structured Clinical Examination
CTCSClinical and Therapeutic Synthesis Certificate

References

  1. Müller, S.; Settmacher, U.; Koch, I.; Dahmen, U. A pilot survey of student perceptions on the benefit of the OSCE and MCQ modalities. GMS J. Med. Educ. 2018, 35, Doc51. [Google Scholar] [PubMed]
  2. Rushforth, H.E. Objective structured clinical examination (OSCE): Review of literature and implications for nursing education. Nurse Educ. Today 2007, 27, 481–490. [Google Scholar] [CrossRef] [PubMed]
  3. Sung, H.; Kim, M.; Park, J.; Shin, N.; Han, Y. Effectiveness of Virtual Reality in Healthcare Education: Systematic Review and Meta-Analysis. Sustainability 2024, 16, 8520. [Google Scholar] [CrossRef]
  4. Harden, R.M.; Stevenson, M.; Downie, W.W.; Wilson, G.M. Assessment of clinical competence using objective structured examination. Br. Med. J. 1975, 1, 447–451. [Google Scholar] [CrossRef] [PubMed]
  5. Harden, R.M.; Gleeson, F.A. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med. Educ. 1979, 13, 39–54. [Google Scholar] [CrossRef]
  6. Alaidarous, S.; Mohamed, T.A.; Masuadi, E.; Wali, S.; Almalki, A. Saudi Internal Medicine Residents׳ Perceptions of the Objective Structured Clinical Examination as a Formative Assessment Tool. Health Prof. Educ. 2016, 2, 121–129. [Google Scholar] [CrossRef]
  7. Bertrand, C.; Hodges, B.; Segouin, C.; Gagnayre, R.; Ammirati, C.; Marty, J.; Farcet, J.P. Les examens cliniques par objectifs structurés. Le Prat. En Anesth. Reanim. 2008, 12, 212–217. [Google Scholar] [CrossRef]
  8. Foley, T.; McLoughlin, K.; Walsh, E.K.; Leggett, P.; O’Reilly, M.; Owens, M.; Jennings, A.A. The candidate perspective of the clinical competency test (CCT) of the MICGP examination: A mixed-methods study. BJGP Open 2018, 2, bjgpopen18X101605. [Google Scholar] [CrossRef] [PubMed][Green Version]
  9. Brand, H.; Schoonheim-Klein, M. Is the OSCE more stressful? Examination anxiety and its consequences in different assessment methods in dental education. Eur. J. Dent. Educ. 2009, 13, 147–153. [Google Scholar] [CrossRef] [PubMed]
  10. Schoonheim-Klein, M.E.; Habets, L.L.M.H.; Aartman, I.H.A.; van der Vleuten, C.P.; Hoogstraten, J.; van der Velden, U. Implementing an Objective Structured Clinical Examination (OSCE) in dental education: Effects on students’ learning strategies. Eur. J. Dent. Educ. 2006, 10, 226–235. [Google Scholar] [CrossRef] [PubMed]
  11. Schoonheim-Klein, M.; Hoogstraten, J.; Habets, L.; Aartman, I.; Van der Vleuten, C.; Manogue, M.; Van der Velden, U. Language background and OSCE performance: A study of potential bias. Eur. J. Dent. Educ. 2007, 11, 222–229. [Google Scholar] [CrossRef] [PubMed]
  12. Schoonheim-Klein, M.; Muijtens, A.; Habets, L.; Manogue, M.; Van der Vleuten, C.; Hoogstraten, J.; Van der Velden, U. On the reliability of a dental OSCE, using SEM: Effect of different days. Eur. J. Dent. Educ. 2008, 12, 131–137. [Google Scholar] [CrossRef] [PubMed]
  13. Eberhard, L.; Hassel, A.; Bäumer, A.; Becker, F.; Beck-Mußotter, J.; Bömicke, W.; Corcodel, N.; Cosgarea, R.; Eiffler, C.; Giannakopoulos, N.N.; et al. Analysis of quality and feasibility of an objective structured clinical examination (OSCE) in preclinical dental education: A preclinical dental OSCE. Eur. J. Dent. Educ. 2011, 15, 172–178. [Google Scholar] [PubMed]
  14. Mühling, T.; Schreiner, V.; Appel, M.; Leutritz, T.; König, S. Comparing Virtual Reality–Based and Traditional Physical Objective Structured Clinical Examination (OSCE) Stations for Clinical Competency Assessments: Randomized Controlled Trial. J. Med. Internet Res. 2025, 27, e55066. [Google Scholar] [CrossRef] [PubMed]
  15. UNECD. Le Mal-Être Des Étudiants En Odontologie: Parlons-en et Agissons! National Survey of the French National Union of Dental Students. 2018. Available online: https://www.unecd.com/dossier_presse/enquete-votre-bien-etre-parlons-en/ (accessed on 15 December 2025).
  16. Pierre, R.; Wierenga, A.; Barton, M.; Branday, J.; Christie, C. Student evaluation of an OSCE in paediatrics at the University of the West Indies, Jamaica. BMC Med. Educ. 2004, 4, 22. [Google Scholar] [CrossRef] [PubMed]
  17. Yeates, P.; Sebok-Syer, S.S. Hawks, Doves and Rasch decisions: Understanding the influence of different cycles of an OSCE on students’ scores using Many Facet Rasch Modeling. Med. Teach. 2016, 39, 92–99. [Google Scholar] [CrossRef] [PubMed]
  18. Tamblyn, R.; Abrahamowicz, M.; Brailovsky, C.; Grand’Maison, P.; Lescop, J.; Norcini, J.; Girard, N.; Haggerty, J. Association Between Licensing Examination Scores and Resource Use and Quality of Care in Primary Care Practice. JAMA 1998, 280, 989–996. [Google Scholar] [CrossRef] [PubMed]
  19. Tamblyn, R.; Abrahamowicz, M.; Dauphinee, W.D.; Hanley, J.A.; Norcini, J.; Girard, N.; Grand’Maison, P.; Brailovsky, C. Association Between Licensure Examination Scores and Practice in Primary Care. JAMA 2002, 288, 3019–3026. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Prosper, A.; Broutin, A.; Lê, S.; Cecchin-Albertoni, C.; Monsarrat, P.; Thomas, C.; Laurencin, S.; Cousty, S.; Gendron, B.; Destruhaut, F.; et al. From Written Tests to OSCE: A Study on the Perceptions of Assessment Reform by Students and Faculty in the French Dental Curriculum. Int. Med. Educ. 2026, 5, 7. https://doi.org/10.3390/ime5010007

AMA Style

Prosper A, Broutin A, Lê S, Cecchin-Albertoni C, Monsarrat P, Thomas C, Laurencin S, Cousty S, Gendron B, Destruhaut F, et al. From Written Tests to OSCE: A Study on the Perceptions of Assessment Reform by Students and Faculty in the French Dental Curriculum. International Medical Education. 2026; 5(1):7. https://doi.org/10.3390/ime5010007

Chicago/Turabian Style

Prosper, Alison, Alice Broutin, Sylvie Lê, Chiara Cecchin-Albertoni, Paul Monsarrat, Charlotte Thomas, Sara Laurencin, Sarah Cousty, Bénédicte Gendron, Florent Destruhaut, and et al. 2026. "From Written Tests to OSCE: A Study on the Perceptions of Assessment Reform by Students and Faculty in the French Dental Curriculum" International Medical Education 5, no. 1: 7. https://doi.org/10.3390/ime5010007

APA Style

Prosper, A., Broutin, A., Lê, S., Cecchin-Albertoni, C., Monsarrat, P., Thomas, C., Laurencin, S., Cousty, S., Gendron, B., Destruhaut, F., Diemer, F., Minty, M., Valéra, M.-C., Delrieu, J., Canceill, T., Blasco-Baque, V., & Marty, M. (2026). From Written Tests to OSCE: A Study on the Perceptions of Assessment Reform by Students and Faculty in the French Dental Curriculum. International Medical Education, 5(1), 7. https://doi.org/10.3390/ime5010007

Article Metrics

Back to TopTop