Next Article in Journal
Disciplinary Competencies Overview of the First Cohorts of Undergraduate Students in the Biotechnology Engineering Program under the Tec 21 Model
Previous Article in Journal
Learning about the Coexistence between Nature and Humans in Elementary Science Education: Developing Lessons Using Folktales That Reflect Ancestors’ Views on Nature
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Transparent Curriculum Design and Capability-Based Assessment Portfolio Facilitates Self-Directed Learning

Faculty of Medicine, Health and Human Sciences, Macquarie University, Wallumattagal Campus, Macquarie Park, NSW 2109, Australia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(1), 29; https://doi.org/10.3390/educsci14010029
Submission received: 22 November 2023 / Revised: 17 December 2023 / Accepted: 22 December 2023 / Published: 26 December 2023
(This article belongs to the Section Higher Education)

Abstract

:
This paper describes the design, implementation, and evaluation of a medical degree which uses a capability framework, programmatic assessment, and an assessment portfolio to facilitate feedback literacy and self-directed learning. The Macquarie MD outcomes are expressed through four graduate capabilities, each having two aspects and defined expectation standards for the two two-year stages. Assessments of capability development and Entrustable Professional Activities (EPAs) are embedded in each stage. Assessment data are stored in the student’s online portfolio with data summarised on dashboards presented chronologically and mapped to capability aspects and EPAs. A retrospective audit of curricula and graduate portfolios (n = 104) was undertaken and analysed descriptively. All students met the expectations for capability aspects and EPAs. All students voluntarily undertook additional assessments to strengthen their evidence for capability development and entrustment. On average, students completed 119% (SD = 6) of the required number of assessments: 107% (SD = 3) and 130% (SD = 11) for Stages 1 and 2, respectively. Using a capability framework to explicitly communicate the constructive alignment between capability and EPA outcomes at the degree, stage, and assessment level, as well as student access to future-focused performance standards and all their assessment data, is a powerful way to facilitate self-directed learning.

1. Introduction

The design of professional university degrees can be complex. Best practice is an outcomes-based degree with the constructive alignment of assessments to assure degree-level outcomes [1]. This can be achieved by “flipping” the curriculum design process and commencing with degree-level outcomes, with subsequent backward mapping [2]. This meticulous mapping of the degree, unit (subject), and assessment components to each other and to other regulatory, university, and professional requirements is one way that educators can address this complexity. However, because most assessment and university processes focus on the components of a degree—individual units of study—this meticulous mapping is often invisible to students. This paper describes the development, implementation, and evaluation of a medical degree (the Macquarie MD) that uses a capability framework to facilitate continuous, explicit communication of the design principles upon which this degree was developed. A bespoke online assessment portfolio is used to house all assessment data to facilitate the development of feedback literacy and evaluative judgement with the aim of driving self-directed and lifelong learning.
Capability as a framework for enhancing quality in higher education was originally defined by Stephenson [3] (p. 2) as “an integration of knowledge, skills, personal qualities and understandings used appropriately and effectively—not just in familiar and highly focused specialist contexts but in response to new and changing circumstances”. Stephenson [3] also highlighted that capability can be observed in people with confidence in their ability to (1) take effective and appropriate action; (2) explain what they are about; (3) live and work effectively with others; and (4) continue to learn from experience as an individual and in association with others, in a diverse and changing society. Importantly, capability is not just about knowledge and skills; rather it involves taking effective action in unfamiliar or challenging circumstances and requires an ethical approach, judgement, risk-taking, and a commitment to ongoing learning. In professional degrees, capability is often assessed in the work-integrated learning component using authentic work-based tasks with deliberate reflection and identification of areas for development.
Competency-based frameworks are a common outcomes-based design for many professional health degrees that meet the regulatory requirements of the Australian Qualifications Framework [4], the Higher Education Standards Framework [5], and relevant professional standards. While competency approaches have brought about some benefits in assuring standards are met, they have limitations. The main limitation is that competence is focused on the ability to perform effectively in the present [3], whereas capability embraces competence while also being forward-looking and concerned with the future realisation of potential. Capability extends to self-development, with individuals needing to develop their evaluative judgement to formulate their own learning needs in the context of the work and life that they seek. The use of a capability approach in medical education is not new. In 2006, McNeil and colleagues [6] embedded graduate capabilities, including generic outcomes (e.g., critical evaluation, reflection, communication, and teamwork), in an undergraduate medical degree (see also [7]). Similarly, in 2015, Sandars and Hart [8] advocated for the implementation of a capability approach to medical education as a means of improving social justice as well as enhancing the ability of an individual to drive their learning and fulfil their own aspirations to lead a valued life. Importantly, they advocated avoiding a separate capability unit and instead ensuring that the capability approach is integrated across all areas of the curriculum [8].
In addition to the need to develop capabilities, medical students must master professional activities like taking blood, gathering information, and developing a diagnosis. As such, many post-qualification medical education training programs now include a set of Entrustable Professional Activities, or EPAs [9]. An EPA is a discrete professional activity which integrates relevant capabilities. Assessment of an EPA is through a holistic judgement of required supervision support [10], which, in harmony with the capability framework, is future-focused.
Programmatic assessment as an instructional design tool and a means of optimising the learning and decision function of assessment has been advocated for some time [11]. It incorporates a shift from assessment of learning to assessment for learning [12]. In a programmatic model, assessment information and feedback—from multiple data points in a variety of assessment formats—is aggregated and used for learning and high-stakes decisions such as promotion to the next year or certification [12,13]. Programmatic assessment fits well with a capability framework for education as it is a learner-centred, constructivist view of education focused on longitudinal development and with an emphasis on lifelong learning and self-directed learning [14].
There is some evidence that the use of online portfolios is effective in promoting self-directed learning [15]. Knowles [16], a pioneer in the field, originally defined self-directed learning as “a process in which individuals take the initiative, with or without the help of others, in diagnosing their learning needs, formulating learning goals, identifying human and material resources for learning, choosing and implementing appropriate learning strategies, and evaluating learning outcomes (p. 15)”. One of the challenges in the research evaluating the effectiveness of use of online portfolios in driving self-directed learning is that there are many different types of portfolios [15]. Carless and Boud [17] see self-direction as being underpinned by feedback literacy, which denotes the understandings, capacities, and dispositions needed to make sense of information and use it to enhance work or learning strategies. Feedback literacy involves interrelated components including appreciating feedback, making judgements, managing affect, and taking action [17]. Information provided in an online portfolio, which houses all assessment data, has the potential to drive self-directed learning and feedback literacy as it allows learners to reflect on their progress, diagnose learning needs, and create learning plans [15,18].

1.1. The Macquarie MD Overview: Structure and Capability Framework

The establishment of the Macquarie MD was a deliberate strategy in the university’s expansion in medicine and health. The university established an academic health sciences centre known as MQ Health in 2015, building on the foundation of Macquarie University Hospital (MUH), which opened in 2010, and academic initiatives beginning in the early 2000s. MQ Health is the first university-led integrated health campus in Australia. It brings together clinical care, teaching, and research to provide an integrated approach to holistic patient treatment, discovery, and learning, encapsulated in three words: heal, learn, discover.
Given that higher education needs to promote lifelong learning and be future-focused, a capability framework and programmatic assessment were adopted when designing the Macquarie MD, which is an Australian Qualification Framework Level 9 Masters (Extended) degree [4]. The four-year degree is organised into two stages: Stage 1 encompasses years 1 and 2, and Stage 2 encompasses years 3 and 4. The focus of each year is outlined in Figure 1, and further detail available in Supplementary Data Table S1. The outcomes for the Macquarie MD are articulated using a capability framework with four capabilities: 1. Scientist and Scholar, 2. Clinical Practitioner, 3. Engaged Global Citizen, and 4. Professional. Each capability has two aspects. In addition, there are three expectations that align with higher education studies generally, and all written submissions are assessed against them (Figure 1). Each of the capability aspects has clearly defined expectations statements for the end of Stage 1 and Stage 2 designed to ensure graduates are well prepared to work effectively as interns in Australia and New Zealand.

1.1.1. Entrustable Professional Activities

To further strengthen the course outcomes, a set of EPAs was developed for each stage: five for Stage 1 and nine for Stage 2, and incorporated into the assessment framework (Table 1). The EPAs for Stage 1 are designed to ensure students are fully prepared to engage in, and learn from, the clinical placements which form the primary mode of teaching in Stage 2. The Stage 2 (graduate) EPAs are specified to ensure they address the practice of internship in an Australian hospital, ensuring graduating students are ready to contribute constructively and confidently as hospital interns. The rating of entrustment for an EPA is quantified by the level of supervision required for safety, as outlined in Table 1. The ‘React’ rating is the expected EPA standard for students because interns are supervised in practice. The capability expectation statements, and the associated EPAs, guided the design of learning and teaching activities, experiences, and assessments in all years, ensuring achievement of the constructive alignment that is necessary for successful professional education.

1.1.2. Assessment Framework and Portfolio

The Macquarie MD also implements programmatic assessment and an online assessment portfolio to develop students’ feedback literacy, evaluative judgement, reflection, and ability to direct their own learning. The assessment framework utilises a variety of assessment types tailored to the capability aspects and EPA being assessed. Some assessments are summative and contribute to unit grades; however, many are low-stakes formative or self-assessments, designed to allow students to complete additional assessment tasks to support their learning if desired or required. These additional opportunities allow for eventual achievement even if there are individual differences in the pace of learning. Detailed assessment guides and rubrics define the required standard to meet end-of-stage expectations for each capability aspect and/or the react level of supervision for the EPAs, with grading typically on a four-point letter scale anchored to these end-of-stage expectations. In Stage 1, an F grade recognises a performance well below end-of-stage expectations, and P− a performance below end-of-stage expectations, while a P or P+ recognises performances that meet or exceed end-of-stage expectations. In Stage 2, the P grade is replaced by an ‘I’ (for “commencing Intern”) so that an I− recognises a performance below expectation for a commencing intern, and an I or I+ recognises a performance that meets or exceeds the expectation of a commencing intern. The use of the ‘I’ grading communicates the future-focused performance expectations to both students and assessors. Numerical grading is limited to multiple-choice examinations.
Each assessment task requires a student to be assigned an overall grade as well as grades for capability aspects and relevant EPAs. For example, a common work-based assessment such as a mini-clinical evaluation exercise would provide an overall grade as well as a grade for capability aspects: capability aspect 2.1, an effective digital and personal communicator; capability aspect 2.2, a patient-centred and safe clinician; and capability aspect 4.2, an ethical and reflective practitioner. The same assessment would also yield a level of supervision rating for graduate EPA 1, gather a history and perform a physical examination, and graduate EPA 2, synthesise available information to prioritise a differential diagnosis and develop a management plan that includes appropriate medication and/or other therapies.
After several iterations of our portfolio using a variety of tools, the current portfolio was built using Microsoft Office 365 suite. All assessment data are captured online using customised forms and stored in the student’s personal portfolio, which is accessible to students and mentors 24/7. In the student’s portfolio, data are summarised on dashboards built in Power BI. The two main dashboards reinforce the assessment framework with an overall capability dashboard (Figure 2) and an EPA dashboard. The assessment tasks are mapped to capability aspects and EPAs, and the relevant data are presented chronologically regardless of the unit or subject. A traffic light system helps students and mentors obtain an overview of capability and entrustment development across the stage. Self-assessments and supplementary sources of evidence submitted by a student are coloured grey in the portfolio to facilitate review. The overall performance grade for a task is used in automated unit-grade calculations if it is specified as a summative assessment in a specific unit. In addition to the main dashboards, the portfolio allows students to click on assessment tasks and drill down to see individual capability aspect dashboards and associated written feedback.
Throughout each stage of the Macquarie MD, students are presented with multiple opportunities to achieve the expectations of each capability aspect through a variety of unit assessments and stage assessments as well as the opportunity to undertake additional assessments to support development and strengthen evidence for capability and entrustment development. Stage assessments such as integrated written examinations, objective structured clinical examinations (OSCEs), and portfolio examinations are high-stakes assessments where decisions regarding progression are made. Progression rules are clearly articulated, and remediation and an opportunity for reassessment are provided. The portfolio examinations consider all the data across the stage and require a final grade for each of the eight capability aspects and the specific stage EPAs. In Stage 1, the portfolio examination is a desktop review of each student’s portfolio; in Stage 2, however, it is combined with a panel interview to provide more robust assessment prior to graduation. Interview questions are drawn from a prepared question bank with reference to the data from the Australian Medical Council on intern readiness [19].

1.2. Evaluation of Macquarie MD

Given the explicit and repeated communication of the capability framework, which underpins the Macquarie MD design, as well as the use of an assessment portfolio, we hypothesised that students would use the available tools to prepare themselves for self-directed and lifelong learning: clear expectations, the ability to monitor and reflect on progress, and the ability to direct their learning including the completion of additional assessment tasks and submission of supplementary evidence for feedback. For this research, submissions of additional assessments and evidence are considered an indicator of self-directed learning. Specifically, the research questions were as follows:
  • What is the final performance for the eight capability aspects in Stages 1 and 2, as well as the supervision level for the EPAs?
  • What are the types and number of assessment tasks?
  • What is the extent of self-directed learning overall and in Stages 1 and 2?

2. Materials and Methods

2.1. Design

A retrospective audit of curricula materials and individual assessment portfolio data for the first two cohorts (2018–2021 and 2019–2022) of the Macquarie MD was undertaken. Students were eligible to participate if they completed the degree in 2021 or 2022 and provided written consent for their deidentified data to be used. Demographic characteristics of age at entry, gender, country of birth, and language spoken at home were collected for eligible participants, as was information on prior study characteristics (student type, postgraduate study, time since undergraduate study).

2.2. Outcome Measures

2.2.1. Final Performance

Two measures of final performance were extracted from the portfolio. The first measure was the number and percentage of students awarded an F, P−/I−, P/I, or P+/I+ grade for the 8 capability aspects at the end of Stage 1 and Stage 2 in the portfolio examination. The second measure was the number and percentage of students awarded an explain, direct, react, or available supervision rating for the five Stage 1 and nine Stage 2 (graduate) EPAs in the portfolio examination.

2.2.2. Types of Assessment Tasks

Types of assessment tasks were classified using two systems: grade category and assessment format. There were two grade categories: formative and summative. Formative assessments were tasks which did not contribute to the unit mark (i.e., 0% weighting), whereas summative assessments were tasks which contributed to the unit mark. We acknowledge that this distinction is somewhat artificial, as although many (formative) assessments did not contribute to the unit mark, they were considered in the examination of the assessment portfolio. There were 5 different formats of assessments: examinations, participation, oral presentations, work-based assessments, and written assignments. Examinations included both written and OSCEs. Participation involved the completion of a mandatory wellbeing course. Oral presentations included clinical vivas, interviews, research presentations, and seminars. Work-based assessments were tasks typically associated with the clinical setting and included bedside tutor and mentor reports, direct observation of procedural skills (DOPS), in-training assessments (ITA), mini-clinical evaluation exercises (Mini-CEX), and teamwork mini-clinical evaluation exercises (TMEX) [20]. Written assignments included case reports, research manuscripts, and ethics and reflective reports.

2.2.3. Number of Assessment Tasks

The minimum assessment task requirement overall and by stage were extracted from curriculum materials across the degree. The number of assessment tasks completed by each student was quantified by number per student—overall (across the entire degree) and by stage. The number of individual assessment grades for the eight capability aspects and fourteen EPAs was extracted for individual students from the portfolio—overall and by stage.

2.2.4. Extent of Self-Directed Learning

The extent of self-directed learning was quantified by comparing the number of assessment tasks completed by students with the minimum requirement and expressed as a percentage of the minimum requirement. The percentage above 100 was considered a measure of the extent of self-directed learning and was evaluated overall and by stage.

2.3. Data Analysis

For each student, assessment data were extracted from the portfolio, merged with demographic characteristics, and deidentified by a data analyst. The deidentified dataset was provided in Excel format, which was also used for analyses. Descriptive statistics were used to describe the characteristics of the sample; measures of the type and amount of assessment data; and final performance overall and for Stages 1 and 2.

3. Results

3.1. Flow of Participants through Study

Overall, 104 students graduated from the Macquarie MD across 2021 (n = 46) and 2022 (n = 58). All students consented to be included in this study, and their demographic characteristics are in Table 2. The mean age was 22 (SD = 2.2) years, the majority were female (59%), and although 38% were born overseas, 90% were domestic students. One student entered the degree late in year 2, and therefore insufficient data were available to be included in the overall and Stage 1 counts for capability aspects and Stage 1 EPA counts.

3.2. Final Performance

The portfolio examination demonstrated positive outcomes for capability aspects (Figure 3), particularly growth, with a greater percentage of students exceeding the expectations at the end of Stage 2 (M = 35, SD = 10) compared to Stage 1 (M = 18, SD = 7) despite the higher expectations for Stage 2. Similarly, in the EPA outcomes (Table 3), a greater percentage of students exceeded the react level of supervision and achieved the available level of supervision for the nine graduate EPAs. Specifically, the available rating was achieved on average by 64% (SD = 12) of participants for Stage 2 EPAs compared with an average of 16% (SD = 10) for Stage 1. In order to graduate, students could only have one I− grade across the eight capability aspects, and only one ‘direct’ rating across the nine graduate EPAs. Sixteen students (15%) received an I− for a capability aspect and 7 (7%) a direct supervision rating for one of the nine graduate EPAs. Insights into longitudinal growth in performance for the eight capability aspects can be gleaned from Figure 3. While most students’ progress was at expected levels (i.e., matched the increased expectations for Stage 2) some students progressed at increased rates, while a small proportion had a decrement in rate of expected progress.

3.3. Number and Types of Assessment Data

Overall, the minimum required number of assessment tasks was 157, with 73 (46%) tasks required in Stage 1 and 84 (54%) in Stage 2 (Table 4). While the majority (55%) of assessment tasks were classified as summative, over 40% of assessment tasks overall and for each stage were at least initially formative. Not surprisingly, as a medical degree with considerable work-integrated learning in the form of clinical simulation and immersion, work-based assessments were the most frequent type of assessment task, accounting for over 54% of the total minimum requirement.
Across the Macquarie MD, the number of data points per student for each capability aspect ranged from a mean of 40 (SD = 5) for capability aspect 3.2, a public health and systems aware practitioner, to 162 (SD = 9) for capability aspect 2.1, an effective personal and digital communicator. Both capability aspects 2.2, a patient-centred and safe clinician, and 4.2, an ethical and reflective practitioner, also had over 100 data points or observations for all students. Although less frequent compared with capability aspects, there were numerous supervision ratings for the EPAs which ranged on average from 7 (SD = 2) for Stage 1 EPA 4, provide the healthcare team with resources to improve an individual patient’s care or collective patient care, to 62 (SD = 6) for graduate EPA 2, synthesise available information to prioritise a differential diagnosis and develop a management plan that includes appropriate medication and/or other therapies. More detailed analysis of the type and amount of assessment data as well as the counts for grades for each capability aspect and supervision ratings for each EPA are available in the Supplementary Data Tables S2–S4.

3.4. Extent of Self-Directed Learning

There was clear evidence of self-directed learning, as all students voluntarily completed additional assessment tasks in both Stage 1 and Stage 2. Overall, students completed 19% (SD = 6, range 6–32) more than the minimum assessment requirements, and not surprisingly, given the expectation of greater independence, more self-directed learning occurred in Stage 2 with a mean of 30% (SD = 11) than in Stage 1 with a mean of 7% (SD = 3) (Table 4). Most additional assessment was in the form of work-based assessment tasks; however, there was also evidence of completion of a wide variety of tasks. For example, there was submission of completion certificates of online modules to strengthen evidence of development for 1.1, an applied medical scientist, and narrated PowerPoint presentations for 2.1, an effective digital and personal communicator.

4. Discussion

Implications for the Future

We contend that our experience in using a capability framework to communicate the outcome design explicitly and repeatedly, together with the implementation of a programmatic assessment system with clear capability and EPA expectations and an assessment portfolio, is an effective way to assure degree-level outcomes and overcome the challenges associated with the dominance of components (units/subjects) in the delivery of higher education. Moreover, we contend that such an approach can facilitate the development of feedback literacy and evaluative judgement and promote agency and lifelong learning. The extent of self-directed learning undertaken, and the final graduation outcomes demonstrate that our students were successful in driving their own learning, with all students taking the opportunity to submit additional assessments to support their learning and strengthen evidence for capability and entrustment development. Our implementation of the programmatic assessment and portfolio aligned well with the principles published in the Ottawa 2020 consensus statement for programmatic assessment [14]. There were multiple datapoints available for high-stakes decisions regarding progression to Stage 2 and graduation.
Assuring degree-level outcomes when higher education delivery focusses on delivery of components remains a challenge. While the use of the capability-based assessment portfolio appeared effective in reinforcing the degree design and prioritising the graduate outcomes over component (unit/subject) outcomes, there are likely other strategies which we implemented that also contribute to effective assurance of degree-level outcomes. These include the fully prescribed structure of the degree, which is typical in professional entry degrees as well as curricula committees which take ownership of planning and delivery to ensure coherence. Nonetheless, the efficiency of using the capability framework to underpin degree design, a capability-based assessment portfolio, and reinforcement in all curricula materials is that the connection between degree and components is transparent, as it is explicitly and repeatedly communicated to students.
A capability approach and a programmatic online assessment portfolio yield significant benefits as they are future-focused and are a powerful way to put students in the drivers’ seat to take their career and life wherever they choose. These benefits may be mediated through the student’s development of feedback literacy [17], which includes evaluative judgement defined as the capability to make decisions about the quality of work of oneself and others [21], which is critical for learners to develop autonomy and take responsibility for self-directed and lifelong learning. Using the capability framework and mapping feedback to the framework in the assessment portfolio as well as including regular self- and mentor assessments and reflections appear to have been an effective way to develop feedback literacy and drive self-directed learning. Previous research suggests further benefits in self-directed learning could be realised through the explicit development of mentors [18]. A limitation of this research is that it was a retrospective evaluation of two cohorts within a degree with descriptive analyses. Future research investigating the relationship between graduate outcomes, self-directed learning, and feedback literacy is warranted. Similarly, a qualitive study exploring the motivations behind students engaging in self-directed learning beyond the minimum required assessment tasks would be useful. Use of a validated measure such as the Self-Directed Learning Instrument [22] and the new feedback literacy behavioural scale [23], which measures overall literacy as well as critical components and longitudinal measurement of feedback literacy, will enhance such research.

5. Conclusions

In summary, we demonstrated that explicit and repeated communication of degree design principles using a capability framework to define graduate outcomes and implementing programmatic assessment with fit-for-purpose assessments, which reinforce and assure the required capability development, results in all students driving their own learning. The combined capability and programmatic assessment approach used in the Macquarie MD aligns with the recent World Economic Forum white paper “Defining Education 4.0: A Taxonomy for the Future of Learning” [24]. Based on our positive findings, we are developing an institutional capability framework which can be used to describe graduate outcomes and explicitly communicate the design principles with all curricula materials reinforcing the capability framework so that the constructive alignment between degree and unit/subject outcomes and assessments is obvious. This capability framework has four broad capabilities: Scholar—the development of course-specific capabilities; Practitioner—the application of these capabilities; Citizen—the contextual adaptation of capabilities for diverse settings and positive impact; and Professional—alignment of capabilities to personal and organisational purpose, values, and growth. Adoption of the capability and programmatic assessment approach may yield benefits for the improvement of educational outcomes across many disciplines.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/educsci14010029/s1, Table S1. Brief description of content of each year of the Macquarie MD; Table S2. Overall, Stage 1, and Stage 2 minimum number of assessments tasks in each assessment grade category (formative, summative) and format (examination, oral presentation, participation, work-based, written); Table S3. Overall, Stage 1, and Stage 2 amount (counts per participant) of assessment data for each of the eight capability aspects; Table S4. Amount of assessment data (counts per participant) for each of the Entrustable Professional Activities across Stage 1 (n = 103) and Stage 2 (n = 104).

Author Contributions

Conceptualization, C.M.D., H.H., H.P.M. and C.H.; methodology, C.M.D., H.H., H.P.M. and C.H.; investigation, C.M.D.; resources, C.M.D., H.H., H.P.M. and C.H.; formal analysis, C.M.D.; data curation, C.M.D.; writing—original draft preparation, C.M.D. and H.H.; writing—review and editing, C.M.D., H.H., H.P.M. and C.H.; funding acquisition, C.M.D. and H.H. All authors have read and agreed to the published version of the manuscript.

Funding

The initial development of the assessment portfolio was funded by a Macquarie University Strategic Learning and Teaching grant.

Institutional Review Board Statement

This study was conducted in accordance with the Declaration of Helsinki and approved by the Macquarie University Human Research Ethics Committee (reference number 520211058432493 approved 23 September 2021).

Informed Consent Statement

Informed consent was obtained from all participants involved in this study.

Data Availability Statement

The participants of this study did not give written consent for their data to be shared publicly. There are additional cohort data in Supplementary Data Tables; however, individual supporting data are not available.

Acknowledgments

The researchers would like to acknowledge education designers Matthew Robson and Sherrie Love and data analyst Michael Herbert who assisted in the creation of the portfolio. We would also like to acknowledge Domenico Garzo who assisted with analyses of the first cohort.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Biggs, J. What the Student Does: Teaching for enhanced learning. High. Educ. Res. Dev. 1999, 18, 57–75. [Google Scholar] [CrossRef]
  2. Scott, G. Transforming Graduate Capabilities & Achievement Standards for a Sustainable Future. Key Insights from a 2014–16 Office for Learning & Teaching National Senior Teaching Fellowship; Australian Government Office of Learning and Teaching: Canberra, Australia, 2016. Available online: https://www.education.gov.au/system/files/documents/submission-file/2023-02/AUA_priorities_Emeritus%20Professor%20Geoff%20Scott_1.pdf (accessed on 24 August 2023).
  3. Stephenson, J. The concept of capability and its importance in higher education. In Capability and Quality in Higher Education; Stephenson, J., Yorke, M., Eds.; Kogan Page Limited: London, UK, 1998; pp. 1–14. [Google Scholar] [CrossRef]
  4. Australian Qualifications Framework Council. Australian Qualifications Framework, 2nd ed.; Australian Qualifications Framework Council: Adelaide, Australia, 2013. [Google Scholar]
  5. Tertiary Education Quality and Standards Agency Act, 2011 Higher Education Standards Framework (Threshold Standards) 2021 (Australian Government). Available online: https://www.legislation.gov.au/Details/F2022C00105 (accessed on 24 August 2023).
  6. McNeil, H.P.; Hughes, C.S.; Toohey, S.M.; Dowton, S.B. An innovative outcomes-based medical education program built on adult learning principles. Med. Teach. 2006, 28, 527–534. [Google Scholar] [CrossRef] [PubMed]
  7. Toohey, S.; Kumar, R. A New Program of Assessment for a New Medical Program. Focus Health Prof. Educ. A Multi-Discip. J. 2003, 5, 23–33. [Google Scholar]
  8. Sandars, J.; Sarojini Hart, C. The capability approach for medical education: AMEE Guide No. 97. Med. Teach. 2015, 37, 510–520. [Google Scholar] [CrossRef] [PubMed]
  9. ten Cate, O. Entrustability of professional activities and competency-based training. Med. Educ. 2005, 39, 1176–1177. [Google Scholar] [CrossRef] [PubMed]
  10. ten Cate, O. A primer on entrustable professional activities. Korean J. Med. Educ. 2018, 30, 1–10. [Google Scholar] [CrossRef] [PubMed]
  11. van der Vleuten, C.P.; Schuwirth, L.W. Assessing professional competence: From methods to programmes. Med. Educ. 2005, 39, 309–317. [Google Scholar] [CrossRef] [PubMed]
  12. Schuwirth, L.W.T.; Cees, P.M.; Van der Vleuten, C.P.M. Programmatic assessment: From assessment of learning to assessment for learning. Med. Teach. 2011, 33, 478–485. [Google Scholar] [CrossRef] [PubMed]
  13. Van Der Vleuten, C.P.M.; Schuwirth, L.W.T.; Driessen, E.W.; Govaerts, M.J.B.; Heeneman, S. Twelve Tips for programmatic assessment. Med. Teach. 2015, 37, 641–646. [Google Scholar] [CrossRef] [PubMed]
  14. Heeneman, S.; de Jong, L.H.; Dawson, L.J.; Wilkinson, T.J.; Ryan, A.; Tait, G.R.; Rice, N.; Torre, D.; Freeman, A.; van der Vleuten, C.P.M. Ottawa 2020 consensus statement for programmatic assessment—1. Agreement on the principles. Med. Teach. 2021, 43, 1139–1148. [Google Scholar] [CrossRef] [PubMed]
  15. Beckers, J.; Dolmans, D.H.J.M.; van Merriënboer, J.J.G. e-Portfolios enhancing students’ self-directed learning: A systematic review of influencing factors. Australas. J. Educ. Technol. 2016, 32, 32–46. [Google Scholar] [CrossRef]
  16. Knowles, M.S. Self-Directed Learning: A Guide for Learners and Teachers; Cambridge Adult Education: Englewood Cliffs, NJ, USA, 1975. [Google Scholar]
  17. Carless, D.; Boud, D. The development of student feedback literacy: Enabling uptake of feedback. Assess. Eval. High. Educ. 2018, 43, 1315–1325. [Google Scholar] [CrossRef]
  18. van Schaik, S.; Plant, J.; O’Sullivan, P. Promoting self-directed learning through portfolios in undergraduate medical education: The mentors’ perspective. Med. Teach. 2013, 35, 139–144. [Google Scholar] [CrossRef] [PubMed]
  19. Australian Medical Council 2021 Evaluation Report. National Preparedness for Internship Survey 2017–2019. Available online: https://www.amc.org.au/wp-content/uploads/2021/12/Intern-Survey-evaluation-report-FINAL-for-website.pdf (accessed on 24 August 2023).
  20. Olupeliyawa, A.M.; O’Sullivan, A.J.; Hughes, C.; Balasooriya, C.D. The Teamwork Mini-Clinical Evaluation Exercise (T-MEX): A workplace-based assessment focusing on collaborative competencies in health care. Acad. Med. 2014, 89, 359–365. [Google Scholar] [CrossRef] [PubMed]
  21. Tai, J.; Ajjawi, R.; Boud, D.; Dawson, P.; Panadero, E. Developing evaluative judgement: Enabling students to make decisions about the quality of work. High. Educ. 2018, 76, 467–481. [Google Scholar] [CrossRef]
  22. Shen, W.Q.; Chen, H.L.; Hu, Y. The validity and reliability of the self-directed learning instrument (SDLI) in mainland Chinese nursing students. BMC Med. Educ. 2014, 14, 108. [Google Scholar] [CrossRef] [PubMed]
  23. Dawson, P.; Yan, Z.; Lipnevich, A.; Tai, J.; Boud, D.; Mahoney, P. Measuring what learners do in feedback: The feedback literacy behaviour scale. Assess. Eval. High. Educ. 2023, 1–15. [Google Scholar] [CrossRef]
  24. World Economic Forum. Defining Education 4.0: A Taxonomy for the Future of Learning White Paper. 2023. Available online: https://www3.weforum.org/docs/WEF_Defining_Education_4.0_2023.pdf (accessed on 24 August 2023).
Figure 1. Macquarie MD overview: capability framework and course structure. The Macquarie MD has 4 graduate capabilities each with two aspects and 3 aligned generic higher education expectations: G1.2 a scholar: search and citation standard; G2.1 a communicator: academic writing standard; and G 4.2 a professional: meets expectations and obligations. The 4-year structure is divided into 2 two-year stages, each year with a specific focus.
Figure 1. Macquarie MD overview: capability framework and course structure. The Macquarie MD has 4 graduate capabilities each with two aspects and 3 aligned generic higher education expectations: G1.2 a scholar: search and citation standard; G2.1 a communicator: academic writing standard; and G 4.2 a professional: meets expectations and obligations. The 4-year structure is divided into 2 two-year stages, each year with a specific focus.
Education 14 00029 g001
Figure 2. Example of Stage 1 capability dashboard from personalised assessment portfolio for fictitious student (Sue Perb). Each row of data corresponds to an assessment task with grades mapped to the relevant capability aspects and generic higher education expectations. Data are presented chronologically regardless of unit. Data are colour coded using a traffic light system with bright green indicative of exceeding expectations (P+), green indicative of meeting (P), orange below expectations (P−), and red well below expectations (F) for end of Stage 1. Grey indicates supplementary evidence which has been mapped to capability aspects and assessed by student or required self-assessment task. Note the quantity of data in this example is less than required for a MD student.
Figure 2. Example of Stage 1 capability dashboard from personalised assessment portfolio for fictitious student (Sue Perb). Each row of data corresponds to an assessment task with grades mapped to the relevant capability aspects and generic higher education expectations. Data are presented chronologically regardless of unit. Data are colour coded using a traffic light system with bright green indicative of exceeding expectations (P+), green indicative of meeting (P), orange below expectations (P−), and red well below expectations (F) for end of Stage 1. Grey indicates supplementary evidence which has been mapped to capability aspects and assessed by student or required self-assessment task. Note the quantity of data in this example is less than required for a MD student.
Education 14 00029 g002
Figure 3. Matrix of the number of students awarded the respective final grades for the 8 capability aspects in Stage 1 and Stage 2. Within the central matrix, light-green shading is indicative of increased rate of progress, pink indicative of a decrement in progress, and white indicative of progress at expected levels in Stage 2 compared with Stage 1. Overall stage totals for each grade category are traffic-lighted, with red indicating well below expectations, orange below expectations, green meets expectations, and darker green exceeds expectations.
Figure 3. Matrix of the number of students awarded the respective final grades for the 8 capability aspects in Stage 1 and Stage 2. Within the central matrix, light-green shading is indicative of increased rate of progress, pink indicative of a decrement in progress, and white indicative of progress at expected levels in Stage 2 compared with Stage 1. Overall stage totals for each grade category are traffic-lighted, with red indicating well below expectations, orange below expectations, green meets expectations, and darker green exceeds expectations.
Education 14 00029 g003
Table 1. Macquarie MD Stage 1 and Stage 2 (graduate) Entrustable Professional Activities (EPAs) and Entrustment Rating Scale.
Table 1. Macquarie MD Stage 1 and Stage 2 (graduate) Entrustable Professional Activities (EPAs) and Entrustment Rating Scale.
Stage 1 Entrustable Professional Activities
S1 EPA 1Gather information from a medically stable patient with a common clinical presentation.
S1 EPA 2Integrate information gathered from a patient to construct a reasoned and prioritised differential diagnosis as well as a preliminary plan for common clinical presentations and diagnoses.
S1 EPA 3Communicate information relevant to the patient’s care with other members of the healthcare team.
S1 EPA 4 Provide the healthcare team with resources to improve an individual patient’s care or collective patient care.
S1 EPA 5Perform required procedures.
Stage 2 Entrustable Professional Activities
G EPA 1Gather a history and perform a physical examination.
G EPA 2Synthesise available information to prioritise a differential diagnosis and develop a management plan that includes appropriate medication and/or other therapies.
G EPA 3Form clinical questions and use the medical literature and research methodologies to retrieve information and resources to advance patient care.
G EPA 4Recognise a patient requiring urgent or emergent care and initiate evaluation and management.
G EPA 5Obtain informed consent for clinical encounters, tests, and procedures, and perform common procedures for an intern.
G EPA 6Recommend and interpret common diagnostic and screening tests.
G EPA 7Report a clinical encounter orally, and document patient assessment and management (e.g., findings, orders, prescriptions, and adverse incidents).
G EPA 8Collaborate as an intern in an interprofessional team by giving or receiving handovers, making referrals, and requesting expert consultations.
G EPA 9Share information about the patient’s care, including diagnosis and management plan, with a patient.
Entrustment Rating Scale—level of supervision required for safety
Explain (E):The supervisor will need to explain what is involved in the EPA.
Direct (D):The supervisor will need to direct the performance of the EPA closely.
React (R)The supervisor will need to be close by and ready to react during the performance of the EPA if needed.
Available (A):The supervisor needs only to be available to assist if required.
Table 2. Participants’ demographic and study characteristics.
Table 2. Participants’ demographic and study characteristics.
ParticipantsOverall
(n = 104)
Age at entry (yr), mean (SD) range22 (2) 19–33
Gender, n identifying as female (%)61 (59)
Country of birth, n other than Australia (%)40 (38)
Language spoken at home, n not English (%) 32 (31)
Student type, n domestic (%)94 (90)
Postgraduate study, n prior postgraduate (%) 18 (17)
Time since UG studies (yr), mean (SD)1.0 (1.4)
Table 3. Final level of supervision rating, number of students, and (%) for Stage 1 and Stage 2 Entrustable Professional Activities.
Table 3. Final level of supervision rating, number of students, and (%) for Stage 1 and Stage 2 Entrustable Professional Activities.
Entrustable Professional Activities, n Students (%)Level of Supervision
Explain
n = 104
Direct
n = 104
React
n = 104
Available
n = 104
Stage 1
S1 EPA 1Gather information from a medically stable patient with a common clinical presentation.0 (0)1 (1)72 (69)31 (30)
S1 EPA 2Integrate information gathered from a patient to construct a reasoned and prioritised differential diagnosis as well as a preliminary plan for common clinical presentations and diagnoses.0 (0)4 (4)91 (88)9 (9)
S1 EPA 3Communicate information relevant to the patient’s care with other members of the healthcare team. 0 (0)0 (0)79 (76)25 (24)
S1 EPA 4 Provide the healthcare team with resources to improve an individual patient’s care or collective patient care.0 (0)5 (5)86 (83)13 (13)
S1 EPA 5Perform required procedures.0 (0)5 (5)93 (89)6 (6)
Stage 2
G EPA 1Gather a history and perform a physical examination.0 (0)0 (0)16 (15)88 (85)
G EPA 2Synthesise available information to prioritise a differential diagnosis and develop a management plan that includes appropriate medication and/or other therapies.0 (0)4 (4)52 (50)48 (46)
G EPA 3Form clinical questions and use the medical literature and research methodologies to retrieve information and resources to advance patient care.0 (0)0 (0)40 (38)64 (62)
G EPA 4Recognise a patient requiring urgent or emergent care and initiate evaluation and management.0 (0)1 (1)47 (45)56 (54)
G EPA 5Obtain informed consent for clinical encounters, tests, and procedures, and perform common procedures for an intern.0 (0)0 (0)35 (34)69 (66)
G EPA 6Recommend and interpret common diagnostic and screening tests.0 (0)0 (0)49 (47)55 (53)
G EPA 7Report a clinical encounter orally, and document patient assessment and management (e.g., findings, orders, prescriptions, and adverse incidents).0 (0)0 (0)30 (29)74 (71)
G EPA 8Collaborate as an intern in an interprofessional team by giving or receiving handovers, making referrals, and requesting expert consultations.0 (0)2 (2)25 (24)77 (74)
G EPA 9Share information about the patient’s care, including diagnosis and management plan, with a patient.0 (0)0 (0)34 (33)70 (67)
Table 4. Overall, Stage 1, and Stage 2 minimum required number of assessment tasks in each assessment grade category (formative, summative) and format (examination, oral presentation, participation, work-based, written) and total number of assessments tasks completed by students.
Table 4. Overall, Stage 1, and Stage 2 minimum required number of assessment tasks in each assessment grade category (formative, summative) and format (examination, oral presentation, participation, work-based, written) and total number of assessments tasks completed by students.
Assessment Task Grade Category and FormatMinimum Required Number of Assessment Tasks
Overall
n = 157
Stage 1
n = 73
Stage 2
n = 84
Minimum Required Formative Assessment Tasks, n703535
  Examination, n (%)2 (3)1 (3)1 (3)
  Participation, n (%)1 (1)1 (3)0
  Work-based assessment, n (%)62 (89)31 (89)31 (89)
  Written assignment, n (%)5 (7)2 (6)3 (9)
Minimum Required Summative Assessment Tasks, n8738 49
  Examination, n (%)23 (26)15 (39)8 (16)
  Oral presentation, n (%)9 (10)5 (13)4 (8)
  Work-based assessment, n (%)23 (26)7 (18)16 (33)
  Written assignment, n (%)32 (37)11 (29)21 (43)
Completed Assessment Tasks
Total Completed Assessment TasksOverallStage 1Stage 2
  Ϯ Completed, number/student mean (SD) range 187 (9) 167–20878 (2) 75–83109 (9) 88–130
  Completed, % of minimum required mean (SD) range119 (6) 106–132107 (3) 103–114130 (11) 105–155
Ϯ 103 participants for amount completed assessment, which excluded 1 participant who entered MD in Year 2 from overall and Stage 1.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Dean, C.M.; Harris, H.; McNeil, H.P.; Hughes, C. A Transparent Curriculum Design and Capability-Based Assessment Portfolio Facilitates Self-Directed Learning. Educ. Sci. 2024, 14, 29. https://doi.org/10.3390/educsci14010029

AMA Style

Dean CM, Harris H, McNeil HP, Hughes C. A Transparent Curriculum Design and Capability-Based Assessment Portfolio Facilitates Self-Directed Learning. Education Sciences. 2024; 14(1):29. https://doi.org/10.3390/educsci14010029

Chicago/Turabian Style

Dean, Catherine M., Hayley Harris, Hugh P. McNeil, and Chris Hughes. 2024. "A Transparent Curriculum Design and Capability-Based Assessment Portfolio Facilitates Self-Directed Learning" Education Sciences 14, no. 1: 29. https://doi.org/10.3390/educsci14010029

APA Style

Dean, C. M., Harris, H., McNeil, H. P., & Hughes, C. (2024). A Transparent Curriculum Design and Capability-Based Assessment Portfolio Facilitates Self-Directed Learning. Education Sciences, 14(1), 29. https://doi.org/10.3390/educsci14010029

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop