Next Article in Journal
A Pilot Practitioner’s Inquiry into Students’ Reflections on AI-Generated Podcasts in Higher Education
Next Article in Special Issue
Ai-RACE as a Framework for Writing Assignment Design in Higher Education
Previous Article in Journal
One-Year Follow-Up of Two Intensive Supplemental Reading and Mathematics Programs in Swedish Elementary School
Previous Article in Special Issue
Beyond Detection: Redesigning Authentic Assessment in an AI-Mediated World
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Harness a Simple Design to Make Authentic Learning Moments Visible: A Design-Based Research Study in Clinical Reasoning

1
Education Innovation Exchange (EIX), Swinburne University of Technology, Hawthorn, VIC 3122, Australia
2
Research and Innovation Office, Torrens University Australia, Adelaide, SA 5000, Australia
*
Authors to whom correspondence should be addressed.
Educ. Sci. 2025, 15(12), 1679; https://doi.org/10.3390/educsci15121679
Submission received: 24 October 2025 / Revised: 24 November 2025 / Accepted: 2 December 2025 / Published: 12 December 2025

Abstract

There is a growing demand for digital innovation to facilitate authentic communication during the learning experience at Australian Universities. Student’s communication is considered ‘authentic’ in various ways, from using discipline-specific professional language to expressing personal values through honest self-reflection. Enhancing authentic rational decision-making during social learning online is one priority area now available for students developing clinical reasoning skills. Using a Design-based Research (DBR) methodological framework, 34 students, 26 educators, and 5 learning designers from Torrens University Australia provided iterative feedback on the development and implementation of a simple digital decision wheel tool, aimed at supporting independent and collaborative decision-making. Three DBR phases were implemented, encompassing an initial pilot and development stage with 3 subjects, and two subsequent phases with an additional 17 subjects that were incorporated using a decision wheel tool for independent and problem-based learning. Data were generated through 44 semi-structured interviews and 20 focus groups across twenty undergraduate subjects delivered in various learning modes across five 12-week DBR action cycles. Reflexive thematic analysis and bounded rationality theory guided analysis. Outputs reveal that a simple digital tool contributed positively to making authentic learning moments visible and promoted inclusive and formative dialogue. Benefits included development of psychological authenticity when preparing to make authentic industry decisions. The initiative aligns with broader educational goals for resourcing and developing tools to scaffold a ‘critical pause’ before articulating authentic thinking when engaging with humans and machines.

1. Introduction

Technology is considered one important factor influencing education design in Australia, with a growing demand for digital innovation to enhance communication during the learning experience (Mandala Partners, 2023). Interest has emerged in clinical education for developing tools that support independent rational decision-making before and after collaborative learning with peers. A phenomenon observed with the shift from traditional Face-to-Face (F2F) problem-based learning (PBL) to online modes of delivery (Dolmans, 2019; Hoadley & Campos, 2022). How to ensure authentic learning has occurred for individuals and in groups has led to defining what proficient authenticity in learning looks like and how it can be transparent across a student learning journey. Authentic learning is generally associated with students having ‘real-world’ experiences preparing for industry (Chambers & Broadbent, 2024; Fawns et al., 2024; Herrington et al., 2006). This perspective of authentic learning is evolving to encompass students being able to showcase trustworthy opinion beyond just completing a real-life task (Ajjawi et al., 2024).
Evidencing that ‘psychological authenticity’ has occurred requires a deliberate scaffold to inspire students to make decisions that are consistent with their personal values (Ajjawi et al., 2024; Chambers & Broadbent, 2024). To support this scaffold, digital learning tools have potential to guide self-reflection and to endorse decision-making in ways that feel safe and genuine to the individual. As Maloney et al. (2013) explain, understanding how students arrive at critical reflective decisions requires visibility to ascertain the truthfulness of thinking and to promote deeper discussion. The possibility for both psychological barriers and physical influences to affect authentic communication is a learning problem to be solved.
The need to ‘see’ if decision-making by students is ‘authentic’ when collaborating is now a broader issue collectively being interrogated at a time when learning with generative artificial intelligence (genAI) has accelerated in education (Chan & Tsi, 2024; Lee et al., 2024; Lodge et al., 2023; Liu & Bates, 2025). The current state in higher education is driving innovative ways to showcase authenticity in collaborative decision-making and mitigate risk of genAI challenging evaluation of individual contribution. Daily social media posts appeal to look for evidence of learning rather than cheating (Ellis & Lodge, 2024), to ensure actual capability is visible during assessment (Dawson et al., 2024), and to educate that authentic critical thinking is still vital to be demonstrated. The reality in higher education is that any assessment task can be completed by a student with genAI, much like groupwork (Lodge et al., 2024). As these tensions increase, there is opportunity to revisit outputs from research projects that predate advanced genAI uptake, to explore transferable solutions. The output of a simple decision wheel tool in this study to facilitate moments of ‘critical pause’ to enhance authentic decision-making during clinical reasoning development is now potentially transferable beyond enhancing clinical reasoning development.
To generate ethical educational solutions, it is imperative to adopt a research strategy to understand ‘what works’ in practice (Bakker, 2018; Galvin & Cochrane, 2023). In this study, an innovative learning approach was developed during a longitudinal Design-Based Research (DBR) project (Galvin, 2023) to enhance performing clinical decision-making for health science students at Torrens University, Australia. Outputs were generated in the form of tangible (practical artefacts, e.g., decision wheel tool) and non-tangible (six final themes and design principles, and scholarship learnings) as original contributions (Galvin, 2023). The final outputs were informed by iterative consultation with students, teachers, and learning designers during five DBR action cycle stages. One final design principle, DP3: Harness a simple design, was inspired by the effectiveness of a simple decision wheel tool when learning to make decisions on clinical cases and negotiating these ideas with others. The notion of using a simple tool to create a ‘critical pause’ for students to realise and externalise their authentic reasoning is discussed in this paper. The outputs of this research project may therefore be more broadly relevant to the assurance of authentic learning in higher education.

2. Research Objectives

The original research objective was to inform clinical learning design prior to students entering workplace settings, student clinics, and industry working environments (Galvin, 2023). During 2018–2021, online opportunities for clinical education expanded at Torrens University of Australia (TUA) with 80 new subjects (can also be termed ‘units’ of study) developed in health sciences and nursing undergraduate courses (can also be termed as a ‘program’ of study). This presented an ideal time to develop and test a unique approach to learning clinical sciences and to explore the main research question with key sub-questions mapped to being advisory or descriptive, to inform improving clinical reasoning development (Bakker, 2014; Braun & Clarke, 2022; Galvin, 2023).
Main Research Question (Advisory): How can combining independent online clinical reasoning analysis with group work support undergraduate health science students learning to perform rational decision making?
The skill of clinical reasoning involves gathering and evaluating information to rationally decide on a clinical intervention. To explore the problem of how to improve supporting TUA health science students to learn clinical reasoning, a digital decision wheel tool (Galvin, 2021) was embedded into subjects across undergraduate courses to be used independently, when engaged in group work, and when being guided by an expert teacher. Importantly, this project did not aim to find the ultimate truth’, but rather to provide a theoretically informed rationale for a contextual examination of a learning problem that could have possible transferability to similar contexts (Galvin, 2023). The design principle of ‘harness a simple design’ offers both naturalistic generalisability, if it resonates with others because outcomes are familiar to alternative settings, and inferential transferability, when others in different contexts decide they could adopt and adapt the design principle and use of the decision wheel for their own contexts (Bakker, 2018; Smith, 2018).

3. Materials and Methods

Design-based Research (DBR) is a pragmatic methodological framework for solving pedagogical problems through the implementation of iterative design interventions through four phases: problem analysis, development of solutions to the identified problem, exploring, and evaluating the impact of the design in real learning environments. (Cochrane et al., 2023; McKenney & Reeves, 2019). The methodology of this longitudinal project had a dual focus, to be instrumental, by manipulating the environment through task-oriented problem-solving, and to observe and provoke the naturalistic ways learners communicated personal thoughts and values when making decisions (Bakker, 2018). Reflexive thematic analysis (RTA) method (Braun & Clarke, 2022) and bounded rationality theory (Simon, 1972) were applied when testing, refining, and retesting an online decision wheel tool for independent and group work. This combination provided a lens to observe and reflect on how individuals’ rationality was limited by three elements when making decisions independently and in groups, including (1) imperfect information, (2) cognitive limitations of minds, and (3) the amount of time learners had to make decisions. Figure 1 outlines the conceptual research model that was the foundation of this research study that demonstrates a theoretical framework, methodological framework, and philosophical framework that were combined to solve a learning problem (Galvin, 2023).
In practice, the experience of learning clinical reasoning was explored by embedding a decision wheel tool in twenty problem-based learning undergraduate health science and nursing subjects across Face-to-Face (F2F), Blended Learning (BL), and Fully Online Learning (FOL) platforms (Galvin, 2023) with a variety of learning design features:
  • Structured PBL design using the decision wheel for weekly non-graded group learning activities (6 subjects).
  • Using the decision wheel tool for non-graded learning activities independently and for informal group work (12 subjects).
  • Group-based learning design (not structured PBL) using decision wheel for weekly non-graded learning activities (2 subjects).
  • During later DBR action cycles, the decision wheel was used in selected subjects for assessments as a non-graded hurdle, albeit not mentioned in the rubric for grading (5 subjects).
The decision wheel tool (Galvin, 2021) is simple in both design and function and was designed to catalogue and prioritise thoughts, rate choices to a question, and provide a summary of thinking that could easily be visible to the self and others. To practice ‘real-world’ clinical decision-making, learning activities involved making decisions for case scenarios of varying complexity. Tasks included deciding on the most important clinical questions to ask a patient during a specific context, identifying key red flag symptoms for further investigation, listing a sequence of physical examination steps, and developing a treatment management plan. Notably, teachers and students had agency to decide on alternative ways to use the decision wheel tool throughout the DBR cycles. This evolved to include activities that challenged students to independently and collaboratively decide on how to mitigate challenges that students experience when engaging in group work, for time management and planning of learning tasks, and identifying and reflecting on learning needs.
The first landing page of the wheel tool design provides space to write a specific question and identify and add categories that link with the question. Once categories are decided in relation to a specific question, the second page of the digital tool offers an interactive means to rate the level of importance of each chosen category. Figure 2 demonstrates the layout of the first (a) and second (b) landing page of the decision wheel tool, providing space to input a specific question and related category responses, with an interactive feature to rate chosen categories (1 being low importance and 10 being highest importance).
Individual students or groups completed wheel tool attempts as many times as required. The attempts were downloaded, saved, and uploaded into individual or group forums to share and discuss outcomes. Students used the decision wheel tool predominantly when engaging in formative weekly non-graded learning activities for independent and group tasks (Galvin, 2023). These learning design approaches facilitated a stepping stone experience for learners in practising clinical reasoning choices without penalty, and promoted formative feedback during the learning process (Gamage et al., 2020).
Participants included 34 students, 26 teachers, and 5 learning designers over five DBR action cycles (12-week Trimesters). Students, teachers, and learning designers associated with newly developed health science subjects were invited to join this study during iterative DBR action cycles over an 18-month period. To meet the collaborative aim of DBR, it was imperative for participants to be involved during the development and delivery phases of subjects that had both a problem-based learning (PBL) design and could have the decision wheel tool embedded for learning critical, rational decision-making. Hence, an opportunity/convenience sampling of participants using voluntary selection was used for generating a data set and applying qualitative analysis (Braun & Clarke, 2012; Campbell et al., 2021).
Once ethics approval was obtained from Torrens University (H2/19) and a suitable subject was identified as coming on scope for new development, an ‘Authority to design for subject delivery’ ethics form was emailed to the subject development course leader to gain permission to use the digital analysis tool in a subject. Participant Information and Consent forms were distributed to students, teachers, and learning designers associated with the new subject developments. It was important that participants did not feel pressured to be part of the research project. Data generated encompassed 44 semi-structured interviews, 20 focus groups, 10 participant reflective journal entries, 65 researcher reflective journal entries, and 40 learner decision wheel attempts. Each DBR cycle presented an opportunity to gather feedback and re-engage with the literature to inform future improvements of the learning design and decision wheel tool to enhance clinical reasoning development. As central researcher, the aim was to oversee the DBR process and invite stakeholder feedback through consultation phases and oversee modification and evaluation of the decision wheel design intervention (Galvin, 2023). The process of generating data from focus groups, interviews, and journal data entries resulted in many audio/text/visual items for analysis. All interview recordings were transcribed verbatim during DBR phases by the central researcher, with analysis guided by stages of reflexive TA.
The purpose of reflexive thematic analysis is to generate patterns of meaning across a data set and, for this reason, could include a larger qualitative participant group (Braun & Clarke, 2021, 2022). RTA is a ‘Big Q’ (fully qualitative) analysis method that aligns with the beliefs, values, and assumptions of a qualitative paradigm that can be both descriptive (give credence to participants voices in context) and build on interpretive insights into how and why patterns of meaning impacted lives and experiences (Braun & Clarke, 2022, 2025). During Phase 2 and 3, semantic coding was generated by the researcher during data analysis that represented participant voices. In Phase 4 of the project, the entire data set was revisited to engage in all six stages of RTA, including familiarisation with the data set again, development of latent/critical orientation to coding, generating initial themes, reviewing themes, defining and naming themes before final write up.

4. Results

To inform how to solve a learning problem using design-based research, draft design principles guide improvements throughout iterative cycles of testing and retesting an artefact and represent collaborative voices. A key output of this study is a final set of design principles for enhancing learning clinical reasoning skills before performing authentically in workplace environments. Among the set of six final design principles (DPs) (See Appendix A), was DP3: Harness a simple design, informed by feedback and analysis during DBR cycles. The appreciation for the simplicity of the decision wheel tool design was evident to assist in being ‘discerning in deciding’ and ‘deliberate in expression’ while learning how to perform clinical reasoning across all undergraduate year levels. The simple design of the decision wheel was described as follows:
  • Being a useful and versatile learning tool for decision-making to use as individuals and in groups.
  • Enhancing inclusivity and collaboration for learning.
  • Encouraging learning attempts to show authentic progression of thinking.
  • Assisting transparency of learning and reducing academic integrity risk.
  • Providing better opportunities for efficient formative feedback dialogue between peers and key teachers.
The central idea discussed in this paper is that a deliberately simple scaffold facilitated a critical pause for students to externalise strengths and limitations in their reasoning. Notably, the decision wheel tool attempts made ‘cognitive limitations’ visible for students when independently deciding on how to prioritise case scenario needs. Using bounded rationality theory (Simon, 1991) to guide research analysis, the category of ‘information imperfection’ was more acutely evident for a real-world reasoning task for students when they had ‘time constraints’ in using the rating scale of the decision wheel or not having enough time to discuss attempts with peers or their teacher (Galvin, 2023). Figure 3 illustrates a student decision wheel attempt showing every category being given the same rating of importance, compared to a second wheel attempt example for the same case scenario that was worked through as a group with a teacher.
The three elements of bounded rationality theory were mapped to student and teacher experience of using the decision wheel tool for decision-making (Galvin, 2023). Imperfect information was experienced for students when first practicing how to use discipline knowledge and skills for clinical decision-making. Likewise, cognitive limitations were felt when processing information independently became a struggle, often leading to simplifying the decision-making outcome. Problem-solving was more rushed when there were imposed time constraints to make optimal decisions. Table 1 reveals common learning and teaching experiences mapped to the three elements of bounded rationality theory when students and teachers across undergraduate year levels engaged with the decision wheel tool.
The experience of using the decision wheel tool revealed benefits beyond making specific decisions to prepare for industry. During individual interviews and focus groups, both students and teachers across undergraduate health science levels shared insights aligned with the development of psychological authenticity by using the tool. Table 2 outlines operational indicators of psychological authenticity that were expressed through the opinions of students and teachers when using the simple decision wheel tool for increasing a feeling of safety to negotiate authentic thinking with a group of peers.
The majority of students and educators did not want the tool’s simple design to change, with feedback having a strong focus on maintaining the usability of the tool. Main improvement requests included enhancing editing functions during wheel attempts, improving questions the tool was used for and having more platforms to upload wheel attempts when working online with groups (Galvin, 2023). Table 3 outlines the improvements and evolution DP3: Harness a simple design during the implementation of 5 × 12-week DBR action cycles.
For developing real-world and psychological authenticity during decision-making, design strengths included maintaining the simplicity of the tool, having no instant feedback feature for choices being made, and keeping a limited number of categories (up to 8) to add ideas for rating in response to a question. These design features were expressed as important for the tool’s versatility to use across diverse student cohorts, for encouraging (and challenging) synthesis of ideas, and providing pause for personal reflection of ideas in visible form before external input.

5. Discussion

Clinical education has traditionally promoted collaborative inquiry to encourage agile and critical thinkers to decide on case studies and treatment options (Barber et al., 2015; Burgess et al., 2018; Blackburn, 2015; Christensen & Jensen, 2019; Galvin, 2023). This learning approach was applied in this study to explore real-world, complex clinical problems and solutions. Students were required to demonstrate decision-making moments independently and in groups and, in doing so, needed to trust their own judgement and to distinguish between reliable and unreliable information (Lombardi, 2007). Hence, in this context, learning was ‘authentic’ when knowledge was situated in a real-world activity and when independent decision-making was articulated with others with personal meaning (Herrington et al., 2006). Previous research on clinical education has focused on student perceptions comparing Face-to-Face (F2F) traditional lecturer style with a group learning experience to stimulate critical thinking (Al-Azri & Ratnapalan, 2014; Burgess et al., 2018; Fan et al., 2018; Galvao et al., 2014; Gholami et al., 2016). In comparison, by inviting students, teachers, and learning designers to collaborate on and engage with learning design options iteratively, the experience of the three key elements of bounded rationality, as discussed in more detail in the next paragraphs, provided valuable insight into how a simple decision wheel tool positively influenced making a clinical decision (Galvin, 2023).

5.1. Time Constraints

The impact of ‘time constraints’ (Simon, 1972) during problem-solving activities proved to be an important design consideration for helping students become aware of using heuristics (a mental short cut) when making quick clinical decisions, a necessary skill for practitioners upon graduation to choose real-world solutions confidently (Bolton, 2018; Higgs et al., 2019; Marewski & Gigerenzer, 2012; Moore, 2020; Young et al., 2018). Building this real-world skill was not solely achieved by shortening time to make decisions, or by simulating an ‘authentic industry experience’ for early undergraduate health science students. Instead, developing patience to explore and develop longer arguments in thinking was critical for students learning how to perform real-world decision-making in the longer term (Ajjawi et al., 2024; Lombardi, 2007). This was affirmed when students expressed having more confidence to identify and share opinions on clinical case scenarios after having time to use the decision wheel tool before and during group work, aligning with building psychological authenticity as a first step towards real-world quick clinical reasoning and even imagining their future selves being skilled enough to enact these tasks in professional settings (Ajjawi et al., 2024; Chambers & Broadbent, 2024; Galvin, 2023).

5.2. Information Imperfection

As Higgs et al. (2019) attest, benefits of articulating clinical reasoning includes simplifying complex thinking processes to make ideas explicit to the self and others. Students and teachers described being able to improve ‘information imperfection’ when having equal opportunity to freely defend, explain, assess, and challenge heuristic thinking in groups by using the decision wheel tool (Galvin, 2023).

5.3. Cognitive Limitations

Notably, developing confidence in articulating authentic thinking required teachers to deliberately facilitate psychological authenticity by encouraging students to feel comfortable in potentially making mistakes and to reveal ‘cognitive limitations’. Tautz and Steen (2021) reiterate that we can only be mistaken in choice when we are able to choose between at least two alternatives. Hence, learning tools that enable authentic critical thinking choices to be ‘visible’ has been attributed to helping students talk, write, and demonstrate knowledge and skills to encourage rational discernment and confidence to share authentic ideas (Medina, 2017).

5.4. Psychological Authenticity in Decision-Making

The experience of using the decision wheel for students demonstrated how a simple tool facilitated feelings of confidence to promote self-expression and to challenge assumptions. Ajjawi et al. (2024) explains that authenticity is in a constant state of ‘flux’, perfectly natural for students who are in stages of becoming. Having the opportunity to practice being courageous and critical to the self and to others by using a non-intimidating simple decision wheel tool was a positive physical influence on learning, leaving time to work through any psychological barriers experienced when articulating honest decisions in a pre-clinical learning context (Galvin, 2023; Maloney et al., 2013).

5.5. The Benefit of a “Critical Pause”

The central idea of providing students with a deliberate and simple scaffold to create a critical pause to externalise reasoning was facilitated by using the decision wheel tool design for clinical reasoning development in this DBR project. Markedly, the use of a simple learning tool that did not produce instant and reactive external commentary on opinions enabled this ‘critical pause’ during decision-making. Time for a deliberate pause during the learning process had a positive influence on clarifying an honest personal stance before sharing it with others, hence assisting co-regulation of progressive thinking.
Institutions in higher education globally are now tasked with developing and socialising policy and procedures for incorporating the use of generative AI for learning to ensure students are future-ready for employment (Benton, 2025; Fergusson et al., 2022; Falkner & Stålbrandt, 2023), while maintaining validity and reliability of assessment design (Bearman et al., 2024; Dawson et al., 2024). Moreover, co-regulation of learning between humans and machines is now argued to be a ‘new norm’ in education and has triggered an urgency for students to progressively ‘show’ development of independent and authentic critical thinking skills alongside using generative AI (Lodge et al., 2023; Lodge et al., 2024; Liu & Bates, 2025). The principle of ‘Harness a simple design’ is now potentially transferrable for broader learning contexts at a time in higher education when students have access to using generative artificial intelligence for brainstorming ideas much like relying on peers or teachers to discuss thinking (Lodge et al., 2023). Exploring ways to incorporate simple tools like the simple decision wheel design when engaging with machines may increase both psychological and practice-based authentic skill development. The authors acknowledge the absence of GenAI exposure is a limitation of the study to validate beyond speculation of how DP3: Harnessing a simple design could be used to enhance authentic learning when students are collaborating with machines. We recommend future research explores how a simple learning design can facilitate visibility of authentic decision-making when using generative artificial intelligence for independent and group learning across varying disciplines and modes of delivery.
In summary, design principles from DBR projects can often outlast the design artefact itself (Bakker, 2018). This project offers a message that by harnessing a simple design principle, digital learning tools can potentially be designed to be versatile enough to be used across disciplines for facilitating a critical pause to externalise stages of independent and authentic critical thinking. Furthermore, simple tools like the decision wheel helped develop psychological authenticity in decision-making prior to entering varied real-world clinical workplace settings. Keeping designs simple is worthy of further exploration to solve a complex, shared problem of finding inclusive ways for students to articulate authentic learning moments during a learning journey.

6. Conclusions

The simplicity of the tool facilitated generating psychological authenticity while learning to make authentic real-world decisions for clinical case scenarios. These insights raise consideration for the importance of designing simple tools for learning to make independent and group authentic learning moments visible and to encourage confidence in decision-making before entering workplace environments. Further evaluation is recommended to explore if the principle of ‘Harness a simple design’ can support the broad requirement in higher education for students to practice a ‘critical pause’ and show steps of co-regulation of learning between humans and machines.

Author Contributions

K.G. conceptualised the study and led the development of the decision wheel artefact, as well as the methodology, validation, formal analysis, data curation, writing of the original draft, and review and editing. L.T. supervised the original study and provided critical feedback on the structure and clarity of the argument. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was approved by the Torrens University Australia (TUA) Human Research Ethics Committee (HREC) (H2/19, approved 5 March 2019).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The high-level data that support the findings of this study are available from the corresponding author, Kelly Galvin, upon reasonable request.

Acknowledgments

The authors would like to acknowledge contributions of Timothy Moss for their invaluable HDR supervisory support for the original project. We would like to thank all the students, teachers, leadership staff, and learning designers from Torrens University of Australia for their time and collaboration during the initial project. We are grateful for assistance with formatting the original thesis by Jessica O’Donnell. We appreciate the constructive comments from the anonymous reviewers, which helped improve the clarity and quality of this manuscript significantly. The authors have reviewed and edited the output and take full responsibility for the content of this publication. The authors would like to acknowledge that no GenAI has been used for generating text, data, or graphics, or for study design, data collection, analysis, or interpretation of data for this creation of this manuscript.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
TUATorrens University of Australia
DBRDesign-based Research
DPsDesign Principles
F2FFace-to-Face
BLBlended Learning
FOLFully Online Learning
PBLProblem-based Learning
GenAIGenerative Artificial Intelligence
Chat GPTChat Generative Pre-Trained Transformer

Appendix A

A final set of Design Principles (DPs) were generated by the researcher after full data analysis as a key results output of this study.
Table A1. Final Six Design Principles (DPs).
Table A1. Final Six Design Principles (DPs).
Final Theme and Design PrincipleIntroduction to DP Meaning
DP1: Be the guide in the hiveHighlights a concept expressed across various points throughout the data for how the teacher role is a vital component of the learning experience.
DP2: Engage CoachingExplores the concept that ongoing support is key for both teachers and students.
DP3: Harness a simple designOutlines the central notion that simplicity is key for learning. Particularly evident when using digital tools for problem-based learning and in relation to both content and resources for subject designs involving decision-making.
DP4: Integrate time for reflexivityBrings to the fore the concept that time for a quality reflexive process in learning and teaching is key.
DP5: Vocalise and collaborate Promotes the concept that accessible spaces for communication is required to expand perspectives for quality decision-making.
DP6: Encourage stakeholders as partners in curriculumIndicates quality of learning design evolves when choices for curriculum are ‘conscious & considered’. Extending to the need for co-creation & consultation with stakeholders using the learning design and tools for clinical reasoning development.

References

  1. Ajjawi, R., Tai, J., Dollinger, M., Dawson, P., Boud, D., & Bearman, M. (2024). From authentic assessment to authenticity in assessment: Broadening perspectives. Assessment & Evaluation in Higher Education, 49(4), 499–510. [Google Scholar] [CrossRef]
  2. Al-Azri, H., & Ratnapalan, S. (2014). Problem-based learning in continuing medical education: Review of randomized controlled trials. Canadian Family Physician, 60(2), 157–165. Available online: http://www.cfp.ca/content/60/2/157 (accessed on 20 February 2018). [PubMed]
  3. Bakker, A. (2014). Research questions in design-based research. Utrecht University, Freudenthal Institute. [Google Scholar]
  4. Bakker, A. (2018). Design research in education: A practical guide for early career researchers (1st ed.). Routledge. [Google Scholar]
  5. Barber, W., King, S., & Buchanan, S. (2015). Problem based learning and authentic assessment in digital pedagogy: Embracing the role of collaborative communities. Electronic Journal of E-Learning, 13(2), 59–67. [Google Scholar]
  6. Bearman, M., Tai, J., Dawson, P., Boud, D., & Ajjawi, R. (2024). Developing evaluative judgement for a time of generative artificial intelligence. Assessment and Evaluation in Higher Education, 49(6), 893–905. [Google Scholar] [CrossRef]
  7. Benton, P. (2025). Career development learning in higher education: How authentic work experiences and opportunities for career exploration can increase self-efficacy and inform career identity. Journal of the National Institute for Career Education and Counselling, 34(1), 40–47. [Google Scholar] [CrossRef]
  8. Blackburn, G. (2015). Innovative eLearning: Technology shaping contemporary problem based learning: A cross-case analysis. Journal of University Teaching & Learning Practice, 12(2), 1–19. [Google Scholar]
  9. Bolton, G. E. J. (2018). Reflection and reflexivity: What and why. In G. E. J. Bolton, & R. Delderfield (Eds.), Reflective practice. SAGE Publications Ltd. [Google Scholar]
  10. Braun, V., & Clarke, V. (2012). Thematic analysis. In H. Cooper, P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA handbook of research methods in psychology, Vol 2: Research designs: Quantitative, qualitative, neuropsychological, and biological (pp. 57–71). American Psychological Association. [Google Scholar] [CrossRef]
  11. Braun, V., & Clarke, V. (2021). To saturate or not to saturate? Questioning data saturation as a useful concept for thematic analysis and sample-size rationales. Qualitative Research in Sport, Exercise and Health, 13(2), 201–216. [Google Scholar] [CrossRef]
  12. Braun, V., & Clarke, V. (2022). Thematic analysis. A practical guide. Sage Publications. [Google Scholar]
  13. Braun, V., & Clarke, V. (2025). Reporting guidelines for qualitative research: A values-based approach. Qualitative Research in Psychology, 22(2), 399–438. [Google Scholar] [CrossRef]
  14. Burgess, A., Roberts, C., Ayton, T., & Mellis, C. (2018). Implementation of modified team-based learning within a problem based learning medical curriculum: A focus group study. BMC Medical Education, 18(1), 74. [Google Scholar] [CrossRef] [PubMed]
  15. Campbell, K., Orr, E., Durepos, P., Nguyen, L., Li, L., Whitmore, C., Gehrke, P., Graham, L., & Jack, S. (2021). Reflexive thematic analysis for applied qualitative health research. The Qualitative Report, 26(6), 2011–2028. [Google Scholar] [CrossRef]
  16. Chambers, T. P., & Broadbent, J. (2024). Why authenticity matters: A qualitative investigation of students’ perceptions of personal values in first-year assessments. Teaching in Higher Education, 30(3), 623–639. [Google Scholar] [CrossRef]
  17. Chan, C. K. Y., & Tsi, L. H. Y. (2024). Will generative AI replace teachers in higher education? A study of teacher and student perceptions. Studies in Educational Evaluation, 83, 101395. [Google Scholar] [CrossRef]
  18. Christensen, N., & Jensen, G. M. (2019). Developing clinical reasoning capability. In J. Higgs, G. M. Jensen, S. Loftus, & N. Christensen (Eds.), Clinical reasoning in the health professions (4th ed.). Elsevier. [Google Scholar]
  19. Cochrane, T., Galvin, K., Buskes, G., Lam, L., Rajagopal, V., Glasser, S., Osborne, M. S., Loveridge, B., Davey, C., John, S., Townsin, L., & Moss, T. (2023). Design-based research: Enhancing pedagogical design. In T. Cochrane, V. Narayan, C. Brown, K. MacCallum, E. Bone, C. Denneen, R. Vanderburg, & B. Hurren (Eds.), ASCILITE 2023 conference proceedings: People, partnerships and pedagogies (pp. 351–356). ASCILITE Publications. [Google Scholar] [CrossRef]
  20. Dawson, P., Bearman, M., Dollinger, M., & Boud, D. (2024). Validity matters more than cheating. Assessment & Evaluation in Higher Education, 49(7), 1005–1016. [Google Scholar] [CrossRef]
  21. Dolmans, D. (2019). How theory and design-based research can mature PBL practice and research. Advances in Health Sciences Education, 24(5), 879–891. [Google Scholar] [CrossRef]
  22. Ellis, C., & Lodge, J. (2024, July). Stop looking for evidence of cheating with AI and start looking for evidence of learning [Post]. LinkedIn. Available online: https://www.linkedin.com/posts/cath-ellis-8162581b_education-highered-learning-activity-7216169175851892737-3uyr?utm_source=share&utm_medium=member_desktop&rcm=ACoAABjUX2AB_0FjvoyIXvgA9sQXlXcqGtGWxT0 (accessed on 10 June 2025).
  23. Falkner, K., & Stålbrandt, E. E. (2023). Meanings of authentic learning scenarios: A study of the interplay between higher education and employability of higher education graduates. International Journal of Teaching and Learning in Higher Education, 35(2), 171–183. [Google Scholar]
  24. Fan, C., Jiang, B., Shi, X., Wang, E., & Li, Q. (2018). Update on research and application of problem-based learning in medical science education. Biochemistry and Molecular Biology Education, 46(2), 186–194. [Google Scholar] [CrossRef] [PubMed]
  25. Fawns, T., Bearman, M., Dawson, P., Nieminen, J. H., Ashford-Rowe, K., Willey, K., Jensen, L. X., Damşa, C., & Press, N. (2024). Authentic assessment: From panacea to criticality. Assessment & Evaluation in Higher Education, 50(3), 396–408. [Google Scholar] [CrossRef]
  26. Fergusson, L., van der Laan, L., Imran, S., & Danaher, P. A. (2022). Authentic assessment and work-based learning: The case of professional studies in a post-COVID Australia. Higher Education, Skills and Work-Based Learning, 12(6), 1189–1210. [Google Scholar] [CrossRef]
  27. Galvao, T. F., Silva, M. T., Neiva, C. S., Ribeiro, L. M., & Pereira, M. G. (2014). Problem-based learning in pharmaceutical education: A systematic review and meta-analysis. The Scientific World Journal, 2014, 578382. [Google Scholar] [CrossRef]
  28. Galvin, K. (2021). Decision wheel tool [artefact]. Available online: http://x16.space/dw/ (accessed on 10 June 2025).
  29. Galvin, K. (2023). Clinical reasoning development: Enhancing independent and group rational decision making [Doctoral thesis, Torrens University of Australia]. [Google Scholar] [CrossRef]
  30. Galvin, K., & Cochrane, T. (2023). Design-based Research: An ethical framework to address pedagogical problems and innovation [Poster]. AARE2023. University of Melbourne. [Google Scholar] [CrossRef]
  31. Gamage, K. A. A., de Silva, E. K., & Gunawardhana, N. (2020). Online delivery and assessment during COVID-19: Safeguarding academic integrity. Education Sciences, 10(11), 11. [Google Scholar] [CrossRef]
  32. Gholami, M., Moghadam, P. K., Mohammadipoor, F., Tarahi, M. J., Sak, M., Toulabi, T., & Pour, A. H. H. (2016). Comparing the effects of problem-based learning and the traditional lecture method on critical thinking skills and metacognitive awareness in nursing students in a critical care nursing course. Nurse Education Today, 45, 16–21. [Google Scholar] [CrossRef]
  33. Herrington, J., Reeves, T. C., & Oliver, R. (2006). Authentic tasks online: A synergy among learner, task, and technology. Distance Education, 27(2), 233–247. [Google Scholar] [CrossRef]
  34. Higgs, J., Jensen, G. M., Loftus, S., & Christensen, N. (Eds.). (2019). Clinical reasoning in the health professions (4th ed.). Elsevier. [Google Scholar]
  35. Hoadley, C., & Campos, F. C. (2022). Design-based research: What it is and why it matters to studying online learning. Educational Psychologist, 57(3), 207–220. [Google Scholar] [CrossRef]
  36. Lee, D., Arnold, M., Srivastava, A., Plastow, K., Strelan, P., Ploeckl, F., Lekkas, D., & Palmer, E. (2024). The impact of generative AI on higher education learning and teaching: A study of educators’ perspectives. Computers and Education. Artificial Intelligence, 6, 100221. [Google Scholar] [CrossRef]
  37. Liu, D., & Bates, S. (2025). Generative AI in higher education: Current practices and ways forward. In [Whitepaper] Generative AI in higher education: Current practices and ways forward—APRU. Association of Pacific Rim Universities. [Google Scholar]
  38. Lodge, J. M., De Barba, P., & Broadbent, J. (2024). Learning with generative artificial intelligence within a network of co-regulation. Journal of University Teaching & Learning Practice, 20(7), 1–10. [Google Scholar] [CrossRef]
  39. Lodge, J. M., Howard, S., & Bearman, M. (2023). Assessment reform for the age of artificial intelligence. Available online: https://www.teqsa.gov.au/sites/default/files/2023-09/assessment-reform-age-artificial-intelligence-discussion-paper.pdf (accessed on 15 February 2025).
  40. Lombardi, M. (2007). Authentic learning for the 21 Century: An overview. Educause Learning Initiative, 1(2007), 1–12. [Google Scholar]
  41. Maloney, S., Tai, J. H. M., & Lo, K. (2013). Honesty in critically reflective essays: An analysis of student practice. Advances in Health Sciences Education, 18, 617–626. [Google Scholar] [CrossRef]
  42. Mandala Partners. (2023). Technology and the universities of tomorrow. Available online: https://mandalapartners.com/reports/technology-and-the-universities-of-tomorrow (accessed on 3 June 2023).
  43. Marewski, J. N., & Gigerenzer, G. (2012). Heuristic decision making in medicine. Dialogues in Clinical Neuroscience, 14(1), 77–89. Available online: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3341653/ (accessed on 8 September 2019). [CrossRef]
  44. McKenney, S., & Reeves, T. (2019). Conducting educational design research (2nd ed.). Routledge. [Google Scholar]
  45. Medina, M. S. (2017). Making students’ thinking visible during active learning. American Journal of Pharmaceutical Education, 81(3), 41. [Google Scholar] [CrossRef]
  46. Moore, R. L. (2020). Developing lifelong learning with heutagogy: Contexts, critiques, and challenges. Distance Education, 41(3), 381–401. [Google Scholar] [CrossRef]
  47. Simon, H. A. (1972). Theories of bounded rationality. In C. B. McGuire, & R. Radner (Eds.), Decision and organisation (pp. 161–176). North-Holland Publishing Company. [Google Scholar]
  48. Simon, H. A. (1991). Bounded rationality and organizational learning. Organization Science, 2(1), 125–134. [Google Scholar] [CrossRef]
  49. Smith, B. (2018). Generalizability in qualitative research: Misunderstandings, opportunities and recommendations for the sport and exercise sciences. Qualitative Research in Sport, Exercise and Health, 10(1), 137–149. [Google Scholar] [CrossRef]
  50. Tautz, J., & Steen, D. (2021). The honey factory: Inside the ingenious world of bees (2nd ed.). Black Inc. [Google Scholar]
  51. Young, M., Dory, V., Lubarsky, S., & Thomas, A. (2018). How different theories of clinical reasoning influence teaching and assessment. Academic Medicine, 93(9), 1415. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Conceptual research model (Author).
Figure 1. Conceptual research model (Author).
Education 15 01679 g001
Figure 2. Decision wheel attempt example: (a) Image of first landing page of the digital decision wheel tool showing space to add name, a question, and up to 8 categories; (b) second landing page of the decision wheel tool showing that each category can be rated using a drag feature for each section of the wheel.
Figure 2. Decision wheel attempt example: (a) Image of first landing page of the digital decision wheel tool showing space to add name, a question, and up to 8 categories; (b) second landing page of the decision wheel tool showing that each category can be rated using a drag feature for each section of the wheel.
Education 15 01679 g002
Figure 3. Decision wheel attempt for case scenarios by undergraduate health science students: (a) Image of decision wheel attempt by 2nd year bachelor student deciding on physical examination priorities for a case scenario; (b) decision wheel attempt completed by a group of 2nd year bachelor students with a teacher negotiating and discussing with the student group about physical examination priorities for the same fictional case scenario.
Figure 3. Decision wheel attempt for case scenarios by undergraduate health science students: (a) Image of decision wheel attempt by 2nd year bachelor student deciding on physical examination priorities for a case scenario; (b) decision wheel attempt completed by a group of 2nd year bachelor students with a teacher negotiating and discussing with the student group about physical examination priorities for the same fictional case scenario.
Education 15 01679 g003
Table 1. The students experience using a decision wheel tool for undergraduate health science subjects mapped to elements of Bounded Rationality Theory (Galvin, 2023).
Table 1. The students experience using a decision wheel tool for undergraduate health science subjects mapped to elements of Bounded Rationality Theory (Galvin, 2023).
Bounded Rationality Theory Elements Learning and Teaching Experience Yr Level
Imperfect
Information
The act of filling in the wheel made me question why chose the ratings at a deeper level” (p. 307). 2nd Yr Students
Positive experience using wheel as helped to break down information and come to a conclusion. Found it helped to differentiate what my preferences were for the case study” (p. 309).2nd Yr Students
“I do think it’s a good fit to use in an assessment [as non-graded appendix] and I think it’s supported where the students might have been a little bit scared to write a reflective piece. I think just being able to deduce it down and keep them focused on those particular points that they want to discuss. I think it was really good” (p. 210).2nd Yr Teacher
Cognitive
Limitations
I would have had a completely different outcome for my assessment if had not used the wheel tool. It helped a lot to come to best outcome. Helped me to notice ratings, priorities and things to consider. If no wheel, the considerations to reflect on would not have been as broad” (p. 311). 1st Yr Student
The wheel did change what I would have prescribed, and it was easier to discuss appropriate choice” (p. 308).2nd Yr Student
I can compare previous trimesters to this trimester, where we had similar questions, but we didn’t have the wheel. Having the wheel there as a facilitator, really substantiates what the students are saying, or not saying” (p. 311).1st Yr Teacher
Time
Constraints
It actually makes the students stop and think about their own approach or their own critical reasoning process or their own decision-making process. I think just that pause, it creates then sparks the, “Oh, hang on. I’ve actually got to make decisions myself. How would I go about making those decisions what’s important in that decision? What are the priorities in that decision?” (p. 320).1st Yr Teacher
I think it is a great tool when you have a group working and then everyone can have a buy in and then can compare. So it is a bit of a that time-saver in that it allows us to put our thoughts down and then you can combine everybody views for example, then we can see if we all agree or not, where we have differing opinions and see where the differences are. Whereas if just talk as a group, it takes forever. (p. 323).2nd Yr Student
Table 2. Indicators of psychological authenticity observed and experienced when using a decision wheel tool for undergraduate health science subjects (Ajjawi et al., 2024; Chambers & Broadbent, 2024; Galvin, 2023).
Table 2. Indicators of psychological authenticity observed and experienced when using a decision wheel tool for undergraduate health science subjects (Ajjawi et al., 2024; Chambers & Broadbent, 2024; Galvin, 2023).
Psychological
Authenticity Indicators
Learning and Teaching Experience Yr Level
  • Promote self-expression
I suffer with social anxiety and to have a tool to go, ‘yep, I’m really confident in this area’, will make it easier for someone like me to speak up and felt like I’m not just being pushed aside by the rest of the group […]. I also feel confident with other dominant personalities to have my say”. (p. 326).1st Yr Student
  • Understand own opinions and values
Sometimes people in a group can dominate and then quieter people retreat. But if everyone has done an individual wheel before coming to the group, then everyone participates better” (p. 326).2nd Yr Student
  • Able to have honest dialogue and challenge assumptions
We had a little bit of an argument in an academic forum, kind of feel like a real discussion, about why we thought that was a different priority[..]” (p. 325).3rd Yr Student
  • Communication is consistent with own values
Because the wheel was the non-assessable aspect of the assessment, as facilitator could see all the attempts were their own and not done by someone else” (p. 210).1st Yr Teacher
  • Feel safe to articulate opinions.
“It was more egalitarian in the sense that it allowed students to not be dominated so much, but a single voice that might happen in a discussion that didn’t include this visual tool [..]” (p. 326).1st Yr Teacher
Table 3. Decision wheel improvements and DP 3 evolution during DBR action cycles.
Table 3. Decision wheel improvements and DP 3 evolution during DBR action cycles.
DBR PhasesImprovementsDP 3 Evolution
Phase 2—Cycle 1
1 focus group/15 interviews across 3 subject cohorts/24 weeks
Customise title and notes section of the wheel tool. Create instructional videos and improve exemplars. Provide safe practice for learning (simplicity is key for learning tools)
Phase 3—Cycle 2
13 focus groups/18 interviews across 17 subject cohorts/24 weeks
Creation of autonomous online learning spaces and improving how groups can use the wheel synchronously.Autonomy can inspire learning (versatility is key for learning tools)
Phase 3—Cycle 3
6 focus groups/11 interviews across 16 subject cohorts/12 weeks
Wheel tool added into both non-graded learning activities and as assessment hurdles. Simple designs bridge learners (visual tools benefit justification of ideas)
Phase 4—Revisiting all data set using 6 stages of RTACreation of more ways to use decision wheel to practice decision-making without penalty.Harness a simple design
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Galvin, K.; Townsin, L. Harness a Simple Design to Make Authentic Learning Moments Visible: A Design-Based Research Study in Clinical Reasoning. Educ. Sci. 2025, 15, 1679. https://doi.org/10.3390/educsci15121679

AMA Style

Galvin K, Townsin L. Harness a Simple Design to Make Authentic Learning Moments Visible: A Design-Based Research Study in Clinical Reasoning. Education Sciences. 2025; 15(12):1679. https://doi.org/10.3390/educsci15121679

Chicago/Turabian Style

Galvin, Kelly, and Louise Townsin. 2025. "Harness a Simple Design to Make Authentic Learning Moments Visible: A Design-Based Research Study in Clinical Reasoning" Education Sciences 15, no. 12: 1679. https://doi.org/10.3390/educsci15121679

APA Style

Galvin, K., & Townsin, L. (2025). Harness a Simple Design to Make Authentic Learning Moments Visible: A Design-Based Research Study in Clinical Reasoning. Education Sciences, 15(12), 1679. https://doi.org/10.3390/educsci15121679

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop