Next Article in Journal
Durable Professionalism in Contested Spaces: Evaluating the Conversion of Teacher Readiness into Stable Professional Tenure in Politically Contested Multicultural Settings, 2022–2025
Next Article in Special Issue
Embedding Authentic Learning: A Case Study in Curriculum Transformation
Previous Article in Journal
Designing a Technology Integration Competency Framework for Mathematics Teachers Through Reflective Practice: A Design-Based Research Approach
Previous Article in Special Issue
Do Synoptic Assessments Lead to Authentic Learning? A Critical Perspective on Integration and Intentionality in Higher Education Assessment Design
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Recruitment Rush: A Boardgame to Teach Students About Recruiting Participants for a User Experience Study

by
Melissa J. Rogerson
1,*,
Benjamin McKenzie
1 and
Elisa K. Bone
2
1
School of Computing and Information Systems, The University of Melbourne, Parkville, VIC 3010, Australia
2
Melbourne Centre for the Study of Higher Education, The University of Melbourne, Parkville, VIC 3010, Australia
*
Author to whom correspondence should be addressed.
Educ. Sci. 2026, 16(2), 282; https://doi.org/10.3390/educsci16020282
Submission received: 30 November 2025 / Revised: 19 January 2026 / Accepted: 21 January 2026 / Published: 10 February 2026

Abstract

Authentic student projects in higher education reflect plausible real-world scenarios, connecting the curriculum to students’ future careers. In Human–Computer Interaction, this is realised through focus on real-world design and evaluation problems, which offer similar challenges to those found in industry settings. A limitation, however, is seen in the teaching of participant recruitment, with students typically choosing convenience samples of friends and family as their research participants rather than developing a balanced and budget-conscious recruitment strategy. This paper presents Recruitment Rush, a boardgame designed specifically to meet this challenge, presenting a meaningful real-world recruitment scenario that can be played in under an hour to stimulate understanding of how to recruit participants for a study. Evaluation of the game with 95 students and academics shows that the game is engaging and invites conversation and reflection on the nature of participant recruitment even beyond the direct HCI context.

1. Introduction

A critical concern for user researchers in Human–Computer Interaction (HCI) is how to recruit participants for studies—how many people to recruit, how much detail to specify about them during the study design phase, and how to find them. Even where authentic project work is built into curricula, there is limited opportunity for students to practice recruitment. Due to constraints of time and budget, they typically recruit a convenience sample of friends and family (Bernstein et al., 2011; Caine, 2016). This means that students do not have the opportunity to develop and implement a recruitment strategy, an essential skill for user researchers.
By situating learning tasks in contexts that reflect “the way the knowledge will be used in real life” (Herrington et al., 2014, p. 403), authentic learning confronts students with “the same problem-solving challenges in the curriculum, as they [experience] in their daily endeavours” (Herrington et al., 2014, p. 402). Games can be an accessible, low-risk, engaging and authentic tool for teaching (see, for example, (Khaldi et al., 2023; Meriläinen & Piispanen, 2022)) when they are carefully designed to model specific skills for a specific context. Recognising the gap in educational games for HCI, and specifically in games that teach about participant recruitment, we developed, produced and evaluated a card-based game for use in small group classes to teach students about recruiting study participants. The game introduces students to participant recruitment and provides an opportunity to participate in a simulated recruitment activity. It prepares students to design and implement participant recruitment plans that address participant sampling, budget, and the practical constraints of time and access.
Our Research Question asks whether a tabletop boardgame can authentically support HCI education about participant recruitment. This paper contributes the Recruitment Rush game as an artefact, together with reflections on the process and value of using a game for instruction. These are based on three key activities: (1) reflections by the design team; (2) evaluation by educators and experienced HCI researchers, comprising graduate students and academics; and (3) evaluation by currently enrolled students in graduate and undergraduate subjects that focus on evaluation methods. These highlight, in particular, the value that tabletop games can offer to HCI education in delivering authentic, game-based learning that simulates real-world methods.
In the following sections, we outline the state of research across three key areas: the importance of authenticity and accuracy in games for education; the use of games for education; and the use of games in HCI education.

1.1. Authenticity and Accuracy in Games for Education

Our work on Recruitment Rush reflects the authenticity offered by introducing ‘real-world’ scenarios that students will need to navigate in their professional work, within the safe (and engaging) space of a game-based environment. Such authenticity is particularly important, note Ney et al. (2014, p. 132), “in fields that are difficult to teach because students do not relate the learning goals to their personal experience or learning project.” They present a three-dimensional model of authenticity in simulation games which juxtaposes External or Real-World Relevance, Internal or Gameplay Relevance, and Learning Relevance. Here, external relevance relates to how the model reflects reality, and whether learners feel that they are being “prepared to react adequately in real professional situations” (Ney et al., 2014, p. 134). Rogers et al. (2022, p. 2) define realism in this context as occurring in settings where “the thing being signified in the reproduction process is the real world”. This dedication to realism contrasts with the approach taken by Mochocki (2021), who notes that it may be more important that a game feel authentic (“authenticity-of-feelings”) than that it actually be accurate in its depiction of events or processes (“authenticity-of-facts”) (p. 953). Together, these approaches suggest that an authentic simulation must at least feel real or realistic to learners—through what Petraglia (1998, p. 11) describes as “preauthentication”. Gameplay relevance requires that the game represent “a logical sequence of events” and provide a consistent and coherent experience (Ney et al., 2014, p. 133). Learning relevance requires that learners take on the setting of the game “with the feeling that it is relevant, or meaningful” in the learning context; the lessons must be transferable to similar problems and settings (Ney et al., 2014, p. 133).
In practical implementations, an Extended Authentic Learning Framework (EALF) has been applied in the design of educational games. The EALF can be read against broad definitions of authenticity as providing meaningful opportunities for students that connect with their prior experiences or to ’real-world’ problems, and authenticity in terms of educational game development. Safiena and Goh (2022) used the nine principles of authentic learning outlined by Herrington (Herrington, 2006; Herrington et al., 2014) as design guidelines for a game about identifying hazards. They found that students identified game interaction and guidance as the most important game design and authentic learning factors when evaluating their game. This suggests that these elements are key for the delivery of authentic learning. Even when “actual history doesn’t take place” (Stirling & Wood, 2021), games can authentically represent the setting of an historical event and promote students’ engagement with history. Table 1 summarises and illustrates the connections between Ney’s model, the EALF, and Herrington’s principles of authentic learning.
More recently, Levin et al. (2025) examined how authenticity can be created in simulations used in teacher education. They highlight the value of physical (including setting, facilities, and props), contextual (representativeness), and experiential (structure, flexibility, reactions elicited) elements of a scenario developed for a form of role-play.

1.2. Games for Education

Games and other playful artefacts have been used in education settings for hundreds of years. As early as the 1700s, engravers like John Spilsbury were using jigsaws to “dissect” maps of the world which were then used as geography teaching tools, possibly based on early designs by French educator Madame de Beaumont (Historic Royal Palaces, 2015; Norgate, 2007). Similarly, wargames have a more than hundred-year history as tools for teaching military strategy (Enstad, 2022; Hagen, 2022; Smith, 2010), and more recently computing topics (Haggman, 2019; López-Fernández et al., 2021; North, 2016; Papadakis, 2020) and modern history (MacDougall & Faden, 2016; Reynaud & Northcote, 2015).
A diverse range of games have been used in preschool, primary, secondary and tertiary education. Game-based learning can immerse students in authentic scenarios, developing knowledge and skills to support real-world activities (Herrington et al., 2014). Researchers have examined the alignment between games and educational settings, exploring what makes them effective and how to enhance these effects (Plass et al., 2015). A systematic review of the literature on the use of computer games in primary education, for example, found that game-based learning approaches were widely used across the curriculum (Hainey et al., 2016). In secondary education, computer games are used across a diverse range of subjects including mathematics (Fadda et al., 2022), history (McCall, 2022), science (Kara, 2021), and foreign language acquisition (Acquah & Katz, 2020; Meriläinen & Piispanen, 2022; Peterson et al., 2022). These continue to be primarily subject-specific and focused on imparting explicit knowledge. A further application of games, however, is to develop skills such as computational thinking (Sun et al., 2023) and problem-solving (Adachi & Willoughby, 2013; Araiza-Alba et al., 2021), as well as build student motivation both in specific subjects and more broadly (Fadda et al., 2022; Malouf, 1988). Many of these claimed skills align with graduate attributes, “an orienting statement of education outcomes used to inform curriculum design and the provision of learning experiences at a university” (Barrie et al., 2009). This provides a meaningful connection to higher education policy as a justification for the use of these games in universities, even beyond the subject-specific pedagogical benefits (Hill et al., 2016).
Definitions of game-based learning frequently explicitly mention “computer games” (e.g., (Hainey et al., 2016; Tang et al., 2009)), excluding games that do not involve computers or other devices: “Usually it is assumed that the game is a digital game, but this is not always the case.” (Plass et al., 2015). There has however also been significant work on the use of boardgames and tabletop role-playing games in education settings. In HCI, boardgame research has focused on the development of specific games to teach topic knowledge (e.g., robotics design (Collins & Šabanović, 2021), gut health (Pasumarthy et al., 2021) or quantum computing (Weisz et al., 2018)) or for specific instrumental purposes (e.g., a boardgame for use in co-design workshops with children (Gennari et al., 2019)), as well as on the user experience and cognitive work of boardgame play with or without hybrid components (Farkas et al., 2020; Rogerson et al., 2018; Sidji et al., 2023). A systematic review on the use of boardgames as interventions in medical disciplines found that they can improve understanding (“educational knowledge”) when used for health education, improve cognitive function, enhance interpersonal interactions, and maybe contribute to improving players’ mental health by increasing motivation and moderating symptoms of depression and anxiety (Noda et al., 2019). Another systematic literature review examines the use of simulation games, which model authentic situations and approaches, for education. To be effective, these must be matched to target groups in difficulty, familiarity with the topic, and cultural and linguistic understanding (Alf, 2022). They should be structured to be easy to learn but offer unpredictablity and complexity to a level that suits the learners, and are—essentially—moderated by teaching staff. Meaningful learning, as well as other positive benefits, is found in many different types of game.

1.3. Human–Computer Interaction Education

Games are well-explored in HCI, but an early paper found that only 40% of surveyed research on games and play used games operatively—that is, “as an instrument or tool for achieving external (i.e., non play or non-fun) goals” (Carter et al., 2014). This included 20 papers, or around 11% of the reviewed corpus of 178 papers, which used games for ‘Education and Learning’. Although there has been considerable research in this area since then, much of it has focused on the user experience of games in non-HCI education settings (Zuo et al., 2020), on the use of game design features or gamified learning platforms to motivate student learning (Silveira, 2020; van Roy et al., 2018), on game design education (Wyeth et al., 2018), and on games as the subject of project study or teaching (Roldan et al., 2020; Santana-Mancilla et al., 2019).
Despite this interest in games as pedagogical tools, little work has addressed the use of games for HCI education. Researchers have examined the use and design of a videogame (Santana-Mancilla et al., 2019) as the subject of a usability engineering process (Greenberg, 1996) and as the subject of a co-design project (Roldan et al., 2020), and the use of badges to motivate students in an HCI course (Silveira, 2020), but there is limited work that explicitly presents an educational game as a pedagogical tool to teach concepts specific to the HCI curriculum de Souza Lima and Benitti (2019). We have identified only three prior games that have been developed specifically to teach concepts from HCI, all of which primarily focus on heuristic evaluation. The Usability Game (F. B. V. Benitti & Sommariva, 2015) is a computer game which tells the story of a software development company and its attempts to improve the usability of its products. Student players are ‘employed’ as usability engineers and work through a series of tasks including requirements analysis, the design of a software prototype, and heuristic evaluation. The game was shown to improve student performance in requirement analysis and heuristic evaluation. The same authors also developed UsabiliCity (F. Benitti & Sommariva, 2012), a game which appears to focus on the usability life cycle.1
Like The Usability Game, Heureka focuses on the teaching of usability heuristics (Sobrino-Duque et al., 2022). It appears to use a multiple-choice, quiz-like function where the user is invited to select an image that best represents a given heuristic. Unlike The Usability Game, however, Heureka showed no measurable effect on students’ learning. These three games focus on a very narrow aspect of the HCI curriculum, suggesting a significant gap for developing games that teach foundational HCI skills and concepts. While The Usability Game appears to adopt an authentic learning approach, Heureka is strictly a quiz game.

2. Materials and Methods

In this section, we contextualise our approach to game design and outline the design of Recruitment Rush, before we discuss the methods used to evaluate it.

2.1. Game Design

There are a number of models and methodologies for game design which, typically, can be applied to both boardgames and computer games. One that has been widely used is the MDA framework (Hunicke et al., 2004). This short paper proposes that games can be interpreted formally across three dimensions: Mechanics—the algorithms and data that construct the game; Dynamics—the interactions between players and mechanics; and Aesthetics—the emotional responses that the game seeks to evoke in a player. One of the authors of the MDA framework, Robert Zubek, subsequently presented a revised and more detailed framework which reframes these concepts as Mechanics, Gameplay, and Player Experience (Zubek, 2020). In both of these models, the different elements are closely imbricated.
While these models are useful for interpreting games, we find Jesse Schell’s Elemental Tetrad a more practical model to use in designing games (Schell, 2015). This comprises four key elements: Mechanics, Story, Aesthetics, and Technology. In this model, Mechanics comprises both the Mechanics and Dynamics/Gameplay elements of the MDA model and Zubek’s revised model, while the Aesthetics/Player Experience model is not explicitly represented in the Tetrad.
To Schell, the Mechanics are the procedures and rules that comprise the game, aligning with the concept of Gameplay relevance discussed by Ney et al. (2014) and with the game-related criteria in the EALF (Safiena & Goh, 2022). At a more granular level, Engelstein and Shalev have catalogued over 200 distinct game mechanisms or “building blocks” for tabletop games (Engelstein & Shalev, 2022). Schell’s Story element reflects the theme or sequence of unfolding events, which delivers external relevance (Ney et al., 2014) and authenticity (Safiena & Goh, 2022). Aesthetics describes the ways that the game appeals to the player’s senses, aligning with the user-friendliness element of the EALF (Safiena & Goh, 2022), and Technology comprises the materials that facilitate the game (Schell, 2015). Each of these elements is essential and connected to each of the others, and the connections between these elements inform and drive the design of a game.

2.2. Design of Recruitment Rush

Before we describe our study, here we describe the game Recruitment Rush2 and its connection to core HCI curriculum and skills.
In Recruitment Rush, each player is tasked with recruiting participants for a usability study, to be conducted in one week’s time. A personal client card (see Section 2.2.2) sets out the parameters for the study that each player is to conduct—the website to be studied, required number of participants, a budget for the study, and a profile for their ideal user.
The game is played over seven “days”, each comprising one round. The central board features a day/round tracker, draw and discard piles for Participant cards (see Section 2.2.3), a central “Agency” of four face-up Participant cards, and a summary of available actions and possible attributes.
In each round, participants undertake one recruitment activity, choosing from four possibilities of different cost and efficacy (See Table 2).
Players recruit participants for their study over the course of the game, depending on the actions they choose. Recruited participants form a face-up tableau in front of the player. We designed the recruitment actions to reflect common activities during participant recruitment—the use of a convenience sample, often of family and friends (“Ask Around”); using local channels (posters, newsletters) (“Local Advertising”); advertising on social media and similar channels (“Online Advertising”); and commissioning an external recruiter to provide pre-qualified participants (“Use the Agency”). After seven rounds, players pay the indicated incentive to their participants and the game is scored, with players earning bonus points for matching participants to recruitment criteria.

2.2.1. Design Brief and Constraints

As the game was developed on a limited budget and to be used in a (time-limited) tutorial class, there were some specific constraints on the design.
Our first, and significant, design constraint was the short timeframe in which the game was to be taught and played—a one-hour tutorial class, typically with 5 min to set up and pack up, leaving 50 min for class activities. This contributed to several design decisions, including the use of the game board to provide rules reminders. Research has consistently shown that people are not good at reading game rules; rule interpretation is error-prone and disorderly, with players finding rules to be “unintelligible” (Liberman, 2011). Rather than relying on students reading the game rules in advance of the session, or reading and interpreting them during the class, therefore, we chose to introduce and teach the game to the whole group at the start of the tutorial class, leaving about 40 min for the gameplay itself.
The second design constraint was the need for simplicity, to cater to players with different experience of and interest in games. This aligns with the user-friendliness criterion from the EALF Safiena and Goh (2022). As we will discuss in Section 3, there is a vast discrepancy in game interest and experience among our students. Accordingly, while we drew design inspiration from hobby games rather than mass-market games, we ensured that the mechanisms chosen were simple and easy to explain in the context of the game’s theme. For example, at the end of each player’s turn a participant was moved out of the Agency and replaced with a new one; this was explained as their being recruited by other organisations. This also informed later design decisions, which we will discuss in the context of the playtest sessions in Section 3. Moreover, it precluded some of our initial design ideas including the use of variable player powers or abilities, which would have added complexity and planning overheads.
Finally, the game needed to be portable and storable. This limited the size of the participant deck and of other components in the game.

2.2.2. Client Cards

We developed 24 Client cards. Each represents a potential (imagined) Client for a usability or user experience (UX) study (see, for example, Figure 1). Each Client card begins with the name and brief description of the client.
Additionally, each card describes a preferred study size, representing the number of participants to be recruited, a budget, and a profile of three “desired” participant attributes (see Section 2.2.3), comprising a personal (purple) and a socio-economic (blue) trait as well as personal interests (green). The budget amount reflects, to some degree, how commonly those attributes appear in the deck.

2.2.3. Participant Cards

We designed 100 Participant cards (see Figure 2), representing the people who participate in the study. In designing these cards, we considered the dimensions that are often used to describe participants in a study, with a focus on the industry setting of the majority of our students’ future UX careers.
Each Participant card has a name and an associated incentive cost. Cards with ‘scarcer’ attributes have a higher incentive cost, as do those from demographics that tend to be harder to recruit (e.g., high-income adult professionals). Additionally, each Participant card has three sets of attributes: personal attributes (shown in purple text), socio-economic attributes (shown in blue text) and hobbies (shown in green text). Table 3 shows the different options available for each of these attributes.
While we aimed to make attributes broadly representative of the general population, and used a spread of personal and socio-economic attributes that align to those frequently used by practising UX specialists, we did not explicitly match them to Australian Bureau of Statistics data. Recruitment Rush is an educational game rather than a ‘true’ simulation, and interpreting population-level demographic data was beyond the scope of the project.
Recognising that gender is a complex and “messy” construct (Taylor et al., 2024) but wanting to explicitly include non-binary people in our participant pool, we chose to use pronouns in place of a more overt statement of gender. We recognise that our use of just three pronoun pairs—she/her, they/them, and he/him—is also an oversimplification of this complex issue. Similarly, we simplified age to three categories—young (notionally, up to about 35), adult (to retirement age), and mature—and household status to either single or partnered.
As well as a personal identity, we also provided each participant with three socio-economic attributes. We classified professions as domestic, essential, professional, retired, student, or trade; income as low, middle, high, or wealthy; and English proficiency as beginner, competent, and proficient. We initially used IELTS levels as the measure of proficiency, but found that these levels were not familiar to some participants so switched to the more descriptive terms. The use of these terms reflects that different projects may target different levels of English language comfort. For example, a local organisation in an area with high numbers of community language speakers might explicitly seek to connect with people with beginner English proficiency. Categories were loosely connected—for example, a mature participant might be retired and a young participant might be a student, although not exclusively. Finally, we developed a list of 12 hobbies and allocated three to each participant, allowing each participant card a unique combination of hobbies.

2.2.4. Scoring

We developed a scoring system (see Figure 3) that rewarded players for recruiting to the desired profile and also for recruiting a diverse sample of participants.
As we discuss below, players found the scoring system to be fiddly. Additionally, and highlighting the value of playtesting with diverse groups, one group with very disparate play styles observed that the developed scoresheet strongly favoured a desire for larger participant numbers, rather than a small but carefully selected sample. To remediate this, we suggest increasing the score for Desired Attributes from one point to three points “for each Participant matching each of your Project’s desired attributes”. Another option would be to reduce a player’s score by two for each Participant who does not match any of the desired attributes; however, we do not recommend this negative scoring as it can be experienced as hostile or punitive (Engelstein, 2020).

2.2.5. Connection to Game Design Literature

In designing Recruitment Rush, we followed the four key elements of Schell’s Tetrad (Schell, 2015). We began with the Story of recruiting participants for a usability study, which represents a common, authentic task for user researchers. From the design brief, we knew that the Technology we used for the game would primarily be cards. Connecting Story and Technology led us to the dual decks of Client and Participant cards, which are at the heart of the game. Similarly, the connection between cards and Mechanics, or the goals and procedures of the game (building a set of participants, in the form of a tableau), led us to the understanding that participants would draw and select cards to form their tableau, and the connection between Mechanics and Story informed the various recruitment actions that are available in the game.
Although the game’s graphic design aesthetic remains very simple, it connects to the Mechanics in the way that the board provides information about how to play the game including a representation of the different actions that are available and in the coloured text on the cards, which connects the different types of attributes. Images and text are used to enhance the story, for example through the introductions to the different client cards, and the layout of the cards is similar to other card games.
In the same way as the game connects to Schell’s Elemental Tetrad, it also incorporates specific design principles for educational games, which support the authenticity of the player experience (Laine & Lindberg, 2020). While these are too numerous to discuss in detail, the game pays particular attention to notions of Challenge (“DP2: Favor simple challenges over complex challenges”), Goals (“DP25: Create clear, meaningful, and achievable goals”), Relevance and Relatedness (“DP36: Relate gameplay to real-world contexts”), Storytelling and fantasy (“DP49: Create a meaningful story that the player can relate to”) and importantly, Learning (“DP28: Provide relevant and pedagogically grounded learning content and activities”).

2.3. Method

This project sought to address two core Research Questions. Firstly, we had the challenge of whether it was possible to design a game that taught participant recruitment within the boundaries of our design brief and constraints. Secondly, we wanted to understand students’ attitudes towards the designed game, to examine whether it made a valuable contribution to our HCI curriculum. The game’s learning objective was to teach students about the process of recruiting participants for a study, paying special attention to how to recruit participants, the types of information to include in a recruitment plan, and issues of the match between participants and a study’s audience, budgets, different recruitment methods, and the pragmatic trade-offs that may be made to achieve a target number of participants.
While data were generated through an online survey, our methods are mixed. We use descriptive statistics to report on surveyed measures, and used qualitative coding to analyse free-text comments. The first author grouped comments into broad themes for analysis and reporting; given the small size and brevity of our dataset we did not conduct a full thematic analysis on the data.

2.3.1. Playtest Cycles

We playtested Recruitment Rush at three stages, which roughly correspond to the Concept, Preproduction, and Production stages of game design and development (Fullerton, 2014). Figure 4 shows the three cycles of playtesting.
Firstly, we playtested an early concept version of the game through an informal early playtest within the development team. This playtest focused on the “formal elements” of the game including the player actions, the game procedures, and the turn sequence or “core loop” (Fullerton, 2014). Secondly, we playtested a working preproduction version of the game with three teaching staff from a subject focused on UX evaluation and with a group of teaching specialists from across the University. Finally, we implemented an early production version of the game in tutorials for our UX evaluation subject and simultaneously ran a session where two groups of HCI researchers, comprising academic staff and graduate researchers, played the game and engaged in a focus group discussion. Both tutorial participants and expert reviewers were invited to complete a survey about their experience of the game. Ethics approval was obtained from The University of Melbourne. Information about the participants and key outcomes of each stage of playtesting is included below, in Section 2.4.

2.3.2. Survey

Building on prior research in HCI on player experience, we developed a survey to evaluate Recruitment Rush. The survey comprised a consent form, demographic questions including birth year, gender (using the response options presented by Spiel et al. (2019)), profession and gaming experience, 16 Likert scale questions, which were rated on a 7-point scale from “Strongly disagree” to “Strongly agree”, and three free-response questions exploring what the participant liked about the game, what they would change, and any additional information they wished to provide.
The table of Likert-scale questions included the 11 verbatim statements that comprise the mini-Player Experience Inventory (PXI) (Haider et al., 2022) as well as five statements that related more specifically to the educational goals and context of the game. This is congruent with the use of the PXI and variants; the developers of the PXI anticipated that researchers might create additional modules to extend the base PXI in particular directions (Abeele et al., 2020). These 16 statements were shown in random order (see Table 4).
The lead researcher on this project was also the subject coordinator responsible for teaching this Evaluation subject. She did not attend the classes, to ensure that students did not feel pressured or judged in participating. Because we did not know which students had chosen to complete the survey, we pre-incentivised survey participation (Müller et al., 2014) by providing cupcakes to all students who participated in the class.

2.4. Playtests

2.4.1. Concept Playtest

We used this playtest as part of the game development cycle, noting ideas for different elements of the game as well as for the flow of the game. This informal session was run at the home of one of the researchers; a family member with considerable experience playing boardgames spontaneously sat down to join the session and contributed her feedback and ideas as part of our discussion3. We used a small number of prepared, handwritten Client and Participant cards (see Figure 5 and Figure 6), which we added to as play continued.
Outcomes A key outcome from this session was the need for a central board to focus attention on a common playing environment. This was also the stage where we developed and explored the concept of different ways to recruit participants and simplified some early ideas. We explored a variety of mechanisms including card drafting, where Participant cards were passed from player to player, but settled on each player having access to a semi-independent pool of participants. Additionally, we considered whether there should be action selection restrictions, with each action available to only one player per round, discarding this idea as adding unnecessary complexity to the design.
At this stage, we also refined the implementation of the budget—not only would participants need to be paid an incentive, but there would also be costs to recruit in different ways. We used poker chips to represent the budget. Finally, we limited players’ options for recruitment. Our original design allowed players to draw extra cards when taking a recruitment action, at a cost of 1 per card. In this model, when taking the Local Advertising action (for example), which has a base cost of 3 to draw 5 cards, the player could choose to pay 4 to draw 6 cards, or 5 to draw 7 cards. Playing simulated turns with this option showed us that this was error-prone and fiddly, and made the game unnecessarily complex.

2.4.2. Preproduction Playtests

Playtesting with Subject Staff We ran a playtest with three members of our Evaluation teaching team. Two are graduate researchers who have tutored UX Evaluation subjects over several semesters. One is a lecturer who was assisting with the subject’s delivery. The game was run with a handwritten set of cards and board. In the absence of a supply of “play money”, we created a grid that acted as a budget tracker; players moved a counter to indicate how much money they had left. Although we had intended this as a stop-gap short-term solution, player feedback was positive. They liked the sense that they could see their budget being “crossed off” so the stop-gap became a core component of the design.
Playtesting with Education Specialists At this stage of the project, we also ran a playtest with four education specialist staff members from different areas of the University. These participants enjoyed the game, and felt that it connected well to real-world settings and pedagogical goals. They were concerned, however, that the game might be over-long and suggested reframing it as a co-operative game where players work together to recruit participants for a single study. We considered this option but decided that this would conflict with our design constraint of simplicity (see Section 2.2.1). Casual players may be unfamiliar with the notion of a cooperative game, as these tend to be aimed at hobbyists rather than the mass market (Woods, 2012).
Outcomes Key outcomes from these playtests were that the game was playable but that it might need scaffolding in the first couple of rounds, particularly for people who had had less exposure to boardgames. It was after this stage that we increased the number of hobbies or interests on each participant card from two to three, to increase the likelihood of a match without increasing the number of cards in the deck. This was driven by the design constraint of portability (see Section 2.2.1).
An unexpected observation during these playtests was that the process of checking the cards drawn when taking the ‘Local’ or ‘Online’ advertising options gave a sense of qualifying participants through a screener survey, adding to the feeling of a “real” simulation.

2.4.3. Production Playtests

Tutorial Classes We created multiple copies of the game and dedicated one week of tutorial classes (1 h of class time, up to 25 students in each of four classes) to playing it. Students were expected to play the game as part of their studies, and were invited to complete a survey at the conclusion of the class. A single reminder message to complete the survey was sent through the subject LMS after the end of the week. Students who chose not to complete the survey were invited to spend that time reflecting on what they had learned from playing the game. Of 87 students who attended the tutorial classes, 52 completed the survey. Figure 7 shows an image from a tutorial class. All groups completed the game within the set time, although in a few cases tutors advised groups to end the game early, after five or six “days” rather than the full seven. We see this as a strength of the design as it allows flexibility even when play proceeds more slowly than anticipated.
Expert Review Our final playtest, run during the same week as the tutorial classes, was an expert evaluation with eight HCI researchers split into two tables for play. Three were working academic staff at our university and the other five were current graduate researchers in the HCI research group.
Like the students, this group of participants also completed the survey. Additionally, several of them chose to stay and participate in a discussion after the game had finished. One suggested that he had been able to “maths out” a winning strategy, and that the game might be too easy. Comparing the result from his table with the second table, however, we noted that his raw score was considerably lower than that of the student who won at the other table. The random nature of drawing cards mitigates the impact of such calculations.
Outcomes A particular focus of the discussion was the scoring system for the game and the way that it incentivised broad recruitment rather than narrow and highly specific recruitment, even where the characteristics of a player’s selected Participants bore little resemblance to the desired characteristics. This was an accidental and unexpected outcome of our scoring system which we had overlooked in earlier playtests. The discussion raised the possibility of having multiple scoring systems which the players could select at the start of the session, prioritising either reaching the required number of participants with some diversity in the population, or reaching a well-targeted but smaller group of participants with the potential to provide rich and detailed insights into the studied website. In response to this issue, we added conversation about the value of different types of recruitment to the post-play tutorial discussion, addressing the tension between quantity of participants and quality or fit to a recruitment profile. Further, the discussion identified the potential to use the game not only to teach about recruiting for a usability or UX study but also to teach graduate researchers about recruiting for their own projects, and to inspire similar conversations about recruitment priorities.
Further outcomes from these playtests are discussed in the following section.

3. Results

We received 60 responses to the survey, which included 52 students and eight HCI researchers. An additional four participants consented to participate but did not respond to any of the extended PXI Likert scale statements or free-text response questions.
In total, of the 60 participants, 32 identified as women and 25 as men. One identified as non-binary, one preferred not to disclose, and one self-described as “He/They”. Fifty-two were current students in the Evaluation subject; 1 an HCI/UX practitioner; 4 academic HCI staff; and 4 graduate researchers in HCI (participants were able to select more than one response option for this question). The median age of participants was 21 (year of birth 2002; IQR 2001–2003) which is congruent with a second-year subject in an undergraduate degree.
We also asked participants about their experience with games. All had at least some experience, although the time they spent playing games each week was variable. Nearly a third (19 participants) spent less than an hour a week playing games and only 12 reported playing for more than ten hours. Nevertheless, the majority of participants felt that they were non-novice game players. On a scale from 1 (novice) to 7 (expert), the average rating was 4.28. This was important because we had to ensure that the game was straightforward enough for inexperienced players, reflecting our design goal of simplicity, but had enough interest and complexity for more experienced groups.

3.1. PXI and Custom Module Findings

Table 5 shows the ratings from the mini PXI, which uses a 7-point Likert response scale, as well as from the five unvalidated survey items. In these scales, a response of −3 shows extreme disagreement, and 3 shows extreme agreement. The positive means all indicate some level of agreement with the statements. Full statements are shown in Table 4.
Overall, the responses to the Functional (1.446) and Psychosocial (1.454) elements measured by the mini PXI were markedly similar. The game rated most highly for Enjoyment (participants strongly agreed that “I had a good time playing this game” ( x ¯ = 2.13, stdev = 0.77)) and Immersion—“I was fully focused on the game” ( x ¯ = 1.97, stdev = 0.76). The lowest ratings from the mini PXI were in Autonomy ( x ¯ = 1.27, stdev = 1.35), Mastery ( x ¯ = 1.13, stdev = 1.37), Progress Feedback ( x ¯ = 1.18, stdev = 1.24) and Audiovisual Appeal ( x ¯ = 1.23, stdev = 1.29).
The five custom questions that we appended to the mini PXI included three that related to the perceived educational value of the game, one to its value as a tutorial activity, and one to the transferability of this educational game to a social setting.
In our measures of the game’s educational value, respondents felt that playing the game had helped them to understand both the different ways to recruit people ( x ¯ = 1.8, stdev = 0.86) and how to recruit to a specific user brief ( x ¯ = 1.47, stdev = 1.05), and had improved their ability to create a recruitment plan ( x ¯ = 1.25, stdev = 0.99). These questions relate specifically to the learning goals for the tutorial activity, presented in Section 2.3. Overall, the average for these three questions was 1.5, which is broadly similar to the averages for the validated Functional and Psychosocial constructs. Participants felt strongly that playing the game would be an “effective” use of a tutorial ( x ¯ = 1.9, stdev = 0.92)—given that the subject’s explicit teaching comprises 12 two-hour lectures and 11 one-hour tutorials, this is a significant share of face-to-face teaching time. Although we did not explicitly measure the effectiveness of learning, these results suggest that playing the game improved students’ broad understanding of the recruitment process and their confidence in applying the in-game concepts to real-world scenarios. They were less certain that they would like to play the game with friends or family ( x ¯ = 0.9, stdev = 1.43), although we were surprised that this response was as positive as it was, given the specific and educative nature of the game.

3.2. What People Liked About the Game

Fifty-three participants added free-text information about what they liked about the game. Many commented that they found the game “fun” or enjoyable to play, and that they liked the variety of recruitment options (actions) and Participant cards. “I was really surprised by how fun and well-design the game was. It was really fun seeing how my real-world experience in recruiting was reflected in the game—like the expensiveness of recruiting through an agency, and the hit and miss nature of online advertising.” Others enjoyed the element of luck in the game—“It felt kinda like gambling in a fun way” even when it did not always go in their favour. They found the game “easy to follow” and noted that the highly structured turns made it easy to understand what to do during their turn. Several participants commented on enjoying the need to make decisions and think about their actions—“Thinking about the characters in turn and building a story about your project”—and others complimented the game’s design quality.
Free comments echoed these emotions, with several participants reiterating the fun that they had experienced playing the game—“I was pleasantly surprised by how fun it was”. One participant contrasted it with their broader experience of their studies—“Thanks for the fun class, it was a nice change lol”.

3.3. Opportunities to Improve the Game’s Rules and Mechanisms

Forty-five respondents provided suggestions as to how they would improve the game. Several participants responded “no” or even “no, I think it is perfect” to this question; we did not count these responses as suggestions.

3.3.1. Clarify the Rules

Sixteen participants commented on the rules of the game. They felt that elements of the rules could have been clearer, and that a printed rulebook would be preferable to the group teaching and illustrative round, which we presented at the start of the session to introduce the activity. Two participants felt that there was too much luck in the game, in the form of the Participant deck, one felt that there were too many elements to the game, and one that it should be more complex. One commented that it should have been a digital game, but did not provide any supporting information to explain this comment.

3.3.2. Refine and Simplify Scoring

One participant—who identified as a “novice” to games and who played in the session with the graduate researcher who felt that they had “solved” the game—commented that to improve the game we should “Make it clearer than this is just about probability and chance and not about engaging with human-centred design”; however, we feel that this reflects a misunderstanding of the relationship between a game’s theme and the mechanics or mechanisms that support gameplay. While the mechanisms reflect elements of probability and chance, the theme connects closely to human-centered design. We recognise, however, that this player’s experience was unsatisfactory due to poor luck in card draws and their desire to exactly match the preferred characteristics on the Client card. This reflects their recruitment practices as a qualitative researcher rather than the pragmatic considerations of an industry-based role. Providing more information about the game’s scoring would have alleviated this issue for this session; however, we reflect further on this issue in Section 4.
Further, ten participants expressed concerns about the budget tracker and twelve about the scoresheet. Although our internal playtest group had liked the paper budget tracker sheet, these participants found it anywhere from “a little unclear” to “very confusing”. Several participants confused or conflated the budget tracker and scoring, and ten raised the complexity of the scoring as an issue.

3.3.3. Reflecting English Proficiency in the Game

The English proficiency attribute was the most problematic, and was discussed at all stages of playtesting. Participants questioned why these were not presented as a spectrum where a minimum—but not maximum—English proficiency could be specified. We feel that this could be addressed through teaching and by providing relevant and sensitive examples in rules explanation, for example:
Some clients are interested in knowing whether their website or app is easy to understand. They specifically want people who are fairly new to speaking English to make sure that they can understand the content.

3.4. Opportunities to Improve the Game’s Graphic Design

Twelve respondents commented on the graphic design of the game—“The aesthetics of the board were quite black and white and a bit text heavy so might be interesting to make a bit more colourful.”. Although we explained that the game was a prototype, participants felt that the simple and functional graphics—particularly on the game board—detracted from their gameplay experience. A redevelopment of the game will focus on enhancing the visual elements and strengthening the connection to the attributes on the Participant cards.

4. Discussion

In this project, we sought to explore the use of a tabletop game for teaching the core Human–Computer Interaction skill of recruiting participants for an evaluation study. This authentic learning approach provides students with the opportunity to explore scenarios typical of the evaluation recruitment process within a game environment. Recruitment Rush is a novel and engaging teaching tool for use in face-to-face teaching, although its value extends beyond the specific HCI context. Our discussion reflects primarily on the value of the game and, more broadly, on the applicability of tabletop play to HCI education. In addition, we discuss the value of the interdisciplinary collaboration on this project, as well as opportunities to extend the game with hybrid tools that streamline the scoring process and to extend the evaluation framework that we used to understand the player experience.
Our study supports prior research that demonstrates the benefit of the use of computer and tabletop games for education (Araiza-Alba et al., 2021; Gennari et al., 2019), and contributes to the limited work to date on games for HCI education (de Souza Lima & Benitti, 2019). It highlights opportunities to use a game to authentically simulate processes that would otherwise be too lengthy or complex to undertake in the context of an HCI subject. Moreover, it suggests further opportunities to use games to strengthen research teaching to other groups including graduate researchers. While we did not explicitly measure learning outcomes, both survey outcomes and open feedback demonstrate that students perceived the game as helpful for learning about different dimensions of recruitment, and as a valuable use of class time.
Recruitment Rush thus delivers against Ney’s relevance model of authenticity in simulation games (Ney et al., 2014) as well as the EALF (Safiena & Goh, 2022), incorporating Herrington (2006)’s nine Authentic Learning Principles, as illustrated in Table 1. The game provides external relevance through its careful design, incorporating the use of realistic client scenarios and clear game objectives, varied participant cards, diverse recruitment strategies, and budgets for studies. These address the EALF requirements for both authentic activities and authentic context “that reflects the way the knowledge will be useful in real life” (Herrington, 2006, p. 1). Recruitment Rush provides gameplay relevance through the sequencing of player actions and use of meaningful in-game interactions with relevant in-game consequences, coaching and scaffolding students in a low-risk environment, although we acknowledge that the prototype game could be more attractive and user-friendly. Finally, the game provides learning relevance in its close alignment with the subject curriculum and learning outcomes, which is further demonstrated through the student feedback. By articulating and explaining their choices and manipulating cards and budget to reflect those choices, students realise and model their decisions and their ramifications in the game setting. This allows students to collaborate in play groups of 4–5 in their construction of their knowledge about and understanding of recruitment strategies through gameplay and reflect on this through subsequent structured and guided discussion. The different client scenarios ensure that the recruitment process is understood through multiple roles and perspectives4. Our own input into the game’s design and the expert review sessions ensure that the game reflects access to experts.
Our first Research Question was whether we could design a game to be played within a one-hour tutorial class, which would successfully mimic the challenges of recruiting participants for studies, within the design constraints we had identified. Recruitment Rush achieves these goals and was successfully implemented in a tutorial class. Additionally, we aimed to understand whether the game made a valuable contribution to our HCI curriculum. Specifically, we aimed to teach students about the benefits and disadvantages of different methods of participant recruitment, the costs of both recruitment and incentivising participants, and the importance of matching participants’ attributes with the client’s ideal user. Although we did not use explicit measures to assess student learning in this area, the very positive responses to our survey showed that students felt that the game had been valuable in teaching them about recruitment. Moreover, comments on the survey showed that the game inspired students to think about recruitment in different contexts.
This reflective response to the game was also seen among the “expert” group of staff and graduate researchers from our HCI research group, who spent over an hour discussing the game after playtesting it. This group highlighted the potential value of the game for teaching about recruitment for research studies to graduate researchers. They felt that there was value not only in playing the game but in discussing the values and philosophies that it embodied, particularly in the scoring system which inadvertently privileged those who recruited a large number of participants, even with poor matches to the target attributes, over those who preferred a close fit to recruitment profiles. As a result of these discussions, we extended the reflective post-gameplay discussion to incorporate these ideas.
Interestingly, during our “expert” playtest session, two colleagues from a non-HCI computing discipline walked past and stopped to ask what it was about. After observing the play, they enthusiastically concurred that it would be valuable for their own graduate researchers to deepen their understanding of recruitment for research studies.
We designed Recruitment Rush as a tabletop boardgame, because we wanted it to be simple to implement in tutorial classes, without the need for potentially complex underlying technical infrastructure to support multiplayer play. It was interesting to us that only one respondent suggested that a digital design would be preferable, and they did not provide any detail or suggestion as to how this might be accomplished. Nevertheless, our observations and reading of the issues raised by participants suggest that there are meaningful opportunities to use hybrid technologies to expedite the use of the game, particularly in the domains of housekeeping and calculating (Rogerson et al., 2021b).
A final reflection, which echoes work by Crocco et al. (2016), is on the value of the interdisciplinary team for this project, where Crocco et al. offered faculty members a scaffolded approach to game design which combined an introductory workshop on Game-Based Learning with a hands-on game design workshop, prototyping session and playtesting, our approach combined an HCI academic with some prior game design experience with a working game designer, with further contribution from a specialist student learning team. The first author developed the initial game concept in response to an identified gap in the teaching model and applied for funding to develop it. The second author took that game concept and led development of the game and components, and the third author provided advice and insight on student learning goals and outcomes. Regular meetings and playtests throughout the project ensured that ideas could be raised and challenged within the design team before being playtested with external participants.

Limitations and Future Work

Despite the positive response to Recruitment Rush, we note several key limitations of this work which point to future research opportunities. Firstly, although we developed Recruitment Rush as a tabletop boardgame, we see potential in redeveloping it to leverage hybrid components to expedite play. In particular, we noted that players found scoring to be a lengthy and error-prone process, and we propose a simple, web-based scoring tool to replace the paper scoresheets. This is a simple and light-weight hybrid solution that avoids the costly and complex need for a central multiplayer-equipped server (Rogerson et al., 2021a). Such a scoring tool would expedite the lengthy scoring process, allow teaching staff to collect additional data about game outcomes, and support the development of further refinements to the game’s scoring system.
Secondly, we note the potential to develop and validate a further set of questions which could be used to extend the PXI instrument. The education-focused questions we used are not validated yet provided valuable insights to the student experience. A validated set of education-focused questions would present a valuable extension to the PXI. This would improve the transferability of knowledge from different projects and allow for comparison of different sets of game data. We believe that this is congruent with the PXI’s explicit mention of potential modules that extend it for specific settings (Abeele et al., 2020).
Thirdly, we note limitations that relate to the context of development and testing of Recruitment Rush to integrate with our teaching: our evaluation is limited to a single institution and student cohort, and only 52 of the 87 participating students completed the survey. This may have induced a form of response bias. Future evaluation could extend to implementing the game at other institutions or in other programs. While we recognise that introducing a game to a classroom may have a novelty effect, we ultimately see that as a positive, applicable to each cohort of students who encounter the game. Future work should examine how to measure the value that the game contributes to student learning.
Finally, we note the tension (discussed in the Expert Review sessions) between the desire to recruit the required number of participants and the desire to recruit participants that fit the target user profile. While this presents a valuable opportunity for discussion and reflection in tutorials, we acknowledge that the score system presented here does not reflect the value of appropriately qualifying and targeting participants. We are undertaking further work to refine and simplify the scoring system and better reflect the value we place on the selection of suitable participants for any study. The alternative scoring approaches outlined in Section 2.2.4 represent an interim solution for this issue.

5. Conclusions

This paper presents the design and evaluation of Recruitment Rush, a boardgame to teach HCI students about recruiting participants for an HCI study. The game was a popular addition to the subject pedagogy. Its high scores against the PXI’s Functional and Psychosocial constructs demonstrate that it was successful as a game and as a teaching activity, although further work is required to accurately measure its contribution to student learning. Moreover, its high scores against custom measures developed for this study demonstrate the perceived educational value of the game and its transferability to social settings. Following the Extended Authentic Learning Framework (Safiena & Goh, 2022), the game presents students with an authentic learning experience through ensuring its external, gameplay, and learning relevance (Ney et al., 2014). The game was developed through a three-stage prototyping and playtesting process, and evaluated using the Player Experience Inventory (PXI), supplemented with custom questions relevant to the learning environment. Evaluation of the game demonstrates its warm reception and perceived efficacy as a teaching tool and its value beyond the specific HCI setting. This study supports the use of custom tabletop games as a teaching tool for authentic learning in HCI classrooms and also points to the potential use of hybrid game technologies to streamline complex calculations during play. It demonstrates the value of simulation-based games that model authentic tasks to support subject-specific learning in higher education.

Author Contributions

Conceptualisation, M.J.R.; methodology, M.J.R.; formal analysis, M.J.R.; investigation, M.J.R. and B.M.; resources, E.K.B.; data curation, M.J.R.; game design—concept, M.J.R.; game design—implementation, B.M.; writing—original draft preparation, M.J.R.; writing—review and editing, B.M. and E.K.B.; supervision, M.J.R. and E.K.B.; project administration, M.J.R.; funding acquisition, M.J.R. and E.K.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by a Teaching and Learning grant from The University of Melbourne, grant number: FlexAp000354.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by University of Melbourne Human Research Ethics Committee, Office of Research Ethics and Integrity, The University of Melbourne (protocol code 2023-27640-44925-4 and date of approval: 12 September 2023).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The datasets presented in this article are not readily available due to institutional Human Research Ethics conditions. The Recruitment Rush game is available from the corresponding author.

Acknowledgments

We are especially grateful to Claudia McHarg, Arzoo Atiq, Wei Zhao and Yushan Xing for early formative feedback, to the students of INFO20004 in 2023 and beyond, and to our colleagues from the HCI group at The University of Melbourne for their feedback on the game.

Conflicts of Interest

The authors declare no conflicts of interest.

Notes

1
We were unable to review this paper in detail as the work appears to only have been published in Portuguese and was beyond the scope of our language abilities; the online translation tools we tried did not produce an understandable translation. Nevertheless, we mention it here for the sake of completeness.
2
A copy of the game—currently as printable files—is available from the first author on request.
3
This person has relevant expertise; she has playtested games rated in Boardgamegeek’s top 100 over more than 15 years, and has played and demonstrated well over 500 different boardgames. Following this design session, she was inspired to enrol in a university game design subject to explore her interest in game design.
4
In a separate class activity, students recruit other students to act as participants in another student group’s evaluation project and themselves act as participants; however, this is not itself part of the Recruitment Rush activity.

References

  1. Abeele, V. V., Spiel, K., Nacke, L., Johnson, D., & Gerling, K. (2020). Development and validation of the player experience inventory: A scale to measure player experiences at the level of functional and psychosocial consequences. International Journal of Human-Computer Studies, 135, 102370. [Google Scholar] [CrossRef]
  2. Acquah, E. O., & Katz, H. T. (2020). Digital game-based L2 learning outcomes for primary through high-school students: A systematic literature review. Computers & Education, 143, 103667. [Google Scholar] [CrossRef]
  3. Adachi, P. J., & Willoughby, T. (2013). More than just fun and games: The longitudinal relationships between strategic video games, self-reported problem solving skills, and academic grades. Journal of Youth and Adolescence, 42, 1041–1052. [Google Scholar] [CrossRef] [PubMed]
  4. Alf, T. (2022). Conditions for successful teaching with simulation games—A systematic literature review. Die Hochschullehre, 8(1), 467–480. [Google Scholar] [CrossRef]
  5. Araiza-Alba, P., Keane, T., Chen, W. S., & Kaufman, J. (2021). Immersive virtual reality as a tool to learn problem-solving skills. Computers & Education, 164, 104121. [Google Scholar] [CrossRef]
  6. Barrie, S., Hughes, C., & Smith, C. (2009). The national graduate attributes project: Integration and assessment of graduate attributes in curriculum. Australian Learning and Teaching Council. Available online: https://www.scirp.org/reference/referencespapers?referenceid=3812230 (accessed on 20 February 2021).
  7. Benitti, F., & Sommariva, L. (2012). Investigando o ensino de IHC no contexto da computação: O que e como é ensinado. In Workshop sobre ensino de ihc (Vol. 967, pp. 33–38). Ceur-ws.org. Available online: https://ceur-ws.org/Vol-967/paper5.pdf (accessed on 16 February 2024).
  8. Benitti, F. B. V., & Sommariva, L. (2015). Evaluation of a game used to teach usability to undergraduate students in computer science. Journal of Usability Studies, 11(1), 21–39. Available online: https://uxpajournal.org/usability-game-computer-science-students/ (accessed on 18 January 2026).
  9. Bernstein, M. S., Ackerman, M. S., Chi, E. H., & Miller, R. C. (2011). The trouble with social computing systems research. In CHI’11 extended abstracts on human factors in computing systems (pp. 389–398). Association for Computing Machinery. [Google Scholar] [CrossRef]
  10. Caine, K. (2016). Local Standards for Sample Size at CHI. In CHI ’16: Proceedings of the 2016 CHI conference on human factors in computing systems (pp. 981–992). Association for Computing Machinery. [Google Scholar] [CrossRef]
  11. Carter, M., Downs, J., Nansen, B., Harrop, M., & Gibbs, M. (2014). Paradigms of games research in HCI: A review of 10 years of research at CHI. In Proceedings of the first ACM SIGCHI annual symposium on computer-human interaction in play (pp. 27–36). Association for Computing Machinery. [Google Scholar] [CrossRef]
  12. Collins, S., & Šabanović, S. (2021). “What does your robot do?” A tabletop role-playing game to support robot design. In 2021 30th IEEE international conference on robot & human interactive communication (RO-MAN) (pp. 1097–1102). IEEE. [Google Scholar] [CrossRef]
  13. Crocco, F., Offenholley, K., & Hernandez, C. (2016). A proof-of-concept study of game-based learning in higher education. Simulation & Gaming, 47(4), 403–422. [Google Scholar] [CrossRef]
  14. de Souza Lima, A. L., & Benitti, F. B. V. (2019). Let’s talk about tools and approaches for teaching HCI. In P. Zaphiris, & A. Ioannou (Eds.), Learning and collaboration technologies. Designing learning experiences (pp. 155–170). Springer International Publishing. [Google Scholar] [CrossRef]
  15. Engelstein, G. (2020). Achievement relocked: Loss aversion and game design. MIT Press. [Google Scholar] [CrossRef]
  16. Engelstein, G., & Shalev, I. (2022). Building blocks of tabletop game design: An encyclopedia of mechanisms. CRC Press. [Google Scholar]
  17. Enstad, K. (2022). Professional knowledge through wargames and exercises. Scandinavian Journal of Military Studies, 5(1), 233–243. [Google Scholar] [CrossRef]
  18. Fadda, D., Pellegrini, M., Vivanet, G., & Zandonella Callegher, C. (2022). Effects of digital games on student motivation in mathematics: A meta-analysis in K-12. Journal of Computer Assisted Learning, 38(1), 304–325. [Google Scholar] [CrossRef]
  19. Farkas, T., Wiseman, S., Cairns, P., & Fiebrink, R. (2020). A grounded analysis of player-described board game immersion. In CHI PLAY ’20: Proceedings of the annual symposium on computer-human interaction in play (pp. 427–437). Association for Computing Machinery. [Google Scholar] [CrossRef]
  20. Fullerton, T. (2014). Game design workshop: A playcentric approach to creating innovative games. CRC Press. [Google Scholar]
  21. Gennari, R., Matera, M., Melonio, A., & Roumelioti, E. (2019). A board-game for co-designing smart nature environments in workshops with children. In A. Malizia, S. Valtolina, A. Morch, A. Serrano, & A. Stratton (Eds.), End-user development: 7th international symposium, IS-EUD 2019, Hatfield, UK, 10–12 July 2019 (pp. 132–148). Springer International Publishing. [Google Scholar] [CrossRef]
  22. Greenberg, S. (1996). Teaching human computer interaction to programmers. Interactions, 3(4), 62–76. [Google Scholar] [CrossRef]
  23. Hagen, A. M. (2022). Learning (better) from stories: Wargames, narratives, and rhetoric in military education. Scandinavian Journal of Military Studies, 5(1), 282–296. [Google Scholar] [CrossRef]
  24. Haggman, A. (2019). Cyber wargaming: Finding, designing, and playing wargames for cyber security education [Doctoral dissertation, Royal Holloway, University of London]. Available online: https://pure.royalholloway.ac.uk/ws/files/33911603/2019haggmanaphd.pdf (accessed on 20 February 2024).
  25. Haider, A., Harteveld, C., Johnson, D., Birk, M. V., Mandryk, R. L., Seif El-Nasr, M., Nacke, L. E., Gerling, K., & Vanden Abeele, V. (2022). miniPXI: Development and validation of an eleven-item measure of the player experience inventory. In Proceedings of the ACM on human-computer interaction (Vol. 6, p. 244). Association for Computing Machinery. [Google Scholar] [CrossRef]
  26. Hainey, T., Connolly, T. M., Boyle, E. A., Wilson, A., & Razak, A. (2016). A systematic literature review of games-based learning empirical evidence in primary education. Computers & Education, 102, 202–223. [Google Scholar] [CrossRef]
  27. Herrington, J. (2006). Authentic e-learning in higher education: Design principles for authentic learning environments and tasks. In elearn: World conference on edtech (pp. 3164–3173). Association for the Advancement of Computing in Education (AACE). Available online: https://www.learntechlib.org/primary/p/24193/ (accessed on 23 October 2025).
  28. Herrington, J., Reeves, T. C., & Oliver, R. (2014). Authentic learning environments. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (pp. 401–412). Springer. [Google Scholar] [CrossRef]
  29. Hill, J., Walkington, H., & France, D. (2016). Graduate attributes: Implications for higher education practice and policy: Introduction. Journal of Geography in Higher Education, 40(2), 155–163. [Google Scholar] [CrossRef]
  30. Historic Royal Palaces. (2015). Jigsaw cabinet (archived web page). Available online: https://web.archive.org/web/20150613212211/http://www.hrp.org.uk/NewsAndMedia/TheArtFundKewPalacejigsaw_ (accessed on 20 February 2024).
  31. Hunicke, R., LeBlanc, M., & Zubek, R. (2004, July 25–26). MDA: A formal approach to game design and game research. AAAI Workshop on Challenges in Game AI (Vol. 4, p. 1722), San Jose, CA, USA. Available online: https://cdn.aaai.org/Workshops/2004/WS-04-04/WS04-04-001.pdf (accessed on 26 May 2022).
  32. Kara, N. (2021). A systematic review of the use of serious games in science education. Contemporary Educational Technology, 13(2), ep295. [Google Scholar] [CrossRef]
  33. Khaldi, A., Bouzidi, R., & Nader, F. (2023). Gamification of e-learning in higher education: A systematic literature review. Smart Learning Environments, 10(1), 10. [Google Scholar] [CrossRef]
  34. Laine, T. H., & Lindberg, R. S. (2020). Designing engaging games for education: A systematic literature review on game motivators and design principles. IEEE Transactions on Learning Technologies, 13(4), 804–821. [Google Scholar] [CrossRef]
  35. Levin, O., Frei-Landau, R., Flavian, H., & Miller, E. C. (2025). Creating authenticity in simulation-based learning scenarios in teacher education. European Journal of Teacher Education, 48(2), 291–312. [Google Scholar] [CrossRef]
  36. Liberman, K. (2011). The reflexive intelligibility of affairs: Ethnomethodological perspectives on communicating sense. Cahiers Ferdinand de Saussure, 64, 73–99. [Google Scholar]
  37. López-Fernández, D., Gordillo, A., Alarcón, P. P., & Tovar, E. (2021). Comparing traditional teaching and game-based learning using teacher-authored games on computer science education. IEEE Transactions on Education, 64(4), 367–373. [Google Scholar] [CrossRef]
  38. MacDougall, R., & Faden, L. (2016). Simulation literacy: The case for wargames in the history classroom. In P. Harrigan, & M. Kirschenbaum (Eds.), Zones of control: Perspectives on wargaming. MIT Press. [Google Scholar] [CrossRef]
  39. Malouf, D. B. (1988). The effect of instructional computer games on continuing student motivation. The Journal of Special Education, 21(4), 27–38. [Google Scholar] [CrossRef]
  40. McCall, J. (2022). Gaming the past: Using video games to teach secondary history. Taylor & Francis. [Google Scholar]
  41. Meriläinen, M., & Piispanen, M. (2022). The early bird gets the word games and play: Creating a context for Authentic Language learning. International Electronic Journal of Elementary Education, 14(4), 501–507. [Google Scholar] [CrossRef]
  42. Mochocki, M. (2021). Heritage sites and video games: Questions of authenticity and immersion. Games and Culture, 16(8), 951–977. [Google Scholar] [CrossRef]
  43. Müller, H., Sedley, A., & Ferrall-Nunge, E. (2014). Survey research in HCI. In J. Olson, & W. Kellogg (Eds.), Ways of knowing in HCI (pp. 229–266). Springer. [Google Scholar] [CrossRef]
  44. Ney, M., Goncalves, C., & Balacheff, N. (2014). Design heuristics for authentic simulation-based learning games. IEEE Transactions on Learning Technologies, 7(2), 132–141. [Google Scholar] [CrossRef]
  45. Noda, S., Shirotsuki, K., & Nakao, M. (2019). The effectiveness of intervention with board games: A systematic review. BioPsychoSocial Medicine, 13(1), 22. [Google Scholar] [CrossRef]
  46. Norgate, M. (2007). Cutting borders: Dissected maps and the origins of the jigsaw puzzle. The Cartographic Journal, 44(4), 342–350. [Google Scholar] [CrossRef]
  47. North, M. (2016). War games: Simulation vs. virtual machines in cybersecurity education. Issues in Information Systems, 17(4), 120–126. [Google Scholar] [CrossRef]
  48. Papadakis, S. (2020). Evaluating a game-development approach to teach introductory programming concepts in secondary education. International Journal of Technology Enhanced Learning, 12(2), 127–145. [Google Scholar] [CrossRef]
  49. Pasumarthy, N., Tai, Y. L. E., Khot, R. A., & Danaher, J. (2021). Gooey gut trail: Demystifying human gut health through a board game. In Proceedings of the 13th conference on creativity and cognition. Association for Computing Machinery. [Google Scholar] [CrossRef]
  50. Peterson, M., White, J., Mirzaei, M. S., & Wang, Q. (2022). A review of research on the application of digital games in foreign language education. In M. Kruk, & M. Peterson (Eds.), New technological applications for foreign and second language learning and teaching (pp. 1948–1971). IGI Global Scientific Publishing. [Google Scholar] [CrossRef]
  51. Petraglia, J. (1998). Reality by design: The rhetoric and technology of authenticity in education. Routledge. [Google Scholar] [CrossRef]
  52. Plass, J. L., Homer, B. D., & Kinzer, C. K. (2015). Foundations of game-based learning. Educational Psychologist, 50(4), 258–283. [Google Scholar] [CrossRef]
  53. Reynaud, D., & Northcote, M. (2015). The World Wars through tabletop wargaming: An innovative approach to university history teaching. Arts and Humanities in Higher Education, 14(4), 349–367. [Google Scholar] [CrossRef]
  54. Rogers, K., Karaosmanoglu, S., Altmeyer, M., Suarez, A., & Nacke, L. E. (2022). Much realistic, such wow! A systematic literature review of realism in digital games. In CHI ’22: Proceedings of the 2022 CHI conference on human factors in computing systems (pp. 1–21). Association for Computing Machinery. [Google Scholar] [CrossRef]
  55. Rogerson, M. J., Gibbs, M. R., & Smith, W. (2018). Cooperating to compete: The mutuality of cooperation and competition in boardgame play. In Proceedings of the 2018 CHI conference on human factors in computing systems (pp. 1–13). Association for Computing Machinery. [Google Scholar] [CrossRef]
  56. Rogerson, M. J., Sparrow, L. A., & Gibbs, M. R. (2021a). More than a gimmick-digital tools for boardgame play. In Proceedings of the ACM on human-computer interaction (Vol. 5, p. 261). Association for Computing Machinery. [Google Scholar] [CrossRef]
  57. Rogerson, M. J., Sparrow, L. A., & Gibbs, M. R. (2021b). Unpacking “boardgames with apps”: The hybrid digital boardgame model. In CHI ’21: Proceedings of the 2021 CHI conference on human factors in computing systems (pp. 1–17). Association for Computing Machinery. [Google Scholar] [CrossRef]
  58. Roldan, W., Gao, X., Hishikawa, A. M., Ku, T., Li, Z., Zhang, E., Froehlich, J. E., & Yip, J. (2020). Opportunities and challenges in involving users in project-based HCI education. In CHI ’20: Proceedings of the 2020 CHI conference on human factors in computing systems (pp. 1–15). Association for Computing Machinery. [Google Scholar] [CrossRef]
  59. Safiena, S., & Goh, Y. M. (2022). A hazard identification digital simulation game developed based on the extended authentic learning framework. Journal of Engineering Education, 111(3), 642–664. [Google Scholar] [CrossRef]
  60. Santana-Mancilla, P. C., Rodriguez-Ortiz, M. A., Garcia-Ruiz, M. A., Gaytan-Lugo, L. S., Fajardo-Flores, S. B., & Contreras-Castillo, J. (2019). Teaching HCI skills in higher education through game design: A study of students’ perceptions. Informatics, 6(2), 22. [Google Scholar] [CrossRef]
  61. Schell, J. (2015). The art of game design: A book of lenses (2nd ed.). CRC Press. [Google Scholar]
  62. Sidji, M., Smith, W., & Rogerson, M. J. (2023). The hidden rules of hanabi: How humans outperform AI agents. In Proceedings of the 2023 CHI conference on human factors in computing systems (pp. 1–16). Association for Computing Machinery. [Google Scholar] [CrossRef]
  63. Silveira, M. S. (2020). Badges for all: Using gamification to engage HCI students. In Proceedings of the 19th Brazilian symposium on human factors in computing systems. Association for Computing Machinery. [Google Scholar] [CrossRef]
  64. Smith, R. (2010). The long history of gaming in military training. Simulation & Gaming, 41(1), 6–19. [Google Scholar] [CrossRef]
  65. Sobrino-Duque, R., Martínez-Rojo, N., de Gea, J. M. C., López-Jiménez, J. J., Nicolás, J., & Fernández-Alemán, J. L. (2022). Evaluating a gamification proposal for learning usability heuristics: Heureka. International Journal of Human-Computer Studies, 161, 102774. [Google Scholar] [CrossRef]
  66. Spiel, K., Haimson, O. L., & Lottridge, D. (2019). How to do better with gender on surveys: A guide for HCI researchers. Interactions, 26(4), 62–65. [Google Scholar] [CrossRef]
  67. Stirling, E., & Wood, J. (2021). Actual history doesn’t take place: Digital gaming, accuracy and authenticity. Game Studies, 21(1). Available online: http://gamestudies.org/2101/articles/stirling_wood (accessed on 16 September 2025).
  68. Sun, L., Guo, Z., & Hu, L. (2023). Educational games promote the development of students’ computational thinking: A meta-analytic review. Interactive Learning Environments, 31(6), 3476–3490. [Google Scholar] [CrossRef]
  69. Tang, S., Hanneghan, M., & El Rhalibi, A. (2009). Introduction to games-based learning. In Games-based learning advancements for multi-sensory human computer interfaces: Techniques and effective practices (pp. 1–17). IGI Global Scientific Publishing. [Google Scholar]
  70. Taylor, J., Simpson, E., Tran, A.-T., Brubaker, J., Fox, S., & Zhu, H. (2024). Cruising queer HCI on the DL: A literature review of LGBTQ+ people in HCI. In CHI ’24: Proceedings of the 2024 CHI conference on human factors in computing systems (Vol. 507, pp. 1–21). Association for Computing Machinery. [Google Scholar] [CrossRef]
  71. van Roy, R., Deterding, S., & Zaman, B. (2018). Uses and gratifications of initiating use of gamified learning platforms. In CHI EA ’18: Extended abstracts of the 2018 CHI conference on human factors in computing systems (LBW 5656, pp. 1–6). Association for Computing Machinery. [Google Scholar] [CrossRef]
  72. Weisz, J. D., Ashoori, M., & Ashktorab, Z. (2018). Entanglion: A board game for teaching the principles of quantum computing. In CHI Play ’18: Proceedings of the 2018 annual symposium on computer-human interaction in play (pp. 523–534). Association for Computing Machinery. [Google Scholar] [CrossRef]
  73. Woods, S. (2012). Eurogames: The design, culture and play of modern european board games. McFarland & Company. [Google Scholar]
  74. Wyeth, P., Hall, J., Carter, M., Tyack, A., & Altizer, R. (2018). New research perspectives on game design and development education. In CHI PLAY ’18: Proceedings of the 2018 annual symposium on computer-human interaction in play companion extended abstracts (pp. 703–708). Association for Computing Machinery. [Google Scholar] [CrossRef]
  75. Zubek, R. (2020). Elements of game design. MIT Press. [Google Scholar]
  76. Zuo, T., Birk, M. V., Van der Spek, E. D., & Hu, J. (2020). Exploring fantasy play in mathmythos AR. In CHI PLAY ’20: Extended abstracts of the 2020 annual symposium on computer-human interaction in play (pp. 413–417). Association for Computing Machinery. [Google Scholar] [CrossRef]
Figure 1. The Client cards for “Readr, A dating app for people who love reading” and “Doggy DIY, an app with plans for pet houses, ramps, beds etc that scale to the size of your pet”. The cards show the desired study size and starting budget, together with three desired attributes: personal trait (purple), socio-economic trait (blue) and personal interest (green).
Figure 1. The Client cards for “Readr, A dating app for people who love reading” and “Doggy DIY, an app with plans for pet houses, ramps, beds etc that scale to the size of your pet”. The cards show the desired study size and starting budget, together with three desired attributes: personal trait (purple), socio-economic trait (blue) and personal interest (green).
Education 16 00282 g001
Figure 2. Example Participant cards for Costa, Alex, and Zhu Li. Each participant card has an incentive cost, three personal attributes (purple), three socioeconomic attributes (blue) and three personal interests (green).
Figure 2. Example Participant cards for Costa, Alex, and Zhu Li. Each participant card has an incentive cost, three personal attributes (purple), three socioeconomic attributes (blue) and three personal interests (green).
Education 16 00282 g002
Figure 3. The scoresheet used in tutorial classes.
Figure 3. The scoresheet used in tutorial classes.
Education 16 00282 g003
Figure 4. The three stages of playtesting, showing playtest focus and participant numbers. The final row (Production playtests) shows where the survey responses were collected.
Figure 4. The three stages of playtesting, showing playtest focus and participant numbers. The final row (Production playtests) shows where the survey responses were collected.
Education 16 00282 g004
Figure 5. “Bookish” image from the concept playtest. The Budget is 50 and Study Size is 12. The study requires IELTS 6, Middle Income, Books. The player has recruited two participants who are both Middle Income, however neither has IELTS 6 or Books as a hobby. Participant cards only show two hobbies at this stage.
Figure 5. “Bookish” image from the concept playtest. The Budget is 50 and Study Size is 12. The study requires IELTS 6, Middle Income, Books. The player has recruited two participants who are both Middle Income, however neither has IELTS 6 or Books as a hobby. Participant cards only show two hobbies at this stage.
Education 16 00282 g005
Figure 6. “Grandma’s Hands” image from the concept playtest. The Budget is 50 and Study Size is 12. The study requires Young, IELTS 6, Crafts. The player has recruited five participants who are all Young. None of the visible cards has IELTS 6; two have Crafts as a hobby.
Figure 6. “Grandma’s Hands” image from the concept playtest. The Budget is 50 and Study Size is 12. The study requires Young, IELTS 6, Crafts. The player has recruited five participants who are all Young. None of the visible cards has IELTS 6; two have Crafts as a hobby.
Education 16 00282 g006
Figure 7. Production playtest in a tutorial class. There are four players at the table; one is recruiting a participant from the Agency by drawing the card from the tableau.
Figure 7. Production playtest in a tutorial class. There are four players at the table; one is recruiting a participant from the Agency by drawing the card from the tableau.
Education 16 00282 g007
Table 1. Elements of models of authentic learning and simulation game design.
Table 1. Elements of models of authentic learning and simulation game design.
Relevance Model of Authenticity in Simulation Games (Ney et al., 2014)EALF (Safiena & Goh, 2022)Authentic Learning Principles (Herrington, 2006)
External RelevanceAuthenticityAuthentic context; authentic activities; authentic assessment
Gameplay RelevanceGame objectives; user-friendliness; game interaction
Learning RelevanceGroup workMultiple roles and perspectives; collaboration; articulation
GuidanceAccess to experts; reflection; coaching and scaffolding
Table 2. Recruitment actions which the player can choose, their costs, and effects.
Table 2. Recruitment actions which the player can choose, their costs, and effects.
Action TypeCostEffect
Ask Around0Draw 3 participants and Recruit one of them.
Local Advertising3Name one interest attribute (green), Draw 5 participants and Recruit any with that attribute.
Online Advertising5Name any 3 attributes, Draw 8 participants and Recruit any with at least 2 of the named attributes.
Use the Agency(face value)Recruit any participants from the (face up) “Agency” display, paying the cost shown on the cards.
Table 3. Participant attributes.
Table 3. Participant attributes.
Attribute TypeClassificationOptions
PersonalPronounsShe/Her, They/Them, He/Him
(purple)AgeYoung, Adult, Mature
Household statusSingle, Partnered
Socio-economicProfessionDomestic, Essential, Professional, Retired,
(blue) Trade, Student
IncomeLow, Middle, High, Wealthy
English proficiencyBeginner, Competent, Proficient
Hobbies Books, Crafts, Drink, Fitness, Food, Games,
(green) Music, Pets, Plants, Sport, Tech, Travel
Table 4. Likert scale response statements and source.
Table 4. Likert scale response statements and source.
StatementConstruct
I liked the look and feel of the gameFunctional: Audiovisual Appeal
The game was not too easy and not too hard to playFunctional: Challenge
It was easy to know how to perform actions in the gameFunctional: Ease of Control
The goals of the game were clear to meFunctional: Clarity of Goals
The game gave clear feedback on my progress towards the goalsFunctional: Progress Feedback
I felt free to play the game in my own wayPsychosocial: Autonomy
I wanted to explore how the game evolvedPsychosocial: Curiosity
I was fully focused on the gamePsychosocial: Immersion
I felt I was good at playing thisPsychosocial: Mastery
Playing the game was meaningful to mePsychosocial: Meaning
I had a good time playing this gameEnjoyment
This game helped me to understand the different ways to recruit people.(unvalidated): Educational value
I feel more confident about my ability to create a recruitment plan after playing this game.(unvalidated): Educational value
This game helped me to understand how to recruit to a specific user brief.(unvalidated): Educational value
This was/would be an effective use of a tutorial.(unvalidated): Effectiveness
I would like to play this game with my friends or family.(unvalidated): Transferability
Table 5. Survey results, shown as mean, median, standard deviation, and inter-quartile range.
Table 5. Survey results, shown as mean, median, standard deviation, and inter-quartile range.
ConstructMeanMedianSDIQR
PXI: Functional Construct
Audiovisual Appeal1.231.51.29(1, 2)
Challenge1.5221.27(1, 2)
Ease of Control1.6321.15(1, 2)
Clarity of Goals1.6721.14(1, 2)
Progress Feedback1.1821.24(1, 2)
PXI: Psychosocial Construct
Autonomy1.2721.35(0, 2)
Curiosity1.5321.07(1, 2)
Immersion1.9720.76(2, 2)
Mastery1.1311.37(0, 2)
Meaning1.3720.96(1, 2)
PXI: Enjoyment
Enjoyment2.1320.77(1, 2)
Unvalidated survey items
Different ways to recruit1.820.86(1, 2)
Create a recruitment plan1.2510.99(1, 2)
Recruit to a specific user brief1.4721.05(1, 2)
Effectiveness in tutorial1.920.92(1, 2)
Transferability to social setting0.911.43(0, 2)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rogerson, M.J.; McKenzie, B.; Bone, E.K. Recruitment Rush: A Boardgame to Teach Students About Recruiting Participants for a User Experience Study. Educ. Sci. 2026, 16, 282. https://doi.org/10.3390/educsci16020282

AMA Style

Rogerson MJ, McKenzie B, Bone EK. Recruitment Rush: A Boardgame to Teach Students About Recruiting Participants for a User Experience Study. Education Sciences. 2026; 16(2):282. https://doi.org/10.3390/educsci16020282

Chicago/Turabian Style

Rogerson, Melissa J., Benjamin McKenzie, and Elisa K. Bone. 2026. "Recruitment Rush: A Boardgame to Teach Students About Recruiting Participants for a User Experience Study" Education Sciences 16, no. 2: 282. https://doi.org/10.3390/educsci16020282

APA Style

Rogerson, M. J., McKenzie, B., & Bone, E. K. (2026). Recruitment Rush: A Boardgame to Teach Students About Recruiting Participants for a User Experience Study. Education Sciences, 16(2), 282. https://doi.org/10.3390/educsci16020282

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop