Next Article in Journal
Entrepreneurial Abilities and Business Performance: Enacting Business Survival Paradigm from Electronics Informal Market, Nigeria
Previous Article in Journal
Systemic Management Practices—Enabling Local Governments to Adapt in Response to Complexity
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Strategies for Increasing Youth Participation in Longitudinal Survey Research: Lessons from a Pilot Study

by
Valentina Castillo Cifuentes
*,
Ana Ferrer
,
Mike Ronchka
,
Ilona Dougherty
,
Amelia Clarke
,
Sana Khaliq
,
Eki Okungbowa
,
Ian Korovinsky
and
Mishika Khurana
School of Environment Enterprise and Development, Faculty of Environment, University of Waterloo, Waterloo, ON N2L 3G1, Canada
*
Author to whom correspondence should be addressed.
World 2025, 6(2), 73; https://doi.org/10.3390/world6020073
Submission received: 11 March 2025 / Revised: 25 April 2025 / Accepted: 19 May 2025 / Published: 1 June 2025

Abstract

:
The pilot phase of a research study is essential for refining methodological and theoretical aspects before a full-scale launch. Using participatory action research with youth and sector partners, this study tested the design and implementation of a longitudinal research project, focusing on four key areas: recruitment strategies, survey design, incentive strategies, and participant engagement and retention. The study compares two recruitment messages, assessed survey clarity and completion rates, tested financial and non-financial incentives, and evaluated participants’ willingness to share contact information and LinkedIn profiles. Data were collected through surveys (n = 91) and focus groups (n = 11) with young people aged 15–29 from across Canada who completed an RBC Future Launch-funded program. Findings indicated that branding and messaging in recruitment emails influenced response rates. Despite concerns about survey length, 97% of participants completed it, with most finishing within 15 min. Among the incentives offered, a CAD 10 payment resulted in the highest response rate. Additionally, both the CAD 10 incentive and the LinkedIn Learning licenses increased participants’ willingness to share LinkedIn profiles. The pilot study provided valuable insights into optimizing recruitment, survey design, and incentive structures for a longitudinal study. These findings provide insights for improving participant engagement and retention in research studies, as well as a co-creation approach to research design.

1. Introduction

The pilot phase of a research study plays a critical role in building a stronger research design before launching a main study, as both methodological and theoretical aspects of research are systematically tested [1]. Pilot studies also help determine the scope, direction and feasibility of research [2], which is particularly relevant for longitudinal studies, as one of the main challenges associated with their extended duration is retaining participants over the years [3]. Therefore, testing different strategies to ensure that participants are effectively reached and engaged in the study is essential.

1.1. Theoretical Framework

This study is guided by the Fogg Behavior Model (FBM) [4] which states that behavior occurs when motivation, ability and a trigger are present simultaneously. In this model, motivation refers to the internal drive to take action, ability refers to how easy or difficult it is to perform the action, and trigger is the prompt that initiates the behavior. Applied to youth participation in research studies, high engagement is more likely to happen when young people are motivated (e.g., through meaningful or guaranteed incentives), when the process is easy (e.g., an accessible survey), and when they receive a strong and timely prompt (e.g., a youth-friendly email invitation). If any of these three elements are missing, the behavior, in this case participating in a research study, is less likely to occur.

1.1.1. Recruitment Messages

Recruitment emails serve as the trigger in the FMB, which acts as the external prompt that encourages youth to participate in the study. Research emphasizes that effective recruitment of young participants via email requires clear, engaging and memorable communication [4]. Research highlights the importance of ensuring consistency in the use of logos, fonts, colors, images, and messaging across all study materials [5]. The social marketing literature emphasizes that specific groups are often reached by customizing messages to align with the shared needs of a particular population [6]. Drawing on these findings, by ensuring clear, consistent, and tailored messaging that resonates with the specific needs of young participants, email recruitment efforts can become more engaging and effective.

1.1.2. Survey Design, Completion Time and Rates

In FBM, survey design influences ability, referring to how easy or difficult the task is to perform. If a task is too complex or time-consuming, even motivated participants with a clear prompt may fail to act [4,7]. Research shows that the ideal survey length is between 10 and 20 min [7,8,9,10]. After the 20 min mark, participants will likely stop responding to a survey, which is known as participant retention loss or survey fatigue [7]. A study conducted about survey completion rates for online surveys indicates that longer surveys generally result in lower completion rates [8]. In a study involving graduate students in education-related fields, 91.1% reported that they would be willing to answer a survey if it took less than 15 min to complete [9]. Therefore, limiting survey duration to the 10 to 20 min range is essential for maximizing both retention and completion rates.

1.1.3. Incentives and Survey Responses

Incentives represent the motivation component of FBM, as they offer participants a reason to act. Although offering incentives to young people for participating in research studies is currently a common practice, research is still lacking on whether incentives are mainly intended to compensate participants for their time or merely to encourage their participation in research studies [11]. In a literature review conducted by [11], the authors identified several studies indicating that a wage-payment model is the most appropriate form of compensating research participation among adolescents and older children [12,13,14]. Yet there are challenges in determining the appropriate amount of compensation [11,15].
Another common approach to incentivizing participation is through cash draws [16]. A study conducted in Scotland, however, found no difference in response rates between participants offered a GBP 500 cash draw and participants that were offered no incentives [3]. Similarly, a group offered 1 of 25 gift cards valued at GBP 20 had a similar response rate (25%) as the group offered no incentive, suggesting that the size of the cash draw or lottery does not necessarily influence participation [3]. The same study indicated that what guaranteed participation was a GBP 10 gift certificate [3]. Overall, research supports the conclusion that incentives, particularly guaranteed rewards, increase response rates for both cross-sectional and longitudinal studies [16].

1.1.4. Use of Social Media for Participant Recruitment and Engagement

Researchers are increasingly exploring ways to integrate the use of social media platforms into identifying and recruiting research participants [17]. Even though there is limited research on how effective this is, several studies in the clinical trial research field show the potential of using social media platforms for recruitment [18,19,20].
A persistent challenge in longitudinal studies is participant attrition due to changes in their contact information over time [18]. To limit this, it is recommended that researchers maintain several ways to communicate with participants [21]. A study that recruited 2SLGBTQIA+ participants through social media found that using platforms like Instagram combined with strong retention strategies, such as incentives and sustained positive relationships with participants, helped them achieve low attrition rates [18]. These findings underline the effectiveness of social media as a dual-purpose tool for both recruitment and sustained participant engagement. While not directly linked to immediate study behavior, the use of social media was explored in the pilot study as a future motivation and ability for sustained motivation in a longitudinal study.
This paper presents the findings from a pilot study developed by the Youth & Innovation Project from the University of Waterloo in Canada. The aim of the study was to test the research design and implementation of a longitudinal study prior to its full-scale launch. Drawing on behavioral theory and social marketing literature, this paper analyzes how recruitment messaging, incentive models, and survey co-design increase youth participation in research studies.
The pilot study focused on four objectives: (1) It tested the effectiveness of two recruitment messages, comparing different strategies to determine which approach was most successful in encouraging participation. (2) It collaboratively designed the survey instrument to ensure clarity, optimize length, and enhance the overall user experience to improve response and completion rates. (3) It examined the effectiveness of different incentives, comparing financial and non-financial options to determine which led to the highest response rates. (4) And it tested participants’ willingness to share their contact information and LinkedIn profile URLs, exploring alternative ways to maintain engagement over the multi-year longitudinal study.

1.2. Study Context

Young people (age 15–30) in Canada face challenges when entering and staying in the job market as they are more likely to work part-time, in entry-level positions, or in jobs that pay minimum wage [22]. Some groups of young people also face additional barriers that affect their inclusion in the labor market, such as housing and food insecurity, racism, caregiving responsibilities, and physical and mental disabilities [23,24]. Indigenous youth also experience the long-lasting effects of colonization and systemic discrimination [23]. In response, the RBC Foundation committed in 2017 to invest CAD 500 million over the course of a decade to support organizations developing initiatives and programs aimed at preparing young people ages 15 to 30 for the future of work.

1.2.1. Introduction to the Longitudinal Study

To measure the impact of this investment, the Youth & Innovation Project at the University of Waterloo in partnership with the RBC Foundation developed the RBC Young People & Economic Inclusion Longitudinal Study. This social impact measurement [25] is a six-year study (2022–2028) aiming to (1) understand and measure the outcomes of youth who have taken part in an RBC-funded program and (2) to determine how these outcomes vary over time. As of July 2024, over 25,000 young participants opted to participate in the study, highlighting both the potential and challenges associated with large-scale longitudinal research.
Longitudinal studies present unique challenges, including participant retention over time, changes in contact information, engagement drop-off and survey fatigue [3]. Budget constraints further complicate the sustainability of a six-year study. To address these complexities, a pilot study was conducted prior to the full launch of the longitudinal study. The pilot aimed to refine the research design, logistics and implementation, while testing engagement strategies to optimize participant retention and data quality.

1.2.2. Participatory Action Research (PAR) Design

The design of the longitudinal study was developed in collaboration with (1) the Youth & Innovation Project’s Youth Advisory Council (YAC), which was composed of a group of nine young people from across Canada that advise the Youth & Innovation Project on its work, and (2) the RBC Partner Advisory Council (PAC), which is composed of 15 to 20 representatives from RBC-funded partners to ensure that the perspectives of youth-serving organizations were incorporated into the design and implementation of the Longitudinal Study. PAR practices were followed, which is a collaborative process involving young people throughout the research process through youth–adult partnerships [26]. PAR with youth not only centers youth voices in research but also provides value to young people, such as leadership development and networking opportunities from participating in the YAC meetings with other youth leaders and experienced professionals, along with academic, career, social, interpersonal, and cognitive outcomes through the YAC feedback sessions where they advise on the research being developed by the Youth & Innovation Project [27].
The YAC is engaged in the Youth & Innovation Project through monthly feedback sessions, where they provide advice on research projects and participate in training sessions on topics they previously identified as needed for their personal and professional development. Two weeks before the March 2022 session, the YAC received a package containing email template scripts for inviting pilot study participants, ethics information letters for the survey and focus groups, and survey and focus group questionnaires. They were specifically asked to provide feedback on the messaging used for participant invitations, as well as on the survey and focus group questionnaires. Once the survey questions were added into Qualtrics, the same platform used to conduct the survey, the YAC was sent a link to test the survey and provide feedback on the user experience. Its members’ feedback primarily focused on ensuring that the email invitations, survey, and focus group questions were accessible and youth-friendly. They also provided advice on the potential incentives for participants.
Similarly, the PAC is involved with the Youth & Innovation Project to provide advice on research projects related to the funding from the RBC Foundation for youth employment programs. PAC members were engaged on two occasions. First, they received a package containing the email invitation scripts and the survey to provide feedback on the messaging and questions. Two meetings were held in November 2021 with different groups of the PAC to give them an opportunity to share their feedback. A second meeting was held with all PAC members in February 2022 to discuss the methodology of the longitudinal study, including the potential incentives for participants.

1.2.3. Recruitment Messages

The first set of messages was recommended by the University of Waterloo’s Ethics Board, which utilized academic language and in-depth explanations of the study. The incentive being offered to the participants for answering the survey was located almost at the end of the email. These emails were text-heavy and did not contain any branding from the University of Waterloo, the Youth & Innovation Project, or the RBC Foundation and did not include an official signature from any of the Youth & Innovation Project team members. This formal and academic set of messages included an invitation to participate in the survey followed by two reminders sent one week apart in the subsequent weeks. These emails were sent through Qualtrics from a generic no-reply email address. These emails were sent to four-hundred participants.
The second set was created by a marketing consultant who developed the emails using a youth-friendly approach, which included a visual design concept with the branding of the Youth & Innovation Project, the University of Waterloo, and the RBC Foundation. As such, the message shared with participants included an explanation of why they were being invited to the study, the importance of their participation, a clear call to action and a sense of urgency at the beginning of the email. The messaging emphasized that participants would be making a difference by sharing their perspectives to support other young people like themselves by answering the survey, but they would also be benefiting by receiving an incentive. This incentive message was placed earlier in the email. At the end of the email, the signature of the Managing Director of the Youth & Innovation Project was included.
A pre-invite invitation email informing the other four-hundred participants that the survey was coming was sent first; then, a survey invitation email with the survey link was sent, followed by three reminder emails. This was carried out over a period of five weeks, with the reminder emails sent one week after the survey was sent. These emails were sent through Mailchimp as, unlike Qualtrics, this platform allows for branded emails and is designed to prevent bulk emails from being marked as junk mail. The email was sent from the Youth & Innovation Project organizational email instead of a generic no-reply email.

1.2.4. Survey Completion Time and Rates

The longitudinal study survey had 65 questions and was estimated to take 20–25 min to complete. The completion time varies based on participants’ responses (e.g., participants who are not currently employed do not receive employment-related questions). Given the longitudinal study survey’s length and existing research on online survey completion time and rates, we anticipated a low completion rate. However, since all the questions were essential to the research, it was decided to include them all and test them in the pilot study. If the results showed low completion rates, the survey would be adjusted for the longitudinal study. To mitigate potential drop-off, the most important questions were placed at the beginning and the least important questions at the end. This way, if participants did not complete the survey, critical information would still be captured and included in the data analysis.
The YAC played a crucial role in designing the survey questions. YAC members ensured that the survey was accessible and easy to understand by using youth-friendly language. They also recommended including an email contact for the Youth & Innovation Project throughout the survey platform so participants could easily reach out with any questions. Additionally, they provided guidance on the font, text size, color scheme, and layout of the survey including the number of questions per page to prevent participant fatigue. All of the YAC’s recommendations were implemented, and a new version of the survey was tested with them to ensure it aligned with their feedback.

1.2.5. Incentives and Survey Responses

One of the core values at the Youth & Innovation Project is engaging young people as equal and active contributors, ensuring they benefit from their involvement, including by receiving compensation for their contributions. Additionally, the goal of achieving higher response rates influenced the decision to offer incentives. Therefore, it was decided that compensating participants was essential for the longitudinal study, aligning with [28], as they emphasize that participants’ time and contributions should be valued.
During discussions on choosing incentives for the longitudinal study, the YAC and the PAC agreed that every participant should be compensated for the time they spent responding to the survey. This method of compensating participants is known as a wage-payment model, which is paying participants as if they were being paid for a job.
When discussing the method of distributing incentives, the YAC recommended sending e-transfers rather than offering gift cards to participants, despite the additional labor involved. This includes setting up a specific bank account, manually sending e-transfers to hundreds of participants, determining if passwords are needed or if participants have automatic deposits, and tracking whether they are accepted, among other steps needed. The YAC reasoned that a monetary payment via an e-transfer would be more meaningful, as it reflects valuing participants’ time, and they did not want the Youth & Innovation Project to be perceived by research participants as affiliated with brands like Amazon, Starbucks or Visa, which are commonly used for gift cards. Similar advice was recommended by the University of Waterloo’s Office of Research and the Work-Learn Institute.
Based on the recommendations of both councils, it was determined that the wage-payment model compensation should be based on living wage rates in Canada. The living wage in the province of Ontario at the time was CAD 20 per hour [29] and it was estimated that the survey would take 20–25 min to answer; therefore, it was recommended that participants receive CAD 10 for completing the survey. The average living wage in Canada was not available at the time the research design was developed, and therefore the Ontario living wage was used as a proxy, as it has one of the highest living wages in Canada [29]. However, as in any research study, this longitudinal study faced budget constraints. After the completion of the pilot study, it was unclear whether funding would be available to implement and sustain the wage-payment model for the entire study. Therefore, it was essential to explore less-costly incentive options.
It was calculated that the Youth & Innovation Project would offer a CAD 50 cash draw for each of the 12 cohorts of the longitudinal study (the budget calculation was 12 cohorts per year, for 6 years over three survey responses); however, the PAC recommended offering CAD 150 cash draws, as it was likely that participants would be more interested in a higher amount for a cash draw. Despite this belief, a research study in Scotland [3] found no difference in response rates between participants offered a GBP 500 gift certificate and participants that were offered no incentives. Additionally, a group offered 1 of 25 gift cards valued at GBP 20 had a similar response rate (25%) to the group offered no incentive, suggesting that the size of the cash draw or lottery might not actually make a difference on response rates [3]. However, the same study indicated that what guaranteed participation was a GBP 10 gift certificate [3]. Table 1 shows a summary of the different incentive strategies and their rationale.
Additionally, the RBC Foundation also offered to provide LinkedIn Learning licenses as an incentive valued at CAD 324 per year to all participants. LinkedIn Learning is an online educational platform that helps individuals develop skills and access certifications including business, technology, creative skills and personal development, through industry expert-led videos [30]. Since the longitudinal study is employment-related, it was decided by the PAC and the Youth & Innovation Project’s research staff that offering a LinkedIn Learning license would be a meaningful incentive in this context as it would give participants access to an educational employment-related platform.
Given the recommendations by the YAC, PAC, the University of Waterloo’s Office of Research, and the Work-Learn Institute, as well as the LinkedIn Learning licenses offered by the RBC Foundation, four incentive options were tested in the pilot study to assess their impact on response rates: a CAD 10 e-transfer to each participant for completing the survey, a chance to win one of two CAD 50 cash draws, a chance to win one of two CAD 150 cash draws, and a LinkedIn Learning license. The results from this test would later inform the longitudinal study.

1.2.6. Use of LinkedIn

For the longitudinal study, which focuses on youth employment, the Youth & Innovation Project’s research team decided to request LinkedIn profile URLs from participants as a second method of contact. Since LinkedIn is an employment-oriented platform, it provides a suitable way to reach participants for employment-focused research. However, the research team was unsure whether young people aged 15–29 actively use LinkedIn or would be willing to share their profile URLs. To assess this, they decided to request LinkedIn profile URLs in the pilot study.
Building on FBM [30] and the context of this research, this pilot study tested four strategies to increase youth participation. The following hypotheses were examined:
Hypothesis 1 (H1).
(Trigger): Participants who receive visually branded, youth-friendly recruitment emails will have a higher response rate than those who received academic-style messages.
Hypothesis 2 (H2).
(Motivation): Participants offered a guaranteed monetary incentive (CAD 10 e-transfer) will have a higher response rate than those offered randomized cash-draw incentives or non-monetary rewards (e.g., LinkedIn Learning licenses).
Hypothesis 3 (H3).
(Future motivation and ability): Participants who received guaranteed incentives (CAD 10 e-transfer or LinkedIn Learning licenses) will be more likely to share their LinkedIn profile URLs than those offered chance-base incentives (CAD 50 and CAD 150 cash draws).

2. Materials and Methods

2.1. Participant Selection

A post-program evaluation survey is completed by youth at the end of their participation in an RBC-funded program. The focus of this survey is to understand the skills and knowledge that young people gained while participating in these programs. At the end of this survey, there is an opt-in question where participants can opt into the longitudinal study. From the 7004 participants that opted in to the longitudinal study from May 2021 to April 2022, 800 participants were randomly selected to participate in the pilot study to ensure a manageable and representative sample to test the survey instrument and engagement strategies.

2.2. Data Collection

2.2.1. Survey

The pilot study was approved by the University of Waterloo’s Ethics Board on 26 May 2022.
The survey data collection for the pilot study began on 15 July 2022 and concluded on 26 September 2022. Participants were surveyed using the survey software Qualtrics 2024. Consent forms are integrated into Qualtrics, which was completed by the participants before filling out the survey. Parental/guardian consent for underage participants was not required, as the Tri-Council Policy Statement 2 (TCPS 2) does not require a minimum age for consent for research. Instead, participants were informed about the purpose of the research, as well as its risks and potential benefits, and that they have the capacity to decide for themselves whether or not to participate in the research [31].

2.2.2. Focus Group

Two focus groups were hosted with pilot study participants who either fully or partially completed a survey with the purpose of gathering their feedback on the different recruitment materials, survey instruments, and incentives. At the beginning of the survey, participants were asked if they were interested in participating in one of the two focus groups. From the 38 participants who expressed interest, 20 were selected to participate in each of the focus groups based on demographic characteristics and the extent to which they had completed the survey.
The two focus groups were held. Five participants attended the first focus group, and six participants attended the second focus group. Participants were provided with a CAD 20 cash incentive for their participation. The focus groups lasted an hour and were held over the platform Zoom. Participants provided their consent prior to attending the focus groups. The meetings were hosted by the Youth & Innovation Project’s staff and recorded for transcription purposes only.

2.3. Data Analysis

2.3.1. Recruitment Messages

To analyze participants’ opinions regarding the recruitment messages, we conducted content analysis on the focus group transcripts to understand their perception of the recruitment messages, and we also analyzed their opinions on the incentives, survey and the use of LinkedIn.

2.3.2. Survey Completion Time and Rates

To determine the completion time, the metrics that Qualtrics provides were used and the time participants took to respond to the survey was measured in seconds. Qualtrics calculates the entire duration of the response, which means that if a respondent stops in the middle of the survey, closes the browser, and comes back another day, that time in between is counted. For the completion rate, the progress metrics that Qualtrics provides were used, which is an exact percentage of how many survey questions the participants answered.

2.3.3. Incentives and Survey Responses

The 800 participants selected for the pilot study were randomly assigned to the four different incentive options, forming four groups of 200 participants each. The incentive options were CAD 10 cash for completing the survey, the chance to win one of two CAD 50 cash draws, the chance to win one of two CAD 150 cash draws, and a LinkedIn Learning license for completing the survey. Response rates were calculated by dividing the number of people who responded to the survey by the total number of people invited to participate in the survey. An overall response rate was calculated as well as separate rates for each incentive group. It is important to note that the response rates include individuals who did not complete the survey.

2.3.4. Use of LinkedIn

At the end of the survey, participants were asked if they would be willing to share their LinkedIn profile URL to stay in tough regarding the longitudinal study. The LinkedIn profile URL itself was not collected; rather, participants could only answer “Yes” or “No” depending on their willingness to share their LinkedIn profile URL. This question was included in a separate survey within the main survey; therefore, survey answers and participants’ willingness to share their LinkedIn profiles were not linked. Participants were informed of the intended method for collecting LinkedIn profile URLs for the longitudinal study. A question in the focus group was also asked regarding the participants’ use of LinkedIn.

3. Results

3.1. Recruitment Messages

When feedback was gathered from focus group participants about the formal and academic set of messages, recommended by the University of Waterloo’s Ethics Board, they noted that while the email invitation was easy to read, it was text-heavy. Some even mistook it for a phishing email, especially since it had been so long since they had participated in an RBC-funded program. They suggested that incorporating a digital text signature to show authenticity would make them feel that the email was coming from a real person. They also suggested that including logos from the University of Waterloo and the RBC Foundation would also make them feel more comfortable responding to the survey. All these suggestions were incorporated in the youth-friendly version of the recruitment emails, which included branding and marketing strategies.
The response rate of the group receiving the first message was 16%. Contrary to expectations, the response rate for the second message, which included branded marketing strategies, was only 6.75%, much lower than anticipated. These results do not support H1, which proposed that participants who receive visually branded, youth-friendly recruitment emails will have a higher response rate than those who received formal and academic style messages. Discussions among the Youth & Innovation Project’s team and the University of Waterloo’s Office of Research revealed that surveys sent in June and July typically result in better response rates, while those sent in September or close to holidays tend to perform worse. The first recruitment message was sent in July, while the second was sent in September, just before the Labor Day weekend, which likely contributed to the lower response rates for the second group.

3.2. Survey Completion Time and Rates

The survey completion rate was 97%. Among the participants that completed the survey, Table 2 shows that 75% did so in 15 min or less, 18.18% took between 16 and 90 min, and only 6.8% took longer than 90 min. It is important to highlight that at least three participants grouped in the “More than 91 min” category completed the survey in more than a day. It is likely that these participants opened the survey and returned later to finish the survey, which skews the average completion time. When excluding this group from the analysis, Table 3 shows that the survey average completion time is 15.34 min. However, this group raises the average to 444.63 min.
Insights from the focus group reveal that the majority of participants found the survey easy to be accessible, with clear and well-structured content that facilitated their experience. They highlighted key accessibility features, such as adequate color contrast, appropriate font size and smooth navigation on Qualtrics. These responses align with the survey results, suggesting an overall positive user experience.

3.3. Incentives and Response Rates

Of a total of 800 participants recruited, 91 participated in the study, resulting in a response rate of 11.38%. This includes people who responded to the survey but did not complete the survey. Due to data limitations, we could not perform statistical tests on response rates across incentives; however, the descriptive analysis in Table 4 shows that higher responses were achieved when participants were offered CAD 10 for completing the survey, resulting in a 17% response rate. In contrast, offering cash draws or LinkedIn Learning licenses resulted in lower response rates. Specifically, the cash draws resulted in the lowest response rates, with 9.5% for the CAD 50 cash draws and 8% for the CAD 150 cash draws.
These findings are consistent with H2, which claimed that participants offered a guaranteed monetary incentive would respond at higher rates than those offered randomized cash draws or non-monetary incentives. Focus group participants also supported this finding, as they unanimously agreed that the CAD 10 cash was one of the main motivations to complete the survey.

3.4. Use of LinkedIn

From the total number of participants who answered the survey, 49.38% indicated their willingness to share their LinkedIn profile URLs to be contacted about the longitudinal study. Table 5 presents the comparison of results across different incentive groups, showing variation in participants’ willingness to share their LinkedIn profile URLs based on the incentives offered for completing the survey. To test H3, the hypothesis that participants who receive a guaranteed incentive (CAD 10 or LinkedIn Learning license) will be more likely to share their LinkedIn profile URLs than those offered randomized cash-draw incentives (CAD 50 and CAD 150 cash draws), a one-way ANOVA test was conducted. The analysis showed a marginally significant effect (F (3, 83) = 2.36, p = 0.078), suggesting that there may be a potential association between incentive type and willingness to engage with employment-oriented platforms.
The LinkedIn Learning licenses and the CAD 10 groups had the highest percentage of “Yes” responses, supporting the direction of H3. Additional feedback from focus group participants indicated that in addition to email, LinkedIn would be the best social media platform to communicate with them about the longitudinal study.

4. Discussion

Developing the pilot study provided significant insights into how to collaboratively design and implement a research study that would later inform the implementation of a longitudinal study. Several key lessons were learned about incentives, survey design and response rates, and young participants’ engagement.
The pilot study also showed that the branding and messaging of recruitment emails may play an important role in response rates for youth participants. Focus group participants suggested that branded emails with signatures and logos create a sense of authenticity and trust, making them more likely to engage in the study. Although the response rate for the youth-friendly message was lower, contrary to what was proposed in H1, feedback from the participants indicated that this approach is more engaging to them. Thus, incorporating clear, personalized, and visually engaging emails is a valuable strategy for research recruitment. However, since the lower response rates for the youth-friendly emails were likely influenced by a holiday weekend, future studies should carefully monitor response rate trends based on seasonality.
These findings are consistent with FBM [30], which posits that behavior is most likely to occur when motivation, ability and trigger align simultaneously. In this context, the youth-friendly set of messages likely functioned as a more effective trigger by capturing attention and providing a clear, personalized call to action. However, FBM also emphasizes that if the timing of the trigger is not adequate, even when motivation and ability are high, the desired behavior may not occur.
Despite concerns about survey length, the study achieved a high completion rate of 97%, with most participants completing the survey within 15 min or less, which is considered within good parameters in the literature [6,7,8,9]. These results suggest that collaborating with the YAC was crucial in designing the survey as their feedback helped ensure that the survey was easy to understand for young people, accessible, and designed to minimize fatigue. Feedback from focus group participants confirmed that the survey was easy to follow and understandable, which likely contributed to the high completion rates. This emphasizes the importance of a collaborative survey design that engages the target population as contributors to the research process.
Following the ability component of FBM [30], the accessible language, clear structure and user-friendly design helped reduce task complexity for participants, therefore increasing their ability to complete the survey successfully. According to Fogg, when a task feels easy to perform, the likelihood of the desired behavior increases even among individuals with varying levels of motivation [30]. Future studies should consider including a control group to compare response rates and user experience between a standard survey and one co-designed with youth as in the pilot study.
Among the four incentives offered—CAD 10 to each participant for completing the survey, two CAD 50 and two CAD 150 cash draws, and the LinkedIn Learning licenses—the CAD 10 resulted in higher response rates. This finding aligns with the wage-payment model, which emphasizes the importance of compensating participants directly for their time [10]. It also aligns with previous research, which suggests that offering small amounts of cash to participants lead to higher response rates than larger cash draws [3].
From the perspective of FBM [30], the guaranteed cash incentive contributed to increased motivation. It is likely that the immediate reward is likely to increase participants’ drive to complete the survey, especially when compared to less certain incentives such as the randomized cash draws. These findings support H2, which proposed that participants offered a guaranteed monetary incentive would respond at higher rates than those offered randomized cash draws or non-monetary incentives such as the LinkedIn Learning licenses.
During the testing of the cash draws, it became evident that stating clearly defined rules was essential for ensuring transparency and participant trust. In the first round of cash draws, rules were not specified; however, in the second round, rules and regulations were included. This set of rules and regulations outlined the date and time when the draws would be conducted and how winners would be selected, and required them to answer a simple mathematical skill question in addition to having a Canadian bank account. Additionally, participants were informed that if they did not respond within five business days, a new winner would be selected. The rules and regulations were developed by the University of Waterloo’s Legal and Immigration Services. Clear communication with participants regarding the timing of incentive payments was crucial for both the cash draws and the CAD 10 incentives. While paying CAD 10 to each participant results in higher response rates, it is important to recognize that it requires significant administrative effort. The process is labor-intensive for staff as it requires high levels of attention to detail and coordination. Based on focus group feedback, the personalized approach is considered valuable and may foster higher engagement. As a result, automating this process is not being considered.
Regarding the LinkedIn Learning licenses, the results indicated that both the CAD 10 incentive and the LinkedIn Learning licenses separately encouraged more participants to express interest in sharing their profiles, supporting H3. Although about half of the participants indicated a willingness to share their LinkedIn profiles, the actual sharing of profiles was not measured, limiting the ability to conclude whether participants would provide their LinkedIn profile URLs.
Future research should test additional strategies, such as the one mentioned in [17] of using social media to maintain contact with participants over time. While FBM does not address long-term retention behaviors, the participants’ willingness to share their LinkedIn URLs reflects a degree of motivation and ability to engage beyond the immediate survey.
Finally, the collaboration with both the YAC and the PAC, as well as with colleagues and departments at the University of Waterloo, provided invaluable insights into effectively engaging young people in research. Their feedback significantly influenced the survey design, incentives, and recruitment strategies, shaping the pilot study and informing the longitudinal study.

5. Conclusions

This pilot study underscores the importance of testing and refining research design and the implementation of longitudinal research. Each of the four objectives and findings provide critical insights into recruitment, survey implementation, incentive effectiveness, and long-term engagement strategies that will strengthen future research efforts.
The comparison of recruitment messages emphasized the importance of clear, branded, and personalized communication while also highlighting external factors such as timing and seasonality that may influence response rates. These findings are essential to informing outreach and engagement strategies for longitudinal studies. High survey completion rates, achieved within a 15 min time frame, validate the co-design approach, particularly with the involvement of the Youth Advisory Council. This result emphasizes the role of participatory research design in improving clarity, accessibility, and overall survey design.
The analysis of different incentive strategies revealed that small, guaranteed financial incentives (CAD 10 per participant) led to higher engagement compared to larger, randomized cash draws. Although direct payments require greater administrative capacity, the increased response rates and positive feedback from participants suggest that the approach is both effective and appropriate in this context. Lastly, the willingness of participants to provide contact information and express interest in LinkedIn-based engagement offers an opportunity to maintain engagement in the research study over multiple years.
Overall, the pilot study contributes to the broader field of longitudinal research by informing best practices for recruitment, retention, and survey administration. Through the lens of FBM [30], the study demonstrates how aligning motivation (incentives), ability (accessible survey design), and timely triggers (youth-friendly messages) can effectively prompt desired behaviors. Ultimately, this study advances the understanding of designing inclusive and effective research methodologies that foster collaboration between youth and sector partners.

6. Limitations of the Study

Regarding the limitations of the study, it is important to note that the pilot study did not test a panel management strategy, which is crucial for longitudinal studies [20], nor did it consider alternative survey formats, such as phone or in-person surveys for communities with limited access to online surveys [24]. Additionally, we did not include a control group without incentives as the decision was made to provide incentives to participants regardless. These are areas that require further consideration for future studies.
Finally, we did not study how different youth characteristics might have influenced their preferences [32]. Future studies could consider how intersectional identities lead to diverse youth perspectives and thus consider the impact on youth engagement strategies in research.

Author Contributions

Conceptualization, V.C.C., A.F., M.R., I.D. and A.C.; methodology, V.C.C., A.F., M.R., I.D., A.C., S.K., E.O., I.K. and M.K.; formal analysis, V.C.C.; investigation, data curation, V.C.C.; writing—original draft preparation, V.C.C.; writing—review and editing, V.C.C., A.F., M.R., I.D., A.C., S.K., E.O., I.K. and M.K.; project administration, V.C.C. and I.D.; funding acquisition, I.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Rideau Hall Foundation and the RBC Foundation.

Institutional Review Board Statement

The pilot study was approved by the University of Waterloo’s Ethics Board on 26 May 2022 (Ethics Protocol # 44046).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data supporting this study are not publicly available due to participants not providing consent for data sharing.

Acknowledgments

We would like to thank the Youth & Innovation Project staff, Mariah Jolin, Youth Advisory Council member, Daniel Wang, and the University of Waterloo for their support and contributions to this research.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Smith, C.A. The uses of pilot studies in sociology: A processual understanding of preliminary research. Am. Sociol. 2019, 50, 589–607. [Google Scholar] [CrossRef]
  2. Ismail, N.; Kinchin, G.; Edwards, J.-A. Pilot study, does it really matter? Learning lessons from conducting a pilot study for a qualitative PhD thesis. Int. J. Soc. Sci. Res. 2017, 6, 1–17. [Google Scholar] [CrossRef]
  3. Henderson, M.; Wight, D.; Nixon, C.; Hart, G. Retaining Young People in a Longitudinal Sexual Health Survey: A Trial of Strategies to Maintain Participation. 2010. Available online: http://www.biomedcentral.com/1471-2288/10/9 (accessed on 31 July 2024).
  4. Fogg, B. A behavior model for persuasive design. In Proceedings of the 4th International Conference on Persuasive Technology; ACM Digital Library: New York, NY, USA, 2009; pp. 1–7. [Google Scholar] [CrossRef]
  5. Karrasch, N. Easy Branding Strategies for Your Research Study. Available online: https://trialfacts.com/articles/easy-branding/ (accessed on 10 October 2024).
  6. Grier, S.; Bryant, C. Social marketing in public health. Annu. Rev. Public Health 2005, 26, 319–339. [Google Scholar] [CrossRef]
  7. Liu, M.; Wronski, L. Examining completion rates in web surveys via over 25,000 real-world surveys. Soc. Sci. Comput. Rev. 2018, 36, 116–124. [Google Scholar] [CrossRef]
  8. Revilla, M.; Ochoa, C. Ideal and maximum length for a web survey. Int. J. Mark. Res. 2017, 59, 557–565. [Google Scholar] [CrossRef]
  9. Saleh, A.; Bista, K. Examining factors impacting online survey response rates in educational research: Perceptions of graduate students. J. Multidiscip. Eval. 2017, 13, 63–74. [Google Scholar] [CrossRef]
  10. Galesic, M.; Bosnjak, M. Effects of questionnaire length on participation and indicators of response quality in a web survey. Public Opin. Q. 2009, 73, 349–360. [Google Scholar] [CrossRef]
  11. Afkinich, J.L.; Blachman-Demner, D.R. Providing incentives to youth participants in research. J. Empir. Res. Hum. Res. Ethics Int. J. 2020, 15, 202–215. [Google Scholar] [CrossRef] [PubMed]
  12. Rice, M.; Broome, M. Incentives for children in research. J. Nurs. Sch. 2004, 36, 167–172. [Google Scholar] [CrossRef]
  13. Wendler, D.; Rackoff, J.E.; Emanuel, E.J.; Grady, C. The ethics of paying for children’s participation in research. J. Pediatr. 2002, 141, 166–171. [Google Scholar] [CrossRef]
  14. Bagley, S.; Reynolds, W.; Nelson, R. Is a ‘wage-payment’ model for research participation appropriate for children? Pediatrics 2007, 119, 46–51. [Google Scholar] [CrossRef] [PubMed]
  15. Nguyen, T.T.; Jayadeva, V.; Cizza, G.; Brown, R.J.; Nandagopal, R.; Rodriguez, L.M.; Rother, K.I. Challenging recruitment of youth with type 2 diabetes into clinical trials. J. Adolesc. Health 2014, 54, 247–254. [Google Scholar] [CrossRef]
  16. Singer, E.; Ye, C. The use and effects of incentives in surveys. Ann. Am. Acad. Politi. Soc. Sci. 2013, 645, 112–141. [Google Scholar] [CrossRef]
  17. Gelinas, L.; Pierce, R.; Winkler, S.; Cohen, I.G.; Lynch, H.F.; Bierer, B.E. Using social media as a research recruitment tool: Ethical issues and recommendations. Am. J. Bioeth. 2017, 17, 3–14. [Google Scholar] [CrossRef]
  18. Weisblum, M.; Trussell, E.; Schwinn, T.; Pacheco, A.R.; Nurkin, P. Screening and retaining adolescents recruited through social media: Secondary analysis from a longitudinal clinical trial. JMIR Pediatr. Parent. 2024, 7, e47984. [Google Scholar] [CrossRef] [PubMed]
  19. Whiteley, J.A.; Faro, J.M.; Mavredes, M.; Hayman, L.L.; Napolitano, M.A. Application of social marketing to recruitment for a digital weight management intervention for young adults. Transl. Behav. Med. 2021, 11, 484–494. [Google Scholar] [CrossRef] [PubMed]
  20. Gupta, A.; Calfas, K.J.; Marshall, S.J.; Robinson, T.N.; Rock, C.L.; Huang, J.S.; Epstein-Corbin, M.; Servetas, C.; Donohue, M.C.; Norman, G.J.; et al. Clinical trial management of participant recruitment, enrollment, engagement, and retention in the SMART study using a Marketing and Information Technology (MARKIT) model. Contemp. Clin. Trials 2015, 42, 185–195. [Google Scholar] [CrossRef]
  21. Andreß, H.-J.; Golsch, K.; Schmidt, A.W. Applied Panel Data Analysis for Economic and Social Surveys; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar] [CrossRef]
  22. Morissette, R. Chapter 2: Youth Employment in Canada. In Portrait of Youth in Canada: Data Report; Statistics Canada: Ottawa, ON, Canada, 2021. [Google Scholar]
  23. Government of Canada. Employment and Social Development Canada, Understanding the Realities: Youth Employment in Canada—Interim Report of the Expert Panel on Youth Employment. 2016. Available online: https://www.canada.ca/en/employment-social-development/corporate/youth-expert-panel/interim-report.html (accessed on 4 May 2024).
  24. Nichols, L. Working through the unknowns: Canadian youth’s experience of employment during the COVID-19 pandemic. Can. J. Fam. Youth 2023, 15, 113–129. [Google Scholar] [CrossRef]
  25. Feor, L.; Clarke, A.; Dougherty, I. Social Impact Measurement: A Systematic Literature Review and Future Research Directions. World 2023, 4, 816–837. [Google Scholar] [CrossRef]
  26. Shamrova, D.; Cummings, C. Participatory action research (PAR) with children and youth: An integrative review of methodology and PAR outcomes for participants, organizations, and communities. Child. Youth Serv. Rev. 2017, 81, 400–412. [Google Scholar] [CrossRef]
  27. Anyon, Y.; Bender, K.; Kennedy, H.; Dechants, J. A systematic review of youth participatory action research (YPAR) in the United States: Methodologies, youth outcomes, and future directions. Health Educ. Behav. 2018, 45, 865–878. [Google Scholar] [CrossRef] [PubMed]
  28. Gelinas, L.; Largent, E.A.; Cohen, I.G.; Kornetsky, S.; Bierer, B.E.; Lynch, H.F. A framework for ethical payment to research participants. N. Engl. J. Med. 2018, 378, 766–771. [Google Scholar] [CrossRef] [PubMed]
  29. Ontario Living Wage Network, Rates. Available online: https://www.ontariolivingwage.ca/rates (accessed on 14 March 2022).
  30. LinkedIn. LinkedIn Learning Overview. Available online: https://www.linkedin.com/help/learning/answer/a700789 (accessed on 2 August 2024).
  31. Government of Canada. Canadian Institutes of Health Research, Natural Sciences and Engineering Research Council of Canada, and Social Sciences and Humanities Research Council of Canada, Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans. 2018. Available online: https://ethics.gc.ca/eng/policy-politique_tcps2-eptc2_2018.html (accessed on 1 May 2024).
  32. Cornelius-Hernandez, T.; Clarke, A. The current state of integrating equity, diversity and inclusion into knowledge mobilization: A systematic literature review. Equal. Divers. Incl. Int. J. 2024. ahead of print. [Google Scholar] [CrossRef]
Table 1. Incentive strategies and rationale.
Table 1. Incentive strategies and rationale.
IncentiveValueRationale
Direct incentiveCAD 10 e-transfer per personThe wage-payment model aligns with valuing participants’ time and contributions, based on Ontario’s living wage.
Cash draw 1CAD 50 × 2 winnersInitially budgeted at CAD 50 per draw per cohort, designed as the most cost-effective alternative to individual payments.
Cash draw 2CAD 150 × 2 winnersA higher-value cash draw was recommended by the PAC, as they believed it would be more attractive to participants.
LinkedIn Learning LicenseCAD 324 per yearOffered by the RBC Foundation, providing educational and employment-related learning opportunities relevant to the study’s focus on career development.
Table 2. Survey completion time.
Table 2. Survey completion time.
MinutesPercentage
0–15 min75.00%
16–90 min18.18%
More than 91 min6.82%
Table 3. Survey completion time with and without outlier group.
Table 3. Survey completion time with and without outlier group.
MeanStd. Dev.Min.Max
Completion time (n = 91)444.632622.280.722,854.5
Completion time excluding “More than 91 min” group (n = 84)15.3416.250.788.27
Table 4. Response rates achieved by incentives.
Table 4. Response rates achieved by incentives.
IncentiveResponse Rate
CAD 1017%
LinkedIn Learning license11%
CAD 50 cash draw9.5%
CAD 150 cash draw8%
Overall response rate11.38%
Table 5. LinkedIn profile URLs.
Table 5. LinkedIn profile URLs.
IncentiveYesNo
LinkedIn Learning license36.36%13.95%
CAD 1031.82%44.19%
CAD 150 cash draw18.18%16.28%
CAD 50 cash draw13.64%25.58%
p < 0.9.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Castillo Cifuentes, V.; Ferrer, A.; Ronchka, M.; Dougherty, I.; Clarke, A.; Khaliq, S.; Okungbowa, E.; Korovinsky, I.; Khurana, M. Strategies for Increasing Youth Participation in Longitudinal Survey Research: Lessons from a Pilot Study. World 2025, 6, 73. https://doi.org/10.3390/world6020073

AMA Style

Castillo Cifuentes V, Ferrer A, Ronchka M, Dougherty I, Clarke A, Khaliq S, Okungbowa E, Korovinsky I, Khurana M. Strategies for Increasing Youth Participation in Longitudinal Survey Research: Lessons from a Pilot Study. World. 2025; 6(2):73. https://doi.org/10.3390/world6020073

Chicago/Turabian Style

Castillo Cifuentes, Valentina, Ana Ferrer, Mike Ronchka, Ilona Dougherty, Amelia Clarke, Sana Khaliq, Eki Okungbowa, Ian Korovinsky, and Mishika Khurana. 2025. "Strategies for Increasing Youth Participation in Longitudinal Survey Research: Lessons from a Pilot Study" World 6, no. 2: 73. https://doi.org/10.3390/world6020073

APA Style

Castillo Cifuentes, V., Ferrer, A., Ronchka, M., Dougherty, I., Clarke, A., Khaliq, S., Okungbowa, E., Korovinsky, I., & Khurana, M. (2025). Strategies for Increasing Youth Participation in Longitudinal Survey Research: Lessons from a Pilot Study. World, 6(2), 73. https://doi.org/10.3390/world6020073

Article Metrics

Back to TopTop