1. Introduction
Last two decades, web or internet surveys have been a popular method of data collection in the academic community [
1,
2]. Web surveys are being conducted in web-based environments where the potential participants that have been invited can find and fill out an online questionnaire. Web survey offers several advantages such as easy access and participants’ anonymity [
3], quicker response, time and cost-saving method, simple processing and fast storing of data and the ability to collect data from large population areas [
3,
4,
5,
6,
7,
8,
9]. Due to these advantages, traditional forms of research are conducted by mail or telephone. Despite the advantages offered, web surveys present meagre response rates [
6,
8,
9,
10,
11,
12,
13], which primarily affect the validity of the results and the reliability of the outcomes threatening the inferential value of the survey method [
1,
14,
15,
16]. This fact is important because much of the published academic research in influential journals is based on web surveys [
17].
Considering the low response rate of web surveys, researchers focus on investigating the factors that may be related to this issue. In the last decade, they have conducted several studies on the factors that affect the survey response rate. Specifically, more of them investigated the factors that affect the response rate of web surveys (e.g., [
3,
7,
18]), and fewer studies on the response rate both of web surveys and postal or paper surveys (e.g., [
19,
20]). Notably, researchers have identified the problem of low response rates in web surveys from various perspectives (see
Section 2). However, relevant studies investigated some factors but not all of them overall. These factors are “consent to participate in the survey” [
21], “location of demographic information in the questionnaire”, “demographics and other personal information discourage responders from participating”, “mandatory answers”, and “a reference to the survey’s precise purposes” [
2], option to choose an answer: “I do not know/I do not answer” [
22], “time limit (deadline)” [
9,
23,
24], and “approval of ethics committee”. In the present study, we attempt to present the factors influencing the response rate by doing an extensive literature review.
Additionally, most previous research has been conducted in countries outside Europe. Specifically, they have been done in countries such as the United States of America [
1,
2,
4,
7,
13,
20,
22,
25,
26,
27], Japan [
15], and Chile [
23]. Only six of them have been conducted in Europe and specifically in Denmark [
8,
19], German [
9], Spain [
3], Slovenia [
24] and Sweden [
18], while no relevant research has been done in Greece.
Moreover, these surveys mainly concern participants without being referred to a particular cohort [
8,
18,
19,
24,
26], graduate and undergraduate Students [
1,
2,
13,
19,
28,
29], primary care health professionals [
3], radiologists [
27], physician specialist [
25], and only three concern educators and specifically University faculty [
4] and school principals [
20,
23]. Similar surveys worldwide that only study teachers’ response rates have not been found in the relevant literature.
Therefore, in this survey, with a sample of Greek teachers, we investigate a wide range of possible factors that explain teachers’ participation in web surveys that are conducted by online questionnaires. This study has significant implications since teachers are frequently requested to complete surveys. Therefore, we hope to enrich the existing bibliography by providing empirical insights into the factors that explain the response rate of web surveys in education.
2. Literature Review
Increasing participation rates in web surveys are challenging for researchers [
20]. The international literature has presented several factors that explain the willingness of individuals to participate in a survey. The factors that researchers demonstrate are (a) incentives, (b) authority, (c) survey structure/form, (d) ethical issues, (e) pre-notification and reminders, and (f) survey time received.
Specifically, the factor “insensitive” refers, on the one hand, to “external incentives” such as small financial incentives [
1,
7,
13,
20,
25,
26,
27,
29,
30] or donation to charities incentives [
8]. “Monetary incentives” are one of the more critical factors that can increase the response rate on a survey [
20]. This is evidenced by Smith et al.’s [
26] study, where in order to increase the participation rates in the research, he included in the invitation a sum of
$2. Indeed, the results of his research showed that prepaid monetary incentives were an effective method of improving response rates. On the other hand, “internal” incentives such as interest in the research subject [
8,
9,
25] or the participant attitudes in an investigation [
2,
10]. The category “Attitude towards research” also includes the generous incentives influencing a responder to participate in research. An altruistic motivation can be considered when the respondents take part in the research as they consider that their answers can contribute to the achievement of a good purpose or just to take part in helping with their participation in the research [
22,
24]. “Interest” in the subject of the research is an essential factor for increasing the response rate for any research. Indeed, as Park et al. [
7] confirmed, respondents are much more likely to participate in a survey when its topic is related to their interests.
The sponsor or “Authority” of the research seems to explain the willingness of someone to participate in a web survey [
2,
7,
24,
28,
29]. When the sponsor is a prestigious agency, such as a governmental agency or an educational agency, and not a commercial one, participants are more likely to feel more secure in trusting their personal information and data [
24]. Even though researchers often point it out, some [
2,
7] argue that it does not significantly influence the participants’ willingness to complete a survey. Therefore, it would be of particular research interest to examine it further.
The “survey structure/form” concentrates on factors such as personalization of invitations, open-ended/close-ended questions, and questionnaire length. In more detail, the factor “Personalization of invitations” [
7,
13,
22,
25,
29,
31] with a positive effect on response rates, mainly in web surveys [
13,
25]. Response rates could be increased when the researcher uses personalization tactics, such as email invitations that include the respondents’ first or last names in the subject’s title [
22]. The “length of the survey” [
7,
22,
26,
27,
29] could influence a person’s participation decision. Researchers conclude that the less time a questionnaire takes to be completed, the more likely it is to be completed by the respondent. Surveys lasting more than 13 min present lower response rates [
22] than surveys lasting less than 10 min [
7]. Saleh and Bista [
2] also indicate in their research that the form of questions, when they are open-ended, discourages the respondents from completing the survey. In this context, they suggest using close-ended questions to achieve higher response rates to their surveys.
“Ethical issues” such as consent for participation, data safety, privacy, and anonymity, although they have scarcely been studied (e.g., [
21]), seem to explain the response rate of the web surveys. According to Nayak and Narayan [
21], it is a critical issue of ethical surveys to give the respondents the choice of consent to participate in a survey. Also, researchers should assure the anonymity of the participants since data safety and privacy has been pointed out by researchers that have a strong correlation with the high response rates [
2].
“Reminders and pre-notification” [
1,
2,
3,
9,
22,
25,
31] are considered essential factors since they could increase the response rate. Specifically, on the one hand, “pre-notification” can provide some information and invitation calls for the participants to join the survey [
10]. On the other hand, “reminders” act as an additional tool in case the potential respondents do not initially receive the invitation or have not studied the necessary information from the first invitation [
10]. Indeed, the correct number of “reminders” to avoid becoming annoying to the participants is at the centre of research interest [
25].
“Survey time received” is identified in the literature that explains the response rates to web surveys [
23]. Specifically, Madariaga et al. [
23] mentioned that scheduling when the survey will be sent to the respondent for completion is a factor that can affect the response rate considering some groups of citizens, such as school principals, have increasingly busy daily schedules. However, this factor needs further examination as Saley and Bista [
2] did not find similar findings.
7. Results
To have a more cohesive presentation of the teachers’ responses, we presented analytically in all the tables the percentages of “disagree” and “strongly disagree” as well as “agree” and “strongly agree”. Also, in each table, we present the Goodman-Kruskal gamma (γ) coefficients, a measure of association between teachers’ intention to participate in a web survey with a questionnaire and all possible factors that explain this behaviour. Also, the statements were sorted in descending order according to the last column (i.e., items with the highest association coefficient appear at the top).
Regarding the teachers’ intention to participate in a web survey with a questionnaire (see
Appendix Item 1. I will participate in a web survey with a questionnaire), almost all participants answered that they agree (67%) or strongly agree (30%). Only 3% of them disagree or strongly disagree. Pearson chi-square results do not indicate significant associations between the teachers’ intention and all the demographic variables: gender (
χ2(2) = 0.001,
p = 0.99), group of age (
χ2(12) = 12.218,
p = 0.43), educational stage (
χ2(6) = 4.936,
p = 0.55), postgraduate studies (
χ2(2) = 2.562,
p = 0.28), and research experience of teachers (
χ2(2) = 0.232,
p = 0.89).
Relate to the authority conducting the web survey (
Table 2). Participants presented a high percentage of positive responses (agree to agree strongly) to the statements related to this factor. However, the Goodman-Kruskal gamma showed moderate positive significant associations of conducting web surveys only by a prestigious organization and by known colleagues with teachers’ intention to participate in the survey.
Regarding the incentives that motivate participants to complete a survey (
Table 3), teachers stated that they mainly agree or strongly agree with the four first statements related to internal incentives and lower degree with the last two statements related to external motivations. Similarly, the two coefficients confirm that their internal incentives mainly motivate them to participate in a web survey.
About the Survey’s structure/form (
Table 4), a high percentage of teachers declared that they mainly prefer to complete questionnaires when it mainly includes concise, closed-ended questions, the option “I do not know and/or I do not answer” when they know in advance the time required to complete the questionnaire and the number of questions. Parallelly, according to the two coefficients, these factors correlate with the participants’ intention to participate in a web survey. However, the professional look of the questionnaire, the mandatory answers, and the inclusion of participants’ names do not present a significant association with teachers’ intention to participate in the survey.
About the time required to complete a questionnaire (see
Appendix Item 20 with possible answers: less than 5′, 6–10′,11–15′, 16+), overall, 85.9% of the sample participants stated that they prefer to complete questionnaires when the total time does not exceed the 10 min. Regarding the deadline for completion of a questionnaire (see
Appendix Item 21, I participated in a web survey with a questionnaire if the deadline is: 1 week, two weeks, three weeks, or four weeks), 61.7% answered up to 1 week, the 24.5% answered up to 2 weeks, and the remaining teachers answered up to 4 weeks. Regarding the place of demographic information in a questionnaire (see
Appendix Item 22 with possible answers: at the beginning, at the end, not to exist), most of the sample (87.5%) answered that they prefer it to be at the beginning of the questionnaire.
Regarding ethical issues related to research (
Table 5), teachers mainly agree or strongly agree that these issues determine their participation in a web survey. This is confirmed by the significant correlations of these factors with teachers’ intention to participate in the survey.
Regarding the demographic information that seems to discourage participants (see
Appendix Item 28) from completing a web survey (
Figure 1), the information that seems to precede is the email address (66%), the city (29.60%), the country (23.90%), and the marital status (18.60%).
About the reminders and pre-notification, teachers declared that they desire to complete a web survey (
Table 6) when the invitation to participate is being sent by email and in the lower degree, when they see it on a website the invitation, or they receive a reminder to complete the survey. Similarly, the gamma coefficient revealed a significant positive correlation between the invitation to participate being sent by email with teachers’ intention to participate in the survey.
8. Discussion of Results
This research aimed to confirm whether specific factors correlate with the teachers’ intention to participate in a web survey with a questionnaire and, therefore, to indicate the factors that probably influence the web survey research response rate. The discussion below is mainly based on the statements’ statistically significant associations (gamma and delta coefficients) with the teachers’ intention to participate in the web survey.
The findings reveal that the respondents are more likely to participate in the web survey when the survey sponsorship has a reputation or is known to the participants. These results support earlier findings (e.g., [
2,
10,
28,
29]). However, the results diverge from the similar surveys of Park et al. [
7] with a sample of 540 undergraduate and graduate students and the study of Petrovcic et al. [
24] with an extensive sample of 2500 responders from the online health community.
Regarding the factor incentives, the teachers seem motivated to participate in a web survey from internal incentives. The results converge with the survey of Petrovcic et al. [
24], which recommended that a “plea for help” is an essential factor in response rate. Interest in the topic and content of the web survey also seems to be an essential factor that affects the response rate in web surveys. The results are in line with the existing research literature on the impact of the survey topic on the response rate [
2,
7,
10]. Moreover, our survey results indicated two additional factors that affect the response rate on web surveys: the reference of the explicit purpose of the survey and the promise of informing about the results.
Results also indicated that paying attention to the questionnaire structure/form leads teachers to participate in a web survey. Participants declared that they were more likely to complete a survey if only the questions were simple, comprehensive closed-ended and when they knew in advance both the time required to complete it and the number of questions it contained. The results agree with the survey of Fan and Yan [
10]. Moreover, most of the sample stated that they discourage completing web surveys when the questions are open-ended [
2]. A factor that brought up the dimension “Survey structure/form” and seems to affect the response rate on web-survey are the options “I do not know and/or I do not answer”. Common sense suggests that longer questionnaires will tend to yield lower response rates than shorter questionnaires, as they require more time from the respondents to complete them. Indeed, almost nine of ten sample participants stated that they prefer to complete web questionnaires when the total time for it is not more than 10 min and the deadline to complete it is up to 3 weeks. This finding agrees with the existing literature involving the impact of survey length on response rate (e.g., [
7,
22,
37,
38]). We also investigated the influence of the location of demographic information in a web survey as a factor influencing the response rate. The results showed that participants (9 out of 10) would prefer it to be at the beginning of the web questionnaire.
Our findings revealed that consent for participation, data safety, privacy, and anonymity are critical ethical issues in web surveys. These results seem to align with the existing literature on the ethical and moral issues governing research [
2,
10,
21]. The results of our research highlighted two more factors that seem to explain the web-survey response rate. The first is the approval of the research from an official ethics commitment service, and the second is the disclosure to the participant of how the researcher gained access to his/her account, in case the survey invitation is sent by email. We also examined whether some demographic information discouraged participants from completing the survey, and the results showed that they were mainly discouraged when asked for their email addresses and city of residence.
Moreover, the results indicated that participants are more likely to complete a web survey when the invitation to participate is sent by email [
37]. It is worth noting that emails, in most cases, are blindly sent (using Blind Carbon Copy) by researchers to the responders based on an email list and do not have a personal suggestion for each participant. So, this is not contrary to our findings, where most of the participants are denied completing a web survey when they should complete their email addresses. Therefore, this result seems to align with the “ethical issues” that govern a survey, as mentioned above. Also, the results demonstrated that the reminder to complete the questionnaire does not explain the response rate. This finding differs from earlier studies about reminders, demonstrating a positive effect on response rate [
3,
4,
7,
9,
13,
25].
Finally, something worth noting is that the impact of the pandemic on web survey responses is vital, as the forced shift to a 100% digital work and life context has significantly impacted web survey responses. The number of webs survey has increased dramatically, and various approaches have been applied [
39]. In this context, these findings suggest that participants are more likely to complete a web survey after the experience of remote teaching during the pandemic and the familiarity they experienced with these tools [
39,
40].
9. Implications and Limitations
One implication of these results for achieving higher response rates and improving the quality of web surveys, in general, maybe those researchers would be well advised to seek out sample population groups with a particular interest in the survey topic. Researchers can also seek the help of large organizations or prestigious individuals to share the survey. Moreover, an introductory text on the cover page with details about the research’s purpose and how their responses would support this purpose is likely to increase the response rate of the web survey. Additionally, the web survey structure/form should be paid attention to during the planning of the web survey. Teachers are more likely to complete a questionnaire if the questions are simple and comprehensive, close-ended, and the questionnaire requires a logical time such as up to ten minutes.
Moreover, to reduce teachers’ embarrassment to questions they do not desire to answer, we must include the option “I do not know and, or I do not answer” and place the demographic information at the beginning of the web questionnaire. Our findings indicated that researchers should pay attention to teachers’ data safety, privacy, and their consent for participation. Likewise, the research planning should be checked by an official committee responsible for the research’s ethical approval. Demographic information such as email addresses and the city of residence should be avoided.
This study has some limitations that could serve as avenues for further research. Due to the use of cross-sectional data, we should be cautious about the inferences about causality [
32]. Also, the fact that volunteers were asked to present their views are issues that usually lead to response biases [
41]. Another limitation of this research is the almost somewhat limited number of respondents. Although we sent invitations to 400 teachers, only 263 of them completed the survey. However, we submit that this sample is not negligible since, in Greece, similar surveys that study the response rate on web surveys with a sample of teachers have not been conducted. Another limitation may be that no reminder was sent to participants to complete the survey. Suggestions for future research using mixed methods could carry out the same survey on a larger sample of teachers so that the results can be more generalizable.
Moreover, this research instrument could be used with teachers in other countries to identify possible similarities or differences. Afterwards, could be considered other demographics of participants, such as their various personality traits, may influence the teacher’s intention to participate in a web survey with a questionnaire. In addition, it would be advisable to investigate further factors such as “authority” and “number of reminders”, as in the relevant literature, the results seem contradictory.
Finally, considering the results of our research regarding the factors that influence someone to participate in a web survey, we suggest a list of recommendations that a researcher should keep in mind when designing a web survey to prevent low response rates:
Target a sample whose interests are relevant to the subject of the web survey
Disclose to the respondent the information of the organization or the person conducting the web survey so that they know it is a prestigious body
State in the invitation the exact purpose of the web survey, the reasons that the teacher should complete the questionnaire, as well as you should indicate the way to inform them about the results
Indicate to the respondents the approval of the research from the ethics committee
Ask for participant’s consent before starting the web survey
The questions should be formulated in a straightforward manner
I prefer closed-ended questions and insert the option “I do not know/I do not answer.”
Indicate the required time to complete the web survey and the number of questions it contains. The web survey should not take more than 10 min to complete
The timeframe for completion should be about three weeks
Demographic information should be included at the beginning of the web questionnaire. However, you should avoid asking for the email and city residence of the respondents
Send the invitation for participation in the web survey by email using the Blind Carbon Copy (BCC) approach.