Next Article in Journal
Buying the Forthcoming: The Effect of Value Congruence on Consumer Pre-Order Decisions
Previous Article in Journal
Chatbot Adoption: A Systematic Literature Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Does Participation Intention Equal Participation Behavior? The Role of Dynamic Competition in Crowdsourcing Contests

1
Faculty of Business Information, Shanghai Business School, Shanghai 201400, China
2
School of Information Management & Engineering, Shanghai University of Finance and Economics, Shanghai 200433, China
3
Shanghai Key Laboratory of Financial Information Technology, Shanghai University of Finance and Economics, Shanghai 200433, China
*
Authors to whom correspondence should be addressed.
J. Theor. Appl. Electron. Commer. Res. 2026, 21(4), 99; https://doi.org/10.3390/jtaer21040099
Submission received: 25 August 2025 / Revised: 17 October 2025 / Accepted: 22 October 2025 / Published: 25 March 2026

Abstract

Crowdsourcing contest platforms provide enterprises with opportunities to seek external resources at a lower cost. Increasing the participation of solvers is the key to improving the success of crowdsourcing contests. Many previous studies attempt to use participation intention as a proxy for exploring participation behavior, using surveys to examine participation intention or continued participation intention. However, participation intention and participation behavior differ significantly, and the dynamic nature of influencing factors has a more complex effect on solvers’ participation. Therefore, based on social exchange theory, we use the dynamic data of winvk.com to construct a two-stage model of view and submission. The effect of dynamic competition on participation intention and participation behavior is explored. The results show that external competition has a consistent negative effect on both participation intention and participation behavior. However, the effect of internal competition is different. It has no significant effect on participation intention, but has a significant positive effect on participation behavior. In addition, rewards exacerbate the effect of competition on participation behavior. These findings provide empirical evidence for exploring differences in participation intention and behavior, and offer practical suggestions for enterprises and platforms to improve solvers’ participation in a short time.

1. Introduction

With the rapid development of information technology, the internet has played a key role in promoting the formation of distributed collaborative systems and integrating human resources [1]. It has broken down the time and space constraints of individual activity participation [2]. The public can utilize fragmented time to complete tasks, and enterprises can gather and utilize talent from around the world [3,4]. Contests initiated by third-party platforms are a typical and important form of crowdsourcing. They bring together seekers who need to solve problems and solvers who provide solutions [5]. Seekers post tasks through the platform, set a participation timeframe, and offer a reward [6]. Solvers with initial participation intention select tasks of interest and review the content and requirements. They then evaluate the tasks’ various design parameters and competitive factors to decide whether to implement a specific behavior. Through crowdsourcing platforms, seekers can access a large workforce, significantly reducing production costs and improving the quality of solutions [7]. Due to high efficiency and low costs [8], crowdsourcing contest platforms have become an effective option for seekers to solve problems [9].
Improving the performance of crowdsourcing contests is an important research question. Existing studies show that more solvers lead to more submissions [10], and more submissions increase the probability of obtaining the best solution. Therefore, increasing the actual participation of solvers is crucial for improving the efficiency and quality of crowdsourcing contests [11]. Many previous studies believe that participation intention is a prerequisite for participation behavior and that participation behavior is necessarily based on participation intention. Therefore, scholars mostly use questionnaires or interviews to investigate participation intention or continued participation intention [12,13,14], thereby replacing the analysis of participation behavior. However, other scholars find that respondents who expressed strong participation intentions at the beginning of the survey did not participate later. In the field of innovation and entrepreneurship, a large proportion of entrepreneurs with strong intentions have not taken actual entrepreneurial actions. The fear of failure is the main reason hindering the expression of behavior [15]. In the field of green innovation, a survey of residents’ willingness to sort their waste found that while residents have a strong desire to do so, the actual rate of waste sorting is low. Further analysis found that incentives can effectively improve the rate of waste sorting [16]. Therefore, due to the perception of incentives and risks, there is a significant discrepancy between participation intention and participation behavior. In the crowdsourcing field, a large portion of solvers simply register on the platform without participating in actual contests [17]. If platforms and seekers expect solvers to take concrete action, they need to understand how solvers’ participation behaviors are formed. Therefore, paying attention to the initial participation intention of solvers and promoting their in-depth participation are key issues in improving crowdsourcing performance.
The risk of competition [14,17] and the probability of winning [9,18] influence the transformation and formation of behavior. In crowdsourcing contests, task rewards are payments from the seeker to the solver, effectively compensating solvers’ time and effort. Previous studies show that higher rewards attract more solvers, thereby motivating them to provide high-quality solutions [19]. However, solvers’ participation behavior is not only affected by the amount of the rewards, but they also evaluate the possibility of obtaining the rewards. Generally speaking, tasks with more intense competition have a lower probability of receiving rewards, which in turn reduces participation intention [17]. Therefore, competition has a significant effect on solvers’ participation. Previous research on competition has mostly relied on static data. However, the layout of the crowdsourcing platform and the timing of the tasks make crowdsourcing competitions process-oriented, and the participation of solvers is also considered a dynamic process [20]. Dynamic competition has a more complex impact on the behavior of solvers. If the seekers want to obtain a satisfactory solution within the task period, they need to adopt a dynamic perspective and pay attention to the real-time changes in competition. In addition, dynamic competition makes the risks of participation and the probability of winning constantly change, and solvers need to constantly weigh the costs and benefits. Only when the expected benefits outweigh the costs will the solvers take real action [20]. However, previous studies mostly use motivation theory and incentive theory to explain solvers’ participation behavior [21,22], focusing on the effect of incentives. They ignore that the participation process of a crowdsourcing contest is an exchange behavior driven by both costs and benefits. Social exchange theory reflects the measurement of costs and benefits [23]. Therefore, we use social exchange theory to explain the participation behavior of solvers. Based on the above analysis, we propose the following research questions:
RQ1: Are there differences in the solvers’ participation intention and participation behavior?
RQ2: How do dynamic competitive factors influence solvers’ participation intention and participation behavior?
To address the above issues, the logo design task of the winvk.com crowdsourcing contest platform was used as the research object. The average period of the logo task is shorter, the data fluctuates greatly, and participation behavior is more sensitive to changes in influencing factors [9]. Based on social exchange theory, we constructed a two-stage research model of view and submission. We used data from ongoing tasks to explore the differences in the effect of dynamic competition on solvers’ participation intention and participation behavior. The main contributions include the following: (1) We explore the differences in the participation intention and participation behavior from a dynamic perspective, and prove that competition has different effects on different psychological states and behaviors. (2) We deepen the application of social exchange theory in the field of crowdsourcing, explaining solvers’ participation behavior from a cost–benefit perspective. This study provides practical suggestions for seekers and crowdsourcing platforms to promote the participation behavior of solvers, and also enriches the research content related to crowdsourcing contests.
This paper is organized as follows: Section 2 introduces relevant research on solvers’ participation intention and participation behavior, and explains social exchange theory. Section 3 introduces the research model and hypotheses. Section 4 conducts empirical analysis and explains the results. Section 5 discusses the conclusions and summarizes the theoretical and practical significance. Section 6 provides future prospects.

2. Literature Review

2.1. Research on the Participation Intention and Participation Behavior of Solvers

The participation of solvers is the key to task success and the sustainable development of the platform. Therefore, most studies focus on how to motivate solvers to participate. The forms of solvers’ participation mainly include: participation intention and participation behavior. Participation intention refers to the cognitive propensity before actual participation, representing an individual’s subjective likelihood of performing a specific action [24,25]. It affects solvers’ decision-making processes and subsequent behavioral implementation [26,27,28,29]. Participation behavior refers to the solvers’ actual actions, which can be embodied as submitting a solution or contributing ideas. In this process, solvers have to pay real costs. Previous studies suggest that participation intention is a direct predictor of actual behavior [30]. Therefore, researchers attempt to analyze the solver’s participation intention or continue participation intention as a proxy for participation behavior. However, surveys on consumer behavior and green innovation find that many respondents exhibit a significant discrepancy between participation intention and participation behavior [31,32]. Although respondents initially express strong participation intention, subsequent follow-up surveys reveal a low conversion rate to actual participation [16,33]. In the field of crowdsourcing, existing studies primarily focus on participation intentions, with insufficient attention paid to actual participation behavior. While participation intention is the first step in participating in tasks, participation behavior is the key to the success of crowdsourcing tasks [34]. Therefore, it is necessary to comprehensively consider solvers’ participation intention and participation behavior, and focus on exploring the participation behavior.
Competition affects solvers’ perception of risk, thereby affecting the expression of participation intention and participation behavior. Competition includes external competition and internal competition. External competition refers to market competition, that is, competition between tasks of the same type in the market. For example, on the winvk.com crowdsourcing platform, competition is formed between multiple logodesign tasks carried out on the same day. Since solvers can only choose to participate in a limited number of tasks, the participation of solvers in a specific task is affected by other tasks [35]. Generally speaking, the greater the number of tasks carried out simultaneously, the more intense the market competition, which reduces the number of solvers in a single task [36]. Internal competition refers to competition within a single task, that is, competition among all solvers within a task. For example, all solvers in a logo design task compete for the reward. It is usually expressed in terms of the number of solvers or the number of experienced solvers [5]. Generally speaking, more participants lead to more intense internal competition, which reduces the probability of a single participant winning. However, some studies find that intense competition can help increase the creativity of solvers’ ideas [5]. Furthermore, the solver’s experience also influences the intensity of task competition. Research finds that solvers’ past wins influence the participation of other solvers [17]. Specifically, when there are more experienced or higher-ranked solvers in a task, the competition is perceived as more intense, reducing potential solvers’ participation. Based on the above analysis, competition has an important effect on solvers’ participation. In addition, the purpose of solvers’ efforts is to obtain rewards, and their perception of economic risks affects the expression of behavior after the intention is formed. Since the process of submitting a proposal involves actual costs, rewards are the best compensation for these costs [37]. Solvers assess the cost–benefit balance and only take action when they believe the benefits outweigh the costs. Therefore, rewards play a significant role in the relationship between competition and solvers’ behavior [17].
Based on the above analysis, we focus on competition factors and explore the effect of competition on participation intention and participation behavior. Considering the relative and real-time nature of the task process, we incorporate dynamic competitive factors that influence participation behavior, making up for the lack of research on dynamic factors influencing participation. Furthermore, as an important incentive mechanism, rewards can compensate for actual costs, so we investigate the moderating effect of rewards between competition and participation behavior.

2.2. Social Exchange Theory

Social Exchange Theory (SET) interprets the interaction in interpersonal relationships as an exchange process, the purpose of which is to maximize personal interests [38]. In a crowdsourcing contest, the solver submits a solution and receives a reward, which is a concrete manifestation of exchange. The costs and benefits of participating are weighed to determine whether to take actual action. When the perceived benefits outweigh the perceived costs, solvers are inclined to take risks and participate in the task; conversely, solvers refuse to take specific action. The decision-making process reflects the individual’s pursuit of maximizing benefits. During the participation process, solving problems and contributing solutions requires time and energy [39], and rewards are the best compensation for the cost of time and energy. Furthermore, solvers can only choose to participate in a limited number of tasks within a limited timeframe, making opportunity cost a crucial consideration. In summary, due to the existence of competitive factors, participants need to make more reasonable assessments based on time cost, effort cost, opportunity cost, etc., so as to maximize their interests in a dynamic environment.
Although previous studies have used social exchange theory to analyze the effect of perceptual factors on solvers’ participation behavior [40], the existence of competition increases the complexity of cost-effectiveness evaluation, and its effect on participation intention and participation behavior varies. Currently, there is a lack of empirical research and support in this area. Therefore, we further extend the existing research and explore the effect of multiple competitions on participation intention and participation behavior, while considering the role of rewards in the participation process. To more clearly demonstrate the contribution of this paper, a summary of related studies is presented in Table 1. It can be found that previous studies have mostly explored solvers’ participation intentions or participation behavior through surveys, without accurately distinguishing between different modes of participation. Secondly, analyses of influencing factors from the perspective of motivation or incentives have failed to consider cost–benefit measures. Finally, static studies have overlooked the processual nature of tasks and failed to capture the dynamic changes in influencing factors during task execution. For the above reasons, we construct a two-stage empirical model based on social exchange theory. We examine the effect of dynamic competition on solvers’ participation intention and participation behavior from the perspective of costs and benefits.

3. Research Framework and Hypotheses

The process of solvers participating in tasks mainly includes: task viewing, task attention, task solving, and solution submission, as shown in Figure 1. During the view stage, solvers select the tasks of interest, review contents and requirements, and then pay attention to facilitate subsequent submission. During the submission stage, solvers need to invest actual time and energy to conceive and design proposals based on the seeker’s requirements, ultimately submitting the solutions. Research by Koh et al. [42] shows that most participation behaviors in virtual communities can be divided into view and submission. View reflects solvers’ initial intention, while submission is an expression of actual action, which can reflect the difference between participation intention and participation behavior. Based on the solvers’ participation process, we select the view and submission stages as the key research objects, and explore the differences in the effect of competition on participation intention and participation behavior.
Based on the participation process and previous research, combined with social exchange theory, we established a two-stage model of view and submission to explore solvers’ participation intention and participation behavior. First, we explore the effect of dynamic competition (including external competition and internal competition) on participation intention and participation behavior. Considering the actual costs involved in submission, we further explore the moderating role of rewards in the relationship between competition and participation behavior. The research model is shown in Figure 2.

3.1. Research on the Effect of External Competition on Participation Intention and Participation Behavior

Crowdsourcing is a widely recognized and used outsourcing method. Given the advantages of crowdsourcing contest platforms, when seekers have needs, they often choose to post tasks on the platforms to obtain solutions quickly [43]. Therefore, new tasks may start and end at any time. Since multiple tasks exist on the same platform simultaneously, solvers can choose to participate in one or more tasks, thus generating competition between tasks. During the view stage, view means there is a possibility of initial participation. In the process of reviewing the task content, solvers need to spend a certain amount of time and energy to judge and screen. If there are too many tasks of the same type, viewing all ongoing tasks can take much time. During the submission stage, solvers need to put in real effort. Due to limited time and energy, solvers cannot participate in all ongoing tasks at the same time. Therefore, after viewing the task summary, they compare multiple tasks and select one or more tasks to participate in. According to social exchange theory, when faced with multiple options, solvers will weigh costs and benefits [38]. Because multiple similar tasks exist simultaneously, seekers all strive to gain more attention within time constraints, leading to competition for solvers. Solvers make their choices based on a cost–benefit comparison between the target task and other tasks. Tasks with perceived benefits exceeding costs tend to attract more solvers, leading to a decrease in participation in other tasks [17]. Therefore, we hypothesize:
Hypothesis 1a.
External competition has a negative effect on solvers’ participation intention; that is, more intense market competition leads to lower views.
Hypothesis 2a.
External competition has a negative effect on solvers’ participation behavior; that is, more intense market competition leads to lower submissions.

3.2. Research on the Effect of Internal Competition on Participation Intention and Participation Behavior

Regarding internal competition for tasks, competition among solvers can affect individuals’ expectations of winning and their enthusiasm for working hard, thereby influencing the submission of solutions [17]. Previous studies have mostly used the number of solvers to represent competitive intensity. However, existing research suggests that the perceived competition intensity brought about by a high-capability solver is even greater than that of ten low-capability solvers [17]. Therefore, using the number of solvers as a proxy for competitive intensity is not an effective measure. The experience and ability of solvers convey signals of expertise or winning probability and can more effectively reflect the competitive intensity [18]. Therefore, during the view stage, if potential solvers find that many experienced people have participated in the task, their participation intention will decrease. In the submission stage, the participating tasks are the most suitable choices after being judged and screened in the view stage. Therefore, internal competition within the task can motivate solvers to continuously make more efforts to modify and improve solutions [5], thereby increasing the number of submissions. In addition, intensified competition has increased the motivation of solvers to work harder [5,44]. Therefore, tasks that already have many participants can attract more participants later. According to social exchange theory, during the view stage, tasks with a higher proportion of experienced solvers indicate stronger competition. To avoid wasted effort, potential solvers reduce their view of highly competitive tasks. Since tasks have already been screened during the view stage, participating solvers continuously refine proposals to provide higher-quality solutions, thereby increasing chances of winning [45]. Therefore, we hypothesize:
Hypothesis 1b.
Internal competition has a negative effect on solvers’ participation intention; that is, stronger task competition leads to lower views.
Hypothesis 2b.
Internal competition has a positive effect on solvers’ participation behavior; that is, stronger task competition leads to more submissions.

3.3. The Moderating Effect of Relative Rewards

In crowdsourcing contests, competition cannot be analyzed in isolation and needs to be considered together with incentives [5]. Rewards are considered a crucial incentive for solvers, playing a key role in influencing participants’ motivation to compete [46,47]. During the submission stage, solvers conceive, design, modify, and refine proposals based on requirements, incurring real costs in this process. Rewards can directly compensate for costs. For external competition, solvers can only participate in a limited number of tasks due to time and capacity constraints. Therefore, solvers need to consider the opportunity cost of forgoing other tasks. When the task reward is higher than other similar tasks in the current period, solvers’ opportunity cost is reduced and the task competitiveness is enhanced, so solvers are more likely to participate. For internal competition, tasks with relatively higher rewards can motivate more effort [19]. Especially for solvers with high skill levels, they usually choose to participate in tasks with higher rewards in order to obtain more potential benefits [41]. According to social exchange theory, only when the benefits outweigh the costs will solvers choose to participate or increase investment. Task rewards are the most important tangible compensation [21]. When the task reward is higher, it not only reduces the opportunity cost of solvers but also effectively compensates for the time and energy spent by solvers. Thus, relative rewards intensify the effect of competition on participation behavior [36]. Therefore, we hypothesize:
Hypothesis 3a.
Relative rewards strengthen the negative effect of external competition on participation behavior.
Hypothesis 3b.
Relative rewards strengthen the positive effect of internal competition on participation behavior.

4. Methodology

4.1. Data

Winvk.com is one of the most mature and popular crowdsourcing contest platforms in China. Currently, winvk.com has launched more than 500,000 tasks and obtained more than 10 million solutions, becoming a key way for domestic seekers to solve problems. The main reasons for choosing the logo design task as a representative example are: (1) Logo design tasks are one of the important types supporting the development of crowdsourcing platforms, accounting for more than 78.4% [22], which can provide a large amount of data for research. (2) Logo design tasks have clear skill requirements and a relatively low skill threshold, which can motivate more solvers to participate. (3) Logo design tasks have a short average period, and data fluctuations are more obvious in a short time, which facilitates the research of dynamic factors.
In order to explore the dynamic changes in competitive factors in the task process, data from the logo design task is captured at a fixed point every day for the ongoing task. Considering that the average period of the logo design task is about seven days, which is a short-term task, we randomly selected continuous data from March to May 2021. We cleaned the obtained data based on the crawled content and experimental requirements, and eliminated invalid data, ultimately obtaining 1768 valid task data. Table 2 shows examples of the original data after structured transformation.

4.2. Measurements

According to the research model, corresponding data are collected and processed as needed. The interpretation and measurement of the main variables are as follows.
Dependent Variable: Our research focuses on solvers’ participation intention and participation behavior. Participation intention indicates solvers’ inclination to participate in tasks and is a prerequisite for behavior. Since this study aims to explore solvers’ participation under dynamic competition conditions, we focus on the daily increase in views, which is represented by the difference between the views in period t and the views in period t − 1. Participation behavior refers to solvers’ actual involvement, which requires real time and effort. Given the lag between view and submission, we focus on the daily increase in submissions, which is represented by the difference between the submissions in period t + 1 and the submissions in period t.
Independent variables: We explore the effect of competitive factors on participation intention and participation behavior. The competitive factors include external competition and internal competition. External competition refers to the competition between tasks, represented by the number of tasks of the same type during the same period. In this paper, we use the number of ongoing logo design tasks on the platform per day to represent external competition. Internal competition refers to competition within a single task. A higher proportion of more experienced participants indicates more intense competition within the task. In this paper, the platform categorizes solvers into Unranked, Star, Diamond, and Crown levels based on the number of participations and wins. Crown levels represent the top 25% of all levels and represent high-quality solvers. Therefore, we use the daily percentage of Crown-level solvers to represent internal competition.
Moderating variable: According to previous studies, rewards affect the role of competition. In dynamic competition, new tasks are added or completed every day. The reward amount for each task is different, and the effect of varying reward amounts on solvers is also different. Therefore, we use relative rewards to explore the effect of rewards, that is, the ratio of the reward for task i to the average reward of all ongoing tasks simultaneously.
Control variables: Existing studies have shown that the title and details of tasks can reveal the needs of seekers and the complexity of tasks, which influence the judgment and choice of solvers [48]. The number of characters in the title and details is used to represent them. In addition, new tasks are posted daily, resulting in real-time changes in the position of tasks. Due to the primacy effect and the recency effect, the position of tasks also has a certain effect on solvers’ participation. Therefore, we use the order in which tasks appear on the page to represent position, with 1 being the first and so on.
The variables and descriptive statistical analysis are shown in Table 3. Table 4 shows the correlation analysis of the variables.

4.3. Model

From the perspective of the task process, based on the purpose differences in the solvers’ participation intention and participation behavior, a two-stage model of view and submission was established. The task data collected in this paper is multidimensional and dynamic. Tasks can be entered and exited at any time, resulting in significant dimensionality differences between tasks. Based on the type and practical significance of this dynamic data, a multivariate linear regression model was ultimately chosen for empirical analysis. Model (1) explores the relationship between dynamic competition and views, demonstrating the effect of competition on participation intention. Model (2) explores the effect of dynamic competition on submissions, as well as the moderating effect of relative rewards on the relationship between competition and submissions, demonstrating the effect of competition on participation behavior. The specific model is as follows:
V i e w i , t ( t 1 ) = α 0 + α 1 E x t e r n a l i t + α 2 I n t e r n a l i , t 1 + γ X i t + θ i t
S u b m i s s i o n i , t + 1 t = β 0 + β 1 E x t e r n a l i t + β 2 I n t e r n a l i t + β 3 R e w a r d i t + β 4 R e w a r d i t   ×   E x t e r n a l i t + β 5 R e w a r d i t   ×   I n t e r n a l i t + δ Z i t + ε i t
Among them, the subscript i represents the i-th task, and the subscript t represents the t-th period of the task process.

4.4. Results

According to the specific model constructed, the variables were tested for multicollinearity. The VIF between all variables was <3, which proved that there was no multicollinearity problem. Therefore, further empirical analysis was carried out, and the results are shown in Table 5.
Table 5 shows the hierarchical regression results for participation intention and participation behavior. Column 1 shows the effect of factors in the view stage, that is, the effect of competition on participation intention. Column (1-1) is the model containing only the control variables, while column (1-2) is the baseline model based on column (1-1) with all independent variables added. The analysis results show that external competition has no significant effect on views ( α 1 = −0.252, p > 0.05), which does not support Hypothesis H1a. View usually refers to realizing interesting tasks, which is a concrete manifestation of participation intentions. Since logo design tasks are small, with relatively simple requirements and content, they have a short lifespan. Therefore, simply viewing does not take up too much time and cognitive cost. Usually, solvers can complete a simple view of all ongoing tasks in a short time. That is, the increase in tasks on the platform will not affect view behavior. Consistent with Hypothesis H2a, internal competition has a negative effect on views ( α 2 = −0.528, p < 0.001). Crown-level solvers possess greater experience or success, indicating stronger competitiveness. A greater number of crown-level solvers among existing participants indicates more intense internal competition. Solvers can make decisions based on the circumstances of multiple tasks, minimizing costs and increasing chances of winning. Therefore, tasks with more high-quality solvers reduce the attention of potential solvers, negatively affecting participation intention.
Column 2 shows the effect of factors at the submission stage, that is, the effect of competition on participation behavior. Column (2-1) contains only control variables, column (2-2) is the baseline model with independent variables added, and column (2-3) further adds interaction terms to the baseline model to explore the moderating effect of rewards. The results show that external competition has a significant negative effect on submissions ( β 1 = −0.664, p < 0.001), which is consistent with Hypothesis H2a. Although new tasks continue to appear on the platform and the market is generally active, the pool of solvers remains relatively stable and does not surge with increasing market activity. Therefore, the more tasks a given period represents, the more intense the market competition. In the participation process, solvers need to analyze the seeker’s needs, then conceive and complete solutions. During the design process, the seeker’s needs may change. To increase chances of winning, solvers need to continuously refine and perfect proposals based on the needs. Due to limited time and energy, solvers cannot participate in all tasks simultaneously. Therefore, intensified market competition reduces the number of solvers for a single task; that is, external competition has a negative effect on the participation behavior. Consistent with Hypothesis H2b, internal competition has a significant positive effect on submissions ( β 2 = 0.358, p < 0.01). During the participation process, solvers view the tasks, make a preliminary assessment and screening based on the basic situation, and then take specific action. Once selected to participate in a task, solvers expend greater effort to reduce the opportunity cost of not participating in other tasks, as well as the cognitive and time costs of participating in the task. Usually, solvers choose to continuously revise and submit multiple solutions. Therefore, internal competition increases solvers’ participation behavior.
Regarding the moderating effect of rewards, columns (2-3) show that relative rewards strengthen the negative effect between external competition and submissions ( β 4 = −0.347, p < 0.05), as shown in Figure 3. Hypothesis H3a is supported. Submission represents concrete participation. Solvers expend time and effort during the proposal submission process, forgoing the opportunity to earn rewards by participating in other tasks. Therefore, they make participation decisions based on cost–benefit considerations. Rewards are the best tangible compensation for solvers’ costs [17], making the reward amount a crucial consideration in deciding whether to participate. When the task reward is higher than other tasks of the same period, it becomes more competitive. Consequently, higher rewards intensify competition, and solvers are more inclined to participate in tasks with higher rewards. Consistent with Hypothesis H3b, relative rewards strengthen the positive effect between internal competition and submissions ( β 5 = 0.377, p < 0.01), as shown in Figure 4. According to social exchange theory, solvers expect to receive a corresponding return for their efforts. For a single task, a higher reward represents better compensation for solvers’ costs, thus attracting potential solvers to compete and encouraging existing solvers to continue providing solutions.
All the hypothetical results are summarized in Table 6.

4.5. Robustness Tests

To verify the robustness of the results, we manipulate the variables. First, we added important control variables. Position refers not only to the top-to-bottom arrangement on the same page, but also to the front-to-back arrangement on different pages. As the number of tasks increases, the tasks’ positions shift back one by one. To view all ongoing tasks, solvers need to flip through the pages, which takes relatively more time. Therefore, we added the page number as a control variable: 1 represents the task’s position on the first page displayed on the platform, 2 represents the task’s position on the second page, and so on. Second, the task period has a significant effect on solvers, especially on participation behavior. Tasks with relatively longer periods provide solvers sufficient time to conceive and create. However, excessively long periods weaken the advantage of more experienced solvers. Therefore, we reran the experiment by adding the task period as a control variable. The results are shown in Table 7. The significance and signs of the variables remain consistent with the main experiment, demonstrating the robustness of this study.
Second, we changed the measurement method of the dependent variable. Compared to daily increased submissions, total daily submissions can also represent solvers’ participation behavior. Therefore, we used total daily submissions instead of daily increased submissions for the empirical analysis. The results are shown in Table 8. As can be seen, the final results are consistent with the main experiment, further demonstrating the robustness of the results.
Third, we changed the measurement method of the moderator variable. Previous studies have mainly used the amount of reward to explore the effect of competition. To demonstrate the generalizability of our results, we replaced the relative reward with the task reward and conducted an experiment. The results are shown in Table 9. These results are consistent with the main model and provide strong support for our theoretical framework.
Finally, some data was deleted. Holidays occurred during the data collection process. Holidays reduce the number of tasks released by seekers, while solvers have more time to choose whether to participate in the tasks. Therefore, holidays affect both seekers and solvers. For the above reasons, we deleted the holiday-related data during the task implementation and re-conducted the experimental analysis. The results are shown in Table 10. The experimental results are consistent with the main model, which proves the robustness from the perspective of the samples.

5. Discussion and Implications

5.1. Discussion

The key to improving the performance of crowdsourcing contests is to have enough solvers. Previous studies believe that participation intention is a prerequisite for participation behavior, which can directly predict the occurrence of participation behavior. Therefore, many scholars conduct research on participation intention or continued participation intention to represent participation behavior. However, there is a discrepancy between participation intention and participation behavior [16]. Participation intention is merely an expression of preference, requiring no actual cost, and may not fully translate into actual action. Participation behavior is a genuine expression of action, involving the expenditure of time, energy, and other costs, which can reflect the compensation effect of rewards. Therefore, participation intention cannot directly serve as a substitute for participation behavior.
For the above reasons, based on social exchange theory, we constructed a two-stage model based on view and submission to explore the effect of dynamic competition on solvers’ participation, and discussed the moderating effect of relative rewards on the relationship between competition and participation behavior. The results show that external competition has no significant effect on views. First, logo design tasks are typically straightforward, and view simply involves reviewing the task content, which doesn’t take much time. Second, the number of ongoing tasks during the same period is relatively small, and more than 90% are concentrated on the first two pages. Viewing all ongoing tasks doesn’t distract too much attention of solvers. Therefore, given relatively low external competition, the number of tasks doesn’t significantly affect solvers’ participation intention. However, internal competition has a significant negative effect on participation intention. A higher proportion of crown-level solvers indicates more intense competition for the task, and participating in competition carries a higher risk of failure. In order to avoid risks in advance, solvers will reduce the view of such tasks.
During the submission stage, external competition has a significant negative effect on submissions. Submission implies a concrete performance. During this process, solvers need to understand the task details, analyze the seeker’s requirements, and complete the design and submission of proposals. Since actions require a lot of cognitive and time costs, it is unlikely that solvers can participate in all ongoing tasks simultaneously, which inevitably leads to competition for solvers between tasks. Therefore, external competition has a negative effect on participation behavior. However, internal competition has a positive effect on submissions. As previous research has shown, intention is a prerequisite for behavior [30]. Therefore, actual action can be considered a further expression of participation intention. In other words, during the view stage, solvers have already conducted a preliminary screening of tasks and choose to participate in selected tasks. In order to increase the possibility of winning, solvers continuously improve the task or submit multiple types of solutions according to the seeker’s needs [45]. Therefore, internal competition encourages more participation behavior.
The existence of rewards exacerbates the effect of competition. Rewards are direct financial compensation for costs and are also the ultimate goal of solvers’ participation in the task. For external competition, higher rewards mean that the opportunity cost of giving up other tasks can be better compensated, thereby gaining more attention. For internal competition, tasks with higher rewards encourage the participation of potential solvers and motivate existing solvers to continuously submit new proposals.
In summary, there are significant differences in the effect of competition on views and submissions, and intention is not always converted into behavior. Therefore, using participation intentions directly instead of participation behavior is biased.

5.2. Theoretical Implications

This study provides a theoretical basis for exploring the participation behavior of solvers. First, we accurately depict participation intention and participation behavior, using actual transaction data to demonstrate the difference between the two, enriching the research content in the field of crowdsourcing. Previous studies consider participation intention to be a necessary condition for participation behavior and usually use participation intention as a substitute for participation behavior [37]. Research usually focuses on participation intention or continued participation intention. However, research in areas such as consumer behavior and green innovation finds that participation intention does not fully translate into participation behavior [31,32]. Although the difference between participation intention and participation behavior is recognized [49], due to limitations in scenarios and measurement methods, there is currently a lack of empirical research on the distinction between the two in the crowdsourcing field. Considering the dynamic nature of crowdsourcing contests, we obtain real-time dynamic data to explore the effect of competition on views and submissions. View is an expression of participation tendency and can be considered a proxy for participation intention, while submission is the actual participation behavior. The results show that there are significant differences in the effect of competition on views and submissions. Therefore, the previous method of using participation intention instead of participation behavior is biased to a certain extent.
Second, based on the cost–benefit perspective of social exchange theory, we analyze the effect of competition on participation intention and participation behavior, providing theoretical support for a more comprehensive analysis of solvers’ behavior. Previous studies focus primarily on the effect of motivation and incentives on solvers’ participation [50,51]. Although Ye and Kankanhalli [40] used social exchange theory to analyze the effect of perceived factors, such as reputation and hedonic benefits, as well as cognitive and knowledge losses. The data obtained through questionnaires cannot reflect the measurement of costs and benefits in the actual participation process. Since competition risk affects the transformation of participation behavior, we comprehensively consider costs and benefits of the real participation process, and explore the effect of competition on participation intention and participation behavior. Our study suggests that during the submission stage, solvers not only consider the time and effort invested but also the opportunity costs of forgoing other projects [9]. Rewards are the best compensation for all these costs. The process of solvers’ participation reflects the trade-off between costs and benefits. In summary, social exchange theory can more comprehensively explain solvers’ participation behavior.
Finally, our study enriches the application of social exchange theory. Social exchange theory has been primarily used to understand and analyze knowledge-sharing and knowledge-seeking behaviors in networked societies [52,53,54], explaining the dual effects of costs and benefits in social interactions. Unlike social interactions, reciprocity in crowdsourcing contests cannot be guaranteed, because not all submitted proposals are ultimately accepted. In most cases, only one proposal can receive the reward, and the costs paid by other participants cannot be compensated. Although previous studies have explored the effect of perception factors on participation behavior from the perspective of costs and benefits [40], the data obtained through surveys cannot effectively represent behavioral choices in real environments because they do not generate real costs and benefits. Therefore, the research content and methods have certain limitations. Based on previous research, we use actual data to clarify the measurement of benefits and costs, and consider the effect of competition factors on participation intention and participation behavior, enriching the content of social exchange theory.

5.3. Practical Implications

This study provides practical suggestions for seekers and crowdsourcing platforms. First, the survey on participation intention can serve as a preliminary survey for research participation behavior, but the preliminary survey results still need to be adjusted according to the actual situation. Previous studies believe that intention is a necessary condition for the occurrence of behavior. Therefore, scholars use participation intention as a substitute for participation behavior, trying to understand the occurrence of actual behavior by exploring participation intention [32]. Our results also confirm that view behavior has a significant positive effect on submission behavior. Therefore, seekers can use the results of participation intention to initially understand the possible participation of solvers. However, our study also demonstrates a discrepancy between participation intention and participation behavior. The results indicate that external and internal competition have different effects on views and submissions. Therefore, research on participation intention can only serve as a preliminary step before conducting actual behavioral research, helping seekers and platforms understand the potential effect of implementing specific behaviors on solvers. To accurately understand participation behavior, actual data on participation is required.
Second, the seeker should comprehensively consider the reward and competition of the task and set an appropriate reward amount. According to social exchange theory, solvers typically weigh potential benefits against actual costs. View only provides a general understanding of the task requirements and does not incur actual costs, resulting in no monetary reward. However, the submitted proposal requires solvers to analyze and formulate content to suit the seeker’s tastes and preferences, thus incurring actual cognitive, time, and opportunity costs. Rewards can compensate for the costs. Regarding the amount of reward, seekers need to consider the reward set for other similar tasks. If the seeker wants to obtain more solutions quickly, he should set a relatively high reward within a reasonable range to enhance the competitiveness of the task. In addition, it is necessary to consider the effect of market competition. Since the number of solvers is relatively stable within a certain period, more tasks at the same time only distract the solvers and reduce their participation. Therefore, seekers should consider both rewards and competition, avoiding entering the platform when the market is flooded with tasks. If the situation is urgent or the need is pressing, increasing the reward amount is a feasible solution. The reward amount can be based on the average reward for tasks in the same period to attract as many solvers as possible to participate.
Finally, the platform needs to comprehensively consider seekers and solvers, reduce service costs, and increase the benefits of solvers. The purpose of the crowdsourcing platform is to help seekers more easily obtain better solutions. Therefore, attracting more attention from solvers and ensuring their participation are concerns for platform managers. Rewards are the best way to compensate for time and effort, and they are also an important factor in motivating solvers to continue participating [18]. When seekers post tasks, they are required to pay 20% of the total rewards as a platform service fee. The winning solvers ultimately receive only 80% of the total rewards. Therefore, platforms can consider adopting diversified charging methods for different tasks and amounts [55]. This form can effectively reduce costs for seekers and increase profits for solvers, thereby encouraging more solvers to continue participating and maintaining platform activity.

6. Limitations and Future Directions

This study has certain limitations, but it also provides ideas for future research. First, the crowdsourcing contest platform we selected has characteristics such as financial rewards, time constraints, and third-party hosting. The logo design task also requires unique professional skills. However, the complexity and required skills of different tasks on different platforms vary. Therefore, future research can be expanded to other platforms and markets to verify the generalizability of this study.
Second, our study focuses on a single platform, and market competition data is derived only from the target platform. However, crowdsourcing markets exist for multiple similar platforms, and solvers may participate in tasks on multiple platforms simultaneously. Therefore, our measurement of market competition is incomplete. Future research could consider controlling for the influence of other platforms to further explore the effect of market competition on solvers’ participation behavior.
Finally, the data changes in different quarters, holidays and other time points show periodic fluctuations. In the future, we hope to collect more data to eliminate the changes in solvers’ behavior caused by time fluctuations, so as to further verify the robustness of the results.

Author Contributions

Conceptualization, X.L. and X.H.; methodology, X.L. and X.F.; validation, X.L. and X.F.; formal analysis, X.L. and X.H.; investigation, X.L. and Z.P.; data curation, X.L. and Z.P.; writing—original draft preparation, X.L.; writing—review and editing, X.L. and X.H.; funding acquisition, X.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Social Science Foundation of China (20BGL287).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Deng, X.; Joshi, K.D.; Galliers, R.D. The Duality of empowerment and marginalization in microtask crowdsourcing. MIS Q. 2016, 40, 279–302. [Google Scholar] [CrossRef]
  2. Hossain, M.; Islam, K.Z. Ideation through online open innovation platform: Dell IdeaStorm. J. Knowl. Econ. 2015, 6, 611–624. [Google Scholar] [CrossRef]
  3. Piezunka, H.; Dahlander, L. Distant search, narrow attention: How crowding alters organizations’ filtering of suggestions in crowdsourcing. Acad. Manag. J. 2015, 58, 856–880. [Google Scholar] [CrossRef]
  4. Ye, H.J.; Feng, Y.; Choi, B.C. Understanding knowledge contribution in online knowledge communities: A model of community support and forum leader support. Electron. Commer. Res. Appl. 2015, 14, 34–45. [Google Scholar] [CrossRef]
  5. Dissanayake, I.; Yasar, M.; Vaghefi, M.S.; Nerur, S.P. Driving the Innovation Race: Effect of Competitiveness in Crowdsourcing Contest. J. Manag. Inform. Syst. 2024, 41, 1142–1172. [Google Scholar] [CrossRef]
  6. Mridha, S.K.; Bhattacharyya, M. A network based mechanism for managing decomposable tasks via crowdsourcing. Electron. Commer. Res. 2018, 18, 869–881. [Google Scholar] [CrossRef]
  7. Yang, Y.; Dong, C.; Yao, X.; Lee, P.K.; Cheng, T.C.E. Improving the effectiveness of social media-based crowdsourcing innovations: Roles of assurance mechanism and innovator’s behaviour. Ind. Manag. Data Syst. 2021, 121, 478–497. [Google Scholar] [CrossRef]
  8. Cardamone, E.; Marozzo, V.; Miceli, G.N.; Raimondo, M.A. Co-creating through win and quick: The role of type of contest and constraints on creativity. J. Knowl. Econ. 2023, 14, 4449–4465. [Google Scholar] [CrossRef]
  9. Chen, P.Y.; Pavlou, P.; Wu, S.; Yang, Y. Attracting high-quality contestants to contest in the context of crowdsourcing contest platform. Prod. Oper. Manag. 2021, 30, 1751–1771. [Google Scholar] [CrossRef]
  10. Boudreau, K.J.; Lacetera, N.; Lakhani, K.R. Incentives and problem uncertainty in innovation contests: An empirical analysis. Manag. Sci. 2011, 57, 843–863. [Google Scholar] [CrossRef]
  11. Renard, D.; Davis, J.G. Social interdependence on crowdsourcing platforms. J. Bus. Res. 2019, 103, 186–194. [Google Scholar] [CrossRef]
  12. Alalwan, A.A. Mobile food ordering apps: An empirical study of the factors affecting customer e-satisfaction and continued intention to reuse. J. Inf. Manag. 2020, 50, 28–44. [Google Scholar] [CrossRef]
  13. Liu, Y.; Liu, Y. The effect of workers’ justice perception on continuance participation intention in the crowdsourcing market. Internet Res. 2019, 29, 1485–1508. [Google Scholar] [CrossRef]
  14. Wu, W.; Yang, Q.; Gong, X.; Davison, R.M. Understanding sustained participation in crowdsourcing platforms: The role of autonomy, temporal value, and hedonic value. Inf. Technol. People 2023, 36, 734–757. [Google Scholar] [CrossRef]
  15. Kong, F.; Zhao, L.; Tsai, C.H. The relationship between entrepreneurial intention and action: The effects of fear of failure and role model. Front. Psychol. 2020, 11, 229. [Google Scholar] [CrossRef]
  16. Zhang, B.; Lai, K.H.; Wang, B.; Wang, Z. From intention to action: How do personal attitudes, facilities accessibility, and government stimulus matter for household waste sorting? J. Environ. Manag. 2019, 233, 447–458. [Google Scholar] [CrossRef]
  17. Li, D.; Hu, L. Exploring the effects of reward and competition intensity on participation in crowdsourcing contests. Electron. Mark. 2017, 27, 199–210. [Google Scholar] [CrossRef]
  18. Liu, T.X.; Yang, J.; Adamic, L.A.; Chen, Y. Crowdsourcing with all-pay auctions: A field experiment on taskcn. Manag. Sci. 2014, 60, 2020–2037. [Google Scholar] [CrossRef]
  19. Wang, M.M. Strategically reward solvers in crowdsourcing contests: The role of seeker feedback. Behav. Inf. Technol. 2022, 41, 3124–3137. [Google Scholar] [CrossRef]
  20. Tekic, A.; Pacheco, D.V.A. Contest design and solvers’ engagement behaviour in crowdsourcing: The neo-configurational perspective. Technovation 2024, 132, 102986. [Google Scholar] [CrossRef]
  21. Wang, M.M.; Wang, J.J.; Zhang, W.N. How to enhance solvers’ continuance intention in crowdsourcing contest: The role of interactivity and fairness perception. Online Inf. Rev. 2020, 44, 238–257. [Google Scholar] [CrossRef]
  22. Zheng, H.; Li, D.; Hou, W. Task design, motivation, and participation in crowdsourcing contests. Int. J. Electron. Commer. 2011, 15, 57–88. [Google Scholar] [CrossRef]
  23. Hashem, G. Examining intention to use crowd logistics: A synergy of unified theory of acceptance and use of technology and social exchange theory. Sustain. Futures 2025, 9, 100687. [Google Scholar] [CrossRef]
  24. Gao, L.; Wang, S.; Li, J.; Li, H. Application of the extended theory of planned behavior to understand individual’s energy saving behavior in workplaces. Resour. Conserv. Recycl. 2017, 127, 107–113. [Google Scholar] [CrossRef]
  25. Xiao, L.; Zhang, G.; Zhu, Y.; Lin, T. Promoting public participation in household waste management: A survey based method and case study in Xiamen city, China. J. Clean Prod. 2017, 144, 313–322. [Google Scholar] [CrossRef]
  26. Alam, S.L.; Campbell, J. Temporal motivations of volunteers to participate in cultural crowdsourcing work. Inf. Syst. Res. 2017, 28, 744–759. [Google Scholar] [CrossRef]
  27. Heo, M.; Toomey, N. Motivating continued knowledge sharing in crowdsourcing: The impact of different types of visual feedback. Online Inf. Rev. 2015, 39, 795–811. [Google Scholar] [CrossRef]
  28. Piezunka, H.; Dahlander, L. Idea rejected, tie formed: Organizations’ feedback on crowdsourced ideas. Acad. Manag. J. 2019, 62, 503–530. [Google Scholar] [CrossRef]
  29. Wang, M.M.; Wang, J.J. Understanding solvers’ continuance intention in crowdsourcing contest platform: An extension of expectation-confirmation model. J. Theor. Appl. Electron. Commer. Res. 2019, 14, 17–33. [Google Scholar] [CrossRef]
  30. Locke, E.A.; Latham, G.P. Building a practically useful theory of goal setting and task motivation: A 35-year odyssey. Am. Psychol. 2002, 57, 705. [Google Scholar] [CrossRef]
  31. Wang, S.; Wang, J.; Yang, S.; Li, J.; Zhou, K. From intention to behavior: Comprehending residents’ waste sorting intention and behavior formation process. Waste Manag. 2020, 113, 41–50. [Google Scholar] [CrossRef]
  32. Rausch, T.M.; Kopplin, C.S. Bridge the gap: Consumers’ purchase intention and behavior regarding sustainable clothing. J. Clean Prod. 2021, 278, 123882. [Google Scholar] [CrossRef]
  33. Czajkowski, M.; Kądziela, T.; Hanley, N. We want to sort! Assessing households’ preferences for sorting waste. Resour. Energy. Econ. 2014, 36, 290–306. [Google Scholar] [CrossRef]
  34. Feller, J.; Finnegan, P.; Hayes, J.; O’Reilly, P. ‘Orchestrating’sustainable crowdsourcing: A characterisation of solver brokerages. J. Strateg. Inf. Syst. 2012, 21, 216–232. [Google Scholar] [CrossRef]
  35. Krishnan, V.; Ulrich, K.T. Product development decisions: A review of the literature. Manag. Sci. 2001, 47, 1–21. [Google Scholar] [CrossRef]
  36. Shao, B.; Shi, L.; Xu, B.; Liu, L. Factors affecting participation of solvers in crowdsourcing: An empirical study from China. Electron. Mark. 2012, 22, 73–82. [Google Scholar] [CrossRef]
  37. Sun, Y.; Fang, Y.; Lim, K.H. Understanding sustained participation in transactional virtual communities. Decis. Support Syst. 2012, 53, 12–22. [Google Scholar] [CrossRef]
  38. Horng, S.M.; Wu, C.L. How behaviors on social network sites and online social capital influence social commerce intentions. Inf. Manag. 2020, 57, 103176. [Google Scholar] [CrossRef]
  39. Afuah, A.; Tucci, C.L. Crowdsourcing as a solution to distant search. Acad. Manag. Rev. 2012, 37, 355–375. [Google Scholar] [CrossRef]
  40. Ye, H.J.; Kankanhalli, A. Solvers’ participation in crowdsourcing platforms: Examining the impacts of trust, and benefit and cost factors. J. Strateg. Inf. Syst. 2017, 26, 101–117. [Google Scholar] [CrossRef]
  41. Wu, W.; Gong, X.; Yang, Q. Role of motivations, self-regulations, and perceived competitive intensity in solvers’ continuance intention in crowdsourcing contests. Behav. Inform. Technol. 2023, 42, 2152–2175. [Google Scholar] [CrossRef]
  42. Koh, J.; Kim, Y.G.; Butler, B.; Bock, G.W. Encouraging participation in virtual communities. Commun. ACM 2007, 50, 68–73. [Google Scholar] [CrossRef]
  43. Zou, T.; Cao, Z. Momentum effects in crowdsourcing contest platforms. Inf. Manag. 2025, 62, 104206. [Google Scholar] [CrossRef]
  44. Körpeoğlu, E.; Cho, S.-H. Incentives in contests with heterogeneous solvers. Manag. Sci. 2018, 64, 2709–2715. [Google Scholar] [CrossRef]
  45. Jiang, Z.; Huang, Y.; Beil, D.R. The role of feedback in dynamic crowdsourcing contests: A structural empirical analysis. Manag. Sci. 2022, 68, 4858–4877. [Google Scholar] [CrossRef]
  46. Steils, N.; Hanine, S. Recruiting valuable participants in online IDEA generation: The role of brief instructions. J. Bus. Res. 2019, 96, 14–25. [Google Scholar] [CrossRef]
  47. Zhang, X.; Xia, E.; Shen, C.; Su, J. Factors influencing solvers’ behaviors in knowledge-intensive crowdsourcing: A systematic literature review. J. Theor. Appl. Electron. Commer. Res. 2022, 17, 1297–1319. [Google Scholar] [CrossRef]
  48. Liu, X.; Hao, X. Do dynamic signals affect high-quality solvers’ participation behavior? Evidence from the crowdsourcing platform. J. Theor. Appl. Electron. Commer. Res. 2024, 19, 561–580. [Google Scholar] [CrossRef]
  49. Sheeran, P.; Webb, T.L. The intention-behavior gap. Soc. Personal. Psychol. Compass. 2016, 10, 503–518. [Google Scholar] [CrossRef]
  50. Bakici, T. Comparison of crowdsourcing platforms from social-psychological and motivational perspectives. Int. J. Inf. Manag. 2020, 54, 102121. [Google Scholar] [CrossRef]
  51. Xu, H.; Wu, Y.; Hamari, J. What determines the successfulness of a crowdsourcing campaign: A study on the relationships between indicators of trustworthiness, popularity, and success. J. Bus. Res. 2022, 139, 484–495. [Google Scholar] [CrossRef]
  52. Dissanayake, I.; Nerur, S.; Wang, J.; Yasar, M.; Zhang, J. The impact of helping others in coopetitive crowdsourcing communities. J. Assoc. Inf. Syst. 2021, 22, 7. [Google Scholar] [CrossRef]
  53. Phang, C.W.; Kankanhalli, A.; Sabherwal, R. Usability and sociability in online communities: A comparative study of knowledge seeking and contribution. J. Assoc. Inf. Syst. 2009, 10, 2. [Google Scholar] [CrossRef]
  54. Ye, H.J.; Kankanhalli, A. Investigating the antecedents of organizational task crowdsourcing. Inf. Manag. 2015, 52, 98–110. [Google Scholar] [CrossRef]
  55. Fang, D.; Noe, T.; Strack, P. Turning up the heat: The discouraging effect of competition in contests. J. Polit. Econ. 2020, 128, 1940–1975. [Google Scholar] [CrossRef]
Figure 1. Solvers’ participation in the task process.
Figure 1. Solvers’ participation in the task process.
Jtaer 21 00099 g001
Figure 2. Research Model.
Figure 2. Research Model.
Jtaer 21 00099 g002
Figure 3. The moderating effect of reward on the relationship between external competition and participation behavior.
Figure 3. The moderating effect of reward on the relationship between external competition and participation behavior.
Jtaer 21 00099 g003
Figure 4. The moderating effect of reward on the relationship between internal competition and participation behavior.
Figure 4. The moderating effect of reward on the relationship between internal competition and participation behavior.
Jtaer 21 00099 g004
Table 1. Summary of related research.
Table 1. Summary of related research.
AuthorTheorySubjectsDataData Type
Liu et al. [18]Motivation theoryParticipation behavior;
Submission quality
Field experimentsStatic
Ye & Kankanhalli [40]Social exchange theoryParticipation behaviorElectronic QuestionnaireStatic
Li & Hu [17]Expected value theoryParticipation behaviorSecondary dataStatic
Wang et al. [21]Self-determination theory; social exchange theoryContinuance intentionQuestionnaireStatic
Yang et al. [7]Uncertainty reduction theorySeeker’s RetentionSecondary dataStatic
Wu et al. [41]Self-determination theoryContinuance intentionQuestionnaireStatic
Tekic & Pacheco [20]Social exchange theorySolvers’ attraction;
Solvers’ participation
Secondary dataStatic
Table 2. Task data examples.
Table 2. Task data examples.
VariableLogo DesignE-Commerce Technology Logo DesignSpeed Selection
Reward500100100
DistributionOne winnerOne winnerMultiple winners
Period766
Detail1273325
View35582260
Submission582125
Participation461621
Task statusJoin nowJoin nowSuccessful conclusion
Table 3. Variables and descriptive statistics.
Table 3. Variables and descriptive statistics.
Variable TypeVariableMinMaxMeanSD
Dependent variablesSubmission0332.183.504
View145442.1157.282
Independent variablesExternal528474.657.737
Internal010.580.15
Moderating variableReward 0.266.591.041.03
Control variablesTitle3339.544.628
Detail21980121.62229.663
Position182.121.208
Table 4. Correlation analysis of variables.
Table 4. Correlation analysis of variables.
Variable12345678
1 Submission 1
2 View 0.577 **1
3 External−0.142 **−0.101 **1
4 Internal0.026−0.096 **0.0161
5 Reward0.363 **0.331 **−0.0060.0331
6 Title0.025−0.0240.025−0.0380.091 **1
7 Detail0.199 **0.067 **−0.059 *0.162 **0.336 **0.123 **1
8 Position−0.356 **−0.644 **0.089 **−0.054 *0.050 *−0.0420.145 **1
Note: ** p < 0.01, * p < 0.05.
Table 5. Results of two-stage regression analysis.
Table 5. Results of two-stage regression analysis.
VariableViewSubmission
1-11-22-12-22-3
Externalit −0.252
(0.176)
−0.630 ***
(0.144)
−0.664 ***
(0.144)
Internalit −0.528 ***
(0.109)
0.252 *
(0.109)
0.358 **
(0.115)
Viewi,t−(t−1) 0.354 ***
(0.022)
0.353 ***
(0.022)
Rewardit 0.144 ***
(0.018)
0.153 ***
(0.018)
Externalit ×
Rewardit
−0.347 *
(0.157)
Internalit ×
Rewardit
0.377 **
(0.144)
Titlei−0.185 ***
(0.045)
−0.188 ***
(0.045)
−0.045
(0.042)
0.015
(0.037)
0.017
(0.037)
Detaili0.133 ***
(0.014)
0.143 ***
(0.014)
0.153 ***
(0.013)
0.063 ***
(0.012)
0.059 ***
(0.012)
Positionit−0.610 ***
(0.016)
−0.619 ***
(0.016)
−0.276 ***
(0.015)
−0.052 **
(0.019)
−0.048 *
(0.019)
Constant4.312
(0.112)
5.702 ***
(0.763)
0.840 ***
(0.104)
1.889 **
(0.638)
1.971 **
(0.637)
N17681768176817681768
R20.4460.4540.1910.3900.394
ΔR20.4450.4520.1900.3880.390
F472.811 ***292.801 ***139.013 ***160.646 ***126.714 ***
Note: Standard errors are reported in parentheses; *** p < 0.001, ** p < 0.01, * p < 0.05.
Table 6. Hypothesis test results.
Table 6. Hypothesis test results.
No.Research HypothesisResults
H1aExternal competition has a negative effect on solvers’ participation intention; that is, more intense market competition leads to lower views. Not supported
H2aExternal competition has a negative effect on solvers’ participation behavior; that is, more intense market competition leads to lower submissions.Supported
H1bInternal competition has a negative effect on solvers’ participation intention; that is, stronger task competition leads to lower views.Supported
H2bInternal competition has a positive effect on solvers’ participation behavior; that is, stronger task competition leads to more submissions.Supported
H3aRelative rewards strengthen the negative effect of external competition on participation behavior.Supported
H3bRelative rewards strengthen the positive effect of internal competition on participation behavior.Supported
Table 7. Regression analysis with added control variables.
Table 7. Regression analysis with added control variables.
VariableViewSubmission
1-11-22-12-22-3
Externalit −0.273
(0.176)
−0.628 ***
(0.144)
−0.663 ***
(0.144)
Internalit −0.517 ***
(0.109)
0.251 *
(0.110)
0.357 **
(0.115)
Viewi,t−(t−1) 0.355 ***
(0.022)
0.353 ***
(0.022)
Rewardit 0.144 ***
(0.019)
0.153 ***
(0.019)
Externalit ×
Rewardit
−0.347 *
(0.157)
Internalit ×
Rewardit
0.377 **
(0.144)
Titlei−0.198 ***
(0.046)
−0.200 ***
(0.045)
−0.041
(0.042)
0.017
(0.038)
0.018
(0.038)
Detaili0.143 ***
(0.015)
0.152 ***
(0.015)
0.150 ***
(0.014)
0.062 ***
(0.013)
0.058 ***
(0.013)
Positionit−0.590 ***−0.599 ***−0.282 ***−0.053 **−0.049 *
(0.019)(0.019)(0.017)(0.020)(0.020)
Periodi−0.086 *
(0.039)
−0.081 *
(0.039)
0.026
(0.036)
0.008
(0.032)
0.004
(0.032)
Constant4.457 ***
(0.129)
5.921 ***
(0.770)
0.796 ***
(0.120)
1.864 **
(0.646)
1.959 **
(0.645)
N17681768176817681768
R20.4470.4550.1920.3900.394
ΔR20.4460.4530.1900.3870.390
F356.633 ***245.200 ***104.363 ***140.498 ***113.980 ***
Note: Standard errors are reported in parentheses; *** p < 0.001, ** p < 0.01, * p < 0.05.
Table 8. Regression analysis with the substitution of the dependent variable.
Table 8. Regression analysis with the substitution of the dependent variable.
VariableViewSubmission
1-11-22-12-22-3
Externalit −0.255
(0.132)
−0.392 ***
(0.109)
−0.411 ***
(0.109)
Internalit −1.328 ***
(0.082)
0.149
(0.092)
0.232 *
(0.098)
Viewi,t−(t−1) 0.577 ***
(0.024)
0.586 ***
(0.025)
Rewardit 0.173 ***
(0.015)
0.174 ***
(0.015)
Externalit ×
Rewardit
−0.009
(0.119)
Internalit ×
Rewardit
0.276 *
(0.110)
Titlei−0.230 ***
(0.037)
−0.240 ***
(0.034)
−0.023
(0.039)
0.088 **
(0.029)
0.090 **
(0.029)
Detaili0.072 ***
(0.012)
0.098 ***
(0.011)
0.053 ***
(0.013)
−0.029 **
(0.010)
−0.032 **
(0.010)
Periodi0.363 ***
(0.027)
0.352 ***
(0.025)
0.189 ***
(0.029)
−0.054 *
(0.022)
−0.055 *
(0.022)
Constant4.995 ***
(0.104)
6.845 ***
(0.579)
2.819 ***
(0.110)
1.640 **
(0.508)
1.628 **
(0.508)
N17681768176817681768
R20.1690.2790.0500.5090.511
ΔR20.1680.2770.0490.5070.508
F119.718 ***136.688 ***31.053 ***260.473 ***203.788 ***
Note: Standard errors are reported in parentheses; *** p < 0.001, ** p < 0.01, * p < 0.05.
Table 9. Regression analysis with the substitution of the moderator variable.
Table 9. Regression analysis with the substitution of the moderator variable.
VariableViewSubmission
1-11-22-12-22-3
Externalit −0.252
(0.176)
−0.480 **
(0.142)
−0.474 **
(0.143)
Internalit −0.528 ***
(0.109)
0.183
(0.109)
0.236 *
(0.114)
Viewi,t−(t−1) 0.332 ***
(0.022)
0.334 ***
(0.022)
Rewardit 0.259 ***
(0.026)
0.258 ***
(0.025)
Externalit ×
Rewardit
−0.405 *
(0.187)
Internalit ×
Rewardit
0.199
(0.163)
Titlei−0.185 ***
(0.045)
−0.188 ***
(0.045)
−0.045
(0.042)
0.051
(0.037)
0.054
(0.037)
Detaili0.133 ***
(0.014)
0.143 ***
(0.014)
0.153 ***
(0.013)
0.040 **
(0.013)
0.038 **
(0.013)
Positionit−0.610 ***
(0.016)
−0.619 ***
(0.016)
−0.276 ***
(0.015)
−0.073 ***
(0.019)
−0.069 ***
(0.019)
Constant4.312
(0.112)
5.702 ***
(0.763)
0.840 ***
(0.104)
0.146
(0.637)
0.078
(0.639)
N17681768176817681768
R20.4460.4540.1910.4030.405
ΔR20.4450.4520.1900.4010.402
F472.811 ***292.801 ***139.013 ***169.745 ***132.939 ***
Note: Standard errors are reported in parentheses; *** p < 0.001, ** p < 0.01, * p < 0.05.
Table 10. Regression analysis with deleted samples.
Table 10. Regression analysis with deleted samples.
VariableViewSubmission
1-11-22-12-22-3
Externalit −0.255
(0.132)
−0.777 ***
(0.191)
−0.809 ***
(0.192)
Internalit −1.328 ***
(0.082)
0.157
(0.114)
0.256 *
(0.119)
Viewi,t−(t−1) 0.296 ***
(0.025)
0.297 ***
(0.025)
Rewardit 0.163 ***
(0.019)
0.171 ***
(0.019)
Externalit × Rewardit −0.251
(0.187)
Internalit × Rewardit 0.381 **
(0.145)
Titlei−0.167 ***
(0.045)
−0.178 ***
(0.044)
−0.042
(0.042)
−0.008
(0.038)
−0.006
(0.038)
Detaili0.106 ***
(0.013)
0.124 ***
(0.014)
0.143 ***
(0.013)
0.064 ***
(0.012)
0.060 ***
(0.012)
Positionit−1.475 ***
(0.036)
−1.512 ***
(0.036)
−0.705 ***
(0.034)
−0.260 ***
(0.049)
−0.251 ***
(0.049)
Constant3.968 ***
(0.108)
3.321 **
(0.980)
0.698 ***
(0.103)
2.832 **
(0.842)
2.908 **
(0.842)
N16341634163416341634
R20.5090.5200.2380.4010.404
ΔR20.5080.5190.2360.3990.401
F563.024 ***352.824 ***169.215 ***155.606 ***122.380 ***
Note: Standard errors are reported in parentheses; *** p < 0.001, ** p < 0.01, * p < 0.05.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, X.; Hao, X.; Pang, Z.; Fan, X. Does Participation Intention Equal Participation Behavior? The Role of Dynamic Competition in Crowdsourcing Contests. J. Theor. Appl. Electron. Commer. Res. 2026, 21, 99. https://doi.org/10.3390/jtaer21040099

AMA Style

Liu X, Hao X, Pang Z, Fan X. Does Participation Intention Equal Participation Behavior? The Role of Dynamic Competition in Crowdsourcing Contests. Journal of Theoretical and Applied Electronic Commerce Research. 2026; 21(4):99. https://doi.org/10.3390/jtaer21040099

Chicago/Turabian Style

Liu, Xue, Xiaoling Hao, Zhiliang Pang, and Xing Fan. 2026. "Does Participation Intention Equal Participation Behavior? The Role of Dynamic Competition in Crowdsourcing Contests" Journal of Theoretical and Applied Electronic Commerce Research 21, no. 4: 99. https://doi.org/10.3390/jtaer21040099

APA Style

Liu, X., Hao, X., Pang, Z., & Fan, X. (2026). Does Participation Intention Equal Participation Behavior? The Role of Dynamic Competition in Crowdsourcing Contests. Journal of Theoretical and Applied Electronic Commerce Research, 21(4), 99. https://doi.org/10.3390/jtaer21040099

Article Metrics

Back to TopTop