Next Article in Journal
Addressing Industry Adaptation Resistance in Combating Brand Deception: AI-Powered Technology vs. Revenue Sharing
Previous Article in Journal
Motivators and Demotivators of Consumers’ Smart Voice Assistant Usage for Online Shopping
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Effects of Privacy Regulatory Protection on Users’ Data Sharing in Mobile Apps

1
School of Economics and Management, Beijing University of Posts and Telecommunications, Beijing 100876, China
2
College of Business Administration, Capital University of Economics and Business, Beijing 100070, China
3
Business School, Beijing Technology and Business University, Beijing 100048, China
*
Author to whom correspondence should be addressed.
J. Theor. Appl. Electron. Commer. Res. 2025, 20(3), 153; https://doi.org/10.3390/jtaer20030153
Submission received: 19 May 2025 / Revised: 9 June 2025 / Accepted: 24 June 2025 / Published: 1 July 2025

Abstract

While data-driven digital technologies (e.g., mobile applications) have brought convenience to users, they have also raised data privacy concerns. Regulators have taken various actions that ensure user data privacy to balance the protection and exploitation of personal data. However, the effect of privacy regulatory protection in preparing mobile users for data sharing remains unclear. This research develops and empirically tests an integrative model of how mobile users adjust their intention to share personal data in response to privacy regulatory protection. The results indicate that privacy regulatory protection can increase users’ intention to share personal data through enhancing psychological ownership of data (i.e., entitlement effect) and decreasing privacy concerns (i.e., reassurance effect). Moreover, the entitlement effect can be attenuated as app control over data increases, while the reassurance effect can be accentuated as users’ privacy efficacy increases. This research provides new insights into the role of privacy regulatory protection in promoting user data sharing by highlighting the psychological mechanisms underlying users’ responses to such regulations. It provides implications for digital platforms seeking to balance the challenges of user data protection with the benefits of data-driven marketing.

1. Introduction

In the digital age, effective data-driven marketing requires digital platform firms to generate market insights based on the adoption of digital technologies (e.g., mobile apps) and user data collection from various wireless devices (e.g., smartphones, tablets, smartwatches, and other smart wearables) [1]. However, data privacy conflicts arise when users claim ownership of their personal data and seek to restrict firms’ access to it [2]. To balance the protection and exploitation of user data, regulators have taken various actions that address user data privacy issues [3]. For example, the General Data Privacy Regulation (GDPR) constrains firms’ efforts in data-driven marketing through requiring explicit customer consent before data use. Although users expect the regulators, rather than the firms, to play the primary role in protecting their data privacy [4], the effectiveness of privacy regulatory protection in preparing users for data sharing is unclear. A recent study revealed that 75% of the respondents were still concerned about privacy intrusion, though more than 90% reported that they knew data laws [5]. Thus, it is important to investigate whether and how privacy regulatory protections influence users’ data sharing (e.g., prevention or promotion), as well as how digital platform firms might refine their strategies to cope with privacy regulations [6].
However, the extant literature has paid less attention to the influence mechanisms of privacy regulatory protection on users’ data sharing, restraining a comprehensive understanding of the effects of privacy regulatory protection. Firstly, as shown in Table 1, existing studies have mainly focused on the mediating role of users’ privacy perception (e.g., privacy risk and privacy concerns), omitting other influence mechanisms of privacy regulatory protection on users’ data sharing. For example, although privacy regulatory protection is expected to empower individuals to exercise their rights over personal data [5], there is still limited understanding of how this empowerment influences users’ data sharing. Secondly, few studies, with the exception of Xu et al. [7], have explored the contingent factors that might help adjust the effect of privacy regulatory protection. These oversights are noteworthy because they could play a critical role in shaping the effectiveness of privacy regulatory protection.
To address the above gaps, this research centers on two questions: (1) What are the influence mechanisms of privacy regulatory protection on users’ data sharing? (2) Are the effects of privacy regulatory protection contingent on any firm- or user-related factors? This research proposes a dual mediation model based on the communication process model and regulatory focus theory to account for the effects of privacy regulatory protection. Specifically, regulators communicate and deliver privacy regulatory information to users to affect their behavior. Users decode and respond to the received information in line with their regulatory focus (i.e., promotion focus and prevention focus), seeking matches for gains (e.g., increased psychological ownership of data) and nonlosses (e.g., decreased privacy concerns) [13]. Study 1 (Preliminary Study), using the observational data of Facebook app user reviews, found that Facebook users’ psychological ownership of data increased and privacy concerns decreased after the passage of the California Consumer Privacy Act (CCPA). Study 2 (Main Study) tested the conceptual model using a sample of 1095 mobile app users. The results showed that privacy regulatory protection could increase users’ intention to share personal data through enhancing psychological ownership of data (i.e., entitlement effect) and decreasing privacy concerns (i.e., reassurance effect). Moreover, the entitlement effect can be attenuated as firm control over data increases, while the reassurance effect can be accentuated as users’ privacy efficacy increases.
This research contributes to the literature on privacy regulatory protection, as it proposes and empirically validates an integrative model that sheds light on the psychological mechanisms—particularly psychological ownership and privacy concerns—through which privacy regulatory protection shapes users’ behavior. This contribution helps to understand the dual role of privacy regulatory protection as both a constraint and an enabler in data-driven digital ecosystems.
This paper is organized as follows. The next section provides the theoretical background and conceptual model. Hypotheses are developed in Section 3. Section 4 presents the preliminary study and Section 5 reports the main study. The final section concludes with a broad discussion of findings and implications, as well as the limitations and suggestions for future research.

2. Theoretical Background and Conceptual Model

2.1. Users’ Data Sharing in Mobile Context

Data sharing refers to individuals intentionally and voluntarily revealing data or information about themselves to others [14]. In this research, users’ data sharing in a mobile app refers to users sharing their personal data or information with the firm that owns the app. In the mobile context, a firm often requests various permissions from users through the app, the digital interface between firms and users, to access their personal data, such as location, identity, photos/media, device and call information, usage history, camera, Wi-Fi connection information, microphone, and contacts [15]. Usually, users have to agree to an app’s data access permissions to continue the full use of the app. For example, users of the Facebook app intentionally and voluntarily share their location information to use the location-based social networking service [16]. Consequently, firms can record, store, and leverage users’ shared data for other agreed marketing applications, such as target ads, personalized recommendations, and user persona generation.

2.2. Privacy Regulatory Protection

The privacy literature describes three major approaches to help protect users’ data privacy, individual self-protection, industry self-regulation, and privacy regulatory protection [17]. Privacy regulatory protection refers to the degree to which regulators devise privacy regulations to direct and police firms’ use of user data [18]. Privacy regulatory protection is expected to be more effective than individual self-protection and industry self-regulation because it grants users rights and restrains the misusage of personal data simultaneously [3,19].
The process of privacy regulatory protection influencing users’ data sharing can be viewed as a communication process. According to the communication process model, which depicts how messages flow from senders to receivers, regulators (i.e., senders) deliver privacy regulatory protection information (i.e., messages) to users (i.e., receivers). As receivers, users decode the received information based on their motivational states and make decisions about whether to share personal data accordingly. Moreover, noises in the communication process (i.e., firms’ behavior and users’ characteristics) may distort the message and affect regulators’ communication with users.
Privacy regulatory protection may lead to different user responses. Previous literature has documented the positive effect of privacy regulatory protection on users’ trust [12], privacy empowerment [19], and perceived control [11,17], as well as the negative effect of privacy regulatory protection on users’ privacy concerns and perceived risk [7,8,9,20]

2.3. Regulatory Focus Theory

Regulatory focus theory has been used to predict users’ broad attitudes toward public policies and regulatory priorities [21]. It distinguishes two types of regulatory focus when individuals engage in goal-directed behavior, such as decoding information [13]. Notably, regulatory focus is a motivational state, and the two types of regulatory focus can coexist independently of each other in every person [22]. A promotion focus involves sensitivity to positive outcomes (e.g., advancement, growth, and accomplishment), emphasizing gains and nongains. A prevention focus involves sensitivity to negative outcomes (e.g., security, duties, and obligations), emphasizing losses and nonlosses. For example, promotion-focused users tend to interpret privacy regulatory protection regulations as opportunities to promote accomplishment, such as gaining psychological ownership of data. Alternatively, prevention-focused users tend to consider privacy regulatory protection information as a measure to avoid risks, thus reducing privacy concerns.

2.4. Psychological Ownership of Data

Psychological ownership is defined as the state in which individuals feel as though the target of ownership or a piece of that target is “theirs” (i.e., “It is mine!”) [23]. The target of psychological ownership varies greatly, including physical targets (e.g., mug, shirt, and shoes) as well as intangible targets (e.g., ideas, experience, and data) [24]. Psychological ownership of data refers to the state in which individuals regard certain data as their data (i.e., “The data is mine!”).
Individual control of data is a powerful route through which psychological ownership of data emerges [23]. Control means the ability to use and control the use of an object [25]. Perceived control can be instantiated through touching, manipulating, and moving [26].
The effects of psychological ownership of data on users’ data sharing are mixed in the literature. Although some studies suggest that psychological ownership of data may reduce users’ propensity to share data (e.g., deter others from accessing or using the target) [27], other empirical studies evidence that psychological ownership of data can encourage users’ data sharing due to their territorial expansion motives [28,29]. Individuals with high-level psychological ownership are more likely to exchange information for other valuable resources. For example, users may be willing to share their data in exchange for appropriate compensation despite strong feelings of ownership [30]. Thus, this research expects a positive effect of psychological ownership of data on users’ data sharing.

2.5. Privacy Concerns

Privacy concerns refer to individuals’ concerns about organizational data privacy practices [31]. In the mobile environment, users’ privacy concerns are defined as concerns about possible loss of privacy as a result of data sharing with a specific app [32].
Privacy concerns are strongly affected by users’ perceptions of a firm’s data practices. Unfair data practices, such as excessive data collection, improper data access, and unauthorized secondary use of personal data, will increase user privacy concerns [31]. Moreover, these perceptions are also affected by institutional or societal factors. For example, institutional trust can counteract privacy concerns and varies across industries [33]. Societal factors such as government regulation play an important role in shaping individuals’ evaluation of a firm’s data practices [34]. As governments become more involved in corporate data privacy management, firms may adopt stricter management of privacy issues, and individuals will be less concerned about data privacy.
Privacy concerns have become one of the major inhibitors of users’ data sharing in mobile apps. Privacy concerns can increase users’ perception of risky consequences, such as identity theft and economic loss [35]. This negative perception subsequently leads users to intentionally inhibit their data disclosure behavior [36]. Moreover, privacy concerns can increase users’ feelings of anxiety and discomfort, resulting in inactive user participation [37]. For example, users will be discouraged from clicking on a specific ad due to its personalized advertising driven by personal data [38].

2.6. Conceptual Model

From the perspective of the communication process model, as mentioned previously, users’ data sharing depends on their decoding of information regarding privacy regulatory protection as well as contextual factors. Drawing on regulatory focus theory, users decode and respond to the received information in line with their regulatory focus, seeking matches for gains (i.e., increased psychological ownership of data) and nonlosses (i.e., decreased privacy concerns). Accordingly, this research includes psychological ownership of data and privacy concerns to account for the effect of privacy regulatory protection, as well as firm control over data and user privacy efficacy as the moderators (see Figure 1).

3. Hypothesis Development

3.1. Privacy Regulatory Protection, Psychological Ownership of Data, and Intention to Share Personal Data

Privacy regulatory protection can help users establish psychological ownership of their personal data through increasing perceived control [17,23]. Privacy regulatory protection empowers individuals with control over their data, granting rights for data access, deletion, or modification along with the necessity of obtaining explicit consent prior to data processing [39]. For example, GDPR recognizes that individuals own and control their personal data in perpetuity through the right to explicit consent, be forgotten, and data portability [40]. Given the explicit consent mechanism, if users do not consent to data sharing with a firm, the firm cannot collect personal data from users. The right to be forgotten gives users the ability to selectively remove their personal data from firms’ databases, granting users greater control over the sharing of their data [41]. The right to data portability ensures users the free portability of personal data from one firm to another [42]. In sum, privacy regulatory protection can increase users’ control over their personal data, which is an immediate driver of psychological ownership of data [24]. Thus, it is hypothesized that
H1. 
Privacy regulatory protection is positively related to users’ psychological ownership of their personal data.
When users perceive something as valuable and believe they can offer it, they will become more willing to exchange it [43]. Psychological ownership of data can increase users’ perceived value of the data and their sense of entitlement to control the data, thus encouraging users to exchange the data for benefits or returns. First of all, users who report higher psychological ownership of data tend to perceive the data as valuable for exchange [41]. Once the incentives (e.g., discounts, monetary compensation, and personalized service) for data exchange are adequate, users are encouraged to share data for benefits. Moreover, users with higher psychological ownership of their personal data are more likely to categorize this data as a part of themselves and feel empowered to explore, operate, and master it [28]. This feeling of empowerment can promote users’ willingness to share personal data. For example, users often exchange their own personal data, rather than the information they hold about others, for rewards [29]. Thus, it is hypothesized that
H2. 
Users’ psychological ownership of their personal data is positively related to their intention to share personal data.

3.2. Privacy Regulatory Protection, Privacy Concerns, and Intention to Share Personal Data

Regulators can reduce users’ privacy concerns through enforcing organizational accountability, including enacting laws and regulations or conducting corporate privacy compliance investigations [40]. First of all, regulators typically rely on the judicial or legislative branches to protect data privacy [7]. For example, the United States has sector-specific privacy laws that apply to specific industry sectors or to classes of sensitive information such as health information. The existence of these laws makes users feel safe and secure about their data privacy [8]. Moreover, with legal structures in place, firms’ illegal data practices can be deterred through the threat of punishment. This creates an environment where firms are held accountable for mishandling user data, further reinforcing users’ trust in the regulatory system and firms’ data practices. Thus, when effective privacy regulatory protection is sufficient, users are more likely to trust that firms will collect and use personal data respectfully and fairly, leading to lower privacy concerns. Accordingly, it is hypothesized that
H3. 
Privacy regulatory protection is negatively related to users’ privacy concerns.
Users who are concerned about the data privacy practices of firms tend to share less personal data while using mobile apps due to fears of psychological or financial damage. Firstly, users with high privacy concerns worry about unwanted identification of personal information. The data that users share while using mobile apps may reveal users’ interpersonal relationships with friends, family, and business partners, as well as their private information such as daily routines or health status [44]. Sharing such data may make users more vulnerable to in-person harassment or threats. Moreover, once users’ important personal data is accidentally leaked, it may lead to identity theft or financial loss, as some personal data, such as fingerprints, palmprints, voice, and facial dynamics, often contain significant identity-related information that is intrinsically linked to personal assets [45]. Thus, it is hypothesized that
H4. 
Users’ privacy concerns are negatively related to their intention to share personal data.

3.3. The Moderating Role of Firm Control over Data

Firm control over data refers to the extent to which a firm can control the release and dissemination of personal data [17]. A firm with high control over personal data can leverage it for purposes such as generating revenue through data sharing and targeted advertising. When users perceive that their personal data is highly controlled by a firm, they are less likely to perceive the data as belonging to them. Users may feel it reasonable for the firm to own the data because control over an object is a key characteristic of the phenomenon of ownership [25]. In other words, high levels of perceived firm control over personal data create unfavorable conditions for users to psychologically claim to be the sole owner of their personal data. Under this condition, even though regulatory protection provides clarity on users’ data rights, it is less possible for users to claim the privilege of controlling their personal data and develop a sense of ownership of the data. Accordingly, it is hypothesized that
H5. 
Firm control over data attenuates the relationship between privacy regulatory protection and users’ psychological ownership of their personal data. That is, as the level of firm control over data increases, the positive effect of privacy regulatory protection on users’ psychological ownership of their personal data will become weaker.

3.4. The Moderating Role of User Privacy Efficacy

Privacy efficacy refers to the belief in one’s ability to protect data privacy while using mobile apps [46]. Prior work has revealed that privacy efficacy predicts users’ intentions to engage in privacy-protective activities [47]. For this reason, users with high privacy efficacy may be more sensitive to and understanding of privacy regulatory protection [48]. In addition, they can better utilize privacy-enhancing techniques to manage their data privacy, such as utilizing privacy-protective settings [49]. This enhanced self-management creates a favorable environment for reducing users’ anxiety and increasing their trust in the effectiveness of privacy regulatory protection. Thus, with a high level of privacy efficacy, the effect of privacy regulation in reducing users’ concerns about privacy becomes more salient. Accordingly, it is hypothesized that
H6. 
User privacy efficacy accentuates the relationship between privacy regulatory protection and privacy concerns. That is, as the level of users’ privacy efficacy increases, the negative effect of privacy regulatory protection on privacy concerns will become stronger.

4. Preliminary Study

This preliminary study investigates the effects of privacy regulatory protection on users’ perception of data ownership and privacy concerns with observational data. An exploratory dictionary-based text analysis of user reviews of the Facebook app was conducted to examine whether users’ psychological ownership of data and privacy concerns differed before and after the passage of the CCPA on 28 June 2018, a milestone of privacy regulatory protection for users’ personal data and information in the United States.

4.1. Method

4.1.1. Research Context and Data Collection

The data for this study was collected from user reviews of the Facebook app on Apple’s App Store. The data set covered a four-year period from 28 June 2016 to 27 June 2020, spanning two years before and two years after the passage of the CCPA. This study used the passage of the CCPA to represent privacy regulatory protection, as it is the consumer privacy law comparable to GDPR. Moreover, this study coded psychological ownership and privacy concerns based on the user reviews of the Facebook app, as Facebook is one of the largest and most representative online social network platforms in the United States, characterized by strong social ties and a large collection of various personal data about users. In addition, given the exploratory nature of this preliminary study, only the user reviews of the Facebook iOS app were collected for analysis.
The full data set contains 202,464 user reviews of the Facebook app from Apple’s App Store, including the publication date, rating, and text of each review. To ensure that the reviews were related to personal data, the data set was filtered using the terms “data” or “information”. This procedure generated a final sample of 3766 user reviews.

4.1.2. Measures

Firstly, a dummy variable (i.e., independent variable) was created to indicate the passage of the CCPA. It was coded as 1 if a user review was generated on or after the date of CCPA’s passage and 0 otherwise.
Secondly, a dictionary-based text analysis was used to measure the dependent variables of psychological data ownership and privacy concerns as dummy variables. Following Humphreys and Wang’s suggestions [50], this analysis used the psychological ownership dictionary developed by Smith et al. [51] and the privacy dictionary developed by Visentin et al. [52] for variable coding (see Appendix A). Either variable was coded as 1 if a review contained any of the words in the corresponding dictionary and 0 otherwise.
Thirdly, as suggested by Smith et al. [51], review length (word count) and review rating were included as control variables in the analysis.

4.2. Results

As shown in Figure 2, the percentage of reviews related to psychological ownership of data increased over time during the observation window, while the percentage of reviews related to privacy concerns decreased over time.
Binary logistic regression models were used to estimate the effects of the passage of the CCPA on psychological ownership of data and privacy concerns at the user review level. As shown in Table 2, the passage of the CCPA showed a positive effect on psychological ownership of data (odds ratio = 1.254, p < 0.01) and a negative effect on privacy concerns (odds ratio = 0.628, p < 0.001).

4.3. Discussion

The results of this preliminary study provide some initial evidence that privacy regulatory protection, as measured by the passage of the CCPA, positively influences users’ psychological ownership of data and negatively influences users’ privacy concerns, thus lending support to the hypotheses regarding the relationship between the independent variable and mediators (i.e., H1 and H3). Building on these findings, the main study turned to test the research model, that is, how privacy regulatory protection influences users’ data sharing in mobile apps.

5. Main Study

5.1. Method

5.1.1. Research Context and Data Collection

The data for this study were collected from an online survey among mobile app users in China. China has a diverse mobile app landscape, such as social media, mobile payment, and e-commerce. Moreover, China has recently implemented several significant privacy regulations, such as the Personal Information Protection Law, providing an appropriate context for observing the impact of privacy regulatory protection on mobile app users.
A convenience sample of 1095 respondents was recruited for this survey, 59.9% of whom were female, and the mean age was 41. The distribution of age (20.2% aged 18–29, 22.0% aged 30–39, 30.7% aged 40–49, and 27.1% aged 50 and above) was comparable to the age distribution of Internet users in China disclosed in the statistical report on China’s Internet development [53], showing that this sample was representative and suitable for this study. Appendix B reports the distribution of demographics.
At the beginning of this survey, respondents were instructed to recall their favorite mobile app and indicate its category. The top four categories of the reported apps are short video (48.7%), online shopping (15.5%), social networking (10.2%), and instant messaging (10.0%). The remaining apps were from other categories (e.g., news, finance, and education), each accounting for less than 5%. Then, respondents were required to keep their experience with this recalled app in mind while completing the survey. At the end of the survey, respondents indicated their demographics (e.g., age, gender, and previous privacy experience) and were financially rewarded.

5.1.2. Measures

The measurement scales of all constructs used in this study were adapted from existing literature. Specifically, the five-item measurement of perceived privacy regulatory protection was adapted from Miltgen and Smith [20] to measure users’ perceptions regarding the existence and adequacy of provisions and systems for protecting their personal data. The measurement of psychological ownership of data includes three items adapted from Peck and Shu [26] to measure the degree to which users perceive ownership of personal data. The measurement of mobile app users’ privacy concerns was a second-order model (i.e., perceived surveillance, perceived intrusion, and secondary use of information) with nine items adopted from Xu et al. [32]. Perceived firm control over data was measured with five items adapted from Xu et al. [17] to assess the extent to which users perceive a firm has control over personal data. The eight-item measurement of privacy efficacy was adapted from Crossler and Bélanger [49] to assess users’ confidence in their belief that they know how to protect data privacy on their phones. The measurement of intention to share personal data was adapted from Grosso et al. [54] with four items. The measurement of previous privacy experience was adapted from Xu et al. [55] with three items. All items were measured using a seven-point Likert-type scale ranging from 1 = “strongly disagree” to 7 = “strongly agree”.
The original questionnaire was presented to two marketing researchers and revised in line with their feedback. A pilot study (N = 107) was conducted to confirm the final questionnaire for the main study. Table 3 provides the measurement items and the measurement properties of each construct in the main study.
All multi-item measured constructs were assessed for reliability, as well as convergent validity and discriminant validity based on confirmatory factor analysis (CFA). Regarding the multi-dimensionality of the privacy concern measures, we followed Ramani and Kumar’s [56] approach, using an aggregated scale consisting of the average scores of the three dimensions as the indicators of privacy concerns for further analyses, given the second-order CFA results (see Appendix C).
As reported in Table 3, the Cronbach’s alpha values of all constructs were above 0.7, indicating acceptable reliability of multi-item scales [57]. Convergent validity was supported as all factor loadings were above 0.6, average variance extracted (AVE) values for each latent construct were above 0.5, and all composite reliabilities exceeded 0.7. The square root of the AVE of each construct was greater than the correlations between the construct itself and other constructs (see Table 4). Thus, the discriminant validity was supported [58].

5.2. Results

5.2.1. Structural Model Testing

Structural equation modeling (SEM) was used to estimate the structural model. The proposed structural model (see Model 1 in Table 5) demonstrated a good fit to the data (χ2[86] = 282.303, CFI = 0.984, TLI = 0.980, RMSEA = 0.046). The results showed that perceived privacy regulatory protection positively influenced psychological ownership of data (γ = 0.09, p < 0.05) while negatively influencing privacy concerns (γ = −0.36, p < 0.001), thus supporting H1 and H3, which was consistent with the results of the preliminary study. Furthermore, intention to share personal data was positively affected by psychological ownership of data (γ = 0.15, p < 0.001) while being negatively affected by privacy concerns (γ = −0.56, p < 0.001). Thus, H2 and H4 were supported.
A follow-up analysis exploring the mediating roles of psychological ownership of data and privacy concerns was conducted through testing a nested model with the direct effect path (i.e., perceived privacy regulatory protection → intention to share personal data). It used the bootstrapping method with a sample of 5000 and a bias-corrected confidence interval of 95%, and the results are reported in Model 2 of Table 5. The direct effect was significant (estimate = 0.240, CI = [0.1947, 0.2916]) and the overall model fit was significantly improved (χ2[85] = 246.180, CFI = 0.986, TLI = 0.983, RMSEA = 0.042; Δχ2[1] = 36.123, p < 0.001) after adding the direct path. Moreover, the total effect of perceived privacy regulatory protection on intention to share personal data was significantly positive (estimate = 0.474, CI = [0.3857, 0.5717]). The indirect effect of perceived privacy regulatory protection on intention to share personal data via either psychological ownership of data (estimate = 0.014, CI = [0.0025, 0.0311]) or privacy concerns (estimate = 0.225, CI = [0.1833, 0.2753]) was also significant. Therefore, psychological ownership of data and privacy concerns partially mediate the effects of perceived privacy regulatory protection.

5.2.2. Moderation Effect Testing

This study conducted a stepwise hierarchical regression analysis to test the moderation effects in H5 and H6 (see Table 6). The results of regression analysis (including previous privacy experience, age, gender, and app category as control variables) revealed that the moderation effects of both firm control over data (β = −0.11, p < 0.001) and user privacy efficacy (β = −0.12, p < 0.001) were significant. As shown in Figure 3, the positive effect of perceived privacy regulatory protection on psychological ownership of data became weaker when firm control over data was high (+1SD), while the negative effect of perceived privacy regulatory protection on privacy concerns became stronger when user privacy efficacy was high (+1SD). Therefore, H5 and H6 were supported.
Given the significant moderation effects of firm control over data and user privacy efficacy, this study conducted a post hoc analysis using a custom model in PROCESS [59] to examine the possible moderated mediation effects (see Appendix D and Appendix E). This analysis used the bootstrapping method with a sample of 5000 and a bias-corrected confidence interval of 95%. The control variables were included as covariates in this analysis.
The results showed that firm control over data moderates the mediation effect of psychological ownership of data (index of moderated mediation = −0.0194, CI = [−0.0353, −0.0064], not including zero). Specifically, as firm control over data increased, the positive indirect effect (effect = 0.0389, CI = [0.0162, 0.0682], not including zero) of perceived privacy regulatory protection on intention to share personal data through psychological ownership of data became nonsignificant (effect = 0.0001, CI = [−0.0119, 0.0154], including zero). User privacy efficacy moderates the mediation effect of privacy concerns (index of moderated mediation = 0.0679, CI = [0.0397, 0.0988], not including zero). As user privacy efficacy increased, the positive indirect effect (effect = 0.0400, CI = [0.0110, 0.0688], not including zero) of perceived privacy regulatory protection on intention to share personal data through privacy concerns became stronger (effect = 0.1758, CI = [0.1051, 0.2505], not including zero).

5.2.3. Common Method Bias

This study employed three techniques to examine the potential bias caused by common method variance (CMV). First, Harman’s one-factor test was conducted with CFA. The one-factor model indicated a poor fit to the data (χ2[434] = 17442.954, CFI = 0.393, TLI = 0.349, RMSEA = 0.189), compared to the original CFA model (χ2[413] = 942.862, CFI = 0.981, TLI = 0.979, RMSEA = 0.034). Second, this study utilized the marker variable technique. A marker variable measuring perceptions of the role of diplomas (i.e., “Diplomas are becoming less and less useful.”: 1 = “strongly disagree”, 7 = “strongly agree”) met Lindell and Whitney’s [60] criterion of being theoretically unrelated to the studied constructs. The adjustment of the original correlations among constructs for the smallest correlation of the marker variable with all other constructs (i.e., r = 0.0003) revealed that all adjusted correlations maintained their size and pattern of significance (see Table 4). Third, with this marker as a control variable, the structural model was re-estimated after adding paths from this marker variable to all endogenous variables. As reported in Model 3 of Table 52[97] = 249.054, CFI = 0.984, TLI = 0.980, RMSEA = 0.043), all the structural relationships, which were originally significant, as shown in in Model 1 of Table 5, remained significant, evidencing that CMV did not bias the results of this study. Thus, common method variance, if it existed, was not a pervasive problem in this study.

6. Conclusions and Discussion

6.1. Major Findings

This research examines the relationship between privacy regulatory protection and users’ data sharing. First, this study elucidates the positive effect of privacy regulatory protection on users’ data sharing, contributing to the literature on privacy protection. Specifically, privacy regulatory protection can facilitate users’ data sharing through enhancing their psychological ownership of data (i.e., entitlement effect) and reducing their privacy concerns (i.e., reassurance effect).
Second, the effects of privacy regulatory protection can be moderated. The reassurance effect of privacy regulatory protection on users’ intention to share personal data is accentuated when users’ privacy efficacy is high. This is because users’ privacy efficacy strengthens the negative effect of privacy regulatory protection on privacy concerns. In addition, the entitlement effect of privacy regulatory protection on users’ intention to share personal data diminishes when users perceive that a firm exercises greater control over their data. This is because firm control over data mitigates the positive effect of privacy regulatory protection on users’ psychological ownership of data.

6.2. Theoretical Implications

This research contributes to the existing research in several ways. Firstly, this research expands the view of privacy regulatory protection as a marketing environmental variable. Privacy regulatory protection should be regarded as a favorable factor that facilitates user data collection, rather than merely a constraint of firms’ data-driven marketing.
Secondly, this research broadens the application of regulatory focus theory in digital privacy. Focusing on the digital privacy of mobile users, this research adopts regulatory focus theory to explain how privacy regulatory protection affect users’ data sharing, providing a new research direction for this theory in emerging digital technology and data privacy fields.
Thirdly, this research advances the theoretical development in the digital marketing literature on balancing user data privacy protection and utilization. The two coexisting and independent psychological routes (i.e., privacy concerns and psychological ownership of data) remind researchers to consider not only negative privacy factors but also positive psychological factors in digital marketing studies.
Lastly, this research explores contingent factors that influence the effectiveness of privacy regulatory protection (i.e., firm control over data and user privacy efficacy). These insights offer actionable guidance for data-driven digital platform firms seeking to tailor their data collection strategies.

6.3. Managerial Implications

This research offers some implications for digital platform firms that want to unleash the power of privacy regulatory protection. Firstly, digital platforms should build a comprehensive understanding of privacy regulatory protection. They should regard privacy regulatory protection as a favorable environmental factor that facilitates user data collection, as this research unveils the entitlement and reassurance effect of privacy regulatory protection on users’ data sharing. In this regard, digital platform firms should showcase their efforts to reassure users of data safety, such as developing clear and transparent privacy policies.
Secondly, digital platform firms that want to encourage users to share data should consider the extent of control firms have over personal data. Firms should avoid excessive control over users’ data, especially when releasing and disseminating personal data without explicit user consent. Instead, they can offer more flexible options for users to determine which data can be made public or shared with a third party.
Thirdly, digital platform firms should take users’ digital literacy and privacy efficacy into account while developing data privacy strategies to encourage users’ data sharing. Firms can enhance users’ digital literacy by offering educational resources or user guides that clarify data sharing safety and privacy practices. In addition, providing easy-to-use privacy-enhancing tools, such as private browsing and identity anonymizers, can empower users to take control of their data security, ultimately boosting their privacy efficacy and confidence in sharing data.

6.4. Limitations and Future Research

Firstly, although this research was conducted in the context of mobile application usage with samples from the United States and China, the results may be context-specific due to the research context and sampling limitations. Future research is encouraged to incorporate a broader range of user demographics, such as nationality, culture, and value orientations, and to distinguish between different types of personal data to test whether the proposed research model still holds.
Secondly, in this research, regulatory focus theory was only used as a theoretical framework for exploring the influence mechanism of psychological ownership and privacy concerns. However, users with a promotion (prevention) focus might respond more strongly to the entitlement (reassurance) effect. Future research is encouraged to further investigate the boundary roles of prevention or promotion focus in this influence mechanism.
Thirdly, this research measured users’ behavioral intentions rather than their actual data sharing behavior. Although behavioral intentions can reflect real behavior to a certain extent, there may be a hypothetical bias. Future research could conduct field experiments to test the effects of real regulatory events or firms’ data strategies on user behavior.
Lastly, this research only examines the overall impact of privacy regulatory protection. Based on the framework proposed in this research, future research can further explore the impact of various regulatory measures (e.g., public service ads, corporate data processing licenses, and institutional rating assessments) and implementable corporate measures to enhance users’ privacy efficacy (e.g., digital literacy education, and privacy protection tool usage guidelines) in users’ data-sharing behaviors.

Author Contributions

Conceptualization, J.K., J.L., S.H. and L.C.; methodology, J.L. and S.H.; investigation, J.L. and L.C.; writing—original draft preparation, J.L. and S.H.; writing—review & editing, J.K., J.L. and S.H.; funding acquisition, J.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Beijing Municipal Social Science Foundation grant number (21GLC065).

Institutional Review Board Statement

The ethical review was approved by the Ethics Committee of the School of Economics and Management, Beijing University of Posts and Telecommunications on 20 July 2024.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data supporting reported results are available from the authors on request.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

According to Humphreys and Wang [50], the two dictionaries used in the preliminary study were adapted from existing research. Specifically, we employed the stemming method of the NLTK (Natural Language Toolkit) package of Python for dictionary-based English text analysis, as it can efficiently reduce different word forms to their basic forms, effectively avoiding semantic omissions caused by word form variations and improving the accuracy and consistency of text analysis. Thus, the elements contained in the dictionaries we used were the stems of words.
The dictionary of psychological ownership was adapted from the seven-word psychological ownership dictionary of Smith et al. [51], generating seven stems of words that all happened to be the original words themselves (i.e., “me”, “mine”, “my”, “our”, “own”, “myself”, “ownership”).
The dictionary of privacy concerns was adapted from the privacy dictionary of Visentin et al. [52], generating 24 stems of words. Figure A1 shows the detailed construction process of the privacy concerns dictionary.
Figure A1. Privacy concerns dictionary generation process [52].
Figure A1. Privacy concerns dictionary generation process [52].
Jtaer 20 00153 g0a1

Appendix B

Table A1. Sample demographics (N = 1095) (Section 5.1.1).
Table A1. Sample demographics (N = 1095) (Section 5.1.1).
Demographic Category FrequencyPercentage (%)
GenderMale43940.1
Female65659.9
Age18–2922120.2
30–3924122.0
40–4933630.7
50 and above29727.1
Education (highest completed)College and below28926.4
Bachelor’s degree62657.2
Master’s degree and above18016.4
Monthly income (CNY)Less than 500024422.3
5001–10,00044340.5
10,001–15,00022620.6
15,001–20,0001009.1
More than 20,000827.5

Appendix C

Table A2. The second-order CFA results (Section 5.1.2). (A) Using a second-order conceptualization of privacy concerns. (B) Using average scores for the three dimensions of privacy concerns.
Table A2. The second-order CFA results (Section 5.1.2). (A) Using a second-order conceptualization of privacy concerns. (B) Using average scores for the three dimensions of privacy concerns.
(A)
ConstructIndicatorStandardized
Loading
Unstandardized
Loading
SEt-Valuep
PC aPS b0.9601.000
PC aPI b0.9780.9730.02147.2060.000
PC aSUPI b0.9130.9500.02242.4100.000
PSPS10.9170.8780.01655.7250.000
PSPS20.9290.9410.01658.3230.000
PSPS30.9381.000
PIPI10.8850.8460.01749.3130.000
PIPI20.8950.8920.01751.0030.000
PIPI30.9371.000
SUPISUPI10.9430.9210.01464.9070.000
SUPISUPI20.9481.0000.01566.5900.000
SUPISUPI30.9471.000
(B)
ConstructIndicatorStandardized
loading
Unstandardized
loading
SEt-valuep
PCPS0.9421.0320.02050.8480.000
PCPI0.9640.9750.01853.1950.000
PCSUPI0.8881.000
Notes: PC = privacy concerns, PS = perceived surveillance, PI = perceived intrusion, SUPI = secondary use of personal information. The respective indicators of PS, PI, SUPI are numbered serially (e.g., PS1, PS2, PS3, …, SUPI3) a Second-order factor. b Second-order indicator.

Appendix D

The moderated mediation custom model analysis was conducted with PROCESS version 4.3.1 code, which was downloaded from www.processmacro.org (accessed on 9 April 2023).
Figure A2. Conceptual diagram.
Figure A2. Conceptual diagram.
Jtaer 20 00153 g0a2
Before running the PROCESS code, we assigned the parameter moments to 1 (located in line 3037 of the PROCESS code) to condition effects that are moderated at the mean and plus and minus one standard deviation from the mean. Then, we ran the SPSS syntax “process y=SH /x=PRP /w=PE /z=FC /m=PC PO /conf=95 /cov=PPE AGE GEN APP1 APP2 APP3 APP4 /boot=5000 /bmatrix=1,1,0,0,1,1 /wmatrix=1,0,0,0,0,0 /zmatrix=0,1,0,0,0,0.” to estimate the custom model.

Appendix E

Table A3. Results of moderated mediation analysis (Section 5.2.2). (A) Mediator: psychological ownership of data. (B) Mediator: privacy concerns.
Table A3. Results of moderated mediation analysis (Section 5.2.2). (A) Mediator: psychological ownership of data. (B) Mediator: privacy concerns.
(A)
Moderator:
Firm Control Over Data
Indirect EffectBootSEBootLLCIBootULCI
Low0.03890.01330.01620.0682
Medium0.01950.00750.00720.0364
High0.00010.0068−0.01190.0154
IndexBootSEBootLLCIBootULCI
Index of moderated mediation−0.01940.0074−0.0353−0.0064
(B)
Moderator:
User Privacy Efficacy
Indirect EffectBootSEBootLLCIBootULCI
Low0.04000.01470.01100.0688
Medium0.10790.02410.06110.1560
High0.17580.03730.10510.2505
IndexBootSEBootLLCIBootULCI
Index of moderated mediation0.06790.01510.03970.0988

References

  1. Kotler, P.; Kartajaya, H.; Setiawan, I. Marketing 5.0: Technology for Humanity, 1st ed.; John Wiley & Sons: Hoboken, NJ, USA, 2021. [Google Scholar]
  2. Cui, T.H.; Ghose, A.; Halaburda, H.; Iyengar, R.; Pauwels, K.; Sriram, S.; Tucker, C.; Venkataraman, S. Informational Challenges in Omnichannel Marketing: Remedies and Future Research. J. Mark. 2021, 85, 103–120. [Google Scholar] [CrossRef]
  3. Quach, S.; Thaichon, P.; Martin, K.D.; Weaven, S.; Palmatier, R.W. Digital Technologies: Tensions in Privacy and Data. J. Acad. Mark. Sci. 2022, 50, 1299–1323. [Google Scholar] [CrossRef]
  4. Generation Privacy: Young Consumers Leading the Way. Available online: https://www.cisco.com/c/dam/en_us/about/doing_business/trust-center/docs/cisco-consumer-privacy-report-2023.pdf (accessed on 13 May 2025).
  5. Marikyan, D.; Papagiannidis, S.; Rana, O.F.; Ranjan, R. General Data Protection Regulation: A Study on Attitude and Emotional Empowerment. Behav. Inf. Technol. 2024, 43, 3561–3577. [Google Scholar] [CrossRef]
  6. Research Priorities 2022–2024. Available online: https://www.msi.org/wp-content/uploads/2022/10/MSI-2022-24-Research-Priorities-Final.pdf (accessed on 13 May 2025).
  7. Xu, H.; Teo, H.H.; Tan, B.C.Y.; Agarwal, R. The Role of Push-Pull Technology in Privacy Calculus: The Case of Location-Based Services. J. Manag. Inform. Syst. 2009, 26, 135–173. [Google Scholar] [CrossRef]
  8. Zhao, L.; Lu, Y.; Gupta, S. Disclosure Intention of Location-Related Information in Location-Based Social Network Services. Int. J. Electron. Commer. 2012, 16, 53–90. [Google Scholar] [CrossRef]
  9. Ioannou, A.; Tussyadiah, I.; Lu, Y. Privacy Concerns and Disclosure of Biometric and Behavioral Data for Travel. Int. J. Inf. Manag. 2020, 54, 102122. [Google Scholar] [CrossRef]
  10. Urbonavicius, S.; Degutis, M.; Zimaitis, I.; Kaduskeviciute, V.; Skare, V. From Social Networking to Willingness to Disclose Personal Data When Shopping Online: Modelling in the Context of Social Exchange Theory. J. Bus. Res. 2021, 136, 76–85. [Google Scholar] [CrossRef]
  11. Tang, J.; Zhang, B.; Akram, U. What Drives Authorization in Mobile Applications? A Perspective of Privacy Boundary Management. Information 2021, 12, 311. [Google Scholar] [CrossRef]
  12. Urbonavicius, S. Relative Power of Online Buyers in Regard to a Store: How it Encourages Them to Disclose Their Personal Data? J. Retail. Consum. Serv. 2023, 75, 103510. [Google Scholar] [CrossRef]
  13. Higgins, E.T. Beyond Pleasure and Pain. Am. Psychol. 1997, 52, 1280–1300. [Google Scholar] [CrossRef]
  14. Zhang, X.; Liu, S.; Chen, X.; Wang, L.; Gao, B.; Zhu, Q. Health Information Privacy Concerns, Antecedents, and Information Disclosure Intention in Online Health Communities. Inf. Manag. 2018, 55, 482–493. [Google Scholar] [CrossRef]
  15. Maseeh, H.I.; Nahar, S.; Jebarajakirthy, C.; Ross, M.; Arli, D.; Das, M.; Rehman, M.; Ashraf, H.A. Exploring the Privacy Concerns of Smartphone App Users: A Qualitative Approach. Mark. Intell. Plan. 2023, 41, 945–969. [Google Scholar] [CrossRef]
  16. Sun, Y.; Wang, N.; Shen, X.L.; Zhang, J.X. Location Information Disclosure in Location-Based Social Network Services: Privacy Calculus, Benefit Structure, and Gender Differences. Comput. Hum. Behav. 2015, 52, 278–292. [Google Scholar] [CrossRef]
  17. Xu, H.; Teo, H.H.; Tan, B.C.Y.; Agarwal, R. Effects of Individual Self-Protection, Industry Self-Regulation, and Government Regulation on Privacy Concerns: A Study of Location-Based Services. Inf. Syst. Res. 2012, 23, 1342–1363. [Google Scholar] [CrossRef]
  18. Lwin, M.; Wirtz, J.; Williams, J.D. Consumer Online Privacy Concerns and Responses: A Power-Responsibility Equilibrium Perspective. J. Acad. Mark. Sci. 2007, 35, 572–585. [Google Scholar] [CrossRef]
  19. Bandara, R.; Fernando, M.; Akter, S. Managing Consumer Privacy Concerns and Defensive Behaviors in the Digital Marketplace. Eur. J. Market. 2021, 55, 219–246. [Google Scholar] [CrossRef]
  20. Miltgen, C.L.; Smith, H.J. Exploring Information Privacy Regulation, Risks, Trust, and Behavior. Inf. Manag. 2015, 52, 741–759. [Google Scholar] [CrossRef]
  21. Lucas, G.; Molden, D.C. Motivating Political Preferences: Concerns with Promotion and Prevention as Predictors of Public Policy Attitudes. Motiv. Emot. 2011, 35, 151–164. [Google Scholar] [CrossRef]
  22. Schumacher, C.; Eggers, F.; Verhoef, P.C.; Maas, P. The Effects of Cultural Differences on Consumers’ Willingness to Share Personal Information. J. Interact. Mark. 2023, 58, 72–89. [Google Scholar] [CrossRef]
  23. Pierce, J.L.; Kostova, T.; Dirks, K.T. The State of Psychological Ownership: Integrating and Extending a Century of Research. Rev. Gen. Psychol. 2003, 7, 84–107. [Google Scholar] [CrossRef]
  24. Peck, J.; Luangrath, A.W. A Review and Future Avenues for Psychological Ownership in Consumer Research. Consum. Psychol. Rev. 2023, 6, 52–74. [Google Scholar] [CrossRef]
  25. Pierce, J.L.; Kostova, T.; Dirks, K.T. Toward a Theory of Psychological Ownership in Organizations. Acad. Manag. Rev. 2001, 26, 298–310. [Google Scholar] [CrossRef]
  26. Peck, J.; Shu, S.B. The Effect of Mere Touch on Perceived Ownership. J. Consum. Res. 2009, 36, 434–447. [Google Scholar] [CrossRef]
  27. Kirk, C.P.; Peck, J.; Swain, S.D. Property Lines in the Mind: Consumers’ Psychological Ownership and Their Territorial Responses. J. Consum. Res. 2018, 45, 148–168. [Google Scholar] [CrossRef]
  28. Luo, Y.; Zhou, L.; Huang, J.; Wang, X.; Sun, R.; Zhu, G. Platform Perspective versus User Perspective: The Role of Expression Perspective in Privacy Disclosure. J. Retail. Consum. Serv. 2023, 73, 103372. [Google Scholar] [CrossRef]
  29. Demmers, J.; Weihrauch, A.N.; Thompson, F.H.M. Your Data Are (Not) My Data: The Role of Social Value Orientation in Sharing Data about Others. J. Consum. Psychol. 2022, 32, 500–508. [Google Scholar] [CrossRef]
  30. Cichy, P.; Salge, T.O.; Kohli, R. Privacy Concerns and Data Sharing in the Internet of Things: Mixed Methods Evidence from Connected Cars. MIS Q. 2021, 45, 1863–1891. [Google Scholar] [CrossRef]
  31. Smith, H.J.; Milberg, S.J.; Burke, S.J. Information Privacy: Measuring Individuals’ Concerns about Organizational Practices. MIS Q. 1996, 20, 167–196. [Google Scholar] [CrossRef]
  32. Xu, H.; Gupta, S.; Rosson, M.B.; Carroll, J.M. Measuring Mobile Users’ Concerns for Information Privacy. In Proceedings of the 33rd International Conference on Information Systems, Orlando, FL, USA, 16–19 December 2012; Available online: https://aisel.aisnet.org/icis2012/proceedings/ISSecurity/10 (accessed on 25 September 2024).
  33. Swani, K.; Milne, G.R.; Slepchuk, A.N. Revisiting Trust and Privacy Concern in Consumers’ Perceptions of Marketing Information Management Practices: Replication and Extension. J. Interact. Mark. 2021, 56, 137–158. [Google Scholar] [CrossRef]
  34. Milberg, S.J.; Smith, H.J.; Burke, S.J. Information Privacy: Corporate Management and National Regulation. Organ. Sci. 2000, 11, 35–57. [Google Scholar] [CrossRef]
  35. Dinev, T.; Hart, P. An Extended Privacy Calculus Model for E-Commerce Transactions. Inf. Syst. Res. 2006, 17, 61–80. [Google Scholar] [CrossRef]
  36. Kang, J.; Lan, J.; Yan, H.; Li, W.; Shi, X. Antecedents of Information Sensitivity and Willingness to Provide. Mark. Intell. Plan. 2022, 40, 787–803. [Google Scholar] [CrossRef]
  37. Degirmenci, K. Mobile Users’ Information Privacy Concerns and the Role of App Permission Requests. Int. J. Inf. Manag. 2020, 50, 261–272. [Google Scholar] [CrossRef]
  38. Yin, J.; Qiu, X.; Wang, Y. The Impact of AI-Personalized Recommendations on Clicking Intentions: Evidence from Chinese E-Commerce. J. Theor. Appl. Electron. Commer. Res. 2025, 20, 21. [Google Scholar] [CrossRef]
  39. Akanfe, O.; Lawong, D.; Rao, H.R. Blockchain Technology and Privacy Regulation: Reviewing Frictions and Synthesizing Opportunities. Int. J. Inf. Manag. 2024, 76, 102753. [Google Scholar] [CrossRef]
  40. Ke, T.T.; Sudhir, K. Privacy Rights and Data Security: GDPR and Personal Data Markets. Manag. Sci. 2023, 69, 4389–4412. [Google Scholar] [CrossRef]
  41. Morewedge, C.K.; Monga, A.; Palmatier, R.W.; Shu, S.B.; Small, D.A. Evolution of Consumption: A Psychological Ownership Framework. J. Mark. 2021, 85, 196–218. [Google Scholar] [CrossRef]
  42. De Hert, P.; Papakonstantinou, V.; Malgieri, G.; Beslay, L.; Sanchez, I. The Right to Data Portability in the GDPR: Toward User-Centric Interoperability of Digital Services. Comput. Law Secur. Rev. 2018, 34, 193–203. [Google Scholar] [CrossRef]
  43. Houston, F.S.; Gassenheimer, J.B. Marketing and Exchange. J. Mark. 1987, 51, 3–18. [Google Scholar] [CrossRef]
  44. Liu, Y.L.; Wu, Y.; Li, C.; Song, C.; Hsu, W.Y. Does Displaying One’s IP Location Influence Users’ Privacy Behavior on Social Media? Evidence from China’s Weibo. Telecommun. Policy 2024, 48, 102759. [Google Scholar] [CrossRef]
  45. Yang, Q.; Gong, X.; Zhang, K.Z.K.; Liu, H.; Lee, M.K.O. Self-Disclosure in Mobile Payment Applications: Common and Differential Effects of Personal and Proxy Control Enhancing Mechanisms. Int. J. Inf. Manag. 2020, 52, 102065. [Google Scholar] [CrossRef]
  46. Rifon, N.J.; LaRose, R.; Choi, S.M. Your Privacy is Sealed: Effects of Web Privacy Seals on Trust and Personal Disclosures. J. Consum. Aff. 2005, 39, 339–362. [Google Scholar] [CrossRef]
  47. Johnston, A.C.; Warkentin, M. Fear Appeals and Information Security Behaviors: An Empirical Study. MIS Q. 2010, 34, 549–566. [Google Scholar] [CrossRef]
  48. Kim, K.; Kim, J. Third-Party Privacy Certification as an Online Advertising Strategy: An Investigation of the Factors Affecting the Relationship between Third-Party Certification and Initial Trust. J. Interact. Mark. 2011, 25, 145–158. [Google Scholar] [CrossRef]
  49. Crossler, R.E.; Bélanger, F. Why Would I Use Location-Protective Settings on My Smartphone? Motivating Protective Behaviors and the Existence of the Privacy Knowledge–Belief Gap. Inf. Syst. Res. 2019, 30, 995–1006. [Google Scholar] [CrossRef]
  50. Humphreys, A.; Wang, R.J.H. Automated Text Analysis for Consumer Research. J. Consum. Res. 2018, 44, 1274–1306. [Google Scholar] [CrossRef]
  51. Smith, L.W.; Rose, R.L.; Zablah, A.R.; McCullough, H.; Saljoughian, M.M. Examining Post-Purchase Consumer Responses to Product Automation. J. Acad. Mark. Sci. 2023, 51, 530–550. [Google Scholar] [CrossRef]
  52. Visentin, M.; Tuan, A.; Di Domenico, G. Words Matter: How Privacy Concerns and Conspiracy Theories Spread on Twitter. Psychol. Mark. 2021, 38, 1828–1846. [Google Scholar] [CrossRef]
  53. The 54th China Statistical Report on Internet Development. Available online: https://www.cnnic.cn/NMediaFile/2024/0906/MAIN17255881028985DZD0SVVQH.pdf (accessed on 13 May 2025).
  54. Grosso, M.; Castaldo, S.; Li, H.A.; Larivière, B. What Information Do Shoppers Share? The Effect of Personnel-, Retailer-, and Country-Trust on Willingness to Share Information. J. Retail. 2020, 96, 524–547. [Google Scholar] [CrossRef]
  55. Xu, H.; Dinev, T.; Smith, J.; Hart, P. Information Privacy Concerns: Linking Individual Perceptions with Institutional Privacy Assurances. J. Assoc. Inf. Syst. 2011, 12, 798–824. [Google Scholar] [CrossRef]
  56. Ramani, G.; Kumar, V. Interaction Orientation and Firm Performance. J. Mark. 2008, 72, 27–45. [Google Scholar] [CrossRef]
  57. Bagozzi, R.P.; Yi, Y. On the Evaluation of Structural Equation Models. J. Acad. Mark. Sci. 1988, 16, 74–94. [Google Scholar] [CrossRef]
  58. Fornell, C.; Larcker, D.F. Evaluating Structural Equation Models with Unobservable Variables and Measurement Error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  59. Hayes, A.F. Introduction to Mediation, Moderation, and Conditional Process Analysis: A Regression-Based Approach, 2nd ed.; Guilford Publications: New York, NY, USA, 2018. [Google Scholar]
  60. Lindell, M.K.; Whitney, D.J. Accounting for Common Method Variance in Cross-Sectional Research Designs. J. Appl. Psychol. 2001, 86, 114–121. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Conceptual model. Notes: The arrows show the hypothetical relationships among the variables.
Figure 1. Conceptual model. Notes: The arrows show the hypothetical relationships among the variables.
Jtaer 20 00153 g001
Figure 2. Percentage of reviews related to psychological ownership of data or privacy concerns by quarters. Notes: The dashed line marks the passage of the CCPA.
Figure 2. Percentage of reviews related to psychological ownership of data or privacy concerns by quarters. Notes: The dashed line marks the passage of the CCPA.
Jtaer 20 00153 g002
Figure 3. Moderation effect analysis. Notes: Panel (a) shows the moderation effect of firm control over data, and panel (b) shows the moderation effect of user privacy efficacy.
Figure 3. Moderation effect analysis. Notes: Panel (a) shows the moderation effect of firm control over data, and panel (b) shows the moderation effect of user privacy efficacy.
Jtaer 20 00153 g003
Table 1. Selected studies on privacy regulatory protection and individual data sharing.
Table 1. Selected studies on privacy regulatory protection and individual data sharing.
StudyResearch ContextKey FindingsMediatorsModerators
Xu et al. [7] ▲Location-based services in SingaporeGovernment regulation increases user intention to share personal information.Privacy risksInformation delivery mechanisms
Zhao et al. [8] ▲●Location-based social network services in ChinaAwareness of legislation increases intention to share location-based information.Privacy concerns
Ioannou et al. [9] ▲●Traveler’s online services in the United KingdomPrivacy protection regulation perceptions are negatively associated with privacy concerns. However, the negative link between privacy concerns and data sharing was not empirically supported.Privacy concerns
Urbonavicius et al. [10] ▲●Social networking and online buying in LithuaniaPerceived regulatory effectiveness has a positive direct effect and negative indirect effect on willingness to share personal data in e-buying.Perceived lack of control
Tang et al. [11] ●Social media mobile apps in ChinaGovernment regulation increases user intention to authorize personal information. Perceived privacy control, perceived privacy risk, privacy concern, trust
Urbonavicius [12] ▲●Online buying in LithuaniaPrivacy regulation increases willingness to share personal data.Store trust
This researchMobile apps in ChinaPrivacy regulatory protection exerts a positive effect on users’ intention to share personal data.Psychological ownership of data, privacy concernsFirm control over data, user privacy efficacy
Notes: ▲ indicates that this study mainly focused on the mediating role of users’ privacy perception; ● indicates that this study did not explore the contingent factors.
Table 2. Binary logistic regression results.
Table 2. Binary logistic regression results.
Psychological Ownership of DataPrivacy Concerns
Odds RatioSEOdds RatioSE
Review length1.016 ***0.0011.003 ***0.001
Review rating1.0220.0380.914 *0.039
CCPA (0 = before the date of passage, 1 = after the date of passage)1.254 **0.0720.628 ***0.076
Log-likelihood4590.5594447.458
Number of observations37663766
Notes: The dependent variables were measured as dummy variables. *** p < 0.001; ** p < 0.01; * p < 0.05.
Table 3. Measurement items.
Table 3. Measurement items.
λαCRAVE
Perceived privacy regulatory protection [20] 0.870.830.58
 Chinese legislation can cope with the growing number of people leaving personal data on Apps0.77
 I believe that the systems used by the public authorities to manage the citizens’ personal data are technically secure0.85
 I believe citizens will be able to keep a good level of control over their personal data0.61
 I will always be able to rely on public authorities for help if problems arise with my personal data0.76
 I believe that the authorities that manage my personal data are professional and competent0.80
Psychological ownership of data [26] 0.890.800.71
 I feel like my personal data is mine0.78
 I feel a very high degree of personal ownership of my personal data0.89
 I feel like I own my personal data0.86
Privacy concerns [32] a 0.950.850.87
Perceived surveillance b0.95
 I believe that the location of my mobile device is monitored at least part of the time
 I am concerned that this app is collecting too much data about me
 I am concerned that this app may monitor my activities on my mobile device
Perceived intrusion b0.96
 I feel that as a result of my using this app, others know about me more than I am comfortable with
 I believe that as a result of my using this app, data about me that I consider private is now more readily available to others than I would want
 I feel that as a result of my using this app, data about me is out there that, if used, will invade my privacy
Secondary use of personal data b0.89
 I am concerned that mobile apps may use my personal data for other purposes without notifying me or getting my authorization
 When I give personal data to use mobile apps, I am concerned that apps may use my data for other purposes
 I am concerned that mobile apps may share my personal data with other entities without getting my authorization
Firm control over data [17] 0.950.880.78
 The firm that owns this app has control over my personal data that has been released0.88
 The firm that owns this app has control over the amount of my personal data to be collected0.86
 Overall, the firm that owns this app has control over my personal data provided to the app0.86
 The firm that owns this app has control over who can get access to my personal data0.88
 The firm that owns this app has control over how my personal data is being used by the app0.94
User privacy efficacy [49] 0.940.880.68
 I can figure out which apps to trust on my phone0.76
 I am confident I know how to prevent receiving targeted ads on my phone0.78
 I believe I know how to limit the data I share with apps from my phone0.83
 I am confident that I am aware of when my location is being used on my phone0.82
 I know how to change the settings of my phone to protect my privacy0.78
 I am able to protect myself against the release of personal data on my phone0.88
 Overall, I am confident that I can protect my privacy on my phone0.90
 It is easy to control the sharing of location data on my phone0.83
Intention to share personal data [54] 0.920.850.76
 I am willing to share my personal data with this app0.84
 I will probably share my personal data with this app0.89
 I will likely share my personal data with this app0.84
 I will possibly share my personal data with this app0.91
Previous privacy experience [55] 0.920.830.80
 I have experienced incidents that I felt were an improper invasion of data privacy0.91
 I have heard or read during the past year about the use and potential misuse of the data collected from apps0.89
 I have experienced incidents where my personal data was used by an app without my authorization0.89
Notes: The model fit of the CFA with the seven latent factors: χ2(413) = 942.862, CFI = 0.981, TLI = 0.979, RMSEA = 0.034. The model fit of the CFA with the second-order model of privacy concerns: χ2(24) = 105.834, CFI = 0.994, TLI = 0.991, RMSEA = 0.056. λ = standardized loading, α = Cronbach’s alpha, CR = composite reliability, AVE = average variance extracted. a Second-order factor. b Second-order indicator.
Table 4. Means, standard deviations, and correlations of the variables.
Table 4. Means, standard deviations, and correlations of the variables.
VariableMeanSD1234567
1.Perceived privacy regulatory protection5.710.940.760.09 **−0.36 **−0.07 *0.54 **0.37 **−0.15 **
2.Psychological ownership of data5.651.250.09 *0.840.04−0.09 **0.09 **0.13 **0.03
3.Privacy concern4.471.76−0.36 **0.040.930.30 **−0.58 **−0.55 **0.51 **
4.Firm control over data5.041.44−0.07 *−0.09 **0.30 **0.88−0.16 **−0.12 **0.30 **
5.User privacy efficacy4.951.280.54 **0.09 **−0.58 **−0.16 **0.820.49 **−0.27 **
6.Intention to share personal data4.881.360.37 **0.13 **−0.55 **−0.12 **0.49 **0.87−0.23 **
7.Previous privacy experience5.321.45−0.15 **0.030.51 **0.30 **−0.27 **−0.23 **0.90
Notes: Original correlations among the constructs are below the diagonal. Adjusted correlations for potential common method variance are above the diagonal. Boldfaced numbers on the diagonal are the square roots of the average variance extracted (AVE) values. ** p < 0.01; * p < 0.05.
Table 5. Structural model testing results.
Table 5. Structural model testing results.
Model 1Model 2Model 3
Pathγt-Valueγt-Valueγt-Value
Perceived privacy regulatory protection
 → Psychological ownership of data (H1)
0.09 *2.550.08 *2.440.09 *2.59
Psychological ownership of data
 → Intention to disclose personal data (H2)
0.15 **5.440.14 **4.850.15 **5.45
Perceived privacy regulatory protection
 → Privacy concerns (H3)
−0.36 **−11.08−0.35 **−10.92−0.34 **−10.43
Privacy concerns
 → Intention to disclose personal data (H4)
−0.56 **−19.09−0.49 **−16.22−0.56 **−18.83
Perceived privacy regulatory protection
 →Intention to disclose personal data
0.18 **5.97
Notes: Model 1 was a fully mediated model with no direct effect; Model 2 was a partially mediated model with direct effects; and Model 3 was a structural model used to test common method bias. The dependent variables were measured using Likert-type scales. γ = standardized estimates. ** p < 0.01; * p < 0.05.
Table 6. Regression analysis results.
Table 6. Regression analysis results.
PredictorDV: Psychological Ownership of DataDV: Privacy Concerns
Model 4Model 5Model 6Model 7
βt-Valueβt-Valueβt-Valueβt-Value
Perceived privacy regulatory protection0.103.14 **0.123.74 ***−0.26−10.18 ***−0.15−4.58 ***
Firm control over data −0.07−2.27 *
User privacy efficacy −0.42−15.50 ***
Perceived privacy regulatory protection ×
firm control over data
−0.11−3.55 ***
Perceived privacy regulatory protection ×
user privacy efficacy
−0.12−4.26 ***
Previous privacy experience0.030.980.062.05 *0.4416.97 ***0.3515.01 ***
Age−0.01−0.37−0.01−0.36−0.02−0.58−0.05−2.08 *
a: Male0.185.89 ***0.175.73 ***−0.01−0.470.031.49
b: Short video−0.05−1.06−0.05−1.160.010.200.020.55
b: Online shopping−0.02−0.46−0.03−0.66−0.02−0.46−0.01−0.26
b: Social networking0.020.460.020.530.082.57 **0.051.70
b: Instant messaging0.041.040.030.860.030.870.030.97
Adj-R20.04 0.06 0.30 0.44
Notes: Categorical variables: a = gender (vs. female); b = app category (vs. others). The dependent variables were measured using Likert-type scales. β = standardized coefficient. *** p < 0.001; ** p < 0.01; * p < 0.05.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kang, J.; Lan, J.; Huang, S.; Chen, L. Effects of Privacy Regulatory Protection on Users’ Data Sharing in Mobile Apps. J. Theor. Appl. Electron. Commer. Res. 2025, 20, 153. https://doi.org/10.3390/jtaer20030153

AMA Style

Kang J, Lan J, Huang S, Chen L. Effects of Privacy Regulatory Protection on Users’ Data Sharing in Mobile Apps. Journal of Theoretical and Applied Electronic Commerce Research. 2025; 20(3):153. https://doi.org/10.3390/jtaer20030153

Chicago/Turabian Style

Kang, Jun, Jingyi Lan, Suping Huang, and Libin Chen. 2025. "Effects of Privacy Regulatory Protection on Users’ Data Sharing in Mobile Apps" Journal of Theoretical and Applied Electronic Commerce Research 20, no. 3: 153. https://doi.org/10.3390/jtaer20030153

APA Style

Kang, J., Lan, J., Huang, S., & Chen, L. (2025). Effects of Privacy Regulatory Protection on Users’ Data Sharing in Mobile Apps. Journal of Theoretical and Applied Electronic Commerce Research, 20(3), 153. https://doi.org/10.3390/jtaer20030153

Article Metrics

Back to TopTop