Next Article in Journal
The Impact of Digital Finance on the Development of Cross-Border E-Commerce
Previous Article in Journal
Agency or Reselling? Multi-Product Sales Mode Selection on E-Commerce Platform
Previous Article in Special Issue
Digital Payments and Sustainable Economic Growth: Transmission Mechanisms and Evidence from an Emerging Economy, Turkey
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Trust, Privacy Fatigue, and the Informed Consent Dilemma in Mobile App Privacy Pop-Ups: A Grounded Theory Approach

Glorious Sun School of Business and Management, Donghua University, Shanghai 200051, China
*
Author to whom correspondence should be addressed.
J. Theor. Appl. Electron. Commer. Res. 2025, 20(3), 179; https://doi.org/10.3390/jtaer20030179
Submission received: 6 June 2025 / Revised: 10 July 2025 / Accepted: 11 July 2025 / Published: 14 July 2025

Abstract

As data becomes a core driver of modern business innovation, mobile applications increasingly collect and process users’ personal information, posing significant challenges to the effectiveness of informed consent and the legitimacy of user authorization. Existing research on privacy informed consent mechanisms has predominantly focused on privacy policy texts and normative legal discussions, often overlooking a critical touchpoint—the launch-time privacy pop-up window. Moreover, empirical investigations from the user’s perspective remain limited. To address these issues, this study employs a two-stage approach combining compliance audit and grounded theory. The preliminary audit of 21 mobile apps assesses the compliance of privacy pop-ups, and the formal study uses thematic analysis of interviews with 19 participants to construct a dual-path explanatory framework. Key findings reveal that: (1) while the reviewed apps partially safeguarded users’ right to be informed, compliance deficiencies still persist; (2) trust and privacy fatigue emerge as dual motivations driving user consent. Trust plays a critical role in amplifying the impact of positive messages within privacy pop-ups by enhancing the consistency among users’ cognition, affect, and behavior, thereby reducing resistance to privacy consent and improving the effectiveness of the current informed consent framework. Conversely, privacy fatigue increases the inconsistency among these factors, undermining consent effectiveness and exacerbating the challenges associated with informed consent. This study offers a user-centered framework to explain the dynamics of informed consent in mobile privacy pop-ups and provides actionable insights for regulators, developers, and privacy advocates seeking to enhance transparency and user autonomy.

1. Introduction

In the context of data-driven business development, the collection, storage, and utilization of user information have become critical to the sustainable growth of mobile applications (apps). Both the EU General Data Protection Regulation (GDPR) and the Personal Information Protection Law of the People’s Republic of China (PIPL) mandate that data processors obtain “freely given, specific, informed and unambiguous consent” (GDPR Article 4 (11)) prior to data processing. These provisions effectively establish the principle of “informed consent” as a fundamental norm in privacy practices, balancing the rights and obligations of all entities involved. Specifically, they require data processors the duty to disclose information and obtain consent, while granting data subjects the rights to be informed and to make autonomous decisions [1].
In privacy practices, apps employ privacy policies to disclose the scope, purposes, and methods of personal information collection [2], thereby fulfilling transparency obligations and demonstrating accountability. These policies not only inform users of their data rights and choices but also provide regulators with a basis for preliminary compliance assessment. However, when apps employ complex legal jargon and pre-checked default options that substantially impair user comprehension, the so-called informed consent degenerates into a legal fiction [3]. This indicates that the realization of idealized informed consent continues to encounter considerable obstacles.
On the one hand, privacy policies are often presented as text links, fulfilling a basic “notice” function. However, this mechanism often proves to be a mere formality due to inherent characteristics of privacy policy content. Specifically, the policies’ length [4] and frequent use of complex legal jargon [5,6] render them difficult for users to effectively understand. Indeed, even experts find these policies to be misleading and coercive [7]. This dual characteristic not only reduces policy transparency but also directly exacerbates the existing information asymmetry between apps and users. Consequently, this superficial consent essentially constitutes an irrational decision. On the other hand, leveraging data for commercial prosperity requires app developers to invest substantial resources in data tracking, processing, and analysis [8]. This continuous resource investment can generate tension with commercial objectives. Driven by profitability pressures, app developers may adopt aggressive strategies for acquiring user privacy data, which leading to privacy policies they construct to become mere instruments for passively responding to legal regulations.
While prior research has largely centered on privacy policies, it often neglects the interactive components that users actually engage with—namely, privacy pop-ups (typically displayed during app launch). These pop-ups offer concise summaries of data processing practices and are designed to secure user consent in a timely and ostensibly compliant manner. Functionally, privacy pop-ups and full-length privacy policies operate as complementary tools: pop-ups deliver essential highlights and direct users to policy links for further details. Importantly, a user’s consent via a pop-up is often interpreted as legal agreement to the full privacy policy. Despite regulatory mandates like China’s Cybersecurity Standard Practice Guide—Self-Assessment Guide for the Collection and Use of Personal Information by Mobile Internet Apps (CSPG) requiring explicit privacy policy disclosure via pop-ups upon initial launch, academic research has paid less attention to the content of privacy pop-ups. This raises two research questions: (RQ1) To what extent do current app privacy pop-ups comply with legal principles of transparent data collection and explicit notification? (RQ2) From a user perspective, how does interaction with privacy pop-ups contribute to informed consent dilemmas, and what underlying factors shape this process?
To address the aforementioned research questions, this study employs textual review and grounded theory to conduct two sequential investigations. The preliminary study develops 7 evaluation dimensions and 14 assessment indicators based on the CSPG and related regulatory frameworks, conducting a compliance review of privacy pop-up content in 21 widely used Chinese mobile apps. Building on these findings, formal study conducts semi-structured interviews with 19 participants regarding privacy pop-up interfaces, followed by manual coding and thematic analysis applying grounded theory’s the three-step (open, axial, and selective coding), ultimately constructing a theoretical framework that elucidates the process of user informed consent. Our findings challenge the prevailing practice of relying solely on privacy pop-up content to obtain valid user consent, and advocate for a user-centric transformation in the design and presentation of privacy pop-ups.

2. Literature Review

In the medical field, the principle of informed consent aims to safeguard the autonomy and self-determination of potential subjects or patients by providing a structured framework and procedure for communication with those receiving medical treatment or participating in research [9]. This principle has been extended to privacy protection practices in mobile applications. The Autonomous Authorization Model proposed by Faden and Beauchamp [10] encompasses four key elements: understanding, absence of control/voluntariness, intentionality, and authorization. In the context of commercial interactions, this model underscores the responsibility of organizations to respect consumers’ autonomy by providing clear and accurate information and avoiding manipulative or coercive practices, thereby enabling genuinely informed and voluntary authorization.
The principle of informed consent reflects humanity’s aspiration toward fairness and justice. However, users do not make decisions regarding privacy pop-ups in a vacuum; rather, their choices are context-dependent and subject to dual constraints—individual cognitive limitations and designer-imposed frameworks. As the processes of data collection, storage, and utilization become increasingly complex and opaque, it becomes difficult for users to continuously track and comprehend how their personal data is being used [11]. Moreover, the foundational assumption that privacy decisions are grounded in individual rationality has been widely questioned and challenged by scholars. Simon [12] argued that human rationality is at best a rough approximation of full rationality. In practice, users often ignore privacy terms and click the “agree” button out of habit [13], exhibiting behaviors that are characterized by bounded rationality and even irrationality.
Numerous mobile applications intentionally employ manipulative interaction flows and “dark pattern” interface designs to nudge users toward developer-preferred choices [14]. A prominent example is the widely adopted “agree or exit” pseudo-choice architecture, which effectively strips users of substantive decision-making agency [15]. Such deceptive design strategies, both in terms of content framing and presentation, often result in users providing passive consent under conditions of incomplete information, thereby transforming consent into a largely automated, routinized process [16]. To mitigate these issues, scholars have proposed simplifying the language of privacy policies, employing visual aids and interactive interfaces to enhance informational transparency and content comprehensibility [17,18]. Additionally, negotiating consent is a powerful interaction mechanism that engages users and can enable them to strike a balance between privacy and pricing concerns [19].
Ensuring users’ informed consent is not only a prerequisite for the fair utilization of data but also a cornerstone for fostering trust in commercial society. Existing studies have extensively explored this issue from perspectives such as empirical cases and research design. Building upon prior research, this paper adopts a user-centric perspective to examine how the content of privacy pop-up interfaces influences users’ cognition and decision-making behaviors, thereby offering a novel theoretical lens for understanding the dilemma of informed consent.

3. Preliminary Study

Privacy pop-ups are the primary interface for communicating privacy policies, conveying data processing rules and facilitating informed consent. However, mobile apps often implement informed consent mechanisms during registration or initial launch, which prematurely closes negotiation space and reduces user bargaining power [20]. Moreover, current regulatory frameworks lack unified standards for presenting privacy pop-up content, and the effectiveness of their design in safeguarding users’ right to informed consent remains questionable. Therefore, our preliminary study systematically evaluates the compliance of privacy pop-up content in mainstream mobile applications through textual analysis.

3.1. Sample Selection and Analytical Framework

This study builds upon the classification of mobile applications specified in the Regulations on the Scope of Necessary Personal Information for Common Types of Mobile Internet Applications and applies a multi-criteria selection protocol to ensure the representativeness and validity of our privacy pop-up compliance analysis.
First, we ensured broad industry coverage by adopting a one-to-one mapping strategy, in which each of the 20 selected mobile apps represents a distinct usage category (e.g., Meituan for food delivery, TikTok for mobile short videos). This approach enabled us to capture a wide range of user-facing digital services and conduct comparative analyses across different sectors. Second, the selected apps demonstrated high user penetration. Each app ranked highly in major Chinese Android app markets (e.g., Huawei App Gallery, Tencent App Store) in terms of download volume and active user base. Their widespread adoption and strong brand recognition suggest that their privacy authorization practices are representative of their respective categories. Third, we avoided including multiple apps from the same developer to minimize bias arising from interface design homogeneity. For example, having already selected Baidu Maps in the navigation category, we chose Quark Browser instead of Baidu Browser in the browser category. Finally, given the rapid development and widespread adoption of generative artificial intelligence (e.g., ChatGPT, Kimi), we additionally included the category of “AI assistants” to reflect emerging privacy authorization scenarios. In total, the sample consisted of 21 mobile applications, from which we collected approximately 13,000 Chinese characters of privacy pop-up content for compliance analysis.
PIPL, CSPG, and China’s National Standard GB/T 35273-2020 (Information Security Technology—Personal Information Security Specification) [21] establish key compliance benchmarks for reviewing mobile app privacy policies. Drawing on these standards, we further incorporated assessment indicators proposed by Zaeem and Barber [22] in their evaluation of privacy policy compliance. Based on this, we developed a comprehensive evaluation framework encompassing seven assessment dimensions and fourteen specific indicators, which was then used to systematically assess the compliance of privacy pop-up content. The detailed results of this evaluation are presented in Table 1.

3.2. Main Findings and Conclusions

The splash-screen privacy pop-ups of the 21 analyzed mobile applications primarily consist of links to user agreements, privacy policies, and permission requests, reflecting a commonly adopted platform strategy to fulfill the formal requirements of informed consent. The primary findings of the preliminary study are summarized as follows:
First, more than half of the apps used explicit titles such as “User Agreement and Privacy Policy,” while six apps employed vague labels like “Kind Reminder.” The remaining apps lacked any title altogether, substituting it with icons or welcome messages. These differences indicate considerable variation in privacy communication strategies, with some designs potentially aimed at minimizing users’ privacy awareness.
Second, app responses to users’ initial refusal to consent varied significantly. Most platforms attempted to persuade users by re-emphasizing the importance of the privacy policy. However, several adopted a rigid “reject-to-exit” model, whereby access to the app is blocked upon denial of consent. Only one app offered a limited-function mode to preserve minimal usability. These divergent interaction patterns may reflect differences in the extent to which the core business models of apps depend on user privacy data, as well as inconsistencies in developers’ interpretation and implementation of Article 16 of the PIPL, which mandates that refusal to authorize should not restrict access to basic services.
Third, although all apps had eliminated pre-selected checkboxes and avoided premature data collection, significant shortcomings remain in the protection of users’ right to be informed. As shown in Table 1, the key deficiencies include: (1) lack of update notifications: None of the apps specified a mechanism to notify users of policy changes, thus failing to comply with Article 17 of the PIPL. (2) opaqueness in data storage: No app disclosed essential information such as data retention periods or storage locations. (3) insufficient disclosure of data sharing: Only eight apps briefly mentioned third-party data sharing, while the remaining apps lacked relevant information. (4) absence of feedback channels: No app provided a visible or accessible way for users to raise concerns or suggestions during the consent process.
In sum, although privacy pop-ups serve as an initial layer of compliance, their content remains superficial and largely symbolic. The mechanisms in place often prioritize procedural formality over substantive transparency and user empowerment. To better understand how users perceive and react to such interfaces, the subsequent formal study employs semi-structured interviews to explore the informed consent experience from a user-centered perspective, addressing the limitations identified in this preliminary phase.

4. Formal Study

4.1. Research Design

4.1.1. Methodology

Grounded theory, proposed by Glaser, et al. [23], advocates that theoretical development should emerge from collected data rather than be derived deductively from existing theoretical frameworks. Unlike approaches that emphasize purely theoretical analysis, grounded theory also highlights the importance of the researcher’s personal experience, allowing researchers to draw upon their own knowledge and insights to retrospectively code the collected data and construct theory through a bottom-up, data-driven process.
Given the variations in privacy pop-up content across apps and the study’s focus on exploring informed consent dilemmas from users’ perspectives, we adopted the three-stage coding paradigm of Procedural Grounded Theory. Specifically, based on a preliminary study, we developed an interview guide and collected primary data through semi-structured interviews. Using the qualitative analysis software NVivo 14, we first conducted open coding to extract initial concepts, followed by axial coding to establish the logical relationships among these concepts, and finally performed selective coding to integrate and construct the core theoretical framework. Before finalizing the framework, we conducted a theoretical saturation test to ensure its completeness and reliability.

4.1.2. Interview Participants

The interview participants, primarily aged 25–35 and representing diverse occupations, were selected for their high app usage and varied contexts. To enhance representativeness, participants from other age groups and backgrounds were also included. Given the potential use of specialized terminology, most interviewees held a bachelor’s degree or higher. Based on these criteria, the final sample comprised 19 participants, whose demographic characteristics are detailed in Table 2.

4.1.3. Interview Content and Procedure

Based on the preliminary analysis of privacy pop-ups, we developed an interview guide, which focused on exploring users’ decision-making and covered the following main topics: 1. Users’ attitudes and behavioral tendencies toward privacy pop-up content when strong usage motivations are excluded; 2. Users’ evaluations of the readability and credibility of privacy policy links within pop-ups, as well as their routine clicking behavior; 3. Users’ perceptions of the privacy protection promises offered in the pop-ups; 4. Users’ acceptance of various types of privacy permission requests; 5. Users’ evaluations of information sharing, data processing methods, and privacy management channels; 6. Users’ perceptions and evaluations of secondary pop-ups appearing after rejecting initial privacy prompts.
The interviews were conducted online using a one-on-one format, guided by a flexible question-and-answer structure that allowed for adaptive probing based on participant responses. Each session lasted between 45 and 70 min. Prior to each interview, participants gave explicit consent for audio recording and screen capture, and were informed that all content would be anonymized and used solely for academic research. The order of questions was adjusted dynamically during the interviews to ensure natural conversation flow and allow deeper exploration of relevant issues. Upon completion, the recordings were transcribed verbatim, and all content unrelated to the research aims was excluded. Between December 2024 and February 2025, a total of 19 in-depth interviews were conducted, with a cumulative duration of approximately 800 min and yielding about 98,000 words of valid transcript data.

4.2. Data Analysis

4.2.1. Open Coding

Open coding, as the initial stage of data analysis in grounded theory, requires researchers to maintain an open and neutral stance, avoiding any preconceived theoretical frameworks or subjective biases. During this process, researchers repeatedly read and compare the original interview data, extracting key concepts line by line and performing preliminary coding to ensure that the analytical results accurately capture the core content of the interviews. Following this procedure, the researchers conducted careful comparisons and multiple rounds of synthesis, ultimately identifying 47 initial concepts (a1–a47) and 34 initial categories (a1–a34). The results of the open coding are presented in Table A1.

4.2.2. Axial Coding

Axial coding further compares and analyzes the results of open coding to identify and construct logical relationships among the initial categories, reorganizing them into more comprehensive core categories. Based on the 34 initial categories presented, an in-depth analysis of their nature and interrelations was conducted, resulting in the synthesis of 14 core categories (B1–B14). These core categories, along with their corresponding initial categories and category meanings, are detailed in Table 3. Moreover, to more clearly illustrate the hierarchical relationships among the core categories and their roles in explaining the informed consent mechanism, the 14 core categories were classified into seven classes—cognition (positive/negative), affect (positive/negative), conation, internal factors, and external factors—based on the Cognitive-Affective-Conative (CAC) framework.

4.2.3. Selective Coding

Selective coding, as an extension of axial coding, entails a refined comparative analysis of the identified core categories and their interrelationships to determine an integrative core category that conceptually unifies all others. This integrative core category is subsequently connected to the remaining categories through the construction of a “storyline,” thereby forming a coherent and systematic theoretical framework [24]. By thoroughly aligning the axial categories with the research objectives, the conative dimension category (“habitual consent to privacy pop-ups”) was identified as the integrative core category (see Table 4 for selective coding results). The developed storyline elucidates two critical pathways within the informed consent mechanism:
In the first pathway, trust serves as a critical mediator, positively linking cognitive appraisal with habitual behavior and also mediating the relationship between external factors and such behavior. Overall, trust helps alleviate the challenges associated with informed consent in the context of privacy pop-ups. However, in the second pathway, although privacy pop-ups elicit negative cognition and subsequent distrust, users do not adopt cautious decision-making regarding privacy permissions. Instead, they continue to exhibit habitual consent behaviors. Privacy fatigue provides an explanation for this paradox, as its cumulative effect further diminishes users’ decision-making capacity, thereby exacerbating the informed consent dilemma.

4.2.4. Theoretical Saturation Test

Theoretical saturation refers to the point at which additional data no longer yield new categories, relationships, or theoretical insights. We tested for saturation using the two methods proposed by Morse [25]. First, we conducted follow-up inquiries with seven participants, confirming that the research framework was consistent with their experiences and expectations. Additionally, we performed theoretical sampling on three reserved interview transcripts from the coding phase, which did not generate any new categories or relationships. These results indicate that the current model was sufficiently developed and had reached theoretical saturation.

5. Explanation of the Theoretical Model

The CAC framework conceptualizes decision-making as a three-stage process: cognition refers to individuals’ evaluations and interpretations of external stimuli; affect denotes the emotional responses triggered by these cognitive appraisals; and conation captures the behavioral inclinations or intentions that emerge from the interaction between cognition and affect [26]. Prior studies further suggest that both cognitive and affective components can be divided into positive and negative dimensions [27]. Building on this theoretical lens, the current study integrates internal and external influencing factors into the CAC structure, organizing 14 core categories into seven conceptual clusters. This categorization underpins a dual-pathway model that explains user consent behavior in the context of privacy pop-ups by delineating two opposing processes—alleviation and exacerbation—of the informed consent dilemma, as elaborated in the following sections. (see Figure 1).

5.1. Alleviating the Informed Consent Dilemma

5.1.1. Positive Cognition and Consent to Privacy Pop-Ups

When users encounter privacy pop-ups that present clear and reasonable permission requests, they tend to develop positive cognitive evaluations. A key component of such positive cognition is perceived benefits–which in privacy contexts differ from traditional material incentives, primarily manifesting as perceived convenience, perceived personalization, and perceived usefulness.
With the advancement of intelligent services, users increasingly expect customized experiences tailored to their needs. Perceived benefits of personalization often motivate users to disclose personal data [28]. In our interviews, a participant passionate about cooking reported that personalized recommendations based on his browsing history on video platforms significantly enhanced his experience by matching his content preferences. Others mentioned that e-commerce recommendations not only met their needs but also reduced time spent on search and decision-making, enhancing convenience. Moreover, when users understand that disclosing certain information is essential for core app functionality, nearly all interviewees deemed such requests reasonable. A frequently cited example was: “If a food delivery app lacks location access, it cannot fulfill orders.”
Another key aspect of positive cognition is perceived control, as reflected in participant remarks such as, “The app provides management explanations, which at least makes me feel I can manage my personal information—it gives me the sense that my privacy belongs to me.” Users value a sense of control over their personal data. Even minimal management options in privacy pop-ups elicited positive responses and enhanced consent willingness. Clear and explicit app communications further strengthened users’ perception of being informed. For instance, many participants favored contextual consent, which clarified how their data would be used.
Perceived legal effectiveness refers to individuals’ general perceptions of the authority and efficacy of government and legal frameworks, functioning as an agency mechanism whereby users entrust the outcomes of their privacy decisions to the regulatory power of privacy-related laws [29,30]. Common pop-up statements such as “in accordance with relevant legal requirements” significantly influence user decisions. For Chinese users, privacy laws are viewed as highly authoritative and a critical means of protecting personal rights (e.g., “If privacy issues arise, I can rely on legal provisions to pursue legal action”). Higher perceived legal effectiveness is associated with stronger consent intentions and may lead to faster decision-making.

5.1.2. External Factors and Consent to Privacy Pop-Ups

External factors such as social norms, platform characteristics, and data attributes may directly influence users’ decisions to consent to privacy pop-ups. For instance, one participant noted, “If the app is recommended by friends, I will most likely agree directly,” indicating that observing others’ behaviors shapes individual privacy decisions. When individuals see that members of groups they identify with consistently consent to privacy pop-ups, this encourages their own positive decision to consent. This phenomenon may also reflect that, under the current privacy context, users perceive privacy risks as a collective issue, leading to a “everyone agrees, so I agree too” decision logic.
Most interviewees expressed trust in large platforms (e.g., “at least big platforms are more legitimate”), which in turn increased their likelihood of consenting to privacy pop-ups on these platforms. In contrast, smaller platforms face more critical user scrutiny due to privacy concerns, which can reduce users’ willingness to consent and hinder platform growth. Additionally, for newly downloaded apps, the number of downloads serves as an important cue; a high download count increases users’ readiness to consent without extensive review. Therefore, salient app characteristics, such as size, reputation, and popularity, directly affect users’ consent decisions.
Differences in data relevance and sensitivity also directly affect privacy pop-up consent. For example, one interviewee noted, “It’s strange that a gaming app would ask for my contact information,” while granting camera access for a homework-scanning app is usually perceived as reasonable. Users also vary in their perceptions of information sensitivity; for instance, email addresses and account passwords are considered to represent different levels of sensitivity.

5.1.3. Trust as a Mediator of Privacy Pop-Up Consent

Positive cognition not only directly influences users’ consent to privacy pop-ups but also exerts an indirect effect through trust as a mediating variable. Trust arises from users’ favorable cognitive evaluations of the content presented in privacy pop-ups and is typically conceptualized along three dimensions: platform trust, legal trust, and trust in authorization mechanisms. Platform trust refers to users’ positive assessment of a platform’s ability to safeguard personal privacy; legal trust reflects users’ recognition of the legal validity and enforceability of privacy policies; and trust in authorization mechanisms primarily concerns contextual consent, which enhances users’ perceived control and right to be informed, thereby fostering trust. Compared to blanket consent, contextual consent provides users with greater transparency and autonomy in privacy decision-making, eliciting positive emotional responses that manifest as trust.
Beyond individual cognitive appraisals, external factors such as app characteristics, data attributes, and social norms also play a critical role in shaping user trust and thereby indirectly affect consent behavior. For example, beliefs “Large companies are more capable of handling my privacy properly” or “At least large companies are more regulated” suggest that corporate reputation can reduce privacy concerns and enhance users’ inclination to trust. Similarly, data requests that align with app functionality (e.g., a scanner app requesting camera access) are perceived as more reasonable and trustworthy. In contrast, irrelevant or excessively sensitive data requests can heighten user vigilance and reduce trust. Additionally, observing widespread acceptance of privacy pop-ups for a particular app may lead users to infer the safety and legitimacy of these practices, thereby increasing their willingness to consent.
It is important to note that the essence of the informed consent dilemma lies in whether users endorse their own consent behavior. Such endorsement may stem from positive perceptions of the privacy pop-up content or from favorable external cues; both can serve as valid reasons for habitual consent, reducing psychological resistance. In this process, trust functions as a critical mediating mechanism that bridges cognition, external context, and user behavior. In summary, fostering trust can play a pivotal role in mitigating the current challenges associated with informed consent in digital environments.

5.2. Exacerbating the Informed Consent Dilemma

5.2.1. The Direct Impact of Negative Cognition on Distrust

Poorly designed privacy pop-ups exacerbate users’ negative experiences by reinforcing adverse cognitive responses and fostering distrust. In this context, perceived cost refers to users’ subjective evaluation of the time and cognitive effort required to comprehend privacy-related information. Many interviewees expressed frustration with hyperlinked privacy policies, criticizing their excessive length and the use of technical jargon. For example, one noted, “The privacy content is too complex and full of technical terms—I can’t understand it,” which often led to further concerns: “With so much text, there could be hidden clauses that harm my interests without my knowledge.
The privacy protection commitments made by apps have failed to effectively alleviate these concerns. These risks primarily originate from the platforms themselves, particularly the intrusive nature of the technologies employed. Several interviewees reported feeling highly discomforted when their chat content was converted into corresponding recommendations. Moreover, some participants believed that various platforms still suffer from security vulnerabilities that cannot be entirely eliminated. Additionally, although privacy pop-ups notify users that the app may share their data with third parties for certain purposes, the absence of clear and comprehensive disclosure regarding the purposes and the identities of these third parties has further intensified users’ distrust.
Perceived coercion reflects users’ recognition of the imbalanced power relationship between themselves and mobile applications. Interview participants frequently reported feeling subordinate to platforms, often describing privacy pop-ups as “negotiations where the platform holds most of the bargaining chips.” This perceived power asymmetry is primarily driven by two factors: a constrained choice architecture typified by the “accept or exit” dilemma, which may foster an illusion of user autonomy, and the repeated display of privacy prompts even after explicit rejections. Such coercive design patterns systematically amplify users’ distrust toward data collection motives, fostering both attitudinal skepticism and behavioral resistance. While these negative perceptions may not immediately threaten platforms’ commercial performance due to their dominant market position, they can gradually erode user trust, posing long-term risks to organizational legitimacy and sustainable development.
Finally, apps often request relevant permissions under the pretext of providing personalized services; however, interviewees exhibited two contrasting attitudes toward this justification. Some participants believed it helped fulfill their personalized needs, while others questioned its legitimacy. For instance, many respondents noted that “recommendations often arrive only after I have already made a purchase,” highlighting the awkward delay in intelligent recommendations. Furthermore, users expressed dissatisfaction with the homogeneous content recommended on short-video platforms, which often foster “information cocoons.” This growing aversion to algorithmic recommendations may ultimately erode trust in platforms’ permission requests that rely on such justifications.

5.2.2. The Paradox of Consent in Privacy Pop-Ups

Users’ negative cognitions frequently give rise to distrust toward the app. Notably, although users generally express skepticism toward apps’ privacy protection commitments, data management capabilities, and information-sharing practices, such distrust does not lead to more cautious decision-making when confronted with privacy pop-ups. Nearly all respondents reported habitually clicking “agree” whenever such pop-ups appeared. This tendency to continue granting consent despite negative affect reflects a typical manifestation of the paradox of consent. Acquisti and Gross [31] found through their survey of student Facebook users that even individuals with strong privacy awareness tend to disclose a substantial amount of personal information in practice. This finding suggests that negative affect does not always lead to adverse privacy behaviors. Hsu, et al. [32] also indicate that, under certain conditions, the relationship between privacy concerns and information sharing may exhibit a U-shaped nonlinear pattern.
“I know my privacy will definitely be compromised, but in the end, I still have to click agree.” This statement illustrates that even when users are fully aware of potential privacy risks, their decision-making behavior often reflects an automated and habitual tendency to consent. Similarly, O’Maonaigh and Saxena [33] found that although smart speaker users recognize the privacy risks associated with their use, this awareness does not significantly influence their continued engagement with the product. Likewise, Dienlin and Trepte [34] observed that even after experiencing privacy violations, users’ privacy-related behaviors do not necessarily undergo meaningful change.
Integrating prior research with findings from the present interviews, we propose that the relationship between users’ negative cognitions, distrust, and their consent behavior in response to privacy pop-ups is not a simple linear negative correlation; rather, it may reflect a more complex nonlinear pattern. Specifically, negative cognitions and distrust may, to some extent, suppress users’ willingness to consent; however, under certain conditions, this willingness may rise again, resulting in a paradox of consent.

5.2.3. Privacy Fatigue as Moderators of Privacy Pop-Up Consent

Cynicism and emotional exhaustion together constitute privacy fatigue. The former denotes a defeatist belief in the inevitability of privacy violations, while the latter reflects the cognitive depletion resulting from continuous engagement in privacy management tasks [35]. Empirical evidence suggests that emotional exhaustion significantly weakens the positive relationship between privacy concerns and protective behavioral intentions, while cynicism may even reverse this relationship [36]. We argue that privacy fatigue also moderates the paradoxical relationship proposed in this study.
Specifically, when experiencing high levels of privacy fatigue, users exhibit mechanized and resigned behaviors when consenting to privacy notices. Under the influence of cynicism, users perceive privacy risks as ubiquitous and feel that “My resistance is futile,” thereby neutralizing the inhibitory effect of distrust on consent. Meanwhile, emotional exhaustion drives heuristic processing, leading to habitual agreement as a cognitive shortcut (“I’m too tired to care about privacy risks”). This dual mechanism elucidates the phenomenon of fatalistic consent, where negative privacy cognitions and affect paradoxically correlate with higher consent rates, effectively manifesting as behavioral “privacy surrender.” As one respondent lamented, “What can I do if I’m not satisfied? I can’t change anything, so whatever the platform says, that’s how it is.” Conversely, when privacy fatigue is low, users are more cognitively and emotionally capable of acting on their privacy concerns or distrust, thereby reducing consent likelihood and mitigating the privacy paradox.
Privacy fatigue may lead users to agree to privacy pop-ups in ways that, while superficially analogous to trust-based consent, stem from distinct psychological mechanisms. Such behaviors often conflict with users’ privacy-related cognitions and affective states, resulting in a dissonance between their genuine willingness and manifested actions. This discrepancy compromises the autonomy and validity of user consent, further undermining the legitimacy and operational efficacy of existing informed consent frameworks.

6. Conclusions

This study sheds light on the informed consent dilemma in mobile app privacy pop-ups by combining compliance analysis and grounded theory. The following sections discuss the theoretical and practical implications of the findings.

6.1. Theoretical Implications

First, this study advances theoretical understanding by shifting the focus from traditional privacy policy text analysis, which has largely concentrated on static and document-based representations of privacy practices, toward privacy pop-ups as the primary interactive medium influencing users’ informed consent behavior. This shift better reflects real user contexts, thereby enhancing the practical relevance and explanatory power of theoretical constructs. Moreover, preliminary phase of this study, we developed a set of compliance assessment indicators for privacy pop-ups and conducted an initial synthesis, providing a useful reference point for theoretically informed analyses of how compliance cues embedded in interface design influence user perceptions and behaviors.
Second, this study not only employs textual analysis but also integrates user interview data alongside a compliance assessment framework for privacy pop-ups, offering exploratory empirical evidence to illuminate the informed consent dilemma at the intersection of legal scholarship and information behavior research. This multifaceted approach helps address a notable gap in existing studies, which have largely focused on compliance assessment while paying limited attention to the underlying mechanisms driving user consent behaviors.
Finally, the proposed dual-path model elucidates how users′ cognitive disparities, affective responses, and subsequent behavioral outcomes unfold when encountering privacy pop-ups. Notably, the paradoxical relationship observed among negative cognitions, affect, and behavioral intention in the exacerbation path provides new insights into the privacy paradox. Furthermore, interview data reveal moderating effects of internal factors such as privacy fatigue, which delineate the boundary conditions of the dual-path model and suggest promising directions for future research.

6.2. Practical and Managerial Implications

Although relevant authorities and industry standards have imposed regulations on platforms’ data collection and processing practices, thereby enhancing transparency to some extent, significant gaps remain in fully safeguarding users’ right to be informed. Consequently, the role and regulatory positioning of privacy pop-ups require clear legal definition, while their content and format standards should be progressively standardized. Such standardization can serve as an important supplementary mechanism for ensuring app compliance with privacy laws.
From an app developer’s perspective, while ambiguous privacy designs may not pose immediate risks to short-term interests, the gradual accumulation of negative sentiments such as distrust can ultimately undermine long-term platform growth. This underscores the critical need for large platforms to proactively engage in addressing these challenges to ensure sustainable development. For smaller apps operating in highly competitive markets, implementing robust privacy protections is not only a matter of regulatory compliance but also a strategic avenue for cultivating user trust and building a positive reputation.

6.3. Research Limitations and Future Directions

Although this study has reached some interesting conclusions, there are still several limitations. First, the preliminary study’s limited app selection may have constrained the comprehensiveness of the analysis, underscoring the need to broaden the range of apps examined and apply large-scale textual analyses of privacy pop-up content. Additionally, the interview sample predominantly consisted of young and well-educated individuals, which may limit the generalizability of findings to older or less-educated populations. Due to their lower digital literacy, these groups might face greater cognitive barriers in processing privacy information or exhibit different privacy decision-making behaviors. Expanding the participant diversity would enhance the external validity and applicability of the proposed theoretical framework.
Second, at the beginning of the interviews, participants were instructed to set aside their strong usage motivations when evaluating privacy pop-ups and to focus instead on the content of the pop-ups themselves. However, in real-world settings, powerful usage needs may lead users to overlook all associated risks. The dual-path framework proposed in this study thus serves as a supplementary perspective beyond demand-driven influences. Future research could consider incorporating the influence of demand-driven factors and conducting related exploratory studies.
Another limitation lies in the unclear interactions between the two explanatory pathways and the extent of their respective contributions. When both pathways lead to the same behavioral outcome (i.e., consenting to privacy pop-up), future research should quantitatively assess the relative dominance of trust versus privacy fatigue, supported by robust empirical evidence.
Lastly, while grounded theory serves as a valuable qualitative approach for theoretical construction, it cannot quantify the strength of relationships between variables. Future studies could employ mixed-methods designs, combining qualitative insights with quantitative measurements to provide more comprehensive evidence. Specifically, large-scale experimental studies or longitudinal data analyses would help validate and extend the current findings.

Author Contributions

Conceptualization, M.C. (Ming Chen) and M.C. (Meimei Chen); methodology, M.C. (Ming Chen); software, M.C. (Ming Chen); validation, M.C. (Ming Chen); formal analysis, M.C. (Meimei Chen); investigation, M.C. (Meimei Chen); resources, M.C. (Meimei Chen); data curation, M.C. (Ming Chen); writing—original draft preparation, M.C. (Ming Chen); writing—review and editing, M.C. (Meimei Chen); visualization, M.C. (Ming Chen); supervision, M.C. (Meimei Chen); project administration, M.C. (Meimei Chen); funding acquisition, M.C. (Meimei Chen) All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Social Science Foundation of China under Grant number 20BGL284 “Research on the neural mechanism of user experiences for innovation of intelligent recommendation services”.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki: and approved by the Donghua University (protocol code SRSY202406150027 and 28 June 2024).

Informed Consent Statement

Informed consent was obtained from all subjects involved in this study.

Data Availability Statement

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A1. Results of Open Coding.
Table A1. Results of Open Coding.
Initial CategoryInitial ConceptRepresentative Original Statements (Excerpt)
A1 Cognitive Costa1 Highly Specialized Agreement TermsThe terms contain too much jargon incomprehensible to laypersons.
A2 Time costa2 Excessively Lengthy AgreementI feel like I need to scroll for a long time to get through the agreement.
A3 Choice Constrainta3 Consent Required to Use the AppIf I don’t agree, I can’t use the app.
A4 Power Asymmetrya4 Platform DominanceThe privacy pop-up feels like a negotiation, and the app holds all the chips.
A5 Second Pop-up Pressurea5 Repeated Pop-up HarassmentI’ve already explicitly rejected it, yet it asks again—it’s basically like spam calls.
A6 Third-party Platform Riska6 Opaque Data Handling;
a7 Risks of Data Leakage
Third-party data usage remains undisclosed; Cross-platform data leaks occur frequently.
A7 Platform-Inherent Risksa8 Technological Intrusiveness;
a9 Security Vulnerabilities
Big data technologies are scary—the app seems to know what I’m thinking; the platform may still have loopholes that people exploit illegally.
A8 Privacy Breach Experiencea10 Personal Information LeakageAfter using my phone number to consult about teacher qualification exams, I was harassed by various sales calls.
A9 Perceived Usefulnessa11 Basic Function RealizationWithout location, a food delivery app can’t deliver food.
A10 Perceived Conveniencea12 Time-savingThe app recommends products to me based on my information, so I don’t have to spend time searching, which is very convenient.
A11 Perceived Personalizationa13 Personalized RecommendationsThe app’s recommendation of cooking videos aligns with my culinary interests, providing both engagement and educational value.
A12 Reliabilitya14 Reliability of LawI feel that the law might be more reliable than technical protection statements.
A13 Constraintsa15 Legal Constraints on App BehaviorThe law at least constrains what the app can do, restricting its actions within legal boundaries.
A14 Basis for Rights Defensea16 Legal Basis for Rights ProtectionIf a privacy issue occurs, I can rely on the law to demand compensation for my losses from the app.
A15 Right to be informeda17 Data Collection TransparencyScenario-based authorization lets me know what kinds of personal data will be collected.
A16 Autonomous Choice Righta18 Empowerment of Choice;
a19 Sense of Autonomy
Contextual authorization empowers users to decide whether to consent; providing clear explanations of privacy procedures enhances my sense of autonomy, reducing concerns about privacy leakage even if I choose not to use the service.
A17 Cynicisma20 Helplessness over Routine Data Breaches; a21 Futility of Effort“Privacy erosion seems inevitable”; “Individual resistance proves futile.”
A18 Emotional Exhaustiona22 Cognitive Resource Depletion;
a23 Cumulative Negative Affect
The dense and complex terms are cognitively exhausting; reviewing these privacy agreements induces frustration, making me unwilling to invest time in reading them.
A19 App Sizea24 Corporate SizeLarger apps are okay because they won’t deceive you.
A20 App Download Volumea25 App Download VolumeIf an app has very few downloads, I don’t really trust its privacy pop-up content.
A21 Data Sensitivitya26 Low-sensitive Data; a27 High-sensitive DataIf it only collects my social media accounts or email, I don’t really mind. However, when it involves my passwords or financial information, I become highly cautious.
A22 Perceived Relevancea28 Low-relevance Permissions;
a29 High-relevance Permissions
Playing a game shouldn’t require my contacts—that’s weird; but camera permissions for a scanning app make sense, or it can’t work.
A23Recommendation Latencya30 Delayed RecommendationsI think the recommendation system isn’t smart enough—by the time it suggests things, I’ve already bought them.
A24Homogeneous Recommendationsa31 Repetitive ContentIt keeps recommending the same type of things, making my view narrow.
A25 Peer Influence Referencea32 Herd BehaviorI tend to install apps recommended by friends without verifying their security.
A26 Socialization of Riska33 Collective Risk BearingWhen risks materialize, the consequences are collectively borne by all parties.
A27 Platform Trusta34 Platform’s Privacy GuaranteesAt least apps downloaded from XX’s official store won’t have viruses, so privacy feels relatively safe.
A28 Legal Trusta35 Legal Guarantees for PrivacyIf an app claims to follow the law to protect privacy, it increases my trust.
A29 Trust in Authorization Methoda36 Preference for Scenario-based AuthorizationIt’s better to request camera permission when taking a photo than to force consent upon installation.
A30 Vague Protection Commitmentsa37 Undisclosed Technical Standards; a38 Unclear Protection MeasuresI don’t understand how the so-called advanced protection technology works; privacy protection is just a verbal commitment without substance, and I don’t know how it actually protects my privacy.
A31 Unclear Management Patha39 Opaque Operational GuidanceThe app provides a privacy management pathway, but I’m not clear on how to operate it.
A32 Opaque Information Sharinga40 Hidden Third-party Information; a41 Unclear Sharing PurposeI’m not clear which third-party platforms my information is shared with; the privacy policy does not clearly state the specific purpose of information sharing, which is unsettling.
A33 Habitual Consent
(Privacy Fatigue-driven)
a42 Reluctant Acceptance;
a43 Reducing Information Overload; a44 Privacy Resignation
As individual users in front of platforms, we can only agree and have no other choice; I can’t possibly study every detail, so I just agree directly. I feel there’s no such thing as privacy nowadays, so I don’t care whether I agree or not.
A34 Habitual Consent
(Trust-driven)
a45 Authorizing Familiar Platforms;
a46 Authorizing Large Platforms;
a47 Consenting Due to Trust in Legal Safeguards
I directly agree for big, familiar companies—they won’t misuse data; I feel their products are legit, so I just agree; I trust the law, so if it says it complies, I’m fine with it.

References

  1. Hurd, H.M. The normative force of consent. In The Routledge Handbook of the Ethics of Consent; Routledge: London, UK, 2018; pp. 44–54. [Google Scholar]
  2. Marotta-Wurgler, F. Self-regulation and competition in privacy policies. J. Leg. Stud. 2016, 45 (Suppl. S2), S13–S39. [Google Scholar] [CrossRef]
  3. Bechmann, A. Non-informed consent cultures: Privacy policies and app contracts on Facebook. J. Media Bus. Stud. 2014, 11, 21–38. [Google Scholar] [CrossRef]
  4. Jensen, C.; Potts, C. Privacy policies as decision-making tools: An evaluation of online privacy notices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vienna, Austria, 24–29 April 2004; pp. 471–478. [Google Scholar]
  5. Milne, G.R.; Culnan, M.J.; Greene, H. A longitudinal assessment of online privacy notice readability. J. Public Policy Mark. 2006, 25, 238–249. [Google Scholar] [CrossRef]
  6. Slavin, R.; Wang, X.; Hosseini, M.B.; Hester, J.; Krishnan, R.; Bhatia, J.; Breaux, T.D.; Niu, J. Toward a framework for detecting privacy policy violations in android application code. In Proceedings of the 38th International Conference on Software Engineering, Austin, TX, USA, 14–22 May 2016; pp. 25–36. [Google Scholar]
  7. Reidenberg, J.R.; Breaux, T.; Cranor, L.F.; French, B.; Grannis, A.; Graves, J.T.; Liu, F.; McDonald, A.; Norton, T.B.; Ramanath, R. Disagreeable privacy policies: Mismatches between meaning and users’ understanding. Berkeley Technol. Law J. 2015, 30, 39. [Google Scholar] [CrossRef]
  8. West, S.M. Data capitalism: Redefining the logics of surveillance and privacy. Bus. Soc. 2019, 58, 20–41. [Google Scholar] [CrossRef]
  9. Tymchuk, A.J. Informing for consent: Concepts and methods. Can. Psychol./Psychol. Can. 1997, 38, 55. [Google Scholar] [CrossRef]
  10. Faden, R.R.; Beauchamp, T.L. A History and Theory of Informed Consent; Oxford University Press: Oxford, UK, 1986. [Google Scholar]
  11. Mikalef, P.; Boura, M.; Lekakos, G.; Krogstie, J. Big data analytics capabilities and innovation: The mediating role of dynamic capabilities and moderating effect of the environment. Br. J. Manag. 2019, 30, 272–298. [Google Scholar] [CrossRef]
  12. Simon, H.A. A behavioral model of rational choice. Q. J. Econ. 1955, 69, 99–118. [Google Scholar] [CrossRef]
  13. Böhme, R.; Köpsell, S. Trained to accept? A field experiment on consent dialogs. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Atlanta, GE, USA, 10–15 April 2010; pp. 2403–2406. [Google Scholar]
  14. Gray, C.M.; Kou, Y.; Battles, B.; Hoggatt, J.; Toombs, A.L. The dark (patterns) side of UX design. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–14. [Google Scholar]
  15. Schermer, B.W.; Custers, B.; Van der Hof, S. The crisis of consent: How stronger legal protection may lead to weaker consent in data protection. Ethics Inf. Technol. 2014, 16, 171–182. [Google Scholar]
  16. Solove, D. Privacy self-managementand the consent dilemma. Harv. Law Rev. 2013, 126, 1880–1903. [Google Scholar]
  17. Van den Berg, B.; Van der Hof, S. What happens to my data? A novel approach to informing users of data processing practices. First Monday 2012, 17. [Google Scholar] [CrossRef]
  18. McDonald, A.M.; Lowenthal, T. Nano-notice: Privacy disclosure at a mobile scale. J. Inf. Policy 2013, 3, 331–354. [Google Scholar] [CrossRef]
  19. Baarslag, T.; Alan, A.T.; Gomer, R.C.; Liccardi, I.; Marreiros, H.; Gerding, E.H.; Schraefel, M. Negotiation as an interaction mechanism for deciding app permissions. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 2012–2019. [Google Scholar]
  20. Nissenbaum, H. A contextual approach to privacy online. Daedalus 2011, 140, 32–48. [Google Scholar] [CrossRef]
  21. GB/T35273-2020; Information Security Technology—Personal Information Security Specification. Standardization Administration of China: Beijing, China, 2020.
  22. Zaeem, R.N.; Barber, K.S. The effect of the GDPR on privacy policies: Recent progress and future promise. ACM Trans. Manag. Inf. Syst. (TMIS) 2020, 12, 1–20. [Google Scholar] [CrossRef]
  23. Glaser, B.G.; Strauss, A.L.; Strutzel, E. The discovery of grounded theory; strategies for qualitative research. Nurs. Res. 1968, 17, 364. [Google Scholar] [CrossRef]
  24. Pandit, N.R. The creation of theory: A recent application of the grounded theory method. Qual. Rep. 1996, 2, 1–15. [Google Scholar] [CrossRef]
  25. Morse, J.M. Critical analysis of strategies for determining rigor in qualitative inquiry. Qual. Health Res. 2015, 25, 1212–1222. [Google Scholar] [CrossRef]
  26. Fishbein, M.; Ajzen, I. Belief, Attitude, Intention, and Behavior: An Introduction to Theory and Research. Philos. Rhetor. 1977, 10, 130–132. [Google Scholar]
  27. Chih, W.-H.; Liou, D.-K.; Hsu, L.-C. From positive and negative cognition perspectives to explore e-shoppers’ real purchase behavior: An application of tricomponent attitude model. Inf. Syst. E-Bus. Manag. 2015, 13, 495–526. [Google Scholar] [CrossRef]
  28. Zeng, F.; Ye, Q.; Li, J.; Yang, Z. Does self-disclosure matter? A dynamic two-stage perspective for the personalization-privacy paradox. J. Bus. Res. 2021, 124, 667–675. [Google Scholar] [CrossRef]
  29. Heng, X.; Hock-Hai, T.; Tan, B.; Agarwal, R. Effects of Individual Self-Protection, Industry Self-Regulation, and Government Regulation on Privacy Concerns: A Study of Location-Based Services. Inf. Syst. Res. 2012, 23, 1342–1363. [Google Scholar]
  30. Gong, X.; Zhang, K.Z.; Chen, C.; Cheung, C.M.; Lee, M.K. What drives self-disclosure in mobile payment applications? The effect of privacy assurance approaches, network externality, and technology complementarity. Inf. Technol. People 2020, 33, 1174–1213. [Google Scholar] [CrossRef]
  31. Acquisti, A.; Gross, R. Imagined communities: Awareness, information sharing, and privacy on the Facebook. In International Workshop on Privacy Enhancing Technologies; Springer: Berlin/Heidelberg, Germany, 2006; pp. 36–58. [Google Scholar]
  32. Hsu, C.-L.; Liao, Y.-C.; Lee, C.-W.; Chan, L.K. Privacy concerns and information sharing: The perspective of the u-shaped curve. Front. Psychol. 2022, 13, 771278. [Google Scholar] [CrossRef]
  33. O’Maonaigh, C.; Saxena, D. Investigating personalisation-privacy paradox among young Irish consumers: A case of smart speakers. arXiv 2021, arXiv:2108.09945. [Google Scholar]
  34. Dienlin, T.; Trepte, S. Is the privacy paradox a relic of the past? An in-depth analysis of privacy attitudes and privacy behaviors. Eur. J. Soc. Psychol. 2015, 45, 285–297. [Google Scholar] [CrossRef]
  35. Choi, H.; Park, J.; Jung, Y. The role of privacy fatigue in online privacy behavior. Comput. Hum. Behav. 2018, 81, 42–51. [Google Scholar] [CrossRef]
  36. Tian, X.; Chen, L.; Zhang, X. The role of privacy fatigue in privacy paradox: A psm and heterogeneity analysis. Appl. Sci. 2022, 12, 9702. [Google Scholar] [CrossRef]
Figure 1. The Dual-Path of Informed Consent in Privacy Pop-Ups.
Figure 1. The Dual-Path of Informed Consent in Privacy Pop-Ups.
Jtaer 20 00179 g001
Table 1. Results of Compliance Review.
Table 1. Results of Compliance Review.
Assessment PointAssessment IndicatorCCReference
1. Public Disclosure of Personal Information Collection RulesClearly prompts users to read privacy policies and related documents21CSPG;
PIPL;
GB/T35273-2020;
[22]
Description of information processor details7
2. Explicit Declaration of Purpose, Method, and Scope of Data CollectionWhether notification is provided when changes occur0
Explains the purpose of requesting permissions/sensitive information8
3. User Consent Prior to Data CollectionNo collection of personal information prior to obtaining user consent21
Whether non-explicit methods (e.g., default consent) are used to obtain user consent0
4. Data Minimization PrincipleAvoids collecting non-service-related personal information7
5. User Consent for Third-Party Data SharingWhether there is an explanation of information sharing8
Whether an explanation is provided before obtaining consent to share with third parties8
6. User Privacy Rights Management
(Deletion/Correction/Complaints)
Whether valid account deletion instructions are provided8
Provision of effective ways to correct or delete personal information8
Whether complaint/report channels are established and publicly accessible2
7. Technical Safeguards and Data StorageWhether a technical statement on privacy protection is provided6
Whether explanations related to information storage are provided0
Note: CC: Compliance Count.
Table 2. Demographic Characteristics of Participants.
Table 2. Demographic Characteristics of Participants.
IDGenderAgeOccupationEducationIDGenderAgeOccupationEducation
N1Male29University StudentPhDN11Male30Company EmployeeBachelor
N2Female24FreelancerBachelorN12Female22University StudentMaster
N3Male63University LecturerPhDN13Male34Company EmployeeBachelor
N4Male31FreelancerHigh SchoolN14Female41University LecturerMaster
N5Female29LawyerMasterN15Male20University StudentBachelor
N6Male30Auxiliary PoliceBachelorN16Male29ResearcherPhD
N7Female29Government StaffBachelorN17Male37Company EmployeeBachelor
N8Female29University LecturerMasterN18Female28University StudentPhD
N9Male31Primary School TeacherBachelorN19Female29Company EmployeeBachelor
N10Female30Bank StaffBachelor
Table 3. Results of Axial Coding.
Table 3. Results of Axial Coding.
ClassificationCore CategoryInitial CategoryConceptual Definition
Negative CognitionB1
Perceived Cost
A1 Cognitive CostThe cognitive resources and effort users invest when reading privacy-related content.
A2 Time costThe amount of time users spends when reading and understanding privacy-related content.
B2
Perceived Coercion
A3 Choice ConstraintUsers’ perception of having few choices when interacting with privacy pop-ups.
A4Power AsymmetryUsers’ perception of power imbalance between themselves and the platform in privacy decisions.
A5Second Pop-up PressureUser pressure from repeated privacy pop-up reminders after initial rejection.
B3
Perceived Risk
A6 Third-party Platform RiskPotential risks arising from privacy information being shared with third-party platforms.
A7 Platform-Inherent RisksPotential risks of privacy breaches caused by the platform itself.
A8 Privacy Breach ExperienceUsers’ prior experiences with privacy breaches.
B4
Algorithm Aversion
A23Recommendation LatencyLag between app recommendations and users’ actual needs.
A24 Homogeneous RecommendationsHigh degree of similarity or redundancy in the app’s recommended content.
Positive CognitionB5
Perceived Benefits
A9 Perceived UsefulnessUsers’ perception of usefulness derived from granting privacy permissions.
A10 Perceived ConvenienceUsers’ perception of convenience gained from granting privacy permissions.
A11 Perceived PersonalizationUsers’ perception that granting permissions facilitates personalized recommendations.
B6
Perceived Legal Effectiveness
A12 ReliabilityUsers’ perception of the reliability of the privacy-related legal framework.
A13 ConstraintsUsers’ perception of the binding force of privacy regulations on app data practices.
A14 Basis for Rights DefenseUsers’ perception that privacy laws can serve as a basis for defending their rights.
B7
Perceived Control
A15 Right to be informedThe extent to which users understand the process and content of permission requests.
A16 Autonomous Choice RightThe extent to which users feel they can autonomously make privacy-related decisions.
Positive AffectB8
Trust
A27 Platform TrustUsers’ belief that the platform will protect their privacy.
A28 Legal TrustUsers’ belief that the law can effectively protect their privacy rights.
A29 Trust in Authorization MethodScenario-based authorization boosts users’ trust in permission request transparency.
Negative AffectB9
Distrust
A30 Vague Protection CommitmentsLack of concrete measures and clear standards in privacy protection commitments.
A31 Unclear Management PathDifficulty in locating and navigating privacy settings.
A32 Opaque Information SharingLack of transparency in the content, scope, and methods of information sharing between platforms.
ConationB10
Habitual Consent to Privacy Pop-Ups
A33 Habitual Consent
(Privacy Fatigue-driven)
Consent to privacy pop-ups to avoid the hassle and pressure of frequent privacy decisions.
A34 Habitual Consent (Trust-driven)Users’ habitual consent to privacy pop-ups as a result of trust.
External FactorsB11
App characteristics
A19 App SizeComprehensive evaluation of the app and its parent company.
A20 App Download VolumeTotal number of app downloads.
B12
Data Characteristics
A21 Data SensitivityUsers’ assessment of the importance of the type of data involved.
A22 Perceived RelevanceUsers’ perception of the relevance between requested permissions and the platform’s core business.
B13
Social Norms
A25 Peer Influence ReferenceUsers refer to others’ behavior when making privacy decisions.
A26 Socialization of RiskUsers perceive privacy risks as widespread societal issues, collectively shared.
Internal FactorsB14
Privacy Fatigue
A17 CynicismUsers’ negative attitudes and skepticism toward privacy protection.
A18 Emotional ExhaustionNegative emotions caused by constant and exhausting privacy management.
Table 4. Results of Selective Coding.
Table 4. Results of Selective Coding.
Relational TypeTypical RelationshipPath and ConnotationRepresentative Example Sentences
Causal RelationshipPositive Cognition → Conation (trust-driven)Positive cognitive evaluations of privacy pop-up content lead to habitual consent behaviors.It can provide personalized services for me, so I definitely click ‘Agree’.
External Factors → Conation (trust-driven)Variations in social norms, platform characteristics, and data sensitivity influence trust-based consent behavior.If a food delivery app doesn’t have my address, it can’t deliver the order. This is reasonable, so I usually agree right away.
Negative Cognition → Negative AffectNegative perception of privacy pop-up content triggers distrust and other negative affect.Sharing with third parties to enable certain features? This justification is too vague—they might actually use it for other purposes.
Mediating RelationshipPositive Cognition → Positive Affect → Conation (trust-driven)Users’ positive perception of privacy pop-up content directly influences trust-based positive affect, thereby promoting habitual consent.I still have faith in legal protections—at the very least, they serve as a safety net if issues arise. The provision of privacy management explanations gives me a sense of control over my personal data, thereby enhancing my sense of reassurance.
External Factors → Positive Affect → Conation (trust-driven)External factors (e.g., social norms, app characteristics, data attributes) strengthen users’ trust-based positive affect, thereby promoting habitual consent to privacy notices.When encountering apps from reputable companies, I typically consent without hesitation, as their operations are generally more compliant with regulations. If the app has a low number of downloads, I tend to be more cautious and may sometimes decline to install it.
Paradoxical RelationshipNegative Cognition→ Conation (privacy fatigue-driven)Despite negative perceptions of pop-up content, users seldom adopt cautious consent strategies; instead, consent remains largely habitual and automatic.There’s always an imbalance between users and platforms. Even though I know there might be some privacy issues, I can’t really do much about it, so I just hit “agree” every time.
Negative Affect → Conation (privacy fatigue-driven)Users’ distrust of privacy notice content does not lead to cautious consent decisions; instead, consent remains habitual.Major tech companies may seem legitimate on the surface, but in reality, they might not be that trustworthy. Still, in the end, I just end up clicking “agree” anyway.
Moderating RelationshipNegative Cognition × Internal Factors (Privacy Fatigue) → ConationThe degree of privacy fatigue moderates the paradoxical relationship between negative cognition and habitual consent behavior.Privacy no longer exists nowadays. Facing this repeatedly makes me numb, so I just click ‘Agree’.
Negative Affect × Internal Factors (Privacy Fatigue) → ConationThe degree of privacy fatigue levels moderate the paradoxical relationships between negative affect and habitual consent behavior.Sometimes I feel skeptical when I see these permission requests. But after a while, I just end up clicking ‘Agree’ because refusing doesn’t really change anything.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, M.; Chen, M. Trust, Privacy Fatigue, and the Informed Consent Dilemma in Mobile App Privacy Pop-Ups: A Grounded Theory Approach. J. Theor. Appl. Electron. Commer. Res. 2025, 20, 179. https://doi.org/10.3390/jtaer20030179

AMA Style

Chen M, Chen M. Trust, Privacy Fatigue, and the Informed Consent Dilemma in Mobile App Privacy Pop-Ups: A Grounded Theory Approach. Journal of Theoretical and Applied Electronic Commerce Research. 2025; 20(3):179. https://doi.org/10.3390/jtaer20030179

Chicago/Turabian Style

Chen, Ming, and Meimei Chen. 2025. "Trust, Privacy Fatigue, and the Informed Consent Dilemma in Mobile App Privacy Pop-Ups: A Grounded Theory Approach" Journal of Theoretical and Applied Electronic Commerce Research 20, no. 3: 179. https://doi.org/10.3390/jtaer20030179

APA Style

Chen, M., & Chen, M. (2025). Trust, Privacy Fatigue, and the Informed Consent Dilemma in Mobile App Privacy Pop-Ups: A Grounded Theory Approach. Journal of Theoretical and Applied Electronic Commerce Research, 20(3), 179. https://doi.org/10.3390/jtaer20030179

Article Metrics

Back to TopTop