Trust, Privacy Fatigue, and the Informed Consent Dilemma in Mobile App Privacy Pop-Ups: A Grounded Theory Approach
Abstract
1. Introduction
2. Literature Review
3. Preliminary Study
3.1. Sample Selection and Analytical Framework
3.2. Main Findings and Conclusions
4. Formal Study
4.1. Research Design
4.1.1. Methodology
4.1.2. Interview Participants
4.1.3. Interview Content and Procedure
4.2. Data Analysis
4.2.1. Open Coding
4.2.2. Axial Coding
4.2.3. Selective Coding
4.2.4. Theoretical Saturation Test
5. Explanation of the Theoretical Model
5.1. Alleviating the Informed Consent Dilemma
5.1.1. Positive Cognition and Consent to Privacy Pop-Ups
5.1.2. External Factors and Consent to Privacy Pop-Ups
5.1.3. Trust as a Mediator of Privacy Pop-Up Consent
5.2. Exacerbating the Informed Consent Dilemma
5.2.1. The Direct Impact of Negative Cognition on Distrust
5.2.2. The Paradox of Consent in Privacy Pop-Ups
5.2.3. Privacy Fatigue as Moderators of Privacy Pop-Up Consent
6. Conclusions
6.1. Theoretical Implications
6.2. Practical and Managerial Implications
6.3. Research Limitations and Future Directions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
Initial Category | Initial Concept | Representative Original Statements (Excerpt) |
---|---|---|
A1 Cognitive Cost | a1 Highly Specialized Agreement Terms | The terms contain too much jargon incomprehensible to laypersons. |
A2 Time cost | a2 Excessively Lengthy Agreement | I feel like I need to scroll for a long time to get through the agreement. |
A3 Choice Constraint | a3 Consent Required to Use the App | If I don’t agree, I can’t use the app. |
A4 Power Asymmetry | a4 Platform Dominance | The privacy pop-up feels like a negotiation, and the app holds all the chips. |
A5 Second Pop-up Pressure | a5 Repeated Pop-up Harassment | I’ve already explicitly rejected it, yet it asks again—it’s basically like spam calls. |
A6 Third-party Platform Risk | a6 Opaque Data Handling; a7 Risks of Data Leakage | Third-party data usage remains undisclosed; Cross-platform data leaks occur frequently. |
A7 Platform-Inherent Risks | a8 Technological Intrusiveness; a9 Security Vulnerabilities | Big data technologies are scary—the app seems to know what I’m thinking; the platform may still have loopholes that people exploit illegally. |
A8 Privacy Breach Experience | a10 Personal Information Leakage | After using my phone number to consult about teacher qualification exams, I was harassed by various sales calls. |
A9 Perceived Usefulness | a11 Basic Function Realization | Without location, a food delivery app can’t deliver food. |
A10 Perceived Convenience | a12 Time-saving | The app recommends products to me based on my information, so I don’t have to spend time searching, which is very convenient. |
A11 Perceived Personalization | a13 Personalized Recommendations | The app’s recommendation of cooking videos aligns with my culinary interests, providing both engagement and educational value. |
A12 Reliability | a14 Reliability of Law | I feel that the law might be more reliable than technical protection statements. |
A13 Constraints | a15 Legal Constraints on App Behavior | The law at least constrains what the app can do, restricting its actions within legal boundaries. |
A14 Basis for Rights Defense | a16 Legal Basis for Rights Protection | If a privacy issue occurs, I can rely on the law to demand compensation for my losses from the app. |
A15 Right to be informed | a17 Data Collection Transparency | Scenario-based authorization lets me know what kinds of personal data will be collected. |
A16 Autonomous Choice Right | a18 Empowerment of Choice; a19 Sense of Autonomy | Contextual authorization empowers users to decide whether to consent; providing clear explanations of privacy procedures enhances my sense of autonomy, reducing concerns about privacy leakage even if I choose not to use the service. |
A17 Cynicism | a20 Helplessness over Routine Data Breaches; a21 Futility of Effort | “Privacy erosion seems inevitable”; “Individual resistance proves futile.” |
A18 Emotional Exhaustion | a22 Cognitive Resource Depletion; a23 Cumulative Negative Affect | The dense and complex terms are cognitively exhausting; reviewing these privacy agreements induces frustration, making me unwilling to invest time in reading them. |
A19 App Size | a24 Corporate Size | Larger apps are okay because they won’t deceive you. |
A20 App Download Volume | a25 App Download Volume | If an app has very few downloads, I don’t really trust its privacy pop-up content. |
A21 Data Sensitivity | a26 Low-sensitive Data; a27 High-sensitive Data | If it only collects my social media accounts or email, I don’t really mind. However, when it involves my passwords or financial information, I become highly cautious. |
A22 Perceived Relevance | a28 Low-relevance Permissions; a29 High-relevance Permissions | Playing a game shouldn’t require my contacts—that’s weird; but camera permissions for a scanning app make sense, or it can’t work. |
A23Recommendation Latency | a30 Delayed Recommendations | I think the recommendation system isn’t smart enough—by the time it suggests things, I’ve already bought them. |
A24Homogeneous Recommendations | a31 Repetitive Content | It keeps recommending the same type of things, making my view narrow. |
A25 Peer Influence Reference | a32 Herd Behavior | I tend to install apps recommended by friends without verifying their security. |
A26 Socialization of Risk | a33 Collective Risk Bearing | When risks materialize, the consequences are collectively borne by all parties. |
A27 Platform Trust | a34 Platform’s Privacy Guarantees | At least apps downloaded from XX’s official store won’t have viruses, so privacy feels relatively safe. |
A28 Legal Trust | a35 Legal Guarantees for Privacy | If an app claims to follow the law to protect privacy, it increases my trust. |
A29 Trust in Authorization Method | a36 Preference for Scenario-based Authorization | It’s better to request camera permission when taking a photo than to force consent upon installation. |
A30 Vague Protection Commitments | a37 Undisclosed Technical Standards; a38 Unclear Protection Measures | I don’t understand how the so-called advanced protection technology works; privacy protection is just a verbal commitment without substance, and I don’t know how it actually protects my privacy. |
A31 Unclear Management Path | a39 Opaque Operational Guidance | The app provides a privacy management pathway, but I’m not clear on how to operate it. |
A32 Opaque Information Sharing | a40 Hidden Third-party Information; a41 Unclear Sharing Purpose | I’m not clear which third-party platforms my information is shared with; the privacy policy does not clearly state the specific purpose of information sharing, which is unsettling. |
A33 Habitual Consent (Privacy Fatigue-driven) | a42 Reluctant Acceptance; a43 Reducing Information Overload; a44 Privacy Resignation | As individual users in front of platforms, we can only agree and have no other choice; I can’t possibly study every detail, so I just agree directly. I feel there’s no such thing as privacy nowadays, so I don’t care whether I agree or not. |
A34 Habitual Consent (Trust-driven) | a45 Authorizing Familiar Platforms; a46 Authorizing Large Platforms; a47 Consenting Due to Trust in Legal Safeguards | I directly agree for big, familiar companies—they won’t misuse data; I feel their products are legit, so I just agree; I trust the law, so if it says it complies, I’m fine with it. |
References
- Hurd, H.M. The normative force of consent. In The Routledge Handbook of the Ethics of Consent; Routledge: London, UK, 2018; pp. 44–54. [Google Scholar]
- Marotta-Wurgler, F. Self-regulation and competition in privacy policies. J. Leg. Stud. 2016, 45 (Suppl. S2), S13–S39. [Google Scholar] [CrossRef]
- Bechmann, A. Non-informed consent cultures: Privacy policies and app contracts on Facebook. J. Media Bus. Stud. 2014, 11, 21–38. [Google Scholar] [CrossRef]
- Jensen, C.; Potts, C. Privacy policies as decision-making tools: An evaluation of online privacy notices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vienna, Austria, 24–29 April 2004; pp. 471–478. [Google Scholar]
- Milne, G.R.; Culnan, M.J.; Greene, H. A longitudinal assessment of online privacy notice readability. J. Public Policy Mark. 2006, 25, 238–249. [Google Scholar] [CrossRef]
- Slavin, R.; Wang, X.; Hosseini, M.B.; Hester, J.; Krishnan, R.; Bhatia, J.; Breaux, T.D.; Niu, J. Toward a framework for detecting privacy policy violations in android application code. In Proceedings of the 38th International Conference on Software Engineering, Austin, TX, USA, 14–22 May 2016; pp. 25–36. [Google Scholar]
- Reidenberg, J.R.; Breaux, T.; Cranor, L.F.; French, B.; Grannis, A.; Graves, J.T.; Liu, F.; McDonald, A.; Norton, T.B.; Ramanath, R. Disagreeable privacy policies: Mismatches between meaning and users’ understanding. Berkeley Technol. Law J. 2015, 30, 39. [Google Scholar] [CrossRef]
- West, S.M. Data capitalism: Redefining the logics of surveillance and privacy. Bus. Soc. 2019, 58, 20–41. [Google Scholar] [CrossRef]
- Tymchuk, A.J. Informing for consent: Concepts and methods. Can. Psychol./Psychol. Can. 1997, 38, 55. [Google Scholar] [CrossRef]
- Faden, R.R.; Beauchamp, T.L. A History and Theory of Informed Consent; Oxford University Press: Oxford, UK, 1986. [Google Scholar]
- Mikalef, P.; Boura, M.; Lekakos, G.; Krogstie, J. Big data analytics capabilities and innovation: The mediating role of dynamic capabilities and moderating effect of the environment. Br. J. Manag. 2019, 30, 272–298. [Google Scholar] [CrossRef]
- Simon, H.A. A behavioral model of rational choice. Q. J. Econ. 1955, 69, 99–118. [Google Scholar] [CrossRef]
- Böhme, R.; Köpsell, S. Trained to accept? A field experiment on consent dialogs. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Atlanta, GE, USA, 10–15 April 2010; pp. 2403–2406. [Google Scholar]
- Gray, C.M.; Kou, Y.; Battles, B.; Hoggatt, J.; Toombs, A.L. The dark (patterns) side of UX design. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–14. [Google Scholar]
- Schermer, B.W.; Custers, B.; Van der Hof, S. The crisis of consent: How stronger legal protection may lead to weaker consent in data protection. Ethics Inf. Technol. 2014, 16, 171–182. [Google Scholar]
- Solove, D. Privacy self-managementand the consent dilemma. Harv. Law Rev. 2013, 126, 1880–1903. [Google Scholar]
- Van den Berg, B.; Van der Hof, S. What happens to my data? A novel approach to informing users of data processing practices. First Monday 2012, 17. [Google Scholar] [CrossRef]
- McDonald, A.M.; Lowenthal, T. Nano-notice: Privacy disclosure at a mobile scale. J. Inf. Policy 2013, 3, 331–354. [Google Scholar] [CrossRef]
- Baarslag, T.; Alan, A.T.; Gomer, R.C.; Liccardi, I.; Marreiros, H.; Gerding, E.H.; Schraefel, M. Negotiation as an interaction mechanism for deciding app permissions. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 2012–2019. [Google Scholar]
- Nissenbaum, H. A contextual approach to privacy online. Daedalus 2011, 140, 32–48. [Google Scholar] [CrossRef]
- GB/T35273-2020; Information Security Technology—Personal Information Security Specification. Standardization Administration of China: Beijing, China, 2020.
- Zaeem, R.N.; Barber, K.S. The effect of the GDPR on privacy policies: Recent progress and future promise. ACM Trans. Manag. Inf. Syst. (TMIS) 2020, 12, 1–20. [Google Scholar] [CrossRef]
- Glaser, B.G.; Strauss, A.L.; Strutzel, E. The discovery of grounded theory; strategies for qualitative research. Nurs. Res. 1968, 17, 364. [Google Scholar] [CrossRef]
- Pandit, N.R. The creation of theory: A recent application of the grounded theory method. Qual. Rep. 1996, 2, 1–15. [Google Scholar] [CrossRef]
- Morse, J.M. Critical analysis of strategies for determining rigor in qualitative inquiry. Qual. Health Res. 2015, 25, 1212–1222. [Google Scholar] [CrossRef]
- Fishbein, M.; Ajzen, I. Belief, Attitude, Intention, and Behavior: An Introduction to Theory and Research. Philos. Rhetor. 1977, 10, 130–132. [Google Scholar]
- Chih, W.-H.; Liou, D.-K.; Hsu, L.-C. From positive and negative cognition perspectives to explore e-shoppers’ real purchase behavior: An application of tricomponent attitude model. Inf. Syst. E-Bus. Manag. 2015, 13, 495–526. [Google Scholar] [CrossRef]
- Zeng, F.; Ye, Q.; Li, J.; Yang, Z. Does self-disclosure matter? A dynamic two-stage perspective for the personalization-privacy paradox. J. Bus. Res. 2021, 124, 667–675. [Google Scholar] [CrossRef]
- Heng, X.; Hock-Hai, T.; Tan, B.; Agarwal, R. Effects of Individual Self-Protection, Industry Self-Regulation, and Government Regulation on Privacy Concerns: A Study of Location-Based Services. Inf. Syst. Res. 2012, 23, 1342–1363. [Google Scholar]
- Gong, X.; Zhang, K.Z.; Chen, C.; Cheung, C.M.; Lee, M.K. What drives self-disclosure in mobile payment applications? The effect of privacy assurance approaches, network externality, and technology complementarity. Inf. Technol. People 2020, 33, 1174–1213. [Google Scholar] [CrossRef]
- Acquisti, A.; Gross, R. Imagined communities: Awareness, information sharing, and privacy on the Facebook. In International Workshop on Privacy Enhancing Technologies; Springer: Berlin/Heidelberg, Germany, 2006; pp. 36–58. [Google Scholar]
- Hsu, C.-L.; Liao, Y.-C.; Lee, C.-W.; Chan, L.K. Privacy concerns and information sharing: The perspective of the u-shaped curve. Front. Psychol. 2022, 13, 771278. [Google Scholar] [CrossRef]
- O’Maonaigh, C.; Saxena, D. Investigating personalisation-privacy paradox among young Irish consumers: A case of smart speakers. arXiv 2021, arXiv:2108.09945. [Google Scholar]
- Dienlin, T.; Trepte, S. Is the privacy paradox a relic of the past? An in-depth analysis of privacy attitudes and privacy behaviors. Eur. J. Soc. Psychol. 2015, 45, 285–297. [Google Scholar] [CrossRef]
- Choi, H.; Park, J.; Jung, Y. The role of privacy fatigue in online privacy behavior. Comput. Hum. Behav. 2018, 81, 42–51. [Google Scholar] [CrossRef]
- Tian, X.; Chen, L.; Zhang, X. The role of privacy fatigue in privacy paradox: A psm and heterogeneity analysis. Appl. Sci. 2022, 12, 9702. [Google Scholar] [CrossRef]
Assessment Point | Assessment Indicator | CC | Reference |
---|---|---|---|
1. Public Disclosure of Personal Information Collection Rules | Clearly prompts users to read privacy policies and related documents | 21 | CSPG; PIPL; GB/T35273-2020; [22] |
Description of information processor details | 7 | ||
2. Explicit Declaration of Purpose, Method, and Scope of Data Collection | Whether notification is provided when changes occur | 0 | |
Explains the purpose of requesting permissions/sensitive information | 8 | ||
3. User Consent Prior to Data Collection | No collection of personal information prior to obtaining user consent | 21 | |
Whether non-explicit methods (e.g., default consent) are used to obtain user consent | 0 | ||
4. Data Minimization Principle | Avoids collecting non-service-related personal information | 7 | |
5. User Consent for Third-Party Data Sharing | Whether there is an explanation of information sharing | 8 | |
Whether an explanation is provided before obtaining consent to share with third parties | 8 | ||
6. User Privacy Rights Management (Deletion/Correction/Complaints) | Whether valid account deletion instructions are provided | 8 | |
Provision of effective ways to correct or delete personal information | 8 | ||
Whether complaint/report channels are established and publicly accessible | 2 | ||
7. Technical Safeguards and Data Storage | Whether a technical statement on privacy protection is provided | 6 | |
Whether explanations related to information storage are provided | 0 |
ID | Gender | Age | Occupation | Education | ID | Gender | Age | Occupation | Education |
---|---|---|---|---|---|---|---|---|---|
N1 | Male | 29 | University Student | PhD | N11 | Male | 30 | Company Employee | Bachelor |
N2 | Female | 24 | Freelancer | Bachelor | N12 | Female | 22 | University Student | Master |
N3 | Male | 63 | University Lecturer | PhD | N13 | Male | 34 | Company Employee | Bachelor |
N4 | Male | 31 | Freelancer | High School | N14 | Female | 41 | University Lecturer | Master |
N5 | Female | 29 | Lawyer | Master | N15 | Male | 20 | University Student | Bachelor |
N6 | Male | 30 | Auxiliary Police | Bachelor | N16 | Male | 29 | Researcher | PhD |
N7 | Female | 29 | Government Staff | Bachelor | N17 | Male | 37 | Company Employee | Bachelor |
N8 | Female | 29 | University Lecturer | Master | N18 | Female | 28 | University Student | PhD |
N9 | Male | 31 | Primary School Teacher | Bachelor | N19 | Female | 29 | Company Employee | Bachelor |
N10 | Female | 30 | Bank Staff | Bachelor |
Classification | Core Category | Initial Category | Conceptual Definition |
---|---|---|---|
Negative Cognition | B1 Perceived Cost | A1 Cognitive Cost | The cognitive resources and effort users invest when reading privacy-related content. |
A2 Time cost | The amount of time users spends when reading and understanding privacy-related content. | ||
B2 Perceived Coercion | A3 Choice Constraint | Users’ perception of having few choices when interacting with privacy pop-ups. | |
A4Power Asymmetry | Users’ perception of power imbalance between themselves and the platform in privacy decisions. | ||
A5Second Pop-up Pressure | User pressure from repeated privacy pop-up reminders after initial rejection. | ||
B3 Perceived Risk | A6 Third-party Platform Risk | Potential risks arising from privacy information being shared with third-party platforms. | |
A7 Platform-Inherent Risks | Potential risks of privacy breaches caused by the platform itself. | ||
A8 Privacy Breach Experience | Users’ prior experiences with privacy breaches. | ||
B4 Algorithm Aversion | A23Recommendation Latency | Lag between app recommendations and users’ actual needs. | |
A24 Homogeneous Recommendations | High degree of similarity or redundancy in the app’s recommended content. | ||
Positive Cognition | B5 Perceived Benefits | A9 Perceived Usefulness | Users’ perception of usefulness derived from granting privacy permissions. |
A10 Perceived Convenience | Users’ perception of convenience gained from granting privacy permissions. | ||
A11 Perceived Personalization | Users’ perception that granting permissions facilitates personalized recommendations. | ||
B6 Perceived Legal Effectiveness | A12 Reliability | Users’ perception of the reliability of the privacy-related legal framework. | |
A13 Constraints | Users’ perception of the binding force of privacy regulations on app data practices. | ||
A14 Basis for Rights Defense | Users’ perception that privacy laws can serve as a basis for defending their rights. | ||
B7 Perceived Control | A15 Right to be informed | The extent to which users understand the process and content of permission requests. | |
A16 Autonomous Choice Right | The extent to which users feel they can autonomously make privacy-related decisions. | ||
Positive Affect | B8 Trust | A27 Platform Trust | Users’ belief that the platform will protect their privacy. |
A28 Legal Trust | Users’ belief that the law can effectively protect their privacy rights. | ||
A29 Trust in Authorization Method | Scenario-based authorization boosts users’ trust in permission request transparency. | ||
Negative Affect | B9 Distrust | A30 Vague Protection Commitments | Lack of concrete measures and clear standards in privacy protection commitments. |
A31 Unclear Management Path | Difficulty in locating and navigating privacy settings. | ||
A32 Opaque Information Sharing | Lack of transparency in the content, scope, and methods of information sharing between platforms. | ||
Conation | B10 Habitual Consent to Privacy Pop-Ups | A33 Habitual Consent (Privacy Fatigue-driven) | Consent to privacy pop-ups to avoid the hassle and pressure of frequent privacy decisions. |
A34 Habitual Consent (Trust-driven) | Users’ habitual consent to privacy pop-ups as a result of trust. | ||
External Factors | B11 App characteristics | A19 App Size | Comprehensive evaluation of the app and its parent company. |
A20 App Download Volume | Total number of app downloads. | ||
B12 Data Characteristics | A21 Data Sensitivity | Users’ assessment of the importance of the type of data involved. | |
A22 Perceived Relevance | Users’ perception of the relevance between requested permissions and the platform’s core business. | ||
B13 Social Norms | A25 Peer Influence Reference | Users refer to others’ behavior when making privacy decisions. | |
A26 Socialization of Risk | Users perceive privacy risks as widespread societal issues, collectively shared. | ||
Internal Factors | B14 Privacy Fatigue | A17 Cynicism | Users’ negative attitudes and skepticism toward privacy protection. |
A18 Emotional Exhaustion | Negative emotions caused by constant and exhausting privacy management. |
Relational Type | Typical Relationship | Path and Connotation | Representative Example Sentences |
---|---|---|---|
Causal Relationship | Positive Cognition → Conation (trust-driven) | Positive cognitive evaluations of privacy pop-up content lead to habitual consent behaviors. | It can provide personalized services for me, so I definitely click ‘Agree’. |
External Factors → Conation (trust-driven) | Variations in social norms, platform characteristics, and data sensitivity influence trust-based consent behavior. | If a food delivery app doesn’t have my address, it can’t deliver the order. This is reasonable, so I usually agree right away. | |
Negative Cognition → Negative Affect | Negative perception of privacy pop-up content triggers distrust and other negative affect. | Sharing with third parties to enable certain features? This justification is too vague—they might actually use it for other purposes. | |
Mediating Relationship | Positive Cognition → Positive Affect → Conation (trust-driven) | Users’ positive perception of privacy pop-up content directly influences trust-based positive affect, thereby promoting habitual consent. | I still have faith in legal protections—at the very least, they serve as a safety net if issues arise. The provision of privacy management explanations gives me a sense of control over my personal data, thereby enhancing my sense of reassurance. |
External Factors → Positive Affect → Conation (trust-driven) | External factors (e.g., social norms, app characteristics, data attributes) strengthen users’ trust-based positive affect, thereby promoting habitual consent to privacy notices. | When encountering apps from reputable companies, I typically consent without hesitation, as their operations are generally more compliant with regulations. If the app has a low number of downloads, I tend to be more cautious and may sometimes decline to install it. | |
Paradoxical Relationship | Negative Cognition→ Conation (privacy fatigue-driven) | Despite negative perceptions of pop-up content, users seldom adopt cautious consent strategies; instead, consent remains largely habitual and automatic. | There’s always an imbalance between users and platforms. Even though I know there might be some privacy issues, I can’t really do much about it, so I just hit “agree” every time. |
Negative Affect → Conation (privacy fatigue-driven) | Users’ distrust of privacy notice content does not lead to cautious consent decisions; instead, consent remains habitual. | Major tech companies may seem legitimate on the surface, but in reality, they might not be that trustworthy. Still, in the end, I just end up clicking “agree” anyway. | |
Moderating Relationship | Negative Cognition × Internal Factors (Privacy Fatigue) → Conation | The degree of privacy fatigue moderates the paradoxical relationship between negative cognition and habitual consent behavior. | Privacy no longer exists nowadays. Facing this repeatedly makes me numb, so I just click ‘Agree’. |
Negative Affect × Internal Factors (Privacy Fatigue) → Conation | The degree of privacy fatigue levels moderate the paradoxical relationships between negative affect and habitual consent behavior. | Sometimes I feel skeptical when I see these permission requests. But after a while, I just end up clicking ‘Agree’ because refusing doesn’t really change anything. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, M.; Chen, M. Trust, Privacy Fatigue, and the Informed Consent Dilemma in Mobile App Privacy Pop-Ups: A Grounded Theory Approach. J. Theor. Appl. Electron. Commer. Res. 2025, 20, 179. https://doi.org/10.3390/jtaer20030179
Chen M, Chen M. Trust, Privacy Fatigue, and the Informed Consent Dilemma in Mobile App Privacy Pop-Ups: A Grounded Theory Approach. Journal of Theoretical and Applied Electronic Commerce Research. 2025; 20(3):179. https://doi.org/10.3390/jtaer20030179
Chicago/Turabian StyleChen, Ming, and Meimei Chen. 2025. "Trust, Privacy Fatigue, and the Informed Consent Dilemma in Mobile App Privacy Pop-Ups: A Grounded Theory Approach" Journal of Theoretical and Applied Electronic Commerce Research 20, no. 3: 179. https://doi.org/10.3390/jtaer20030179
APA StyleChen, M., & Chen, M. (2025). Trust, Privacy Fatigue, and the Informed Consent Dilemma in Mobile App Privacy Pop-Ups: A Grounded Theory Approach. Journal of Theoretical and Applied Electronic Commerce Research, 20(3), 179. https://doi.org/10.3390/jtaer20030179