Synthetic Social Alienation: The Role of Algorithm-Driven Content in Shaping Digital Discourse and User Perspectives
Abstract
1. Introduction
2. Literature Review
2.1. Alienation, Algorithms and the Digital Media Ecosystem
2.2. The Sociopsychological Dimensions of Algorithmic Alienation
2.3. Rethinking Synthetic Social Alienation
3. Materials and Methods
3.1. Research Questions and Methodological Orientation
3.2. Interview Structure, Analytical Procedures, and Bias Mitigation
3.3. Limitations and Methodological Reflexivity
3.4. Profile of the Interviewees and Methodological Considerations
4. Results
4.1. Dataset for Sentiment Analyses
- interviews_train.docx: Labeled training data. Each interview starts with the phrase “Interviewee:“ and ends with the emotion labels “Positive”, “Negative” or “Notr”.
- interviews_test.docx: Unlabeled test data. Used for estimation and SSA analysis.
4.2. Data Preprocessing
- Lowercase conversion,
- Removal of numbers and punctuation,
- Extraction of stopwords (stopwords.words),
- Retention of words longer than three letters only, and
- These operations are performed with the clean_text() function.
4.3. SSA and Emotional Keyword Matching
- SSA_PHRASES: Statements representing social alienation (Synthetic Social Alienation) in the digital environment
- EMOTION_WORDS: Words reflecting positive and negative emotions
4.4. Findings and Visualization
- -
- Algorithmic Manipulation: Systematic control of content visibility through opaque algorithmic processes (Gillespie, 2014).
- -
- Digital Alienation: Psychological disconnection arising from the replacement of real human interaction with mediated communication (Turkle, 2015).
- -
- Platform Dependency: Behavioral dependence on digital platforms for social approval and information consumption (van Dijck, 2013).
- -
- Echo Chamber Effects: Reinforcement of existing beliefs through algorithmic filtering and selective exposure (Pariser, 2011).
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Andrejevic, M. (2019). Automated media. Routledge. [Google Scholar]
- Baumeister, R. F., & Leary, M. R. (1995). The need to belong: Desire for interpersonal attachments as a fundamental human motivation. Psychological Bulletin, 117(3), 497–529. [Google Scholar] [CrossRef]
- Berry, D. M. (2014). Critical theory and the digital. Bloomsbury Publishing. [Google Scholar]
- Bok, S. K. (2023). Enhancing user experience in E-commerce through personalization algorithms. Available online: https://www.theseus.fi/bitstream/handle/10024/815645/Bok_Sun%20Khi.pdf?sequence=2&isAllowed=y (accessed on 8 August 2025).
- Bonini, T., & Treré, E. (2025). Furthering the agenda of algorithmic resistance: Integrating gender and decolonial perspectives. Dialogues on Digital Society, 1(1), 121–125. [Google Scholar] [CrossRef]
- Bruns, A. (2017, September 14–15). Echo chamber? What echo chamber? Reviewing the evidence. 6th Biennial Future of Journalism Conference (FOJ17), Cardiff, UK. [Google Scholar]
- Bruns, A. (2019). Filter bubble. Internet Policy Review, 8(4), 14261. [Google Scholar] [CrossRef]
- Bucher, T. (2018). If… then: Algorithmic power and politics. Oxford University Press. [Google Scholar]
- Buhmann, A., Paßmann, J., & Fieseler, C. (2020). Managing algorithmic accountability: Balancing reputational concerns, engagement strategies, and the potential of rational discourse. Journal of Business Ethics, 163(2), 265–280. [Google Scholar] [CrossRef]
- Chavanayarn, S. (2024). Epistemic injustice and ideal social media: Enhancing X for inclusive global engagement. Topoi, 43(5), 1355–1368. [Google Scholar] [CrossRef]
- Costa Netto, Y., & Maçada, A. C. G. (2019, June 8–14). Social media filter bubbles and echo chambers influence IT identity construction. 27th European Conference on Information Systems (ECIS) (pp. 1–14), Stockholm-Uppsala, Sweden. [Google Scholar]
- Couldry, N., & Mejias, U. A. (2019). The costs of connection: How data is colonizing human life and appropriating it for capitalism. Stanford University Press. [Google Scholar]
- Dean, J. (2019). Communicative capitalism and revolutionary form. Millennium, 47(3), 326–340. [Google Scholar] [CrossRef]
- Fardouly, J., Willburger, B. K., & Vartanian, L. R. (2018). Instagram use and young women’s body image concerns and self-objectification: Testing mediational pathways. New Media & Society, 20(4), 1380–1395. [Google Scholar]
- Festinger, L. (1954). A theory of social comparison processes. Human Relations, 7(2), 117–140. [Google Scholar] [CrossRef]
- Fisher, E., & Mehozay, Y. (2019). How algorithms see their audience: Media epistemes and the changing conception of the individual. Media, Culture & Society, 41(8), 1176–1191. [Google Scholar] [CrossRef]
- Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly, 80(S1), 298–320. [Google Scholar] [CrossRef]
- Foucault, M. (1980). Power/knowledge: Selected interviews and other writings 1972–1977. Pantheon. [Google Scholar]
- Fuchs, C. (2014). Social media: A critical introduction. Sage Publications. [Google Scholar]
- Gillespie, T. (2014). The relevance of algorithms. In P. J. Boczkowski, K. A. Foot, & T. Gillespie (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167–193). MIT Press. [Google Scholar]
- Guest, G., Bunce, A., & Johnson, L. (2006). How many interviews are enough? An experiment with data saturation and variability. Field Methods, 18(1), 59–82. [Google Scholar] [CrossRef]
- Gürkan, H., Serttaş, A., & Sarıkaya, T. (2024). The virtual mask: The dark underbelly of digital anonymity and gender identity construction in Turkey. Journal of Arab & Muslim Media Research, 17(1), 47–65. [Google Scholar] [CrossRef]
- Helberger, N., Huh, J., Milne, G., Strycharz, J., & Sundaram, H. (2020). Macro and exogenous factors in computational advertising: Key issues and new research directions. Journal of Advertising, 49(4), 377–393. [Google Scholar] [CrossRef]
- Kitchin, R. (2021). The Data Revolution: A critical analysis of big data, open data and data infrastructures. Sage Publication. [Google Scholar]
- Klinger, U., & Svensson, J. (2018). The end of media logics? On algorithms and agency. New Media & Society, 20(12), 4653–4670. [Google Scholar] [CrossRef]
- Kossowska, M., Kłodkowski, P., & Siewierska-Chmaj, A. (2023). Internet-based micro-identities as a driver of societal disintegration. Humanities and Social Sciences Communications, 10, 955. [Google Scholar] [CrossRef]
- Loecherbach, F., Moeller, J., Trilling, D., & van Atteveldt, W. (2020). The unified framework of media diversity: A systematic literature review. Digital Journalism, 8(5), 605–642. [Google Scholar] [CrossRef]
- Lup, K., Trub, L., & Rosenthal, L. (2015). Instagram# instasad?: Exploring associations among instagram use, depressive symptoms, negative social comparison, and strangers followed. Cyberpsychology, Behavior, and Social Networking, 18(5), 247–252. [Google Scholar] [CrossRef]
- Magalhães, J. C. (2018). Do algorithms shape character? Considering algorithmic ethical subjectivation. Social Media + Society, 4(2), 2056305118768301. [Google Scholar] [CrossRef]
- Malterud, K., Siersma, V. D., & Guassora, A. D. (2016). Sample size in qualitative interview studies: Guided by information power. Qualitative Health Research, 26(13), 1753–1760. [Google Scholar] [CrossRef]
- Marx, K. (1978). The economic and philosophic manuscripts of 1844. In R. C. Tucker (Ed.), The marx-engels reader (2nd ed., pp. 66–125). W.W. Norton & Company. (Original work published 1844). [Google Scholar]
- Mihailidis, P., & Viotty, S. (2017). Spreadable spectacle in digital culture: Civic expression, fake news, and the role of media literacies in “post-fact” society. American Behavioral Scientist, 61(4), 441–454. [Google Scholar] [CrossRef]
- Milan, S. (2015). When algorithms shape collective action: Social media and the dynamics of cloud protesting. Social Media + Society, 1(2), 2056305115622481. [Google Scholar] [CrossRef]
- Mosco, V. (2016). The political economy of communication (3rd ed.). Sage Publications. [Google Scholar]
- Möller, J., Trilling, D., Helberger, N., & Van Es, B. (2020). Do not blame it on the algorithm: An empirical assessment of multiple recommender systems and their impact on content diversity. In Digital media, political polarization and challenges to democracy (pp. 45–63). Routledge. [Google Scholar]
- Nguyen, C. T. (2020). Echo chambers and epistemic bubbles. Episteme, 17(2), 141–161. [Google Scholar] [CrossRef]
- Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. In Algorithms of oppression. New York University Press. [Google Scholar]
- Pariser, E. (2011). The filter bubble: What is the internet hiding from you? Penguin. [Google Scholar]
- Perez Vallejos, E., Dowthwaite, L., Creswich, H., Portillo, V., Koene, A., Jirotka, M., & McAuley, D. (2021). The impact of algorithmic decision-making processes on young people’s well-being. Health Informatics Journal, 27(1), 1–21. [Google Scholar] [CrossRef] [PubMed]
- Poloni, M. (2024). The erosion of the middle class in the age of information: Navigating post-capitalist paradigms of power. Universitat Autònoma de Barcelona. [Google Scholar]
- Rehman, S., Ullah, S., & Tahir, P. (2024). The intersectıon of language, power, and Aı: A discourse analytical approach to social media algorithms. Sociology & Cultural Research Review, 2(4), 277–291. [Google Scholar]
- Reviglio, U. (2020). Personalization in social media: Challenges and opportunities for democratic societies. In Polarization, shifting borders and liquid governance. Springer. [Google Scholar]
- Ross Arguedas, A., Robertson, C., Fletcher, R., & Nielsen, R. (2022). Echo chambers, filter bubbles, and polarization: A literature review. Reuters Institute for the Study of Journalism. [Google Scholar]
- Sanseverino, G. G. (2023). Politics and ethics of user generated content: A cross-national investigation of engagement and participation in the online news ecosystem in 80 news sites [Ph.D. dissertation, Université Paul Sabatier-Toulouse III]. [Google Scholar]
- Saurwein, F., & Spencer-Smith, C. (2021). Automated trouble: The role of algorithmic selection in harms on social media platforms. Media and Communication, 9(4), 222–233. [Google Scholar] [CrossRef]
- Seaver, N. (2017). Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society, 4(2), 1–12. [Google Scholar] [CrossRef]
- Seuren, A. J. (2024). Bypassing algorithms, reinforcing stereotypes: Social media experiences of female creators [Ph.D. dissertation, Murdoch University]. [Google Scholar]
- Shin, D., & Park, Y. J. (2019). Role of fairness, accountability, and transparency in algorithmic affordance. Computers in Human Behavior, 98, 277–284. [Google Scholar] [CrossRef]
- Stark, B., Stegmann, D., Magin, M., & Jürgens, P. (2020). Are algorithms a threat to democracy? The rise of intermediaries: A challenge for public discourse. Algorithm watch, 26. Available online: https://algorithmwatch.org/en/wp-content/uploads/2020/05/Governing-Platforms-communications-study-Stark-May-2020-AlgorithmWatch.pdf (accessed on 5 August 2025).
- Sunstein, C. R. (2001). Republic.com. Harvard Journal of Law & Technology, 14(3), 753–766. [Google Scholar]
- Taylor, A. S. (2022). Authenticity as performativity on social media. Palgrave Macmillan. [Google Scholar]
- Terranova, T. (2000). Free labor: Producing culture for the digital economy. Social Text, 63(18), 33–58. [Google Scholar] [CrossRef]
- Turkle, S. (2015). Reclaiming conversation: The power of talk in a digital age. Penguin. [Google Scholar]
- van Dijck, J. (2013). The culture of connectivity: A critical history of social media. Oxford University Press. [Google Scholar]
- Yeung, K. (2018). Algorithmic regulation: A critical interrogation. Regulation & Governance, 12(4), 505–523. [Google Scholar]
- Zimmer, F., Scheibe, K., & Stock, W. G. (2019, January 3–5). Echo chambers and filter bubbles of fake news in social media: Man-made or produced by algorithms? Hawaii University International Conferences on Arts, Humanities, Social Sciences & Education, Honolulu, HI, USA. [Google Scholar]
- Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. Profile Books. [Google Scholar]
What motivates you to use social media? (e.g., entertainment, news, work, connection) |
How familiar are you with algorithms curating content on social media platforms? |
Do you notice patterns in the kind of content recommended to you? Can you give examples? |
How do you feel about the personalization of content by these algorithms? |
Do you think the content you see on social media reflects a wide range of perspectives? Why or why not? |
How often do you encounter content or opinions that challenge your beliefs? |
How does the content you see on social media affect your perception of others outside your immediate circles? |
Do you think social media algorithms make connecting with people from different backgrounds or beliefs easier or harder? |
Have you ever tried to bypass or limit algorithmic recommendations? How did that affect your experience? |
Theme | Common Observations | Comments from Interviews |
---|---|---|
Motivations for Social Media Use | Entertainment, work, and connection are primary drivers, varying by age and occupation. | Profile 6: “I use TikTok for entertainment.”; Profile 9: “I use Instagram to promote causes.” |
Awareness of Algorithms | Most users are somewhat aware of algorithms but differ in the depth of understanding. | Profile 8: “I am very aware of algorithms and how they shape what I see.” Profile 7: “I do not understand how it works.” |
Patterns in Recommended Content | Users observe repetitive content loops, often aligned with their preferences. | Profile 4: “I see much political content that aligns with my views.”; Profile 3: “I get recommended gaming videos all the time.” |
Feelings About Personalization | Mixed feelings: Some appreciate convenience, others find it intrusive. | Profile 6: “I find the recommendations helpful.”; Profile 9: “It is manipulative; it makes me see only what I like.” |
Exposure to Diverse Perspectives | Echo chambers limit exposure to differing viewpoints. | Profile 7: “All I see is content that confirms what I already know.”; Profile 9: “It is hard to reach new audiences.” |
Encountering Challenging Content | Rare encounters with content that challenges beliefs, especially in younger users. | Profile 3: “I never see content that challenges my views.”; Profile 9: “Sometimes I face polarized discussions.” |
Effects on Perceptions of Others | Limited or skewed portrayals in content shape perceptions. | Profile 10: “I feel disconnected from others.”; Profile 8: “My feed shapes my view of people in my field.” |
Connections Across Differences | Difficulty connecting with diverse groups due to algorithmic filtering. | Profile 7: “I mostly see content from people like me.” Profile 4: “It is hard to engage with differing perspectives.” |
Attempts to Bypass Algorithms | Attempts include using manual search or following diverse accounts, with mixed success. | Profile 3: “I follow accounts manually to get different content.”; Profile 10: “I try to use the platform’s tools, but they do not work well.” |
User Type | Discourse Patterns | Key Linguistic Markers | Implications |
---|---|---|---|
Passive Consumers (Profiles 1, 2, 3, 5, 8) | Repetitive, deterministic | “I always see the same,” “It’s an endless loop” | Low agency, algorithmic dependence |
Active Curators (Profiles 4, 6, 7, 9, 10) | Adaptive, strategic | “I try to manipulate it,” “I avoid certain content” | Algorithmic literacy, resistance discourse |
Algorithm-Dependent Users (Professionals, Activists) | Paradoxical, negotiated | “I need it but hate it,” “I optimize for reach” | Tension between reliance and critique |
Theme | Common Observations | Comments from Interviews |
---|---|---|
Understanding of SSA | Most users know how algorithms create a sense of detachment or alienation, but the depth of their understanding varies. | Profile 1: “It feels like I am constantly being fed the same stuff, and the platform does not care about what I need.” |
Perceptions of Social Interaction | Users experience a sense of isolation or detachment from genuine human connections due to algorithmic filtering and content curation. | Profile 6: “Even though I interact with people online, I feel like I am not truly connecting with them.” Profile 7: “It is like we are all in echo chambers, and it does not feel real.” |
Impact on Emotional Well-being | SSA is often linked to feelings of frustration, dissatisfaction, and emotional disengagement with the content they consume. | Profile 8: “I sometimes feel emotionally drained from the repetitive content that does not resonate with me.”; Profile 3: “The endless gaming videos make me feel stuck.” |
Alienation from Diverse Perspectives | The algorithmic environment limits exposure to differing perspectives, reinforcing feelings of detachment from broader social conversations. | Profile 9: “I do not see much of the other side of issues, which makes me feel disconnected from people with different views.” |
Attempts to Counter SSA | Some users actively try to break free from SSA by seeking more diverse content, though success varies due to algorithmic filtering. | Profile 2: “I manually search for new topics to get out of the bubble, but it does not always work.”; Profile 10: “I follow accounts from different perspectives, but it does not make much of a difference.” |
Social Engagement in Digital Spaces | SSA is often linked to superficial or transactional interactions in digital spaces rather than meaningful, in-depth connections. | Profile 4: “I am engaging with the same type of people, but it feels more like networking than true connection.”; Profile 5: “There is only so much I can gain from these platforms before they start feeling empty.” |
Theme | Common Observations | Comments from Interviews |
---|---|---|
Active Content Curation | Some users actively engage with algorithms by curating their feeds, choosing specific accounts to follow, or using search features. | Profile 1: “I try to follow various accounts to diversify my feed.” |
Manual Content Search | Users rely on manual searches and browsing to seek content outside algorithmic suggestions. | Profile 6: “I search for things manually to find new content that the algorithm does not suggest.” |
Engagement with Diverse Accounts | Some users follow a range of diverse accounts or topics to counter algorithmic homogeneity. | Profile 9: “I follow accounts that challenge my views to broaden my perspective.” |
Frequent Unfollowing/Muting | Some users unfollow or mute certain accounts to control their content to prevent overexposure to repetitive content. | Profile 2: “I mute accounts that keep pushing the same content I do not find interesting.” |
Limiting Time on Platforms | A strategy for coping with content overload and algorithmic influence is to reduce overall platform usage. | Profile 10: “I limit my time on social media so I do not get caught in these loops.” |
Seeking Alternative Platforms | Some users attempt to move to other platforms with less algorithmic control or a different type of content structure. | Profile 5: “I have started using a new platform where I can curate my content more freely.” |
Awareness and Avoidance of Bias | Some users consciously avoid content that reinforces biases by actively seeking diverse opinions or questioning algorithmic suggestions. | Profile 4: “I question whether the content I see is biased, especially in politics.” |
Engaging with Algorithmic Feedback | A few users try to alter algorithmic suggestions through likes, shares, and other feedback loops to create more personalized content. | Profile 8: “I like videos that are more diverse to try to influence my recommendations.” |
Concept | Description | Key Implications |
---|---|---|
Synthetic Social Alienation (SSA) | The phenomenon where algorithm-driven environments reshape social interactions and identities, leading to detachment from real-world connections. | Alienation in digital spaces, cognitive narrowing, and fragmented social bonds. |
Simulated Relationships | Users engage with algorithmically curated content and personas, prioritizing engagement over authenticity. | Weakens genuine human connections, fosters parasocial relationships. |
Commodification of Engagement | Social interactions (likes, shares, comments) are converted into data and exploited for profit. | Reduces relationships to transactional engagements, reinforcing corporate control. |
Erosion of Realness | Users create and maintain digital personas that prioritize visibility over authenticity, shaped by algorithmic incentives. | Disconnection from true self, loss of diverse perspectives. |
Algorithmic Detachment Syndrome (ADS) | A form of digital alienation where users experience cognitive and social estrangement due to constant algorithmic mediation. | Reinforces echo chambers, reduces exposure to new ideas. |
Surveillance Capitalism | Platforms extract behavioral data, manipulate user behavior, and reinforce engagement loops for monetization. | Loss of autonomy, increased corporate influence over thought and behavior. |
Discursive Fragmentation | Algorithmic curation prioritizes content that maximizes engagement, leading to ideological echo chambers. | Limits intellectual diversity and increases polarization. |
Lexical and Rhetorical Shifts | Sensationalized and emotionally charged content dominates discourse, simplifying discussions. | Undermines critical thinking, encourages reactionary communication. |
Algorithmic Visibility and Speech Economy | Algorithms determine whose voices are amplified and whose are suppressed, influencing public discourse. | Concentrates power in digital platforms, marginalizes counter-narratives. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Serttaş, A.; Gürkan, H.; Dere, G. Synthetic Social Alienation: The Role of Algorithm-Driven Content in Shaping Digital Discourse and User Perspectives. Journal. Media 2025, 6, 149. https://doi.org/10.3390/journalmedia6030149
Serttaş A, Gürkan H, Dere G. Synthetic Social Alienation: The Role of Algorithm-Driven Content in Shaping Digital Discourse and User Perspectives. Journalism and Media. 2025; 6(3):149. https://doi.org/10.3390/journalmedia6030149
Chicago/Turabian StyleSerttaş, Aybike, Hasan Gürkan, and Gülçicek Dere. 2025. "Synthetic Social Alienation: The Role of Algorithm-Driven Content in Shaping Digital Discourse and User Perspectives" Journalism and Media 6, no. 3: 149. https://doi.org/10.3390/journalmedia6030149
APA StyleSerttaş, A., Gürkan, H., & Dere, G. (2025). Synthetic Social Alienation: The Role of Algorithm-Driven Content in Shaping Digital Discourse and User Perspectives. Journalism and Media, 6(3), 149. https://doi.org/10.3390/journalmedia6030149