Next Article in Journal
The Weak Engagement Paradox: Public Support and Pro-Environmental Behavior in Bulgaria
Previous Article in Journal
Digital Media and Political Engagement: Shaping Youth Environmental Attitudes and Behaviors in Four European Societies
Previous Article in Special Issue
Algorithmic Burnout and Digital Well-Being: Modelling Young Adults’ Resistance to Personalized Digital Persuasion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Trap of Social Media Algorithms: A Systematic Review of Research on Filter Bubbles, Echo Chambers, and Their Impact on Youth

1
Department of Media Studies, Government College University, Lahore 54000, Pakistan
2
Department of Library, Government College University, Lahore 54000, Pakistan
3
Department of Central Library, Prince Sultan University, Riyadh 11586, Saudi Arabia
4
President Office, Prince Sultan University, Riyadh 11586, Saudi Arabia
*
Author to whom correspondence should be addressed.
Societies 2025, 15(11), 301; https://doi.org/10.3390/soc15110301
Submission received: 2 October 2025 / Revised: 24 October 2025 / Accepted: 25 October 2025 / Published: 30 October 2025
(This article belongs to the Special Issue Algorithm Awareness: Opportunities, Challenges and Impacts on Society)

Abstract

This systematic review synthesizes a decade of peer-reviewed research (2015–2025) examining the interplay of filter bubbles, echo chambers, and algorithmic bias in shaping youth engagement within social media. A total of 30 studies were analyzed, using the PRISMA 2020 framework, encompassing computational audits, simulation modeling, surveys, ethnographic accounts, and mixed-methods designs across diverse platforms, including Facebook, YouTube, Twitter/X, Instagram, TikTok, and Weibo. Results reveal three consistent patterns: (i) algorithmic systems structurally amplify ideological homogeneity, reinforcing selective exposure and limiting viewpoint diversity; (ii) youth demonstrate partial awareness and adaptive strategies to navigate algorithmic feeds, though their agency is constrained by opaque recommender systems and uneven digital literacy; and (iii) echo chambers not only foster ideological polarization but also serve as spaces for identity reinforcement and cultural belonging. Despite these insights, the evidence base suffers from geographic bias toward Western contexts, limited longitudinal research, methodological fragmentation, and conceptual ambiguity in key definitions. This review highlights the need for integrative, cross-cultural, and youth-centered approaches that bridge empirical evidence with lived experiences.

1. Introduction

Over the past decade, social media platforms have become the primary arenas where young people communicate, learn, and construct their worldviews. Platforms such as Instagram, TikTok, YouTube, and X (formerly Twitter) serve not only as entertainment hubs but also as central sources of news and information for this generation, shaping both the content they encounter and the ways they interpret it [1,2]. Yet the architecture of these platforms is not neutral. It is powered by complex recommender systems, social media algorithms designed to filter, rank, and suggest content based on users’ inferred preferences and behaviors [3,4]. While these algorithms offer a highly personalized and engaging experience, they also introduce significant risks. Social media algorithms operate by analyzing user behaviors, such as clicks, shares, and time spent on particular posts, to predict and recommend content that aligns with those behaviors [5]. Although this personalization enhances convenience and relevance, it can simultaneously limit exposure to diverse perspectives, reinforcing a fragmented information environment that mirrors existing beliefs [3]. Recent advances in propaganda detection highlight the potential of hierarchical graph-based integration networks to identify misinformation with high accuracy across diverse contexts, including during crises like COVID-19 [6]. Such approaches illustrate how artificial intelligence can capture hidden semantic and syntactic patterns, reinforcing the urgency of studying youth exposure to manipulative algorithmic environments.
Although frequently conflated, filter bubbles and echo chambers reflect distinct though interrelated mechanisms of information isolation. A filter bubble refers to the personalization effects of algorithmic curation that limit exposure to diverse viewpoints [7], while an echo chamber emerges through selective social interaction and confirmation bias, where users engage primarily with like-minded others [8]. Recognizing this conceptual distinction is critical for understanding how algorithms and social dynamics jointly shape youth engagement with digital information.
A key outcome of this algorithmic curation is the formation of filter bubbles. Filter bubbles arise when algorithms systematically reduce the diversity of information presented, prioritizing content that resonates with prior interests or opinions while limiting exposure to alternative viewpoints [9]. This process fosters selective exposure and entrenches confirmation bias. Closely related are echo chambers, environments where individuals predominantly engage with like-minded voices, amplifying shared perspectives and marginalizing dissenting ones [5]. Echo chambers are particularly concerned in contexts of political and ideological polarization, where they can deepen divisions and even fuel extremism [10,11]. Complementing this, hybrid feature engineering models like HAPI demonstrate how propaganda on social media can be systematically classified using combinations of linguistic, sentimental, and structural features [12]. The success of these models underscores that algorithmic personalization interacts with strategically designed disinformation, further shaping how young audiences perceive political and social realities.
Another critical dimension in algorithmic bias is the systematic favoritism, embedded within algorithmic systems. Such bias often reproduces prevailing social, cultural, and political hierarchies, shaping users’ perceptions of reality by elevating dominant narratives while silencing marginalized ones, frequently without the user’s awareness [13,14]. Combined with selective exposure, algorithmic bias exacerbates polarization, spreads misinformation, and undermines democratic discourse [15,16].
For young social media users, these dynamics are particularly consequential. Adolescents and young adults are at a formative stage of cognitive, social, and political development, making them especially susceptible to the subtle influence of algorithmic curation. Their media habits are largely shaped within opaque digital ecosystems whose operations they often do not fully understand [17]. Studies indicate that many young users remain unaware of how algorithms structure their online experiences; even those who are aware often lack the critical skills to diversify their information sources [18,19]. Consequently, algorithms can profoundly shape what young people know, how they interact, and whom they trust, often without conscious recognition. Beyond propaganda, spam detection research shows the effectiveness of contextualized word embeddings such as BERT and ELMo in identifying hidden semantic cues that traditional models miss [20]. These advances highlight that the same technical sophistication driving personalized feeds can also be repurposed to enhance safeguards against manipulative or harmful information flows.
Although research on filter bubbles, echo chambers, and algorithmic bias has grown rapidly, the literature remains fragmented across multiple disciplines, including computer science, communication studies, psychology, and political science. Existing reviews [11] have synthesized broad insights, yet they typically treat “users” as a homogeneous category, overlooking the distinct vulnerabilities and agency of youth. Much of the scholarship either emphasizes the technical architecture of algorithms or focuses on adult political engagement, with comparatively limited attention to the intersection of algorithmic curation, youth media literacy, and socio-civic outcomes.
This absence of a youth-centered synthesis presents several challenges. First, young people constitute one of the most active user groups on algorithm-driven platforms, making them central to understanding both risks and opportunities [21,22]. Second, as youth are in the process of identity formation, exposure to homogeneous or polarized content can have long-term consequences for their democratic participation, cultural openness, and mental health [23]. Third, without drawing on diverse methodological traditions, ranging from computational audits [13,14] to ethnographic analyses [24], interventions to mitigate these risks are likely to be partial or ineffective.
This review also draws on broader perspectives from media and youth sociology to contextualize algorithmic effects. Scholars such as Sonia Livingstone [25,26] emphasize how digital mediation transforms young people’s civic and learning ecologies, while Nick Couldry [27] highlights the symbolic power of platforms in shaping the conditions of public voice and participation. Integrating these perspectives underscores that algorithmic influence extends beyond technical filtering to include deeper sociocultural and normative implications for youth agency and identity.
Against this backdrop, the rationale for the present review is clear: despite urgent global debates over algorithmic personalization and its societal consequences, youth perspectives remain underrepresented in academic discourse. A systematic review consolidating empirical and conceptual scholarship on the intersection of algorithmic curation and youth is essential for informing policy design, platform governance, educational strategies, and future research. This review seeks to bridge disciplinary silos, foreground youth experiences, and provide a coherent evidence base to guide both scholarly debate and practical efforts aimed at cultivating healthier and more diverse online environments for young people.

2. Objectives of the Study

The objectives of the study are as follows:
  • Map the methodological approaches, theoretical frameworks, and platforms studied in youth-related research on filter bubbles, echo chambers, and algorithmic bias.
  • Identify the mechanisms through which algorithms shape youth information exposure, belief formation, and social interaction.
  • Examine documented social, political, and psychological impacts of these phenomena on young people.

3. Methodology

3.1. Review Design and Framework

This study employed a systematic literature review (SLR) methodology to identify, evaluate, and synthesize peer-reviewed empirical research on filter bubbles, echo chambers, and algorithmic bias, with a particular focus on their impact on youth in social media contexts. The review adhered to the PRISMA 2020 guidelines [28] to ensure transparency, replicability, and rigor throughout the process. Following established recommendations [29,30], the review was carried out in four iterative phases: planning and protocol design, literature search and screening, eligibility and quality assessment, and data extraction followed by thematic synthesis.

3.2. Planning and Search Strategy

The search strategy was developed following an initial scoping of foundational scholarship on algorithmic personalization, online polarization, and youth digital behavior [1,9,15]. Based on this scoping, keyword clusters were formulated around four core constructs. Table 1 displays keyword clusters used to retrieve the required literature.
A pilot search was conducted to refine platform-specific syntax.
The Boolean search string combined platform-specific and conceptual terms as follows:
(“filter bubble*” OR “echo chamber*” OR “algorithmic bias” OR “personalization algorithm*”) AND (“youth” OR “adolescent*” OR “young people”) AND (“social media” OR “Facebook” OR “Instagram” OR “Twitter” OR “TikTok” OR “YouTube” OR “Weibo”).

3.3. Databases Searched

To ensure comprehensive coverage, searches were conducted across ten multidisciplinary and subject-specific databases known for indexing communication, sociology, political science, and computer science research. Table 2 shows the databases utilized to explore the required content from worldwide published studies.
Additionally, Google Scholar was used for backward and forward citation chasing. Searches were conducted independently by two reviewers between 15 May 2025, and 15 August 2025.

3.4. PRISMA Flow and Screening Outcome

Following title and abstract screening, 163 full-text articles were assessed for eligibility. After applying inclusion and exclusion criteria, 30 studies were retained for final synthesis. Reasons for exclusion at the full-text stage included lack of youth focus, non-empirical design, or absence of algorithmic dimension. The screening process is summarized in the PRISMA 2020 Flow Diagram (Figure 1), adapted from Page et al. [28].

3.5. Inclusion and Exclusion Criteria

Eligibility criteria were established in advance to ensure consistency in study selection. Table 3 reveals inclusion and exclusion criteria of the study.
While purely theoretical essays were excluded, three studies integrating conceptual analysis with empirical or computational elements were retained, as they provided theoretical value for the youth-focused synthesis.
Disagreements during screening were resolved by discussion, with arbitration by a third reviewer where necessary.
A potential limitation of the review lies in publication bias, particularly the predominance of English-language and Western-based studies in indexed databases. This overrepresentation may obscure diverse socio-cultural experiences of algorithmic media use in the Global South. Future reviews could mitigate this issue by incorporating multilingual searches and fostering collaborations with Global South research networks.

3.6. Quality Appraisal

Quality assessment was performed using tools that align with the study designs of the included articles. Table 4 displays the quality assessment tool. Out of all the included studies, 19 were rated high quality, 9 moderate, and 2 low but acceptable (due to limited transparency in sampling or analysis). Inter-rater reliability was assessed using Cohen’s Kappa (κ = 0.84), indicating strong agreement between reviewers during both the screening and quality assessment phases [31].

3.7. Synthesis Approach

Given the methodological heterogeneity of the included studies, a narrative thematic synthesis was adopted to integrate and interpret the findings. To ensure analytical clarity, the studies were organized into four thematic clusters: algorithmic processes, which examined how algorithms shape exposure and interactions; youth behavioral responses, focusing on awareness, adaptation, and resistance strategies; socio-political impacts, including polarization, civic engagement, and misinformation; and interventions and solutions, addressing policy reforms, media literacy initiatives, and design-based approaches. Thematic maps and comparative tables were developed to visualize both convergences and divergences across the studies. Evidence gaps were explicitly highlighted, enabling a nuanced understanding of where current scholarship converges and where further investigation is warranted. This approach ensured that the synthesis captured not only the breadth of available research but also the specific nuances of youth-centered dynamics in the social media landscape.

3.8. Data Extraction and Coding

Following the final selection of studies, data were systematically extracted using a standardized template to ensure both consistency and comparability. For each study, key information was recorded, including study design, youth demographics (age range and operational definition of “youth,” where applicable), platform(s) examined, algorithmic focus (e.g., filter bubbles, echo chambers, algorithmic bias, amplification), primary findings related to youth exposure, beliefs, and socio-civic outcomes, as well as relevant methodological considerations. The coding framework combined deductive categories derived from the review’s core constructs with inductive codes emerging directly from the data, following the six-step thematic analysis process proposed by Braun & Clarke [35]. In cases where studies were not exclusively youth-focused but offered strong mechanistic insights relevant to youth, they were clearly marked as “general users; youth-relevant” and retained for synthesis.

3.9. Ethical Considerations

As this review relied solely on published and publicly available academic literature, no formal ethical approval was required. All studies were accessed through institutional subscriptions or open-access repositories, and appropriate citation practices were followed to ensure academic integrity and proper acknowledgment of original sources.

3.10. Registration Statement

This study was not preregistered in PROSPERO or any other database, as it was conducted retrospectively based on previously published literature. However, all procedures adhered to the PRISMA 2020 guidelines to ensure methodological rigor, transparency, and reproducibility.

4. Results

4.1. Overview of Included Studies

This review synthesized 30 peer-reviewed journal articles published between 2015 and 2025, investigating the influence of filter bubbles, echo chambers, and algorithmic bias on youth and young adults within social media environments. The studies encompassed a range of methodologies, including computational audits, simulation modeling, experimental and survey-based research, qualitative ethnographies, and mixed-methods designs.
While several studies explicitly targeted youth populations [17,19,36], others examined general social media users but provided findings with strong relevance to youth engagement, civic participation, or susceptibility to algorithmic curation [1,15]. Together, these studies offer insight into:
  • the mechanisms by which algorithmic systems shape online information exposure,
  • the extent of ideological segregation in social networks,
  • the socio-psychological and contextual factors influencing youth susceptibility to polarization, and
  • potential pathways for intervention, including media literacy and platform design changes.

4.2. Descriptive Characteristics of Included Studies

Table 5 presents the descriptive characteristics of the 30 included studies, organized by author, study design, country/region, platform(s), and thematic focus.

4.3. Methodological Typology

The included studies reflected diverse methodological approaches, with computational audits and modeling dominating, while survey-based and ethnographic approaches highlighted the youth perspective. Figure 2 shows methods employed in the selected studies.
It illustrates the proportion of studies conducted under various research design methodologies. Computational/Algorithm Audits dominate with 33% of the studies, followed by Simulation/Agent-Based Modeling (17%) and Surveys & Experiments (20%). Qualitative/Ethnographic approaches account for 13% of the studies, while Mixed-Methods and Conceptual/Theoretical methodologies make up 7% and 10%, respectively.

4.4. Geographic Distribution

Figure 3 displays the distribution of studies across different regions, highlighting both youth-specific studies and non-youth-specific studies. The chart shows that the USA has the highest number of total studies (11), with a significant portion (3) focused on youth. Western Europe follows with 9 studies, 4 of which are youth-specific. Other regions, such as Eastern Europe (3 studies), Asia-Pacific (4 studies), Australia (2 studies), and Latin America (1 study), show fewer total studies, with a varying proportion of youth-specific research. The stacked structure of the chart allows for a clear visual comparison of the study types within each region.

4.5. Temporal Trends (2015–2025)

The timeline of publications reveals a strong growth in interest after 2018, aligning with heightened public concern over misinformation and political polarization. More recent studies (2023–2025) are notable for their emphasis on youth-centered perspectives and cross-cultural comparisons. Figure 4 shows the distribution of studies over time, from 2015 to 2025, with notable trends emerging in different periods. The chart highlights the growth in the number of studies, particularly after 2018, when topics like polarization, radicalization, and personality traits took center stage. The trend continues to rise in the years 2021–2023, focusing on youth awareness and misinformation. The most recent studies (2024–2025) reflect a shift toward global comparisons and a growing concern around the drift in recommendations. The chart clearly shows the increase in research volume, with each period color-coded for easy identification of themes tied to the studies.

4.6. Thematic Synthesis of Findings

Figure 5 illustrates the distribution of studies across four thematic clusters. Algorithmic Processes comprises the largest portion, accounting for 40% of the studies, followed by Socio-Political Impacts at 27%. Youth Behavioral Responses represents 20% of the studies, while Interventions & Solutions makes up the smallest share at 13%. This chart highlights the relative focus of research on algorithmic risks and socio-political effects, while emphasizing the relatively lesser attention on interventions and solutions.

4.7. Theoretical Framework Usage

Figure 6 shows distribution of theoretical frameworks across studies. Each node represents a distinct theoretical framework, with node size proportional to frequency of use.

4.8. Methodological Limitations Identified

Table 6 summarizes the main methodological limitations reported across 30 studies. The most frequent issue was Western bias (15/30), followed by the lack of longitudinal data (12/30). Algorithm opacity was noted in 10 studies, while youth-specific sampling gaps (8/30) and small qualitative samples (4/30) highlighted weaknesses in representativeness and external validity. Collectively, these limitations point to the need for more diverse, longitudinal, and youth-focused research.

4.9. Most Frequently Used Terms and Concepts

Terms like algorithmic bias, social media, youth engagement, and polarization dominate the word cloud, emphasizing their central role in the research. Figure 7 aggregates the most significant keywords from various thematic clusters within the study in a word cloud figure. It highlights the primary areas of focus, such as the impact of algorithmic bias on social media, the role of youth engagement in polarization, and the increasing concern over filter bubbles and echo chambers. The size of each word reflects its relevance and frequency in the research corpus, offering an intuitive understanding of the key issues addressed in the literature. This figure supports the ongoing narrative by visually reinforcing the dominance of these themes and their intersection with socio-political factors, media literacy, and digital youth behavior.

4.10. Platform-Specific Distribution

Figure 8 visualizes 30 studies across five platforms. Facebook (7) is primarily linked to polarization, with strong youth relevance. YouTube (6) is associated with radicalization, showing very strong youth relevance. Twitter/X (6) emphasizes echo chambers with moderate youth impact. Instagram/TikTok (2) explore algorithm awareness, assessed as under-researched but strong. Multi-platform studies (9) cover comparative bias, with varied relevance. The flow widths correspond to the number of studies, showing that multi-platform and Facebook research dominate, while TikTok/Instagram remain under-studied.

5. Discussion

This systematic review synthesized a decade of peer-reviewed scholarship (2015–2025) on filter bubbles, echo chambers, and youth engagement within algorithmically curated social media environments. Drawing from 30 studies across Europe, North America, Asia, and cross-regional contexts, the findings reveal both convergence and divergence in how researchers conceptualize, measure, and interpret the interplay between platform algorithms, selective exposure, and youth agency. While early work [1,9] approached algorithmic effects primarily through large-scale computational analysis of exposure diversity, more recent studies expand the scope to include youth awareness, critical literacy, identity formation, and cross-cultural contexts [17,45]. Across methodological traditions, audits, simulations, surveys, ethnographies, and conceptual analyses, three dominant patterns emerged.
First, there is a consistent observation across computational audits and simulation studies that platform curation systems amplify ideologically homogeneous content, reinforcing confirmation bias and limiting incidental exposure to diverse viewpoints [1,4,37]. These structural dynamics provide the “default” informational environment in which youth engagement unfolds. Simulation models highlight how small initial biases are magnified by recommender systems, producing polarization cascades at the network level [2,10,38]. Evidence from YouTube demonstrates how personalization drifts toward sensationalist and radical material [14,41,49]. Such findings underscore that algorithmic bias is not a marginal technical quirk but a structural driver shaping everyday media diets. For youth, this environment is especially influential: platforms such as TikTok, Instagram, and YouTube are central not only for entertainment but also for identity work and civic socialization [17]. The narrowing of exposure may thus have longer-term consequences for political learning and civic participation.
Contrary to deterministic accounts of “algorithmic capture,” the reviewed studies highlight significant but uneven youth agency. Survey and interview-based research suggests that many young people possess a partial awareness of algorithmic personalization and actively attempt to diversify their feeds, for example by following cross-cutting accounts or using secondary platforms [17,19]. Ethnographic accounts further show that youth sometimes resist or parody algorithmic power through practices of “algorithmic gossip” and irony [24,40]. Yet, this agency is constrained. Opaque platform architectures limit the effectiveness of user strategies, while broader political and cultural contexts shape the space for resistance. In authoritarian or highly polarized societies, youth face added pressures: in Morocco, YouTube echo chambers around rap culture reveal how algorithms intersect with socio-political marginalization [50]; in Turkey, restrictive environments amplify the chilling effects of echo chambers on freedom of expression [45]. In China, echo chambers during COVID-19 rumor rebuttals demonstrated how algorithmic amplification interacts with state-directed discourse [16]. Together, these cases illustrate that while youth are not passive, their ability to exercise agency varies dramatically by context.
A third key theme is the intertwining of algorithmic curation with identity processes. Experimental and survey studies confirm that curated feeds can strengthen partisan attitudes and affective polarization [18,42]. For youth, however, polarization is not experienced only as political division but also as cultural and subcultural identity reinforcement. Ethnographic and qualitative studies describe how algorithmic environments facilitate the co-creation of shared vocabularies, memes, and humor that both strengthen in-group belonging and exclude outsiders [24,45]. Comparative work shows that individual differences such as personality traits also mediate susceptibility to echo chamber effects [19]. Taken together, this suggests that algorithms do not only “trap” youth in bubbles but also furnish symbolic resources for identity work, making interventions more complex than simply diversifying exposure.
The findings also indicate that technological fixes alone, such as minor tweaks to recommendation algorithms, are insufficient. While algorithmic transparency and auditing are necessary [14,22], they must be complemented by investments in youth-centered algorithmic literacy. Programs should move beyond basic fact-checking to equip young users with knowledge about data collection, personalization processes, and strategies for intentional content diversification. For policymakers, the findings highlight an urgent need for regulatory frameworks that ensure accountability in recommendation systems. The European Union’s Digital Services Act is one step forward, but youth-specific protections remain underdeveloped [51]. NGOs and educational institutions should work with platforms to implement better design interventions, such as customizable feed options, that balance user empowerment with systemic safeguards.

Methodological Gaps and Future Research

This review identified four recurring methodological limitations that shape the current evidence base. First, there is a marked geographic imbalance, with most studies focusing on the United States and Western Europe, leaving youth in the Global South largely underrepresented. Only a few investigations address non-Western contexts, such as Morocco [50], Turkey [45], or China [16], limiting the generalizability of findings across diverse political and cultural environments. Second, the literature demonstrates significant temporal blindness, as very few studies adopt longitudinal designs. This gap restricts our understanding of how filter bubble and echo chamber effects evolve over time, particularly during critical developmental stages of political socialization and civic identity formation. Third, there is a reliance on fragmented methodological approaches: computational audits and modeling studies often provide detailed insights into algorithmic structures but remain disconnected from the lived experiences of youth, whereas qualitative and ethnographic accounts offer depth but lack generalizability.
Mixed-method approaches that could bridge these divides remain scarce. Finally, the field is hindered by conceptual ambiguity. Key terms such as “filter bubble” and “echo chamber” are frequently used interchangeably despite their distinct theoretical origins, leading to inconsistencies in operationalization and synthesis across studies [48]. Addressing these gaps will require more geographically inclusive, temporally sensitive, methodologically integrated, and conceptually precise research agendas. Future research should therefore adopt cross-cultural comparative designs, employ multi-platform approaches, and incorporate experimental intervention testing to evaluate the impact of literacy training or algorithm redesigns.
To address this gap, future studies could establish regional research consortia or partnerships with Global South universities to ensure inclusion of diverse cultural and linguistic contexts. Additionally, supporting open-access repositories for non-English publications and youth-centered participatory research can foster more equitable global knowledge production.
Overall, the findings highlight that while algorithms contribute to ideological segmentation, young people are not passive consumers. Many demonstrate adaptive strategies, such as cross-platform verification and intentional content diversification, that reflect emerging forms of algorithmic literacy. Cultivating these skills through media education could mitigate the isolating effects of personalized information flows.

6. Conclusions

This review shows that filter bubbles and echo chambers are socio-technical phenomena, products of algorithmic design, user agency, and broader cultural and political contexts. Youth are neither passive victims of algorithmic determinism nor fully autonomous navigators of their digital worlds; rather, they negotiate complex environments shaped by both constraint and creativity. The challenge for researchers, educators, and policymakers is to move beyond binary framings of “trapped versus free” and to cultivate approaches that recognize youth as active participants in algorithmically mediated publics. Addressing risks will require a dual strategy: structural reforms in platform governance and deep, critical algorithmic literacy embedded in education. Only by combining these can we ensure that young people are not merely surviving but thriving in an algorithmic society.

Author Contributions

Conceptualization, M.A.; Methodology, M.A. and K.S.; Validation, M.A. and K.S.; Formal analysis, M.A.; Investigation, M.A.; Resources, A.I. and M.L.; Data curation, A.I. and M.L.; Writing—original draft preparation, M.A.; Writing—review and editing, K.S.; Visualization, A.I. and M.L.; Supervision, M.A.; Project administration, M.A.; Funding acquisition, A.I. and M.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding. The authors gratefully acknowledge the support and cooperation provided by Prince Sultan University, KSA, which facilitated the completion of this study. The Article Processing Charge (APC) was funded by Prince Sultan University, KSA.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

This study did not involve human participants; therefore, informed consent was not required.

Data Availability Statement

All data analyzed in this study were derived from publicly available, peer-reviewed journal articles identified through academic databases. No new empirical data were generated. The datasets extracted from the 30 selected studies are summarized in Table 5.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Bakshy, E.; Messing, S.; Adamic, L.A. Exposure to ideologically diverse news and opinion on Facebook. Science 2015, 348, 1130–1132. [Google Scholar] [CrossRef]
  2. Berman, R.; Katona, Z. Curation algorithms and filter bubbles in social networks. Mark. Sci. 2020, 39, 296–316. [Google Scholar] [CrossRef]
  3. Baeza-Yates, R. Bias on the web. Commun. ACM 2018, 61, 54–61. [Google Scholar] [CrossRef]
  4. Kulshrestha, J.; Eslami, M.; Messias, J.; Zafar, M.B.; Ghosh, S.; Gummadi, K.P.; Karahalios, K. Search bias quantification: Investigating political bias in social media and web search. Inf. Retr. J. 2019, 22, 188–227. [Google Scholar] [CrossRef]
  5. Bucher, T. If…Then: Algorithmic Power and Politics; Oxford University Press: Oxford, UK, 2018. [Google Scholar]
  6. Ahmad, P.N.; Guo, J.; AboElenein, N.M.; Haq, Q.M.U.; Ahmad, S.; Algarni, A.D.; Ateya, A.A. Hierarchical graph-based integration network for propaganda detection in textual news articles on social media. Sci. Rep. 2025, 15, 1827. [Google Scholar] [CrossRef]
  7. Pariser, E. The Filter Bubble: What the Internet Is Hiding from You; Penguin: London, UK, 2011. [Google Scholar]
  8. Sunstein, C.R. #Republic: Divided Democracy in the Age of Social Media; Princeton University Press: Princeton, NJ, USA, 2017. [Google Scholar]
  9. Flaxman, S.; Goel, S.; Rao, J.M. Filter bubbles, echo chambers, and online news consumption. Public Opin. Q. 2016, 80, 298–320. [Google Scholar] [CrossRef]
  10. Geschke, D.; Lorenz, J.; Holtz, P. The triple-filter bubble: Using agent-based modelling to test a meta-theoretical framework for the emergence of filter bubbles and echo chambers. Br. J. Soc. Psychol. 2019, 58, 129–149. [Google Scholar] [CrossRef]
  11. Terren, L.; Borge, R. Echo Chambers on Social Media: A Systematic Review of the Literature. Rev. Commun. Res. 2021, 9, 99–118. [Google Scholar] [CrossRef]
  12. Khanday, A.M.U.D.; Wani, M.A.; Rabani, S.T.; Khan, Q.R.; Abd El-Latif, A.A. HAPI: An efficient Hybrid Feature Engineering-based Approach for Propaganda Identification in social media. PLoS ONE 2024, 19, e0302583. [Google Scholar] [CrossRef]
  13. Metaxa, D.; Park, J.S.; Robertson, R.E.; Karahalios, K.; Wilson, C.; Hancock, J.; Sandvig, C. Auditing algorithms. Found. Trends Hum. Comput. Interact. 2021, 14, 272–344. [Google Scholar] [CrossRef]
  14. Srba, I.; Moro, R.; Tomlein, M.; Pecher, B.; Simko, J.; Stefancova, E.; Kompan, M.; Hrckova, A.; Podrouzek, J.; Gavornik, A.; et al. Auditing YouTube’s Recommendation Algorithm for Misinformation Filter Bubbles. ACM Trans. Recomm. Syst. 2023, 1, 1–33. [Google Scholar] [CrossRef]
  15. Guess, A.M.; Malhotra, N.; Pan, J.; Barberá, P.; Allcott, H.; Brown, T.; Crespo-Tenorio, A.; Dimmery, D.; Freelon, D.; Gentzkow, M.; et al. How do social media feed algorithms affect attitudes and behavior in an election campaign? Science 2023, 381, 398–404. [Google Scholar] [CrossRef]
  16. Qian, Y.; Wang, D. Echo Chamber Effect in Rumor Rebuttal Discussions About COVID-19 in China: Social Media Content and Network Analysis Study. J. Med. Internet Res. 2021, 23, e27009. [Google Scholar] [CrossRef]
  17. de Groot, T.; de Haan, M.; van Dijken, M. Learning in and about a filtered universe: Young people’s awareness and control of algorithms in social media. Learn. Media Technol. 2023, 48, 701–713. [Google Scholar] [CrossRef]
  18. Baumgaertner, B.; Justwan, F. The preference for belief, issue polarization, and echo chambers. Synthese 2022, 200, 412. [Google Scholar] [CrossRef]
  19. Sindermann, C.; Elhai, J.D.; Moshagen, M.; Montag, C. Age, gender, personality, ideological attitudes and individual differences in a person’s news spectrum: How many and who might be prone to “filter bubbles” and “echo chambers” online? Heliyon 2020, 6, e03214. [Google Scholar] [CrossRef]
  20. Alshattnawi, S.; Shatnawi, A.; AlSobeh, A.M.R.; Magableh, A.A. Beyond Word-Based Model Embeddings: Contextualized Representations for Enhanced Social Media Spam Detection. Appl. Sci. 2024, 14, 2254. [Google Scholar] [CrossRef]
  21. Broto Cervera, R.; Pérez-Solà, C.; Batlle, A. Overview of the Twitter conversation around #14F 2021 Catalonia regional election: An analysis of echo chambers and presence of social bots. Soc. Netw. Anal. Min. 2024, 14, 96. [Google Scholar] [CrossRef]
  22. Cakmak, M.C.; Agarwal, N.; Oni, R. The bias beneath: Analyzing drift in YouTube’s algorithmic recommendations. Soc. Netw. Anal. Min. 2024, 14, 171. [Google Scholar] [CrossRef]
  23. Guo, S.; Song, X.; Gao, Y. Personality traits and their influence on Echo chamber formation in social media: A comparative study of Twitter and Weibo. Front. Psychol. 2024, 15, 1323117. [Google Scholar] [CrossRef] [PubMed]
  24. Bishop, S. Managing visibility on YouTube through algorithmic gossip. New Media Soc. 2019, 21, 2589–2606. [Google Scholar] [CrossRef]
  25. Livingstone, S. Developing social media literacy: How children learn to interpret risky opportunities on social network sites. Communications 2014, 39, 283–303. [Google Scholar] [CrossRef]
  26. Livingstone, S. Children and the Internet; Polity Press: Cambridge, UK, 2009. [Google Scholar]
  27. Couldry, N. Media, Society, World: Social Theory and Digital media Practice; Polity Press: Cambridge, UK, 2012. [Google Scholar]
  28. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, M.; et al. The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
  29. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. PLoS Med. 2009, 6, e1000097. [Google Scholar] [CrossRef] [PubMed]
  30. Tranfield, D.; Denyer, D.; Smart, P. Towards a Methodology for Developing Evidence-Informed Management Knowledge by Means of Systematic Review. Br. J. Manag. 2003, 14, 207–222. [Google Scholar] [CrossRef]
  31. McHugh, M.L. Interrater reliability: The kappa statistic. Biochem. Medica 2012, 22, 276. [Google Scholar] [CrossRef]
  32. CASP. Critical Appraisal Skills Programme Qualitative Checklist. 2018. Available online: https://casp-uk.net/casp-tools-checklists/qualitative-studies-checklist/ (accessed on 9 August 2025).
  33. Hong, Q.N.; Fàbregues, S.; Bartlett, G.; Boardman, F.; Cargo, M.; Dagenais, P.; Gagnon, M.P.; Griffiths, F.; Nicolau, B.; O’Cathain, A.; et al. The Mixed Methods Appraisal Tool (MMAT) version 2018 for information professionals and researchers. Educ. Inf. 2018, 34, 285–291. [Google Scholar] [CrossRef]
  34. Downes, M.J.; Brennan, M.L.; Williams, H.C.; Dean, R.S. Development of a critical appraisal tool to assess the quality of cross-sectional studies (AXIS). BMJ Open 2016, 6, e011458. [Google Scholar] [CrossRef] [PubMed]
  35. Braun, V.; Clarke, V. Using Thematic Analysis in Psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef]
  36. Guo, Q. Intentional echo chamber management: Chinese celebrity fans’ information-seeking and sense-making practices on social media. J. Doc. 2025, 81, 236–252. [Google Scholar] [CrossRef]
  37. Chen, W.; Pacheco, D.; Yang, K.C.; Menczer, F. Neutral bots probe political bias on social media. Nat. Commun. 2021, 12, 5580. [Google Scholar] [CrossRef]
  38. Sîrbu, A.; Pedreschi, D.; Giannotti, F.; Kertész, J. Algorithmic bias amplifies opinion fragmentation and polarization: A bounded confidence model. PLoS ONE 2019, 14, e0213246. [Google Scholar] [CrossRef]
  39. Papakyriakopoulos, O.; Carlos, J.; Serrano, M.; Hegelich, S. Political communication on social media: A tale of hyperactive users and bias in recommender systems. J. Online Soc. Netw. Media 2019, 15, 100058. [Google Scholar] [CrossRef]
  40. Bucher, T. The algorithmic imaginary: Exploring the ordinary affects of Facebook algorithms. Inf. Commun. Soc. 2017, 20, 30–44. [Google Scholar] [CrossRef]
  41. Ledwich, M.; Zaitsev, A. Algorithmic Extremism: Examining YouTube’s Rabbit Hole of Radicalization. arXiv 2019. [Google Scholar] [CrossRef]
  42. Shmargad, Y.; Klar, S. Sorting the News: How Ranking by Popularity Polarizes Our Politics. Political Commun. 2020, 37, 423–446. [Google Scholar] [CrossRef]
  43. Del Cerro, C.C. The power of social networks and social media’s filter bubble in shaping polarisation: An agent-based model. Appl. Netw. Sci. 2024, 9, 69. [Google Scholar] [CrossRef]
  44. Peralta, A.F.; Kertész, J.; Iñiguez, G. Opinion formation on social networks with algorithmic bias: Dynamics and bias imbalance. J. Phys. Complex. 2021, 2, 045009. [Google Scholar] [CrossRef]
  45. Aydoğan, B.B.; Köse, H. The Problem of Freedom of Expression in the Public Sphere of Social Media: Descriptive Analysis of the Echo Chamber Effect. Siyasal J. Political Sci. 2024, 33, 277–317. [Google Scholar] [CrossRef]
  46. Ackermann, K.; Stadelmann-Steffen, I. Voting in the Echo Chamber? Patterns of Political Online Activities and Voting Behavior in Switzerland. Swiss Political Sci. Rev. 2022, 28, 377–400. [Google Scholar] [CrossRef]
  47. Avnur, Y. What’s Wrong with the Online Echo Chamber: A Motivated Reasoning Account. J. Appl. Philos. 2020, 37, 578–593. [Google Scholar] [CrossRef]
  48. Cakmak, M.C.; Agarwal, N. Influence of symbolic content on recommendation bias: Analyzing YouTube’s algorithm during Taiwan’s 2024 election. Appl. Netw. Sci. 2025, 10, 23. [Google Scholar] [CrossRef]
  49. Coady, D. Stop Talking about Echo Chambers and Filter Bubbles. Educ. Theory 2024, 74, 92–107. [Google Scholar] [CrossRef]
  50. Moreno-Almeida, C. Memes as snapshots of participation: The role of digital amateur activists in authoritarian regimes. New Media Soc. 2021, 23, 1545–1566. [Google Scholar] [CrossRef]
  51. European Commission. The Digital Services Act. 2022. Available online: https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en (accessed on 18 July 2025).
Figure 1. PRISMA 2020 flow diagram for study selection, adapted from Page et al. [28].
Figure 1. PRISMA 2020 flow diagram for study selection, adapted from Page et al. [28].
Societies 15 00301 g001
Figure 2. Research methods employed in the studies.
Figure 2. Research methods employed in the studies.
Societies 15 00301 g002
Figure 3. Geographic distribution of studies by region.
Figure 3. Geographic distribution of studies by region.
Societies 15 00301 g003
Figure 4. Temporal trends (2015–2025).
Figure 4. Temporal trends (2015–2025).
Societies 15 00301 g004
Figure 5. Thematic synthesis of findings.
Figure 5. Thematic synthesis of findings.
Societies 15 00301 g005
Figure 6. Theoretical framework usage.
Figure 6. Theoretical framework usage.
Societies 15 00301 g006
Figure 7. Frequently used terms and concepts.
Figure 7. Frequently used terms and concepts.
Societies 15 00301 g007
Figure 8. Characteristics and population focus.
Figure 8. Characteristics and population focus.
Societies 15 00301 g008
Table 1. Keyword clusters.
Table 1. Keyword clusters.
Core ConstructExample Keywords
Algorithmic Curation“algorithmic bias,” “recommender systems,” “personalization”
Filter Bubbles and Echo Chambers“echo chamber,” “filter bubble,” “information cocoon,” “content polarization”
Youth and Adolescents“youth,” “teenagers,” “young adults,” “adolescents”
Social Media Platforms“Twitter,” “Facebook,” “YouTube,” “Instagram,” “TikTok,” “Weibo”
Table 2. Databases searched.
Table 2. Databases searched.
DatabaseDisciplines Covered
Web of Science (Core Collection)Multidisciplinary
ScopusMultidisciplinary, social sciences
Communication & Mass Media Complete (CMMC)Communication studies
ACM Digital LibraryComputer science, technology
IEEE XploreEngineering, technology
JSTORHumanities, social sciences
SAGE Journals OnlineSocial sciences, education
Taylor & Francis OnlineSocial sciences, humanities
SpringerLinkMultidisciplinary
ProQuest Social ScienceSocial sciences, political science
Table 3. Inclusion and exclusion criteria.
Table 3. Inclusion and exclusion criteria.
Inclusion CriteriaExclusion Criteria
Peer-reviewed journal articlesGrey literature (blogs, theses, op-eds)
Published 2015–2025Published before 2015
Written in EnglishNon-English publications
Focus on filter bubbles, echo chambers, or algorithmic biasStudies on general internet use without focus on these phenomena
Explicit examination of youth (adolescents, young adults ≤ 30)Studies without youth-specific data or analysis
Any empirical method (qualitative, quantitative, mixed)Purely theoretical essays, commentaries
Focus on social media platforms (e.g., Facebook, Twitter, YouTube, Instagram, TikTok)Offline or non-social media contexts
Table 4. Quality assessment tool.
Table 4. Quality assessment tool.
Study DesignQuality Assessment ToolReference
Qualitative studiesCASP (Critical Appraisal Skills Programme)CASP [32]
Mixed-methods studiesMMAT (Mixed Methods Appraisal Tool)Hong et al. [33]
Cross-sectional surveysAXIS (Appraisal of Cross-Sectional Studies)Downes et al. [34]
Table 5. Characteristics of included studies.
Table 5. Characteristics of included studies.
No.Author(s) & YearStudy DesignCountry/RegionPlatform(s)Thematic Focus
1Bakshy et al. [1]Large-scale computational analysisUSAFacebookExposure to ideologically diverse news
2Flaxman et al. [9] Quantitative web traffic analysisUSAGoogle News, social mediaFilter bubbles and news consumption
3Guess et al. [15]Field experimentUSAFacebook, InstagramAlgorithmic feeds and political attitudes
4Chen et al. [37]Experimental audit with botsUSATwitterDetecting political bias in curation
5Metaxa et al. [13]Systematic audit studyUSAMultiple platformsBias in algorithmic audits
6Srba et al. [14]Algorithm audit & content analysisSlovakiaYouTubeMisinformation filter bubbles
7Kulshrestha et al. [4]Quantitative bias measurementGermany/USAFacebook, GooglePolitical bias in search & feeds
8Sîrbu et al. [38]Simulation modelingItalyGeneric networksAlgorithmic bias amplifying polarization
9Baeza-Yates [3]Conceptual & empirical synthesisChile/USAWebWeb bias and fairness
10Berman & Katona [2]Simulation & network analysisUSAGeneric networksFilter bubble formation
11Papakyriakopoulos et al. [39]Computational communication analysisGermanyTwitterHyperactive users & recommender bias
12Bishop [24]Qualitative digital ethnographyUKYouTubeAlgorithmic gossip & visibility
13Bucher [40]Qualitative interviewsNorwayFacebookAlgorithmic imaginary
14Ledwich & Zaitsev [41]Data-driven analysisAustraliaYouTubeRadicalization & algorithmic extremism
15Shmargad & Klar [42]Online experimentUSANews ranking platformsPopularity ranking & polarization
16de Groot et al. [17]Mixed-methods (survey + interviews)NetherlandsInstagram, TikTok, YouTubeYouth awareness of algorithms
17Cakmak, Agarwal, & Oni [22]Computational auditUSAYouTubeDrift in recommendation bias
18Broto Cervera et al. [21]Social network analysisSpainTwitterEcho chambers & bots
19Chueca Del Cerro [43]Agent-based modelingSpainGeneric networksPolarisation dynamics
20Sindermann et al. [19]Survey studyGermanyMultiplePersonality & bubble susceptibility
21Peralta et al. [44]SimulationHungaryGeneric networksOpinion formation with bias
22Baumgaertner & Justwan [18]Experimental surveyUSAMultipleBelief preference & polarization
23Guo, S. et al. [23]Comparative network analysisChina/USATwitter, WeiboPersonality traits & echo chambers
24Aydoğan & Köse [45]Qualitative descriptiveTurkeyMultipleFreedom of expression & echo chambers
25Qian & Wang [16]Network/content analysisChinaWeiboEcho chambers in COVID-19 rebuttals
26Ackermann & Stadelmann-Steffen [46]SurveySwitzerlandFacebook, TwitterOnline activity & voting
27Avnur [47]Philosophical/theoreticalUSAMultipleMotivated reasoning in echo chambers
28Coady [48]Critical conceptual analysisAustraliaMultipleRethinking filter bubbles
29Geschke et al. [10]Agent-based modelingGermanyGeneric networksTriple-filter bubble dynamics
30Terren & Borge [11]Literature reviewSpainMultipleEcho chambers research synthesis
Note: Studies are listed randomly, and no chronological order of publication was followed.
Table 6. Common limitations.
Table 6. Common limitations.
LimitationStudies ReportingConsequences
Western bias15/30Limited generalizability
Lack of longitudinal data12/30No evidence of sustained effects
Algorithm opacity10/30Reliance on black-box inference
Low youth-specific sampling8/30Gaps in adolescent-focused findings
Small qualitative samples4/30Weak external validity
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ahmmad, M.; Shahzad, K.; Iqbal, A.; Latif, M. Trap of Social Media Algorithms: A Systematic Review of Research on Filter Bubbles, Echo Chambers, and Their Impact on Youth. Societies 2025, 15, 301. https://doi.org/10.3390/soc15110301

AMA Style

Ahmmad M, Shahzad K, Iqbal A, Latif M. Trap of Social Media Algorithms: A Systematic Review of Research on Filter Bubbles, Echo Chambers, and Their Impact on Youth. Societies. 2025; 15(11):301. https://doi.org/10.3390/soc15110301

Chicago/Turabian Style

Ahmmad, Mukhtar, Khurram Shahzad, Abid Iqbal, and Mujahid Latif. 2025. "Trap of Social Media Algorithms: A Systematic Review of Research on Filter Bubbles, Echo Chambers, and Their Impact on Youth" Societies 15, no. 11: 301. https://doi.org/10.3390/soc15110301

APA Style

Ahmmad, M., Shahzad, K., Iqbal, A., & Latif, M. (2025). Trap of Social Media Algorithms: A Systematic Review of Research on Filter Bubbles, Echo Chambers, and Their Impact on Youth. Societies, 15(11), 301. https://doi.org/10.3390/soc15110301

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop