Next Article in Journal
Media and International Relations: Serbian Media Narrative on the EU in Light of the “Lithium Crisis” in Serbia
Previous Article in Journal
Framing ASEAN in the Platform Age: Media Infrastructures and Geopolitical Narratives in East Asia
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Truth and Trust in the News: How Young People in Portugal and Finland Perceive Information Operations in the Media

1
Unit of Social Research, Faculty of Social Sciences, Tampere University, 33014 Tampere, Finland
2
Research Center for Arts and Communication (CIAC-UMAIA), University of Maia, 4475-690 Maia, Portugal
3
Artificial Intelligence and Computer Science Laboratory (LIACC), University of Porto, 4200-465 Porto, Portugal
4
Communication and Society Research Centre (CECS), University of Minho, 4710-057 Braga, Portugal
*
Author to whom correspondence should be addressed.
Journal. Media 2026, 7(1), 13; https://doi.org/10.3390/journalmedia7010013
Submission received: 14 November 2025 / Revised: 9 January 2026 / Accepted: 15 January 2026 / Published: 20 January 2026

Abstract

This study explores how young people in Finland and Portugal perceive media trust and vulnerability to information operations in the digital era. While both groups rely heavily on digital platforms for news, they view online sources as less reliable due to disinformation and fake news, especially on TikTok and Instagram. Trust and truth appear emotionally driven, with influencers and entertainment content often considered credible, increasing susceptibility to manipulation. Despite identifying as ‘digital natives’, participants rarely question source credibility or algorithmic influence, leaving them exposed to adversarial actors, such as Russia.

1. Introduction

Young people’s trust in news is a complex and contested issue, particularly in the context of mis- and disinformation on digital platforms (Costera Meijer, 2007; Melro & Pereira, 2023). States like Russia and China strategically use disinformation and selective framing to promote biased narratives in the global media landscape (Yanchenko & Humprecht, 2025). While these countries are known for conducting information operations using false content, other nations are also adopting similar tactics to foster polarized narratives across social and mainstream media (Albert et al., 2023; Fang, 2024; Yanchenko & Humprecht, 2025). Information operations are strategic efforts by state or non-state actors that use information, influencers, and digital platforms to manipulate opinions, behaviors, and decision-making for political or military advantage. These operations aim to dominate the cognitive sphere, often thriving in environments where truth is not the primary concern.
Mis- and disinformation are common tools in information warfare (Meriläinen, 2024, 2025). According to the American Psychological Association (APA, 2023), disinformation refers to deliberately false content meant to mislead, while misinformation involves inaccurate or incorrect information shared without intent to deceive. Digital platforms (social media, Internet, and gaming) amplify both, especially targeting young audiences. Although young people often rely on online media and influencers for news (Meriläinen, 2022, 2024), previous studies reveal they trust more in the information from traditional media outlets, showing a paradox between what they believe to be true and what information they trust (Costera Meijer, 2007; Melro & Pereira, 2023). In other words, they trust more in what they believe to be the ‘truth’, but they do not actually pursue what they think is trustworthy information (Melro & Pereira, 2023). Additionally, recent research highlights growing concerns about the reliability of digital platforms and online actors, framing them as potential national security risks in the context of information operations (Kalpokas, 2017; Meriläinen, 2025).
The socially accepted role of youth and their idealized view of news are being challenged by digital communities and personalized content, which often create echo chambers and filter bubbles. In this environment, truth becomes blurred—where labeling something as “fake news” or cancelling someone is easy, and opinions often outweigh facts in a post-truth context. The rapid spread of mis- and disinformation online has further fueled populist narratives and deepened political polarization (Dmytro & Viacheslav, 2025). This polarization of politics, with extreme poles, is gaining young people’s attention, such as extreme political movements from left to right across the world. Much research has been done on the negative impacts of digital platforms on youth (Amichai-Hamburger & Ben-Artzi, 2003; Markey & Daniels, 2022; Nilan et al., 2015; Parris et al., 2022). Yet, it is important to recognize the central role these play in the lives of young people, especially as news and information sources (Meriläinen, 2022, 2025). Meanwhile, for example, Americans’ trust in legacy news media is at its lowest record (Brenan, 2025). High awareness of mis- and disinformation has led to growing distrust in traditional news media. As a result, many young people are turning away from legacy TV news and increasingly consuming non-mainstream sources, reflecting a broader shift toward anti-establishment perspectives (Hameleers et al., 2022), which is used as a tool in information operations as part of cognitive warfare (Meriläinen, 2024, 2025).
Moreover, disinformation may not be considered while making decisions, especially in times of warfare when, for instance, Russian claims are portrayed as facts on social media and the Internet (Kanet, 2024). The power of digital platforms is that the content can be anything, from facts to fiction, and still appear as facts. Sikorski (2024) argues that hate speech threatens modern societies as Russia exploits the media to manipulate public opinion and destabilize political opponents. The low trust in institutions and the unregulated nature of the Internet and social media make people vulnerable to radicalizing narratives and global power disruption (Sikorski, 2024).
Combined with cognitive biases, algorithms reinforce echo chambers and hinder efforts to combat misinformation, playing a central role in political elections, as proven, for instance, in the Cambridge Analytica case (Burton, 2023; Dmytro & Viacheslav, 2025; Heawood, 2018). Given this context of distrust and uncertainty, opinions about trust and what youngsters believe to be true may differ from sociocultural backgrounds and national political history. Considering the contrasting histories of northeast and southwest Europe, this study seeks to compare youngsters’ perspectives about what they trust and believe is true regarding political information and information operations, especially on digital platforms. The aim of this study is to study how young people in Finland and Portugal understand trust and truth in political information within digital platforms, which are shaped by mis- and disinformation and information operations. The study argues that youth often rely on emotional and identity-based cues when evaluating the truth and trust of information and actors online, leaving them vulnerable to manipulative content despite high confidence in their critical media literacy online. It contributes new comparative insights into how digital platforms, sociopolitical context, and media habits shape young people’s democratic engagement and exposure to information operations across different geographical regions and historical backgrounds. This study, therefore, seeks to answer the following research questions: (RQ1) What are young people’s perspectives about (un)truth and (dis)trust regarding the media and political information? (RQ2) How do Finnish and Portuguese youth perceive information operations and the future of democracy in the digital and influencer culture?

2. Literature Review

The literature review examines digital platforms, influencers, and information operations, and together these subsections build toward the study’s aim of understanding youths’ perceptions of trust and truth in media and political information, while addressing the research gap in how such sociotechnical dynamics shape young people’s vulnerability to information operations across different national contexts across different sides of Europe.

2.1. Digitality and Credible Content

Digital platforms are a central part of the lives of young people (Bibi et al., 2023; Livingstone, 2024; Melro, 2018). In Finland, in relation to legacy media, young people’s trust and relationship are changing (Ehrlén et al., 2023; Meriläinen, 2022). Many rely on platforms like YouTube, Reddit, Instagram, TikTok, and Snapchat—as well as influencers and digitally active young politicians—for news and information (Klopfenstein Frei et al., 2024; Meriläinen, 2022, 2025). Similarly, young people in Portugal primarily consume news through social media, affecting their lives and beliefs (Feio & Oliveira, 2025). In this form of ‘ambient journalism’ (Hermida, 2010), traditional gatekeeping is bypassed, and platforms shape how youth access news, stay informed, and engage in society. However, these platforms also host problematic content, including widely spread disinformation (Aïmeur et al., 2023; Belotti et al., 2022; Meriläinen, 2022, 2024; Theocharis & de Moor, 2021). While digital media can foster participation, peer engagement, and minority empowerment, they are also spaces where cyberbullying, harassment, and hate speech thrive—issues exacerbated by the lack of accountability from tech companies regarding online human rights violations (Ahmed & Mizan, 2018; Craig et al., 2021; Garcia et al., 2020; Harasgama & Jayamaha, 2022; Sambasivan et al., 2019; Waisbord, 2020; Zalnieriute & Milan, 2019).
Content on digital platforms is consumed through selective framing. As Entman (1993) explains, framing involves emphasizing certain aspects of perceived reality to promote specific interpretations or solutions. At its core, framing refers to the placing of a selective emphasis on a particular aspect of the attributes of issues, actors, and events (Meriläinen, 2014). Digital platforms have enabled new agenda setters—such as influencers—to shape news and information flows, bypassing traditional media gatekeepers.

2.2. Influencers as the New(er) Agenda Setters

Influencers have gained significant access to social and political discourse through digital platforms. Influencers legitimize and amplify political messages on digital platforms (Goodwin et al., 2023; Meriläinen, 2025). Perceived as trustworthy by their followers, they often act as agenda setters—sometimes even unknowingly serving the interests of anonymous actors or adversary states (Goodwin et al., 2023; Meriläinen, 2024; Schouten et al., 2020). At their most influential, they function as issue owners with large, cross-platform audiences (Agarwal & Damle, 2020; Meriläinen, 2025). While influencers can enhance young people’s political engagement (Harff & Schmuck, 2023), they may also be exploited in information warfare involving AI, digital technologies, and algorithms (Meriläinen, 2025).
Algorithms play a key role in shaping the news youth encounter online, raising concerns about platforms like TikTok and their impact on trust in legacy media (Tuominen, 2025). For young people, aesthetics, fashion, and lifestyle heavily influence perceptions of political credibility (Meriläinen, 2024; Meriläinen et al., 2024; Parmelee et al., 2023). Political influencers—whether paid or unpaid—are content creators who advocate for social and political issues, seeking credibility, visibility, and influence, often in close connection with campaigns and the influencer economy (Goodwin et al., 2023). Their broad reach makes them valuable in information operations and a risk in national security contexts, where they can shape ideologies or even turn individuals against their own country (Meriläinen, 2025).
Influencers use selective framing differently from legacy media, largely due to the influence of ICT, AI, and algorithms, and their lack of accountability. As Banks (2001) notes, digital creators highlight certain aspects of reality, contributing to ideological shifts and polarization. Ditto et al. (2019) found that both liberals and conservatives tend to accept information that aligns with their beliefs, showing partisan bias. While influencers can strategically shape narratives and influence young audiences by sharing aligned values, they are not bound by journalistic ethics, fact-checking, and are rarely held accountable for mis- or disinformation. Despite recognizing the unreliability of online content, young people often trust information shared by friends, family, influencers, or celebrities (Clark & Marchi, 2017; Meriläinen, 2022). Repetition across platforms can reinforce false beliefs (Meriläinen, 2022, 2024; Skurnik et al., 2005), especially in a convergent digital environment where users constantly switch between apps. This dynamic fosters competing narratives as media outlets and content creators vie for attention (Mejova et al., 2023).
When influencers are seen as credible by young people, their content, motives, funding, and affiliations are rarely questioned by followers (Meriläinen, 2024, 2025). Those who do question them risk being vilified or cancelled (Meriläinen, 2024, 2025). Unlike legacy media, which discloses authorship and follows journalistic standards, influencers and digital platforms often benefit from spreading provocative, viral content (Meriläinen, 2025; Samoilenko & Suvorova, 2023). In turn, influencers are used by adversary states in information warfare.

2.3. Information Operations in Online Warfare

In information operations, content, influencers, and audiences are strategically used as tools of warfare. These operations aim to influence adversaries’ decision-making by manipulating societal values, opinions, and behaviors to achieve strategic goals (Whyte et al., 2021) by addressing societal values and beliefs, and manipulating opinions and behaviors (Meriläinen, 2024, 2025; Mozur et al., 2021; Weedon et al., 2017). As sociotechnical phenomena, they rely on networks of actors and digital infrastructures to spread biased information (Arif et al., 2018; Singh & Sharma, 2022; Zarocostas, 2020), seeking to dominate individuals’ cognitive spheres and often linking to kinetic warfare with long-term effects (Marić, 2016; Meriläinen, 2025).
Starbird et al. (2019) emphasize the participatory nature of these operations, where online crowds help amplify messages. However, digital platforms—primarily tech companies—are not designed to ensure factual accuracy or support democratic values (Meriläinen, 2025; Pennycook & Rand, 2021). Consequently, adversary states and influencers use these platforms to spread disinformation and propaganda (Cohen & Bar’el, 2017; Meriläinen, 2025; Willett, 2022). As digital platforms have become widespread, mis- and disinformation have evolved into powerful political weapons (Karpf, 2020; Rid, 2021; Schia & Gjesvik, 2020; Singer & Brooking, 2019) and are capable of reaching vast audiences, posing a serious threat in modern warfare. They often involve coordinated use of AI, bots, algorithms, and digital content—ranging from fake accounts to memes and historical narratives—to influence public opinion and behavior (Eleferenko, 2023; Geissler et al., 2023; Meriläinen, 2024, 2025; Mozur et al., 2021; Titifanue et al., 2016; Weedon et al., 2017).

2.4. Influencers and Information Operations

Influencers have become global political figures, using their relatable and casual communication styles to attract and engage young audiences (Balaban & Mustățea, 2019; Meriläinen, 2024; Sinha et al., 2023). Increasingly, they are also being used in modern military operations by various states (Meriläinen, 2025; Yao, 2023). Careful planning and intelligence gathering from OSINT (open-source intelligence) and SOCMINT (social media intelligence) are usually done prior to online information operations, mostly to target young people (Giles, 2016; Putter & Henrico, 2022). According to Stewart et al. (2024) and Briant (2023), influencers employ relatable ideologies and language, making a significant impact on young people’s political opinions through entertaining content like videos and memes that are used in information operations. In comparison to traditional agenda setting of legacy news media, influencers increase followers’ internal political efficacy and are relevant and trustworthy political agenda setters for young people (Harff & Schmuck, 2023; Schouten et al., 2020). Thus, they are suitable instruments in warfare because they legitimize and magnify knowledge, politics, and content on social media in addition to promoting products (Goodwin et al., 2023). Overall, influencers assist young people in engaging them in politics by offering important information and online news sources (Harff & Schmuck, 2023), whilst serving as tools of information operations. In addition, hostile states can maintain plausible deniability by involving influencers in information operations (Fang, 2024).
Social media serves as fertile ground for the spread of mis- and disinformation, contributing to political polarization and the erosion of democratic institutions and trust in legacy media (van der Linden et al., 2020). While influencers support political communication, education, and engagement through digital platforms, they also risk amplifying radical ideologies and conspiracy theories (Riedl et al., 2021).

3. Materials and Methods

This study aimed to understand young people’s perceptions of media trust and truth in political information and information operations between Portugal and Finland. The study followed a qualitative methodology by conducting semi-structured interviews with a convenience sample of young people between 18 and 25 years old. The interviews were conducted face-to-face between January 2024 and January 2025 in the north of Portugal and western, middle, and eastern Finland. This paper is part of a broader study on information operations and warfare. The interviews followed a semi-structured script composed of four main themes (traditional media, truth, trust, and influencers), in which more specific questions were asked, as shown in Table 1.
The data were analyzed on NVivo using thematic analysis, in which the coding categories emerged from the data through a deductive approach. The codebook in Appendix A includes the coding categories and the respective number of cases and references. The findings use the number of cases, articulated with participant quotes.

Participants

In total, there were 20 participants, 10 from each country (Portugal—PT and Finland—FI), as shown in Table 2. To preserve anonymity, the participants are described with an alphanumeric ID code that contains the country initials.
Regarding gender, there were 12 (60%) females and 8 males (40%) in total. The ages ranged between 18 and 25 years old, with Finnish participants still in high school, vocational school, or completing a double degree combining high school and vocational school degrees simultaneously, while Portuguese participants were already attending higher education studies. Despite having different educational backgrounds, the focus of this study is on the cross-country comparison, specifically between Eastern and Western Europe. The mean age was M = 20.6; SD = 1.7.

4. Results

4.1. Media Uses and (Dis)Trust

Participants of both countries access the news on a regular basis (Figure 1), mostly on social media (n = 15; 75%), followed by TV (n = 12; 60%) and print or online newspapers (n = 9; 45%). However, some youngsters seem confused about what ‘traditional media’ means. For example, as this participant states: “I don’t really know what traditional media means. When I was born, the Internet already existed” (FI6). They seem to acknowledge that traditional media is about offline news rather than online. Hence, many did not say they followed traditional media, but when asked directly, the majority claimed to access mainstream media news on social media, mainly on Instagram and X, where they usually click on the links to access the news online. On the other hand, others were able to see the media as a continuum between online and offline: “I see the media as a whole. I don’t separate newspapers or social media from each other. Of course, it’s easy to go to social media, because you don’t have to pay for it” (FI8). Most news is accessed on social media and, in some cases, by reading online newspapers or browsing the web.
There are some differences in media use. For instance, TV news is more watched by Portuguese participants (n = 10) than Finnish participants (n = 2), because they believe it is more trustworthy and easier to access than other types of media: “Yes, on television. (…) It’s more practical, and it’s also where I have easier access” (PT10). For some Portuguese participants, watching TV is part of their daily routine “in moments of leisure and family gatherings at the dining table” (PT4), as opposed to “never sitting in front of the TV and watching the news at any particular fixed hour” (FI2) because on the Internet they can access it anytime. As for the radio, some participants tune in occasionally, but primarily to listen to music: “I do listen to radio that predominantly plays music and sometimes talks about current and political issues” (FI1). Another Finnish participant added that they only listen to music because they find the news quite “heavy” (FI4), which they believe “drives young people away from the news” (FI4). This is also shared by another participant who perceives news stories to be negative and therefore a “burden to young people (…) even though they may be true” (FI9). For those who are interested in following current events on the radio, online podcasts are an option that allows them to get a weekly news summary (PT3) or listen to special talks and debates (FI2). Still, four participants (two from each country) said they do not follow the news on a regular basis (daily) but end up receiving updates on social media.
In terms of media trust (Figure 2), legacy media (TV and newspapers) were perceived as more trustworthy by Portuguese participants, while Finnish participants trusted online/social media sources more. Many participants (70%, n = 14) referred to specific TV channels as being the ones they trust the most. In the Portuguese sample, young people mostly trusted TV stations such as RTP (public broadcaster), SIC, TVI, and CNN. In the Finnish sample, the trusted TV stations were Yle (public broadcaster), BBC, Fox News, and CNN. Both samples regarded public broadcasters as trustworthy. Even though most participants said they often watch TV news (60%) because they find it trustworthy, when it comes to online (search engines like Google and platforms like Wikipedia, etc.) and social media news (TikTok, Instagram, YouTube, Reddit, Discord, etc.), the gap between news access and news trust is a bit wider. Despite being the place where most participants get their news from (75%), social media news trust (n = 4) and online news trust from browsers or other platforms (n = 4) are only expressed by the Finnish sample.
Similar to TV news trust, trusting newspapers (55%, n = 11) is aligned with those who mostly access newspapers online. Compared to the Finnish sample (n = 2), the Portuguese sample (n = 9) had a higher level of newspaper trust. For the Portuguese participants, trusted newspapers were Jornal de Notícias, Expresso, and Público, while Finnish participants said they trusted more in Helsingin Sanomat (Hesari) and the yellow press newspaper, Iltalehti. This participant explains why they trust and read online newspapers:
In my case, when something happens, I go straight to the newspaper. I don’t see it on social media, because I don’t follow any newspapers on social media either. So, I end up seeing it in the online newspaper. And I think it ends up being more trustworthy. Because they are not so concerned with the newspaper’s views but rather with passing on the information that needs to be passed on (PT3).
As for printed newspapers, one participant said they were more trustworthy and “immune” when compared to the immediatism of television, radio, and online information: “We look and know that it is truly correct. In printed newspapers, there may be more work done in terms of checking before printing” (PT1). In radio trust, only two participants mentioned trusting podcasts. Furthermore, trust in streaming platforms like Netflix, HBO, and Yle Areena was mentioned by three Finnish participants, which states the importance of entertainment culture, with TV series, movies, and documentaries acting as a trusting environment for young people. Similarly, some participants view humor and entertainment as reliable (n = 4), trusting specific comedians and perceiving infotainment as valid information sources: “Trust and entertainment do not exclude each other. (…) Can’t serious news be presented in an entertaining way? They are easy to follow” (FI9). This is also shared by another participant who feels they trust and “learn more through memes than from the news” (FI4).
Besides trusting friends and family, participants said they trust specific journalists, politicians, scientists, philosophers, familiar faces, influencers, or “someone who looks and can communicate in an intelligent manner” (FI1). Most respondents believe younger journalists are more trustworthy than their older counterparts because they are more like them. In the latter, participants report ‘outdated’ views from older journalists about politics and the need for collaboration between journalists and young people, since “journalists and young people are needed in journalism” (FI1). For them, older people “cannot understand young people and bring them truths” (FI5). Alternatively, one participant viewed older people as more reliable due to their experience and familiarity:
I remember perfectly watching RTP1 when I was little, and now too; and he remains in that position, so he also gives us that level of credibility. There are also people aged in their 40s or 50s, which I think helps a little. I think if they were younger, like our age, we would be less convinced (PT1).
In terms of distrust, half the sample revealed suspicion towards mainstream media outlets (50%, n = 10) like the Helsingin Sanomat (Hesari) and Yleisradio (Yle), from Finland, and CMTV from Portugal. Furthermore, critiques were built around social media (n = 8), which many associate with the spread of disinformation. For instance, “There is a lot of bullying, lies, and fake news on social media. Anyone can write what they want, and this is why you cannot trust social media (FI8)”. Despite being aware of false information on digital platforms, young people use them to keep updated about current events:
All media can be distorted, and not everything should be believed. In the media, a lot of images are edited, and texts may be falsified. Celebrities modify Instagram photos and can even make them completely different from what the photo was originally (…). Publishing them on multiple platforms does not make them trustworthy (…). If I could decide, I would completely delete the existence of TikTok, but I can’t, because there is so much news there that other media doesn’t have (FI8).
Additionally, some Portuguese participants were distrustful of the main right-wing candidate running for the elections, addressing his manipulative style of promoting the party, especially on TV and social media. Others were more extreme, claiming not to trust “almost anyone” (PT8).

4.2. Truths, Untruths, and Post-Truths

Crossovers were found between the concepts of ‘truth’ and ‘trust’, in which participants used both terms interchangeably. In these findings, the coded answers reveal what participants have considered as ‘truth’ providers. For two Portuguese participants, TV news in general is what they believe presents truthful information. While for Finnish participants, information from friends and online is seen as truthful, especially from social media, influencers, and platforms like Google and Wikipedia. Among the Finnish responses, there were nuances of a post-truth narrative, in which the truth is aligned with feelings or beliefs. For instance, “If you come across the right information, i.e., something that feels right and comes across in a credible way (…)” (FI9). It almost touches on intuition versus being presented with factual information, as this participant describes: “I can feel when I do not trust someone. It is just a strong feeling” (FI2). In these cases, truth is speaking in a language they understand and, most importantly, being on their “side” (FI3):
You can sense if something is true. You see who speaks or writes, and you know instantly if they are telling the truth. If you have doubts about it, you search for the person and ask your friends about it (FI7).
When asked about how they describe information operations, participants revealed a certain level of awareness about the motives related to mis- and disinformation spread. Many defined disinformation as incorrect, distorted, or incomplete information that is intentionally delivered to gain public attention, generate profit, get views, manipulate, or provoke interest in certain political parties. For instance:
Disinformation for me is (…) a process in which we are involved in a bubble (…) of fake news or news that is not 100% correct, that is altered to generate more enthusiasm among the population or concern, sometimes, that is not correct and is not true (PT1).
This includes the idea that media sensationalism contributes to disinformation since “we live in a time where people only want clicks/likes/visibility on posts and therefore distort/exaggerate things to create visibility” (FI8). Furthermore, “in the media, a lot of images are edited, and texts may be falsified” (FI8), especially using AI technology to fabricate content and disseminate it on digital platforms. While Portuguese participants discuss disinformation in broader terms regarding political manipulation, Finnish participants describe information operations in the context of the delicate relationship between Finland and Russia, showing an awareness of Russia spreading false information with the intent to ‘invade’ Finland (FI8):
I’m sure Russia is influencing us. There has been a lot of talk on the Internet that Russia is trying to influence us. They break cables in the Baltic Sea through Chinese ships, and my online bank didn’t work. Who else but Chinese bots and the Kremlin would be responsible for that? (FI7).
In some cases, participants believed they were not being influenced or manipulated by military or political forces, especially because they thought these operations were targeted at adults or decision-makers. This is also the case for the majority of participants (60%) who did not see themselves as targets of disinformation. Many of these youngsters considered themselves to be ‘digital natives’, so they thought they knew ‘instantly’ when news is reliable or not, claiming that older people are more likely to believe fake news, since they also have less “capacity for research” (PT6):
I think older people are more prone to disinformation, mainly because our grandparents, for example, who never had access to the Internet in their adolescence and childhood, when they come across information, tend to believe in everything they see on social media, mainly. I think that we, young people, are more predisposed to learn about misinformation and fake news, and we can easily decipher what is true and what is not. (PT1). Still, when they elaborated on the verification of content, they sometimes used expressions like ‘sense’ or ‘feel’, aligned with the post-truth narrative:
I’m a net native, and I can spot fake news from afar. Relatable news and information feels real (FI1).
Fake news can be recognized because it doesn’t feel real. That’s the feeling…we’re Internet natives, so we know how to identify false information (FI9).
I don’t know if they target influencing young people. Somehow, I feel that they influence adults first, for example, politicians and decision-makers (FI8).
Since they viewed themselves as ‘digital natives’, they perceived media and digital literacy as a ‘natural’ predisposition or trait that they were born with, rather than skills they learn and develop through media education.

4.3. From Influencer Culture to Political Beliefs and Views on Democracy

In understanding how participants perceive the formation of their beliefs, especially regarding political ideologies and opinions, the findings reveal that the surrounding environment has great influence on young people’s opinions (Figure 3), in particular, friends, family, and cultural/religious beliefs (35%, n = 7). Family members have an important role in debating about politics “because there is a lot of discussion between (…) like ‘who do you support? Such party and you?’… and then there is a bit of a clash of ideas and ideologies” (PT7). Additionally, friends sharing the same views are central in young people’s lives because they “are understanding (…) maybe the people closest to our stage of life (…) who are living at the same time as us” (PT10). Another participant reinforces the importance of religious beliefs: “Christian values and my relationship with God and my ancestors are important because they have fought for this country at the risk of their lives” (FI3). In the immediate surroundings, contributions from teachers or educators (10%, n = 2) are also valued: “I can say that he [the teacher] helped me understand things about other parties that I didn’t know. And I may have even changed my mind about some things” (PT4).
Digital influencers also play an important role in participants’ opinion-making and ideology (35%, n = 7). Some get guidance from influencers on social media, where “you can find so many truths that you cannot from the mainstream media” (FI3). Young influencers are generally preferred because “I feel like they’re on my side when no one else in this country is” (FI5). In a more extremist position, this participant explains their appreciation for following certain influencers: “I love how he [right-wing influencer] tells every leftist to f*** off in an intelligent manner. How he yells at the protesters. He is a hero, not the elitist politicians who are owned by Soros and Brussels (EU)” (FI3). Influencers are followed by young people not only to get informed or know how to do things, but also to form their opinions on certain matters:
I watch [an influencer’s] podcast weekly, and he ends up making a summary of the most important events of the week. And talk more about them, ending up giving a little bit of their opinion too. But this way I can also stay up-to-date with what’s going on in a lighter way (PT3).
Most participants were aware of influencer culture, highlighting aspects and motives about media production practices. Many stated that influencers use informative and entertaining content mixed with advertising, making their motives mostly driven by money and followers:
I think it’s more the financial part, because they really want to become bigger, and I think the financial part speaks louder. So sometimes you’ve heard that they partner with brands they don’t even like, or with ideas they don’t even support, because they must or want to. Will it be financial, or will it be audience power… followers, more protagonism, and going with the majority’s idea, even if they don’t believe in it (PT1).
When asked about ‘who influences the influencers’, some pointed to the news media and other influencers as a referential source, declaring they are not capable of being vehicles of information operations, as they have their own will, even though they can compromise their sponsorships. One person revealed how important it is for influencers to follow public opinion amidst the cancel culture: “…they are afraid of being cancelled (…). So, they convey exactly the idea that people agree” (PT8). Participants shared the belief that the more followers influencers have, the more impact they can have on their audiences: “I think that within politics, people look for whoever has the most influence and whoever has the most views to try to reach the youth” (PT6). Otherwise, “someone from a small organization or an influencer without followers—I don’t trust them at all” (FI2). Additionally, journalists and news media outlets (25%, n = 5) affect young people’s political opinions, with some saying that “news media has an impact on my thoughts. Journalists have a lot of power over what I think” (FI8). Another participant perceives that the influence depends on the storytelling: “I can be influenced by a journalist who knows how to make news that interests me. Big news stories are interesting if they have an addictive story” (FI9). Another perspective is about the comments section of social media news, where they feel influenced by the opinions from other users: “When you read the news, you will almost automatically be reading the opinions [comments]. And whether we like it or not, this ends up influencing our perspective a little” (PT8). Despite that, many participants claimed they do not let themselves be influenced by anyone (30%, n = 6), because they “have very strong critical thinking skills” (FI1). They said they follow their own opinion regardless, even though they get informed about the parties and listen to the closest ones:
It’s only been two years since I started voting, and I want to start forming my own opinion. And I listen, obviously, to my family, but I also always want to try to have my own ideas… (PT10).
The social environment and the media are important in shaping young people’s political orientation and views on democracy. Despite a few participants having revealed no interest in politics, political orientations ranging from centrist to left-wing to right-wing were identified in some discourses. In the cluster analysis of the coding references, conducted on NVivo (Figure 4), the results show an alignment of perspectives according to their political orientation. For instance, those who lean towards the left are more concerned about human rights and fear the impact of technology or military invasion, while those leaning towards the right are more protective of the nation and young people.
For Finnish participants, the future of democracy is tied to their proximity to Russia and history of invasion, which has been heightened by the context of the war in Ukraine. Most of these participants worry about the fear of military invasion from Russia:
Historically, Russia has had a need to own Finland. Even though there are problems in Finland, such as growing street gangs, roadman clowns, and difficulties in accessing health care, I want to protect Finland. NATO membership was a good thing, but the war scares me… (FI8).
This fear seems exacerbated when youngsters find it hard to trust politicians, claiming “politicians are a joke” and that “if [Russian] nuclear missiles come over Finland, then the whole world is screwed” (FI4). In a way, the rise of extreme right- and left-wing leaders is a concern for participants, who fear the consequences of political polarization on social media campaigns. One participant recognized the strategy of luring those without strong political convictions: “(…) people then choose extremism because they are tired of always listening to the same things, because there are more and more people without an opinion, who go with the flow” (PT10).
Online campaigns, algorithms, filter bubbles, and influencers are pointed out as factors that may well interfere with the future of democracy in a tech-driven world. In addition, the cancel culture can intensify the fear of voicing minority views from the crowd:
I think the fear of cancellation is my fear that…other people are afraid of it. That is, to invalidate your opinion in order to have the opinion of the group. It’s going with the flow. I think it’s much more psychological that people go crazy. And then the filter bubbles too. Because I think people get so caught up in that world because of the algorithm; that’s all that appears to us, and we end up not even having the perspective of other parties (PT3).
In contrast, artificial intelligence and machine learning are perceived to build more democratic tools: “Bitcoin and apps are good examples of democratic tools that can help strengthen democracies” (FI1). Overall, both samples have also expressed concerns about caring for human rights in the future. In particular, they worry about racism, sexual harassment or bullying, women’s rights, minorities, global safety, the environment, and animals’ rights. For example, “I believe that girls and women will be subjected to brutality in the coming years. Some see no value in women and girls” (FI2). Alternatively, one right-wing participant feels “that feminism and woke have gone too far”, claiming not to have “a strong belief in the state of democracy, because part of the population has been silenced in Finland as well” (FI6).
It is important to highlight that some participants felt that most decisions about the future do not include young people, claiming the narratives about democracy are adult-centric:
We may not be able to express things in the right language, but we have a lot to say. I do not feel that we are involved in decision-making and that we are not included in democracy. Adults make decisions past us (FI9).
In addition, they believe society puts pressure on youth to solve future problems.

5. Discussion

Drawing on qualitative data of young people from Finland and Portugal, this study has revealed a complex mix of perspectives and practices about media information and political opinions. A particularly noteworthy finding of this study is that many young participants did not perceive themselves as targets of information operations or disinformation, expressing strong confidence in their ability to identify reliable news and actors. This self-assured stance is rooted in the “digital native” narrative. It creates a paradox in which young people believe they are resistant to manipulation while simultaneously relying on emotionally driven cues, influencers, and entertainment content that can expose them to subtle forms of influence by adversary actors such as Russia and China. By attributing susceptibility to disinformation primarily to older generations, participants overlook the ways algorithms, sociotechnical infrastructures, and covert information strategies specifically target young audiences. This misplaced confidence highlights a critical vulnerability that warrants stronger emphasis in future studies, as it illustrates how overestimating one’s digital literacy may increase rather than reduce exposure to information operations and cognitive warfare on various digital platforms.
Regarding media trust, Portuguese youth still trust legacy media as a news source, to some degree, with TV news being watched on a more regular basis than Finnish youth, as a result of family habits. While both samples use digital platforms the most to access information, online news is perceived as less reliable than legacy media outlets (mostly TV and newspapers), especially due to disinformation spread on digital platforms such as TikTok and Instagram. This was more dominant in the Portuguese sample and is consistent with previous studies that showed a paradox in trusting legacy media but using online media instead to get informed (Costera Meijer, 2007; Melro & Pereira, 2023). Moreover, the findings revealed a misunderstanding about what ‘traditional media’ means to some participants, who believe what they access online is not ‘traditional’. In fact, most participants use online media to access mainstream news, with a minority claiming not to get informed on a regular basis. Thus, young people are generally engaged in current events, using digital platforms to get informed.
In exploring perceptions of truth, this study found that there is a connection between the idea of truthful information and how it relates to their notion of trust by appealing to the feelings, values, and identities of the youth. Therefore, ‘trust’ and ‘truth’ are used interchangeably, being both related to emotions and beliefs (Hannan, 2018; Quintana-Paz, 2018). For Finnish participants, in particular, information from influencers and friends was seen as truthful and therefore reliable, revealing a post-truth narrative in which ‘truthful’ is about what they believe and what feels right, rather than being supported by fact-based information. Age also plays a factor in trustworthiness, with some participants believing more in content delivered by young journalists or influencers, as they feel less understood by older actors. Furthermore, entertainment content (memes, comedians, influencers, streaming platforms) was seen as a reliable source for some participants. While entertainment has been reported as an important but controversial news source for the youth for many decades (Marchi, 2012; Melro, 2018), the viral way in which biased content is spread online can enhance the effectiveness of information operations from adversary actors in building young people’s trust with content they can relate to. Moreover, participants confidently said they do not see themselves as targets of information operations, as they view themselves as ‘digital natives’ that can easily spot mis- and disinformation. This misconception of Prensky’s (2001) label places youth in a vulnerable position, ignoring the challenges they face with technology and information in the digital age (Evans & Robertson, 2020). For participants, disinformation is believed only by older people who they think do not have the digital and media skills to verify information, setting themselves out of the influence of information operations and making youth a target for adversary actors.
Despite viewing themselves as media and digital literates, young participants did not question the sources they trust (where they get their funding, the networks, or who influences them), nor the influence of AI and algorithms in framing and setting the agenda of online information. These results are similar to previous findings (Meriläinen, 2024, 2025). This allows digital influencers to be vehicles of information warfare while remaining relatable and trustworthy to younger audiences (Schouten et al., 2020). Meanwhile, among Finnish youth, trust in Russia was low, as most were afraid of Russia’s invasion, especially given the current war in Ukraine. This fear was largely missing among the Portuguese participants, partly because they are geographically distant. Overall, young people held right- and left-leaning ideologies, with some fearing the impact of political polarization, technology, and cancel culture on the democracy of the countries.

6. Conclusions

This study highlights how young people’s perceptions of trusted and truthful information are closely tied to their feelings and beliefs, revealing their vulnerability to information operations. Two key issues emerge. First, media education is more critical than ever in the digital age, where algorithms, AI, and the viral spread of disinformation often bypass young people’s overconfidence in their digital literacy. It is therefore more urgent than ever to target media literacy education for youth (de los Santos et al., 2025). Corneille et al. (2020) suggest orienting individuals towards evaluations of fakeness instead of evaluations of truth. Second, digital platforms and tech companies need to be held accountable for information bias, hate speech, and data privacy, with heavier regulations and policies in place around the world. Understanding the exposure of youth to information operations within digital platforms is critical for safeguarding democratic resilience in an era dominated by algorithmic influence and viral disinformation.
A key limitation of this study is that only 20 young people were interviewed, which restricts the generalizability of the findings. Also, the difference in educational levels between the two country samples is a limitation of the study and should be addressed in further research. In the next phase of this research, a larger and more diverse group of young participants will be interviewed to further examine and validate the phenomena identified in this study.

Author Contributions

Conceptualization, N.M.; methodology, A.M.; software, A.M.; validation, A.M. and N.M.; formal analysis, A.M.; investigation, N.M. and A.M.; resources, N.M. and A.M.; data curation, N.M. and A.M.; writing—original draft preparation, N.M. and A.M.; writing—review and editing, N.M. and A.M.; visualization, N.M. and A.M.; supervision, N.M. and A.M.; project administration, N.M.; funding acquisition, N.M. All authors have read and agreed to the published version of the manuscript.

Funding

In Finland, this research was funded by the Kone Foundation, entitled “Kuljeksivat teinit: eriar-voisuuden tuottaminen “amisnuoriin” liittyvässä kielenkäytössä ja laajemmin suomalaisessa yhteiskunnassa” with an amount of 248,100 EUR. In Portugal, this research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and the EU GDPR. Ethical review and approval were waived for this study due to the minimal risk to participants who voluntarily participated in the study.

Informed Consent Statement

Written informed consent has been obtained from the participants to take part and publish this paper.

Data Availability Statement

The data presented in this study are available on request from the corresponding author due to privacy reasons.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Figure A1. Codebook of the thematic analysis.
Figure A1. Codebook of the thematic analysis.
Journalmedia 07 00013 g0a1

References

  1. Agarwal, S., & Damle, M. (2020). Sentiment analysis to evaluate influencer marketing: Exploring to identify the parameters of influence. PalArch’s Journal of Archaeology of Egypt/Egyptology, 17(6), 4784–4800. [Google Scholar]
  2. Ahmed, S. I., & Mizan, S. A. (2018). Silencing the minority through domination in social media platform: Impact on the pluralistic Bangladeshi society. In ELCOP yearbook of human rights 2018. Available online: https://ssrn.com/abstract=3326478 (accessed on 25 January 2025).
  3. Aïmeur, E., Amri, S., & Brassard, G. (2023). Fake news, disinformation and misinformation in social media: A review. Social Network Analysis and Mining, 13(1), 30. [Google Scholar] [CrossRef]
  4. Albert, C. D., Aleroud, A., Yang, Y., Melhem, A., & Rutland, J. (2023). Twitter propaganda operations: Analyzing sociopolitical issues in Saudi Arabia. Social Media + Society, 9(4), 20563051231216964. [Google Scholar] [CrossRef]
  5. Amichai-Hamburger, Y., & Ben-Artzi, E. (2003). Loneliness and Internet use. Computers in Human Behavior, 19(1), 71–80. [Google Scholar] [CrossRef]
  6. APA. (2023, October). Misinformation and disinformation. American Psychological Association. Available online: https://www.apa.org/topics/journalism-facts/misinformation-disinformation (accessed on 3 March 2025).
  7. Arif, A., Stewart, L. G., & Starbird, K. (2018). Acting the part. Proceedings of the ACM on Human-Computer Interaction, 2(CSCW), 1–27. [Google Scholar] [CrossRef]
  8. Balaban, D., & Mustățea, M. (2019). Users’ perspective on the credibility of social media influencers in Romania and Germany. Romanian Journal of Communication and Public Relations, 21(1), 31–46. [Google Scholar] [CrossRef]
  9. Banks, J. A. (2001). Citizenship education and diversity. Journal of Teacher Education, 52(1), 5–16. [Google Scholar] [CrossRef]
  10. Belotti, F., Donato, S., Bussoletti, A., & Comunello, F. (2022). Youth activism for climate on and beyond social media: Insights from FridaysForFuture-Rome. The International Journal of Press/Politics, 27(3), 718–737. [Google Scholar] [CrossRef]
  11. Bibi, D., Noreen, D., & Nawaz, M. (2023). Social media concerns faced by university students during COVID-19. Journal of Positive School Psychology, 7(1), 653–665. [Google Scholar]
  12. Brenan, M. (2025, October 2). Trust in media at new low of 28% in U.S. Gallup. Available online: https://news.gallup.com/poll/695762/trust-media-new-low.aspx (accessed on 3 March 2025).
  13. Briant, E. (2023). Ethics in dystopia? Digital adaptation and US military information operations. The Journal of Intelligence, Conflict, and Warfare, 5(3), 136–140. [Google Scholar] [CrossRef]
  14. Burton, J. (2023). Algorithmic extremism? The securitization of artificial intelligence (AI) and its impact on radicalism, polarization and political violence. Technology in Society, 75, 102262. [Google Scholar] [CrossRef]
  15. Clark, L. S., & Marchi, R. (2017). Young people and the future of news. Cambridge University Press. [Google Scholar] [CrossRef]
  16. Cohen, D., & Bar’el, O. (2017). The use of cyberwarfare in influence operations. Available online: https://en-cyber.tau.ac.il/sites/cyberstudies-english.tau.ac.il/files/media_server/cyber%20center/cyber-center/Cyber_Cohen_Barel_ENG.pdf (accessed on 26 January 2025).
  17. Corneille, O., Mierop, A., & Unkelbach, C. (2020). Repetition increases both the perceived truth and fakeness of information: An ecological account. Cognition, 205, 104470. [Google Scholar] [CrossRef]
  18. Costera Meijer, I. (2007). The paradox of popularity: How young people experience the news. Journalism Studies, 8(1), 96–116. [Google Scholar] [CrossRef]
  19. Craig, S. L., Eaton, A. D., McInroy, L. B., Leung, V. W. Y., & Krishnan, S. (2021). Can social media participation enhance LGBTQ+ youth well-being? Development of the social media benefits scale. Social Media + Society, 7(1), 2056305121988931. [Google Scholar] [CrossRef]
  20. de los Santos, T., Smith, E., & Johnson, J. (2025). ‘Straight to the source’: How Teens’ experiences shape their understanding of and expectations for news. Electronic News, 19(4), 195–213. [Google Scholar] [CrossRef]
  21. Ditto, P. H., Liu, B. S., Clark, C. J., Wojcik, S. P., Chen, E. E., Grady, R. H., Celniker, J. B., & Zinger, J. F. (2019). At least bias is bipartisan: A meta-analytic comparison of partisan bias in liberals and conservatives. Perspectives on Psychological Science, 14(2), 273–291. [Google Scholar] [CrossRef] [PubMed]
  22. Dmytro, G., & Viacheslav, S. (2025). Infodemics and populism in the digital age: Threats to political stability and security challenges. Society and Security, 2(8), 61–71. [Google Scholar] [CrossRef]
  23. Ehrlén, V., Talvitie-Lamberg, K., Salonen, M., Koivula, M., Villi, M., & Uskali, T. (2023). Confusing content, platforms, and data: Young adults and trust in news media. Media and Communication, 11(4), 320–331. [Google Scholar] [CrossRef]
  24. Eleferenko, A. (2023). How can digital diplomacy reconcile Russia and the West? IE International Policy Review (IPR), 4(1), 1–20. Available online: https://ipr.blogs.ie.edu/ (accessed on 4 March 2025).
  25. Entman, R. M. (1993). Framing: Toward clarification of a fractured paradigm. Journal of Communication, 43(4), 51–58. [Google Scholar] [CrossRef]
  26. Evans, C., & Robertson, W. (2020). The four phases of the digital natives debate. Human Behavior and Emerging Technologies, 2(3), 269–277. [Google Scholar] [CrossRef]
  27. Fang, K. (2024). Wangbao (cyberbullying) and Jubao (reporting): Collaborative state-society online influence operations in China. Journal of Online Trust and Safety, 2(3). [Google Scholar] [CrossRef]
  28. Feio, C., & Oliveira, L. (2025). Scrolling through the feed—How do young people in Portugal consume news on social media? Observatorio (OBS*), 19(5), 1–15. [Google Scholar] [CrossRef]
  29. Garcia, P., Fernández, C. H., & Jackson, A. (2020). Counternarratives of youth participation among black girls. Youth & Society, 52(8), 1479–1500. [Google Scholar] [CrossRef]
  30. Geissler, D., Bär, D., Pröllochs, N., & Feuerriegel, S. (2023). Russian propaganda on social media during the 2022 invasion of Ukraine. EPJ Data Science, 12(1), 35. [Google Scholar] [CrossRef]
  31. Giles, K. (2016). Handbook of Russian information warfare. Available online: https://css.ethz.ch/content/dam/ethz/special-interest/gess/cis/center-for-securities-studies/resources/docs/NDC%20fm_9.pdf (accessed on 24 May 2025).
  32. Goodwin, A., Joseff, K., Riedl, M. J., Lukito, J., & Woolley, S. (2023). Political relational influencers: The mobilization of social media influencers in the political arena. International Journal of Communication, 17(2023), 1613–1633. Available online: http://ijoc.org/index.php/ijoc/article/view/18987 (accessed on 3 March 2025).
  33. Hameleers, M., Brosius, A., & de Vreese, C. H. (2022). Whom to trust? Media exposure patterns of citizens with perceptions of misinformation and disinformation related to the news media. European Journal of Communication, 37(3), 237–268. [Google Scholar] [CrossRef]
  34. Hannan, J. (2018). Trolling ourselves to death? Social media and post-truth politics. European Journal of Communication, 33(2), 214–226. [Google Scholar] [CrossRef]
  35. Harasgama, K., & Jayamaha, S. (2022, October 11). Violation of human dignity through online harassment: A Case for stronger protection of privacy in Sri Lanka. SLIIT International Conference on Advancements in Sciences and Humanities (pp. 112–119), Virtual Event. [Google Scholar] [CrossRef]
  36. Harff, D., & Schmuck, D. (2023). Influencers as empowering agents? Following political influencers, internal political efficacy and participation among youth. Political Communication, 40(2), 147–172. [Google Scholar] [CrossRef]
  37. Heawood, J. (2018). Pseudo-public political speech: Democratic implications of the Cambridge analytica scandal. Information Polity, 23(4), 429–434. [Google Scholar] [CrossRef]
  38. Hermida, A. (2010). Twittering the news: The emergence of ambient journalism. Journalism Practice, 4(3), 297–308. [Google Scholar] [CrossRef]
  39. Kalpokas, I. (2017). Information warfare on social media: A brand management perspective. Baltic Journal of Law & Politics, 10(1), 35–62. [Google Scholar] [CrossRef]
  40. Kanet, R. E. (2024). Moscow and the world: From soviet active measures to Russian information warfare. Applied Cybersecurity & Internet Governance, 3(1), 34–57. [Google Scholar] [CrossRef]
  41. Karpf, D. (2020). How digital disinformation turned dangerous. In W. L. Bennett, & S. Livingston (Eds.), The disinformation age (pp. 153–168). Cambridge University Press. [Google Scholar] [CrossRef]
  42. Klopfenstein Frei, N., Wyss, V., Gnach, A., & Weber, W. (2024). “It’s a matter of age”: Four dimensions of youths’ news consumption. Journalism, 25(1), 100–121. [Google Scholar] [CrossRef]
  43. Livingstone, S. (2024). Reflections on the meaning of “digital” in research on adolescents’ digital lives. Journal of Adolescence, 96(4), 886–891. [Google Scholar] [CrossRef] [PubMed]
  44. Marchi, R. (2012). With Facebook, blogs, and fake news, teens reject journalistic “objectivity”. Journal of Communication Inquiry, 36(3), 246–262. [Google Scholar] [CrossRef]
  45. Marić, S. (2016). Information society and information operations. Media Dialogues, 10(1), 115–136. Available online: https://www.researchgate.net/publication/378042824 (accessed on 3 March 2025).
  46. Markey, C. H., & Daniels, E. A. (2022). An examination of preadolescent girls’ social media use and body image: Type of engagement may matter most. Body Image, 42, 145–149. [Google Scholar] [CrossRef]
  47. Mejova, Y., Kalimeri, K., De, G., & Morales, F. (2023). Authority without care: Moral values behind the mask mandate response. arXiv. [Google Scholar] [CrossRef]
  48. Melro, A. (2018). The dis(interest) of young people in current events: Study on the role of the media in the information about the world [Doctoral thesis, Universidade do Minho]. [Google Scholar]
  49. Melro, A., & Pereira, S. (2023). “Seeing but not believing”: Undergraduate students ’ media uses and news trust. International Journal of Communication, 17, 993–1018. [Google Scholar]
  50. Meriläinen, N. (2014). Understanding the framing of issues in multi-actor arenas power relations in the human rights debate [Doctoral thesis, University of Jyväskylä]. [Google Scholar]
  51. Meriläinen, N. (2022). “I find this really entertaining”—First look of the relationship between vocational school students and various media. On the Horizon, 30(2), 57–81. [Google Scholar] [CrossRef]
  52. Meriläinen, N. (2024). The possible role of digital platforms in information operations. European Conference on Social Media, 11(1), 137–143. [Google Scholar] [CrossRef]
  53. Meriläinen, N. (2025). Influencers as tools in hybrid operations online. International Conference on Cyber Warfare and Security, 20(1), 265–272. [Google Scholar] [CrossRef]
  54. Meriläinen, N., Ortbals, C. D., & Strachan, J. C. (2024). By the looks of her she is not credible: Sanna Marin and fashion’s influence on credibility. In K. M. Kedrowski, C. D. Ortbals, L. Poloni-Staudinger, & J. C. Strachan (Eds.), The Palgrave handbook of fashion and politics (pp. 183–198). Palgrave Macmillan. [Google Scholar] [CrossRef]
  55. Mozur, P., Zhong, R., Krolik, A., Aufrichtig, A., & Nailah Morgan, N. (2021, December 13). Inside a Chinese propaganda campaign: How Beijing influence the influencers. The New York Times. Available online: https://www.nytimes.com/interactive/2021/12/13/technology/china-propaganda-youtube-influencers.html (accessed on 12 May 2025).
  56. Nilan, P., Burgess, H., Hobbs, M., Threadgold, S., & Alexander, W. (2015). Youth, social media, and cyberbullying among Australian youth: “Sick friends”. Social Media + Society, 1(2), 2056305115604848. [Google Scholar] [CrossRef]
  57. Parmelee, J. H., Perkins, S. C., & Beasley, B. (2023). Personalization of politicians on Instagram: What generation Z wants to see in political posts. Information, Communication & Society, 26(9), 1773–1788. [Google Scholar] [CrossRef]
  58. Parris, L., Lannin, D. G., Hynes, K., & Yazedjian, A. (2022). Exploring social media rumination: Associations with bullying, cyberbullying, and distress. Journal of Interpersonal Violence, 37(5–6), NP3041–NP3061. [Google Scholar] [CrossRef] [PubMed]
  59. Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in Cognitive Sciences, 25(5), 388–402. [Google Scholar] [CrossRef]
  60. Prensky, M. (2001). Digital natives, digital immigrants part 1. On the Horizon, 9(5), 1–6. [Google Scholar] [CrossRef]
  61. Putter, D., & Henrico, S. (2022). Social media intelligence: The national security–privacy nexus. Scientia Militaria, 50(1), 19–44. [Google Scholar] [CrossRef]
  62. Quintana-Paz, M. Á. (2018). Post-truth as a feature of hypermodern times. Edukacja Filozoficzna, 66(2), 143–161. [Google Scholar] [CrossRef]
  63. Rid, T. (2021). Active measures: The secret history of disinformation and political warfare. Profile Books. [Google Scholar]
  64. Riedl, M., Schwemmer, C., Ziewiecki, S., & Ross, L. M. (2021). The rise of political influencers—Perspectives on a trend towards meaningful content. Frontiers in Communication, 6, 752656. [Google Scholar] [CrossRef]
  65. Sambasivan, N., Batool, A., Ahmed, N., Matthews, T., Thomas, K., Gaytán-Lugo, L. S., Nemer, D., Bursztein, E., Churchill, E., & Consolvo, S. (2019, May 4–9). “They don’t leave us alone anywhere we go”: Gender and digital abuse in South Asia. 2019 CHI Conference on Human Factors in Computing Systems (pp. 1–14), Glasgow, UK. [Google Scholar] [CrossRef]
  66. Samoilenko, S. A., & Suvorova, I. (2023). Artificial intelligence and deepfakes in strategic deception campaigns: The U.S. and Russian experiences. In The Palgrave handbook of malicious use of AI and psychological security (pp. 507–529). Springer International Publishing. [Google Scholar] [CrossRef]
  67. Schia, N. N., & Gjesvik, L. (2020). Hacking democracy: Managing influence campaigns and disinformation in the digital age. Journal of Cyber Policy, 5(3), 413–428. [Google Scholar] [CrossRef]
  68. Schouten, A. P., Janssen, L., & Verspaget, M. (2020). Celebrity vs. Influencer endorsements in advertising: The role of identification, credibility, and Product-Endorser fit. International Journal of Advertising, 39(2), 258–281. [Google Scholar] [CrossRef]
  69. Sikorski, J. (2024). Hate speech as a flywheel Russian information warfare. Research reconnaissance. Studia Administracji i Bezpieczeństwa, 17(17), 205–222. [Google Scholar] [CrossRef]
  70. Singer, P. W., & Brooking, E. T. (2019). LikeWar: The weaponization of social media. Mariner Books, Houghton Mifflin Harcourt. [Google Scholar]
  71. Singh, B., & Sharma, D. K. (2022). Predicting image credibility in fake news over social media using multi-modal approach. Neural Computing and Applications, 34(24), 21503–21517. [Google Scholar] [CrossRef]
  72. Sinha, K., Jhalani, P., Khan, A., & Mukherjee, P. C. (2023). Influencers as institutions: Impact of digital politics in the global south. Global Policy, 14(5), 912–924. [Google Scholar] [CrossRef]
  73. Skurnik, I., Yoon, C., Park, D. C., & Schwarz, N. (2005). How warnings about false claims become recommendations. Journal of Consumer Research, 31(4), 713–724. [Google Scholar] [CrossRef]
  74. Starbird, K., Arif, A., & Wilson, T. (2019). Disinformation as collaborative work. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1–26. [Google Scholar] [CrossRef]
  75. Stewart, B., Wenzlaff, R., & Cordero, C. (2024, August 24). Influencers get access to political conventions as campaigns court young voters. ABC News. Available online: https://abcnews.go.com/Politics/influencers-access-political-conventions-campaigns-court-young-voters/story?id=113100221 (accessed on 27 January 2025).
  76. Theocharis, Y., & de Moor, J. (2021). Creative participation and the expansion of political engagement. In Oxford research encyclopedia of politics. Oxford University Press. [Google Scholar] [CrossRef]
  77. Titifanue, J., Tarai, J., Kant, R., & Finau, G. (2016). Social media in the free West Papua campaign. Pacific Studies Journal, 39(3), 255–281. Available online: https://digitalcollections.byuh.edu/pacific-studies-journal (accessed on 31 March 2025).
  78. Tuominen, J. (2025, September 11). Nuoret saavat uutisensa algoritmilta, mutta se ei saa vaikuttaa julkaisupäätökseen. Media-Alan Tutkimussaatio. [Google Scholar]
  79. van der Linden, S., Roozenbeek, J., & Compton, J. (2020). Inoculating against fake news about COVID-19. Frontiers in Psychology, 11, 566790. [Google Scholar] [CrossRef] [PubMed]
  80. Waisbord, S. (2020). Mob censorship: Online harassment of US journalists in times of digital hate and populism. Digital Journalism, 8(8), 1030–1046. [Google Scholar] [CrossRef]
  81. Weedon, J., Nuland, W., & Stamos, A. (2017). Information operations and Facebook. Available online: https://www.facebook.com/notes/mark-zuckerberg/building-global-community/10154544292806634 (accessed on 13 February 2025).
  82. Whyte, C., Thrall, A. T., & Mazanec, B. M. (2021). Introduction. In C. Whyte, A. T. Thrall, & B. M. Mazanec (Eds.), Information warfare in the age of cyber conflict (pp. 1–11). Routledge. [Google Scholar]
  83. Willett, M. (2022). The cyber dimension of the Russia–Ukraine war. Survival, 64(5), 7–26. [Google Scholar] [CrossRef]
  84. Yanchenko, K., & Humprecht, E. (2025). Strategic narratives on social media: How information environments shape the Russian-Ukrainian war discourse in seven European countries. International Journal of Communication, 19, 25. Available online: https://ijoc.org/index.php/ijoc/article/view/25136 (accessed on 14 May 2025).
  85. Yao, J. (2023). DoD manipulates media in the US Wars: From information warfare to cyber operation. Communication and Public Diplomacy, 2(1), 233–244. [Google Scholar] [CrossRef]
  86. Zalnieriute, M., & Milan, S. (2019). Internet architecture and human rights: Beyond the human rights gap. Policy & Internet, 11(1), 6–15. [Google Scholar] [CrossRef]
  87. Zarocostas, J. (2020). How to fight an infodemic. Lancet, 395(10225), 676. [Google Scholar] [CrossRef] [PubMed]
Figure 1. News access by media outlet (N = 20).
Figure 1. News access by media outlet (N = 20).
Journalmedia 07 00013 g001
Figure 2. Trusted media outlets of participants by country.
Figure 2. Trusted media outlets of participants by country.
Journalmedia 07 00013 g002
Figure 3. Participant perspectives on who influences their political opinion.
Figure 3. Participant perspectives on who influences their political opinion.
Journalmedia 07 00013 g003
Figure 4. Coding cluster analysis of political orientation and perspectives of democracy.
Figure 4. Coding cluster analysis of political orientation and perspectives of democracy.
Journalmedia 07 00013 g004
Table 1. Main themes of the semi-structured interview script.
Table 1. Main themes of the semi-structured interview script.
ThemesQuestions
I. Traditional mediaDo you consume traditional media? Why?
Please elaborate on your relationship with traditional media.
Who do you trust within traditional media?
II. TruthWho is a truth provider in terms of news and politics?
Who can affect you and your thoughts?
Can you be a target of information operations? Fake news? Miss- or disinformation? Why?
Can you please explain what these themes mean?
Can you be a target or hybrid operation?
III. TrustWho are trusted news and information sources for you? Why?
Who do you trust or not trust?
Where do the trusted sources operate (online/offline)?
IV. InfluencersWhere do influencers get their funding, news, and information?
Who influences the influencers?
How can influencers be used as tools in information operations?
How is the content (news, information, entertainment, etc.) related to information operations and the current and future state of democracy in Finland/Portugal?
Table 2. Participants interviewed in Portugal and Finland.
Table 2. Participants interviewed in Portugal and Finland.
Participant IDCountryAgeGenderEducational Level
FI1Finland22MaleSecondary education
FI2Finland19MaleSecondary education
FI3Finland18FemaleSecondary education
FI4Finland22FemaleSecondary education
FI5Finland18MaleSecondary education
FI6Finland19FemaleSecondary education
FI7Finland20FemaleSecondary education
FI8Finland22MaleSecondary education
FI9Finland23MaleSecondary education
FI10Finland20FemaleSecondary education
PT1Portugal20FemaleHigher education
PT2Portugal20FemaleHigher education
PT3Portugal20FemaleHigher education
PT4Portugal22MaleHigher education
PT5Portugal21MaleHigher education
PT6Portugal25MaleHigher education
PT7Portugal20FemaleHigher education
PT8Portugal20FemaleHigher education
PT9Portugal20FemaleHigher education
PT10Portugal20FemaleHigher education
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Meriläinen, N.; Melro, A. Truth and Trust in the News: How Young People in Portugal and Finland Perceive Information Operations in the Media. Journal. Media 2026, 7, 13. https://doi.org/10.3390/journalmedia7010013

AMA Style

Meriläinen N, Melro A. Truth and Trust in the News: How Young People in Portugal and Finland Perceive Information Operations in the Media. Journalism and Media. 2026; 7(1):13. https://doi.org/10.3390/journalmedia7010013

Chicago/Turabian Style

Meriläinen, Niina, and Ana Melro. 2026. "Truth and Trust in the News: How Young People in Portugal and Finland Perceive Information Operations in the Media" Journalism and Media 7, no. 1: 13. https://doi.org/10.3390/journalmedia7010013

APA Style

Meriläinen, N., & Melro, A. (2026). Truth and Trust in the News: How Young People in Portugal and Finland Perceive Information Operations in the Media. Journalism and Media, 7(1), 13. https://doi.org/10.3390/journalmedia7010013

Article Metrics

Back to TopTop