Next Article in Journal
Addictive Games: Case Study on Multi-Armed Bandit Game
Next Article in Special Issue
Audio Storytelling Innovation in a Digital Age: The Case of Daily News Podcasts in Spain
Previous Article in Journal
Developing Core Technologies for Resource-Scarce Nguni Languages
Previous Article in Special Issue
The Mediated Effect of Social Presence on Social Commerce WOM Behavior
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Algorithmic Curation and Users’ Civic Attitudes: A Study on Facebook News Feed Results

Dat-Act Lab, Department of Social and Political Sciences Programme of Journalism, University of Cyprus, P.O. Box 20537, Nicosia 1678, Cyprus
*
Author to whom correspondence should be addressed.
Information 2021, 12(12), 522; https://doi.org/10.3390/info12120522
Submission received: 2 November 2021 / Revised: 7 December 2021 / Accepted: 9 December 2021 / Published: 15 December 2021
(This article belongs to the Special Issue Advances in Interactive and Digital Media)

Abstract

:
Facebook users are exposed to diverse news and political content; this means that Facebook is a significant tool for the enhancement of civic participation and engagement in politics. However, it has been argued that Facebook, through its algorithmic curation reinforces the pre-existing attitudes of individuals, rather than challenging or potentially altering them. The objective of this study is to elucidate the emotional and behavioural impact of the personalization of Facebook users’ News Feeds results, and thereby to uncover a possible link between their online and offline civic attitudes. Firstly, we investigate the extent to which users’ Facebook News Feeds results are personalized and customized to fit users’ pre-existing civic attitudes and political interests. Secondly, we explore whether users embody new roles as a result of their emotional and behavioural interaction with political content on Facebook. Our methodology is based on a quantitative survey involving 108 participants. Our findings indicate that, while Facebook can potentially expose users to varying political views and beliefs, it tends to reinforce existing civic attitudes and validate what users already hold to be true. Furthermore, we find that users themselves often assume a proactive stance towards Facebook News Feed results, acquiring roles in which they filter and even censor the content to which they are exposed and thus trying to obfuscate algorithmic curation.

1. Introduction

In recent years, there has been an increasing interest in the relationship between social media platforms and civic participation. The scandals which followed the 2016 US elections, along with the impact of algorithmic processes in social media on public opinion during the Brexit referendum, are indicative examples of this tremendous relationship. The exact way in which Facebook in general, and its algorithms, predetermine the news media content displayed to users is uncertain; however, we do know that the content made visible to a user is determined by a series of actions and set by multiple actors: the user, their friends, the designers, the advertisers, and the publishers [1]. Users ‘like’ and ‘follow’ their friends’ content, publishers and politicians target users with paid content, and algorithms rank and classify the content that is visible on users’ Facebook News Feeds [2].
Civic participation has long been synonymous to democratic rights, liberties, and claims of citizens, which support political engagement and reinforce democratic representation [3]. In recent scholarship, many authors have argued that one of the major components of participation—being ‘informed’ and able to acquire political knowledge through and within the media—is highly dependent on the algorithmic functions imposed on users by social media platforms like Facebook [1]. It is argued that this is due to partial conversion of social media into a source of political media which provides and predicts the political tendencies of users [4] and aims to attract of the attention of individuals by increasing their personal interest in the content. This is due in part to Facebook’s business model. As Thorson et al. argue, the objective of Facebook’s business model “is to create attention that can be sold to advertisers”, and not specifically to create political content [4] (p. 12). In the same vein, Bode [5] for example has demonstrated that online political content is associated with, and provided to, users with similar ideological orientations—which raises questions about those who interact with such content without actively seeking it, and are thus exposed to content which does not correspond to their ideological beliefs (this is known as ‘incidental exposure’) [6]. The results of studies on the relation between social media and civic participation have been generally mixed, finding little or no effect of social media on civic participation [7]. Other more optimistic approaches, such as that of Boulianne [8], argue for the primacy of positive rather than negative effects, claiming that “the metadata demonstrate a positive relationship between social media and participation” [9] (p. 524). Furthermore, other empirical studies have also demonstrated that exposure to political content on Facebook may in fact constitute an interpretive tool for the elucidation of offline political participation [10]. Likewise, Kümpel [11] has argued that accidental exposure to news content by friends may be positively correlated with information gain and can strengthen offline political participation.
As regards of Facebook, scholars seem to agree that the platform possesses embedded features which facilitate users’ exposure to different types of news necessary to acquire political information around an issue [7]. This view has also been confirmed by previous studies which have illustrated how citizens’ consumption (whether intentional or unintentional) of political content tends to increase their political capital [12,13] and strengthen their political activity offline. This is related to the concepts of ‘selective exposure’ and ‘incidental exposure’, both of which play a central role in civic participation. Selective exposure emphasizes the role of individual choice in predicting political content exposure, while incidental exposure focuses on the role of friends and ‘unintentional’ encounters with content. Both concepts, however, tend to be insufficient to predict the way that algorithmic curation works [14,15].
Individuals’ pre-existing attitudes are a significant variable which must be determined when studying empirically the influence of social media on civic attitudes. Political efficacy can be considered as a variable when examining pre-existing understandings of politics and subjects’ real-life political participation. Real-life participation includes political activities such as active political campaigning, attendance of rallies and protests, and voting in elections. Online activities include sharing posts, liking pictures, and commenting on videos on digital platforms [16]. Pre-existing attitudes can in fact be strengthened and reinforced through social media [17,18] rather than changed. Experimental studies have indicated that individuals’ attitudes can be affected by the exposure or discussion of a political matter on social media, depending on the individuals’ level of engagement, their sociopolitical background, and the degree of politicization of the media message. The degree to which users are affected may depend on their level of awareness, their degree of partisanship, and the directionality of media politicization. Users’ civic attitudes may encourage them to prefer consistent information and to defend any motivation to amplify this preference [16,19]. In addition, a significant strand of research has confirmed that social media offers a way to reach users with political information, and that this potentially allows these users to ‘catch up’ in terms of civic knowledge and civic interest, and may facilitate political dialogue [20]. It is evident that within datafied environments where political content or information is arranged by algorithmic curation, the prediction of media exposure is relatively complex [21].
However, media exposure is also associated with users’ civic interest in a specific form of content. Several empirical studies examining the impact of Facebook News Feed on civic participation have reported that civic interest can be a valuable source for the prediction of political use of the internet [22], providing evidence for the interrelation of personal interest and attention on social media. While political interests may correlate with users’ specific interests and preferences in political news choices—thus being easier to predict—a question remains to be answered: what happens when users see content that they did not seek out and which may not correspond to their interests [23]? Civic information is often curated by algorithms which select and display content to users and attempt to form attention patterns [1]. While users may employ certain filtering options to pre-select the kind of content they wish to be exposed to, the efficacy of these filters remains debatable. The realm of social media can therefore be conceptualized as a space in which the mediation of political interests occurs regularly, thus rendering such space a means, rather than a cause, of social and political action [24].
This theoretical discussion posits that the relationship between Facebook use and civic participation may be mediated by the algorithmic curation process, which largely determines the selection of news and political content shown on users’ Facebook News Feeds. Therefore, to understand the communicative process hidden within the platform, we need to move “constantly between the technical and the social […] the inside and the outside of technical objects” [25]. The objective of this article is thus twofold: firstly, to determine the extent to which Facebook News Feed results related to news media articles are customized to reinforce the existing civic attitudes of users; and secondly, to identify the roles which users come to embody, as these relate to the selection of and exposure to different types of political content. These considerations and emerging questions invite further research in this field, and an investigation of whether users’ previous political interests, preferences and beliefs defined as civic attitudes are encouraged by the Facebook News Feed (thus influencing the classification of news and political content online).

2. Research Questions

RQ1: To what extent are Facebook News Feed results related to media articles customized and personalized to reinforce users’ existing civic attitudes and political interests?
RQ2: Which roles do users come to embody in relation to their exposure to types of political content on Facebook?
Hypothesis (H1).
Facebook News Feeds are customized to reinforce users’ existing civic attitudes through the exposure of users to specific political content and information.
Hypothesis (H2).
Facebook users come to embody roles which are discouraged by Facebook when exposed to oppositional political content. Facebook logics are based on content reach among users within the platform. Users will come to embody new roles once exposed to oppositional content such as blocking friends or unfollowing a specific media page which does not correspond to the user’s pre-existing civic attitudes.

3. Materials and Methods

The quantitative data were retrieved from online questionnaires which commenced in January 2020 and involved 108 participants recruited from a Cypriot university. The purpose of the survey was to investigate the influence of algorithmically curated Facebook content on users’ pre-existing civic attitudes and behaviors. A total of 250 invitations, containing the objectives and goals of the study, were sent by email. A total of 108 respondents aged between 18 and 30 completed the online questionnaire. The final sample consisted of 108 participants, 57.8% of whom were male, and 42.6% of whom were female. Further, 90 participants were Greek Cypriots and 18 were Greek. The sample size was considered appropriate for this kind of research [26] and was selected by means of the implementation of a range of non-probability sampling techniques (non-random sampling; participants were selected by the researcher based on specific characteristics) [27]. Our sample was, to a great extent, representative of the general Facebook population, with an average age of 25 years.
The respondents could participate in the experiment from their personal computers at any time and place they liked. Upon receiving a positive response to the invitation, a consent form was sent, which had to be signed by the users and given to the researcher in person or through e-mail. The goal was to make participants feel comfortable and ready to interact in the environment without any pressure and anxiety that might result from the presence of the researcher.
The questionnaire consisted of 54 multiple-choice and open-ended questions [26,27], which were designed to assess the experience of the participants during their interactions (posting, sharing, liking, etc.) on Facebook.
The questionnaire was divided into four sections. The first section concerned the demographic profile of the individuals participating in the study and aimed to determine certain characteristics of the participants (age, gender, political beliefs and political participation). The second section sought to illuminate users’ offline political participation and the ways in which they interpreted their political involvement and attitudes in society (their participation or non-participation in specific offline political events). Examples of questions in the second section include but are not limited to: “do you belong to any political party?” and “do you consider yourself to be an active participant in the political matters or issues of your community?”. The third section focused on online civic participation, analyzing the interaction between Facebook’s News Feed, algorithms, and users civic attitudes. Examples of questions in the third section include: “how often have you seen content about politics or political issues on Facebook news media pages in the past week?”, “how often do you share political content on Facebook?”, and “do you find yourself actively trying to avoid news stories from a particular Facebook page?”. The final section of the questionnaire concentrated on the emotional impact of political discussion online. An example of the questions in this fourth section is: “have you ever been surprised by someone’s views on politics or a political issue, based on something they posted on social media?”. Each section corresponded to a theme and its sub-themes. The theme of the second section was offline political participation, and the sub-themes were: offline political engagement, civic membership, civic identity, and offline engagement in political activities. The theme of the third section was issues of online civic participation, and the sub-themes were: political content online, political preferences online, political interest online and intended or unintended exposure to political content online. The fourth theme was emotions and political discussion online.
The data collection procedure was carried out in a semi-automatic and chain-like manner until data saturation was reached. Privacy, anonymity, and confidentiality were guaranteed. The data analysis was conducted using the Statistical Package for Social Science (SPSS), and the analysis of the open-ended questions was carried out through a coding process—several themes and sub-themes emerged, resulting in a coding scheme based on the aims of the research. An inter-coder reliability test with a sample of the data set (50 questionnaires) revealed that two independent coders agreed on the segmentation in 71% of cases. We evaluated the reliability and validity of the Likert-scale, during our participants interaction in Facebook platform. Reliability and validity were used as indicators to guide efficient scales. Shortening the scale to meet the purpose of the current research, it will better than the long version, has somewhat less satisfying content validity [28].

4. Results

Τhe highest participation rate was among those aged between 18 and 30 years old (n = 50.9%), while 57.8% of the sample were male, and 42.6% were female. As shown in Table 1, most of the sample considered themselves to be ‘very interested’ in offline political life (n = 30.6%), while the rest of the sample varied slightly between ‘moderately interested’ and ‘extremely interested’ in politics.
Although a high percentage of participants were interested in politics, as Table 2 shows the vast majority of users did not belong to a political party (n = 75%).
There seemed to be much more active participation in online rather than offline engagement in civic activities (see Table 3). In the last twelve months, most of the participants had sent a political message via Facebook (n = 71.4%), and most had signed up as volunteers for a campaign or a political cause (n = 54.3%). Minorities of the sample had contributed to a campaign (n = 31.4%), written to a politician (n = 28.6%), subscribed to a political list service (n = 24.3%) or written messages to the editor of a newspaper using the official Facebook page of the newspaper (n = 22.9%). This finding demonstrates that the participants are actively engaging in political activities within the social media.
Continuing the analysis of participants’ online civic attitudes and political interests, 71% of the respondents answered that they ‘always’ came across political content on Facebook, whereas 56% encountered such content ‘often’. A total 71% of those who ‘always’ encountered political content on Facebook also sent political messages, while half of these individuals had also signed up as volunteers for a political campaign or cause. In addition, about one in three participants in that group had made a campaign contribution (Table 3 and Table 4). This finding evokes that our participants are mostly engaged in low-cost political engagement activities online.
A significant positive correlation was found between all the actions (Table 5). The strongest correlation, however, was found between responses to the questions “How often do you create a political content on Facebook?” and “How often do you share political content on Facebook?” (r(108) = 0.82, p < 0.01), followed by responses to the questions “How often do you read political content on Facebook?” and “How often did you see content on Facebook news media pages related to politics or political issues during the last week?” (r(108) = 0.72, p < 0.01). This finding indicates that the most active and engaged citizens are often the one who create, share, see and read political content on Facebook. This demonstrates that social media contribute to high-level engagement in various political activities online.
A Chi-square test for association between responses to the questions “Do you have particular Facebook news media pages from which you get informed?” and “Did you join, like, or follow the Facebook media pages of your preferable news media sources?” was conducted (Table 6). All expected cell frequencies were greater than fine. A statistically significant association was found between responses to these two questions (x2(1) = 17.726, p < 0.001). Another Chi-square test was conducted to test for association between responses to the questions “Which of the following have you done in the last year?” and the “Did you join, like, or follow the Facebook media pages of your preferable news media sources?”. All expected cell frequencies were greater than fine.
There were statistically significant associations between responses to the questions:
  • Added, followed or became friends with a user or organization because of news items they had posted or shared”, x2(1) = 13.325, p < 0.001.
  • Deleted or blocked another user or organization because of news they had posted or shared”, x2(1) = 6.130, p = 0.013.
  • Changed my settings so that I would see more news from a user or organization”, x2(1) = 5.683, p = 0.017.
An additional Chi-square analysis was conducted to test for association between responses to the questions “Which of the following have you done in the last year?” and “Did you find yourself trying to avoid news stories from a particular Facebook source?” (Table 7). All expected cell frequencies were greater than fine, proving that there was significant correlation among the elements. There was a statistically significant association for “Deleted or blocked another user or organization because of news they had posted or shared” (x2(1) = 7.691, p = 0.006), see Table 8.
The same analysis procedure (Chi-square) was used to test for association between responses to the questions “Did you find yourself trying to avoid news stories from a particular Facebook source?” and “ Have you been exposed to an oppositional ideological form of content through news media articles?” (see Table 9). All expected cell frequencies were greater than fine. There was a statistically significant association between the two actions: x2(1) = 3.864, p = 0.049. This finding indicates that participants who answered positive related to their exposure with oppositional ideological form of content through news stories, they find themselves trying to avoid news stories from a particular Facebook source. This evokes a tendency towards what has been called ‘filter bubble’ effect, where users prefer to read, share, and comment news stories which are similar to their own political beliefs. It is still uncertain whether algorithmic curation can effectively amplify individual preferences or enclose users in their own filter bubble. Users’ interests, preferences and habits form the foundation of the platform’s optimal algorithmic categorisation, which in turn promotes content deemed likely to attract users
A Chi-square test for association between responses to the questions “Did you find yourself trying to avoid news stories from a particular Facebook source?” and “Have you ever been surprised by someone’s views on politics or a political issue, based on something they posted on social media?” was also conducted. All expected cell frequencies were greater than fine. There was a statistically significant association between the two reactions, x2(1) = 3.984, p = 0.046 (Table 10).
Only four respondents responded, ‘always’ to the question “How often do you share political content on Facebook?”. Of the 20 respondents that ‘often’ shared political content, 40% felt ‘inspired’, 35% felt ‘anger’, 30% felt ‘sadness’, and 30% felt ‘disgusted’. Similar percentages appeared in the group that ‘occasionally’ shared content. On the other hand, the vast majority (67%) of those that never shared political content replied that they ‘didn’t care’ that they did not do so (Table 11).

5. Discussion

Our analysis indicates that Facebook’s News Feed is a novel and influential source of news content for users and plays a central role in conveying and controlling the flow of information in different formats [29] within the platform. It thus constitutes a key source of news information and encourages people to create and receive news stories from media sources and friends among others [27]. Facebook’s News Feed, of course, operates by means of algorithmic processes which are comprised of systems of criteria used to sort the pollical content to be included from that which is, so to speak, undesirable or not corresponding to the preferences of the user. As such, algorithmic processes user’s experience within the platform in a unique way and according to many parameters; some of them are entirely technical and some others are depended to user’s online behaviour [27]. As Thorson et al. [26] point out, “Facebook uses digital trace data about each user to infer their interests in order to (a) aid the newsfeed ranking algorithm in deciding which stories will be most ‘meaningful’ and ‘relevant to that user, and (b) package that user to be targeted by advertisers”.
Other studies have identified a set of nine News Feed values: friend relationships, explicitly expressed user interests, prior user engagement, implicitly expressed user preferences, postage, platform priorities, page relationships, negatively expressed intentions, and content quality [30]. A variety of different News Feed values are used in everyday story selection, with the most popular being “novelty or oddity, conflict or controversy, interest, importance, impact or consequence, sensationalism, timeliness, and proximity” [31]. News Feed values, which are algorithmically driven, lack a deep understanding of information due to technological limitations. Such limitations—known as technical bias—include technical issues in databases, storage (un)availability, processing power, and potential coding errors [32]. Algorithmic curation is mostly informed by individual users choices (habits, preferences and interests) but also from the arbitrary internal process of the technological artefact as well. This latter form of determination is often referred to as ‘pre-existing bias’ [33], and its effect on the formation of News Feed values may or may not be taking place in a conscious manner [34]; in either case, it can affect the basis of algorithmic curation. For instance, if Facebook News Feed is going to arrange posts from close friends of the user, engineers must decide on the criteria that determine what a ‘close friendship’ is, as opposed to a mere contact or association. Engineers must adapt this approach into a single, operational interpretation, and embed in the design of the algorithmic procedure for determining values around the definition of what a ‘close friend’ is. These value-based decisions are a prerequisite for Facebook as a business medium, because advertisers need access to users’ input, to determine the connections and pre-existing preferences of friend groups to serve the advertisers’ promotional interests [35,36,37].
Facebook designers and engineers need to decide on the degree of relevance and choose the variables to use for evaluation [38] as well as the data sources the algorithm will draw from [32]. Furthermore, engineers can prioritize popular values based on individuals’ preferences and their previous interactions (feedback loops and filter bubbles) [39]. Nicholas Diakopoulos [40] identifies the following as the major functions of these algorithms: prioritization, classification, association, and filtering. Value decisions in these areas are evident in Facebook’s technical documentation for the News Feed [39,41]. Research has revealed a two-element system: (1) featuring objects (content), and (2) featuring edges (relational interactions—tags, comments, etc.). Edges include three components: association between the perceiver and creator, the type of edge (with comments being more important than likes), and ‘time rotting’. Creator and viewer relationships are based on a pathway of interaction through wall posts, likes, private messages, etc.
In general, News Feed story selection and algorithmic decision-making procedures are an ongoing process in which Facebook engineers and users interact, and where new improvements from users’ feedback and market changes drive an increasingly effective personalization of individuals’ Facebook News Feeds [42]. Such personalization is bound up with the significant correlation between online and offline political and civic participation. In our results, this is more pronounced in the 18–30 age group (the age group most strongly represented by our sample): while most individuals in this group stated they were ‘very interested’ in politics, they nevertheless reported no involvement in any political parties. This should come as no surprise, since mistrust of institutions and generalized suspicion towards mainstream politics is quite common among young people. This partial disengagement from formal political structures and procedures should not be mistaken for political apathy. On the contrary, it is evident (see Table 3) that online civic participation is quite prevalent among the age group in question due to its flexible low-cost and effortless engagement. At the same time, the question arises whether these users abstain from conventional politics because they genuinely believe that online civic participation is more effective, or because such online action is easier and less demanding which provides a convenient yet inferior alternative to offline political activities.
Furthermore, a significant proportion of the participants reported seeing political content on their news feeds on a regular basis, which indicates that they must have sought it out, and perhaps still do by following pages with content of this nature. This indicates that although our participants do not engage, or they do not belong to any political party are still interested in politics. This observation, of course, coincides with the empirical findings of Karnowski et al. [43], who demonstrated that individuals who are already interested in political matters are indeed more likely to come across political content in their Facebook News Feed because of incidental exposure. This finding coincides with previous studies arguing that citizens who, “unintentionally or not”, consume politically oriented content can gain in political capital [44] and thus potentially mobilize offline particularly in collective action. This is often related to what scholars call selective exposure and incidental exposure, which can intervene in the making of civic participation. While this is true, this raises the central issue of the ‘filter bubble’ phenomenon, whereby actual exposure (selective or incidental) to a diversity of political news posts and views is hampered by feedback loops created by users’ data traces on the one hand, and algorithmic curation on the other. In other words, social media users are often trapped in “echo chambers” and “filter bubbles”, which expose them only to certain views and opinions, usually ones that they agree with in the first place. Yet, algorithmic curation remains mostly invisible as ‘black boxed’ to the users [45]. Our findings support this idea (see Table 6), as statistically significant associations were established between responses to the questions: “Added, followed or became friends with a user or organization because of news items they had posted or shared”, “Deleted or blocked another user or organization because of news they had posted or shared” and “Changed my settings so that I would see more news from a user or organization. All the questions are referring to a rather selective exposure (preferable sources) from the side of users (in terms of page news organization and friends) to news posts and views that are supporting the effect of filter bubble.
The majority of those who constantly viewed political content on their news feed also engaged in such actions as sending political messages and signing up to volunteer for various political causes. A third of this population went so far as to make political contributions. This goes in line with Ekstrom and Shehata [46] findings where social interaction in social media coincides mostly with engagement in political information and interaction seen as rather an unconventional form of participation especially for young citizens. The fact that citizens are more frequently engaged within social media by sending a political message (71.4%) or by making an online campaign contribution is not surprising. These types of online activities are less demanding and more open in terms of political affiliations. The questions that naturally arise from such observations are concerned with the nature of the political content being viewed, and potential fluctuations (or otherwise) in offline political activity. Thus, a strong correlation between the responses to “Did you join, like, or follow the Facebook media pages of your preferable news media sources?” and “Which of the following have you done in the last year?”, perhaps provides an indication of the validity of our first hypothesis. Users reported having added or followed users or pages due to their news posts (stories); conversely, they also reported having deleted or blocked users or pages for precisely the same reason. Furthermore, a correlation was established between responses to the questions “Did you find yourself trying to avoid news stories from a particular Facebook source?” and “Deleted or blocked another user or organization because of news they had posted or shared”.
Our results complicate our understanding of the phenomena under investigation, due to the correlation observed between responses to the questions “Did you find yourself trying to avoid news stories from a particular Facebook source?” and “Have you been exposed to an oppositional ideological form of content regarding the news media articles?” When encountering pages that share views contrary to their own, a large proportion of users will most likely avoid these pages/or users altogether instead of critically engaging with said views. Furthermore, many users will go so far as to delete or block pages and individuals because of their sharing of certain news stories. Such observations seem to confirm both the presence of the filter bubble phenomenon and as our first Hypothesis (H1) regarding customization and the reinforcement of pre-existing views and convictions. That being the case, we argue that the customization processes initiated and established by the platform are often reinforced by users themselves, who seem to enjoy the cognitive consonance that personalization provides. In these instances, users are not exclusively the passive objects of algorithmic curation: they may also be active yet conservative subjects who are, for whatever reason, unwilling to engage with opposing views. Therefore, our second Hypothesis (H2) also seems to be validated by our findings. The proactive stance users take towards oppositional pages and views presupposes that they assume a specific role, in which they ‘censor indirectly’, so to speak, what is undesirable. Such censoring, however, is not necessarily conservative per se, since it may very well be applied for purposes related to the obfuscation and/or subversion of algorithmic curation.
The last section of the survey results concerned the influence of political content on users’ emotional interactions, and participants were asked to specify their feelings towards the creation and/or sharing of political content on Facebook. The participants’ responses indicated that they were affected emotionally, albeit in a negative way: their affective states ranged from neutral to negative. This indicates that the sharing or creation of online political content by users aims to express their subjective, opposing viewpoint by filtering and criticizing the content they interact with, which is often contrary to their actual political beliefs. Overall, the often-neutral stance of users indicates that users are indifferent to political content that is not politically consistent with their personal political beliefs. Individuals’ emotional tendencies range from neutral to negative, depending on the political content in question and the current political context [44]. Therefore, the creation or sharing of political content satisfies a need for the expression of contrary viewpoints or opinion.

6. Conclusions

Several theoretical questions still plague the research concerned mostly with algorithmic personalization on users’ Facebook news feed results. Further empirical studies must emphasize on the specific role of technological affordances to uncover: To what degree are users aware of technological affordances which encourage specific lines of action while refusing or blocking some others within social media? Will users eventually come to embody the role envisaged by social media such as Facebook, that is, of active agents or will they define other roles for themselves? These considerations and emerging questions invite further research into the field of critical data studies, by investigating whether previous users’ political interests and preferences, as determined by them and the ways are encouraged by the technological artefact in question, might have some implications not only online but as well as offline.

Author Contributions

Conceptualization, V.P.; Methodology, V.P. and T.P.; Validation, V.P.; Analysis, V.P. and T.P.; General Draft preparation, V.P.; Writing, Review and Editing, V.P.; Supervision, V.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the University of Cyprus, grant number 3731/2019.

Institutional Review Board Statement

Ethical review and approval were waived for this study since the collection of data was anonymous and undertaken by the subjects online.

Informed Consent Statement

Informed consent was obtained from all subjects involved in this study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to privacy issues.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Thorson, K.; Wells, C. Curated flows: A framework for mapping media exposure in the digital age. Commun. Theory 2016, 26, 309–328. [Google Scholar] [CrossRef]
  2. Weeks, B.E.; Kim, D.H.; Hahn, L.B.; Diehl, T.H.; Kwak, N. Hostile media perceptions in the age of social media: Following politicians, emotions, and perceptions of media bias. J. Broadcasting Electron. Media 2019, 63, 374–392. [Google Scholar] [CrossRef]
  3. Clift, S.L. E-Government and Democracy: Representation and Citizen Engagement in the Information Age. Available online: https://www.publicus.net/articles/cliftegovdemocracy.pdf (accessed on 2 September 2021).
  4. Yoo, S.W.; Gil De Zúñiga, H. The role of heterogeneous political discussion and partisanship on the effects of incidental news exposure online. J. Inf. Technol. Politics 2019, 16, 20–35. [Google Scholar] [CrossRef]
  5. Bode, L. Political news in the news feed: Learning politics from social media. Mass Commun. Soc. 2016, 19, 24–28. [Google Scholar] [CrossRef]
  6. De Zúñiga, H.G.; Bachmann, I.; Hsu, S.H.; Brundidge, J. Expressive versus consumptive blog use: Implications for interpersonal discussion and political participation. Int. J. Commun. 2013, 7, 1538–1559. [Google Scholar]
  7. Theocharis, Y.; Lowe, Y.W. Does Facebook increase political participation? Evidence from a field experiment. Inf. Commun. Soc. 2015, 19, 1465–1486. [Google Scholar] [CrossRef]
  8. Boulianne, S. Does Internet Use Affect Engagement? A Meta-Analysis of Research. Political Commun. 2009, 26, 193–211. [Google Scholar] [CrossRef] [Green Version]
  9. Boulianne, S. Social media use and participation: A meta-analysis of current research. Inf. Commun. Soc. 2015, 18, 524–538. [Google Scholar] [CrossRef]
  10. Karakaya, S.; Glazier, R.A. Media, information, and political participation: The importance of online news sources in the absence of a free press. J. Inf. Technol. Politics 2019, 16, 1–17. [Google Scholar] [CrossRef]
  11. Kümpel, A.S. The Matthew Effect in social media news use: Assessing inequalities in news exposure and news engagement on social network sites (SNS). Journalism 2020, 21, 1083–1098. [Google Scholar] [CrossRef]
  12. Thorson, K. Attracting the news: Algorithms, platforms, and reframing incidental exposure. Journalism 2020, 21, 1067–1082. [Google Scholar] [CrossRef]
  13. Delli–Carpini, M.X.; Keeter, S. What Americans Know about Politics and Why It Matters; Yale University Press: London, UK, 1997; ISBN 9780300072754. [Google Scholar]
  14. Tewksbury, D.; Weaver, A.J.; Maddex, B.D. Accidentally Informed: Incidental News Exposure on the World Wide Web. J. Mass Commun. Q. 2001, 78, 533–554. [Google Scholar] [CrossRef]
  15. Knoke, D. Networks of political action: Toward Theory Construction. Soc. Forces 1990, 68, 1041–1063. [Google Scholar] [CrossRef]
  16. Park, C.S.; Kaye, B.K. Smartphone and self-extension: Functionally, anthropomorphically, and ontologically extending self via the smartphone. Mob. Media Commun. 2019, 7, 215–231. [Google Scholar] [CrossRef]
  17. Matthes, J.; Schmuck, D. The effects of anti-immigrant right-wing populist ads on implicit and explicit attitudes: A moderated mediation model. Commun. Res. 2017, 44, 556–581. [Google Scholar] [CrossRef]
  18. Saleem, M.; Prot, S.; Anderson, C.A.; Lemieux, A.F. Exposure to Muslims in Media and Support for Public Policies Harming Muslims. Commun. Res. 2015, 44, 817–840. [Google Scholar] [CrossRef]
  19. Winter, S.; Metzger, M.J.; Flanagin, A.J. Selective Use of News Cues: A Multiple-Motive Perspective on Information Selection in Social Media Environments. J. Commun. 2016, 66, 669–693. [Google Scholar] [CrossRef]
  20. Moehler, D.; Conroy-Krutz, J. Partisan Media and Engagement: A Field Experiment in a Newly Liberalized System. Political Commun. 2015, 33, 1–19. [Google Scholar] [CrossRef]
  21. Thorson, K.; Wells, C. How Gatekeeping Still Matters: Understanding Media Effects in an Era of Curated Flows. In Gatekeeping in Transition; Vos, T.P., Heinderyckx, F., Eds.; Routledge: London, UK, 2015; pp. 25–44. [Google Scholar]
  22. Thorson, K.; Xu, Y.; Edgerly, S. Political inequalities start at home: Parents, children, and the socialization of civic infrastructure online. Political Commun. 2018, 35, 178–195. [Google Scholar] [CrossRef]
  23. Fletcher, R.; Nielsen, R.K. Are people incidentally exposed to news on social media? A comparative analysis. New Media Soc. 2018, 20, 2450–2468. [Google Scholar] [CrossRef]
  24. Bond, R.M.; Fariss, C.J.; Jones, J.J.; Kramer, A.D.; Marlow, C.; Settle, J.E.; Fowler, J.H. A 61-million-person experiment in social influence and political mobilization. Nature 2012, 489, 295–298. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Akrich, M. The De-Scription of Technical Objects. In Shaping Technology/Building Society: Studies in Sociotechnical Change; Bijker, W.E., Law, J., Eds.; MIT Press: Cambridge, MA, USA, 1992; pp. 205–224. ISBN 9780262023382. [Google Scholar]
  26. Thorson, K.; Cotter, K.; Medeiros, M.; Pak, C. Algorithmic inference, political interest, and exposure to news and politics on Facebook. Inf. Commun. Soc. 2021, 24, 183–200. [Google Scholar] [CrossRef]
  27. Sand-Jecklin, K.; Sherman, J. A quantitative assessment of patient and nurse outcomes of bedside nursing report implementation. J. Clin. Nurs. 2014, 23, 2854–2863. [Google Scholar] [CrossRef]
  28. Papa, V. ‘To activists: Please post and share your story’: Renewing understandings on civic participation and the role of Facebook in the Indignados movement. Eur. J. Commun.-Nication 2017, 32, 583–597. [Google Scholar] [CrossRef]
  29. Duggan, M.; Ellison, N.B.; Lampe, C.; Lenhart, A.; Madden, M. Pew Research Centre. Available online: https://www.pewresearch.org/internet/2015/01/09/social-media-update-2014/ (accessed on 3 October 2021).
  30. DeVito, M.A. From Editors to Algorithms: A values-based approach to understanding story selection in the Facebook news feed. Digit. Journal. 2016, 5, 753–773. [Google Scholar] [CrossRef]
  31. Gillespie, A. Foundations of Economics; Oxford University Press: Oxford, UK, 2014; ISBN 9780198806523. [Google Scholar]
  32. Friedman, B.; Nissenbaum, H. Bias in Computer Systems. ACM Trans. Inf. Syst. 1996, 14, 330–347. [Google Scholar] [CrossRef]
  33. Elder, M.D.; Jho, J.Y.; Rokosz, V.T.; Schirmer, A.L.; Schultz, M. System and Method for Building Social Networks Based on Activity around Shared Virtual Objects. U.S. Patent 7,249,123 B2, 31 October 2002. issued 24 July 2007. [Google Scholar]
  34. Andrejevic, M. Infoglut: How too Much Information Is Changing the Way We Think and Know; Routledge: London, UK, 2013; ISBN 9780415659086. [Google Scholar]
  35. Bucher, T. Want to be on the top? Algorithmic power and the threat of invisibility on Facebook. New Media Soc. 2012, 14, 1164–1180. [Google Scholar] [CrossRef]
  36. Domingos, P. Mining Social Networks for Viral Marketing. J. Retail. Consum. Serv. 2005, 20, 80–82. [Google Scholar] [CrossRef]
  37. Steiner, M.; Ahijevych, D.; Pinto, J.O.; Williams, J.K. Probabilistic forecasts of mesoscale convective system initiation using the random forest data mining technique. Weather Forecast. 2016, 31, 581–599. [Google Scholar]
  38. Pariser, E. TED: Ideas worth Spreading. Available online: https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles?language=en (accessed on 10 August 2020).
  39. Hoffman, A.L.; Proferes, N.; Zimmer, M. “Making the world more open and connected”: Mark Zuckerberg and the discursive construction of Facebook and its users. New Media Soc. 2016, 20, 199–218. [Google Scholar] [CrossRef] [Green Version]
  40. Diakopoulos, N. Algorithmic Accountability: Journalistic investigation of computational power structures. Digit. J. 2014, 3, 398–415. [Google Scholar] [CrossRef]
  41. Mager, A. Algorithmic Ideology: How capitalist society shapes search engines. Information, Commun. Soc. 2012, 15, 769–787. [Google Scholar] [CrossRef]
  42. Ananny, M. Toward an ethics of algorithms: Convening, observation, probability, and timeliness. Sci. Technol. Hum. Values 2016, 41, 93–117. [Google Scholar] [CrossRef] [Green Version]
  43. Karnowski, V.; Leonhard, L.; Kümpel, A.S. Why Users Share the News: A Theory of Reasoned Action-Based Study on the Antecedents of News-Sharing Behavior. Commun. Res. Rep. 2017, 35, 91–100. [Google Scholar] [CrossRef]
  44. Photiadis, T.; Papa, V. ‘What’s up with ur emotions?’ Untangling emotional user experience on Second Life and Facebook. Behav. Inf. Technol. 2021. [Google Scholar] [CrossRef]
  45. Kothari, C.R. Research Methodology: Methods and Techniques, 2nd ed.; New Age International Publishers: New Delhi, India, 2004. [Google Scholar]
  46. Ekström, M.; Shehata, A. Social media, porous boundaries, and the development of online political engagement among young citizens. New Media Soc. 2018, 20, 740–759. [Google Scholar] [CrossRef]
Table 1. How much do you consider yourself to be interested in politics?
Table 1. How much do you consider yourself to be interested in politics?
How Much Do You Consider Yourself to Be Interested in Politics?Extremely InterestedVery InterestedModerately InterestedSlightly InterestedNot at All Interested
21.3%30.6%23.1%16.7%9.3%
Table 2. Do you belong to a political party?
Table 2. Do you belong to a political party?
Do You Belong to a Political Party?Yes (n = 25%)No (n = 75%)
Table 3. Have you engaged in any of the following online activities during the last twelve months?
Table 3. Have you engaged in any of the following online activities during the last twelve months?
Have You Engaged in Any of the Following Online Activities during the Last Twelve Months?Write to a PoliticianMake a Campaign ContributionSubscribe to a Political List ServiceSign up to Volunteer for a Campaign/CauseSend a political Message via FacebookWrite Messages to the Editor of a Newspaper Using the Official Facebook Page of the News Media
28.6%31.4%24.3%54.3%71.4%22.9%
Table 4. How often you came across political content on Facebook in the last twelve months?
Table 4. How often you came across political content on Facebook in the last twelve months?
AlwaysOftenOccasionallyRarelyNever
Proportion of Participants71%56%9%11%17%
Table 5. Correlation between all actions and between specific actions.
Table 5. Correlation between all actions and between specific actions.
How Often Did You See Content on Facebook News Media Pages Related to Politics or Political Issues during the Last Week?How Often Do You Read Political Content on Facebook?
How often do you create political content on Facebook?0.72
How often do you share political content on Facebook? 0.64
Table 6. A Chi-square test analysis for association between two actions in Facebook.
Table 6. A Chi-square test analysis for association between two actions in Facebook.
Did You Join, Like, or Follow the Facebook Media Pages of Your Preferable News Media Sources?
YesNoTotalx2p-Value
Do you have particular Facebook news media pages from which you get informed?Yes4244617.7260.00
Table 7. Correlation between interactions in Facebook platform.
Table 7. Correlation between interactions in Facebook platform.
Did You Join, Like, or Follow the Facebook Media Pages of Your Preferable News Media Sources?
YesNoTotalx2p-Value
Which of the following have you done in the last year?Added, followed, or became friends with a user or organization because of news items they had posted or sharedYes57103013.3250.00
Table 8. Chi-square test analysis of actions and reactions in the last year on the Facebook platform.
Table 8. Chi-square test analysis of actions and reactions in the last year on the Facebook platform.
Did You Find Yourself Trying to Avoid News Stories from a Particular Facebook Source?
YesNoTotalx2p-Value
Which of the following have you done in the last year?Deleted or blocked another user or organization because of news they had posted or sharedYes4514597.6910.006
No181937
Table 9. Chi-square test analysis of the association between responses to the two questions “Did you find yourself trying to avoid news stories from a particular Facebook source?” and “Have you been exposed to an oppositional ideological form of content through news media articles?”.
Table 9. Chi-square test analysis of the association between responses to the two questions “Did you find yourself trying to avoid news stories from a particular Facebook source?” and “Have you been exposed to an oppositional ideological form of content through news media articles?”.
Have You Been Exposed to Any Oppositional Ideological Form of Content Through News Stories?
YesNoTotalx2p-Value
Did you find yourself trying to avoid news stories from a particular Facebook source?Yes4119603.8640.049
Table 10. Have you ever been surprised by someone’s views on politics or a political issue, based on something they posted on social media?
Table 10. Have you ever been surprised by someone’s views on politics or a political issue, based on something they posted on social media?
Have You Ever Been Surprised by Someone’s Views on Politics or a Political Issue, Based on Something They Posted on Social Media?
YesNoTotalx2p-Value
Did you find yourself trying to avoid news stories from a particular Facebook source?Yes3320533.9840.046
Table 11. Emotional interaction via online political content.
Table 11. Emotional interaction via online political content.
How Often Do You Share Political Content on Facebook?
AlwaysOftenOccasionallyRarelyNever
How do you feel when you share political content on Facebook?Anger100%35%32%14%10%
Sadness75%30%32%14%7%
Fear25%10%11%6%0%
Joy0%5%16%3%0%
Surprise0%20%11%6%0%
Disgust25%30%32%9%7%
Anticipation0%5%11%6%0%
Trust0%15%5%0%0%
Inspired50%40%37%20%3%
Annoyed25%25%26%20%10%
Amused25%5%5%9%0%
Don’t Care0%20%26%37%67%
Base420193530
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Papa, V.; Photiadis, T. Algorithmic Curation and Users’ Civic Attitudes: A Study on Facebook News Feed Results. Information 2021, 12, 522. https://doi.org/10.3390/info12120522

AMA Style

Papa V, Photiadis T. Algorithmic Curation and Users’ Civic Attitudes: A Study on Facebook News Feed Results. Information. 2021; 12(12):522. https://doi.org/10.3390/info12120522

Chicago/Turabian Style

Papa, Venetia, and Thomas Photiadis. 2021. "Algorithmic Curation and Users’ Civic Attitudes: A Study on Facebook News Feed Results" Information 12, no. 12: 522. https://doi.org/10.3390/info12120522

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop