Image: Vincenzo Cosenza
In this blog post, DATACTIVE research associate Antonio Martella is looking forward to the consequences of Facebook’s news feed modifications as result of larger corporate policy changes. He investigates and discusses implications through three hypotheses: 1) the divide between the attention-rich and the attention poor will grow 2) increasing engagement with peer-created content will tighten the filter bubble aspect of networking and 3) the “new” news feed will have a negative impact on users’ mood.
Guest Author: Antonio Martella
On November 11th, 2017, Facebook has announced that the user timeline will change in January 2018. In their words:
“With this update, we will also prioritize posts that spark conversations and meaningful interactions between people. To do this, we will predict which posts you might want to interact with your friends about and show these posts higher in the feed. These are posts that inspire back-and-forth discussion in the comments and posts that you might want to share and react to – whether that’s a post from a friend seeking advice, a friend asking for recommendations for a trip, or a news article or video prompting lots of discussions. […] We will also prioritize posts from friends and family over public content, consistent with our News Feed values.” (Newsroom Facebook 2018)
Any modification in the feed algorithm will have many consequences, and these are not equally predictable. Facebook is a very complicated environment, semi-public in nature and not only related to friendship management. In fact, as the Pew Research Center reported last September, 67% of Americans consume news over social media. This pattern seems to apply to the European news consumption too, where youngsters are exposed to news mostly in a social media context rather than television or newspaper. Indeed, as the Reuters Institute’s Digital News Report 2017 shows, many users follow others because of the news they share.
According to the Pew Research Report, Facebook surpasses other social media as a source of news consumption. This is partially due to the large userbase Facebook has, and partially because news is actually interwoven with people’s timelines. The Digital News Report also shows that exposure to news in Facebook is often incidental; a direct result of news shared by other users, a wide range of news companies that are followed, etc. Notwithstanding, we need to keep in mind that exposure to any content in social media or search engines is algorithm-driven.
Following these considerations, there are several possible consequences to the Facebook news feed changes. This blogpost invests into three probable implication, being
- the divide between the attention-rich and the attention poor will grow;
- continuous personalisation;
- negative impact on users’ mood
1. The divide between the attention-rich and the attention poor will grow
All pages and groups that share content on Facebook will lose visibility and revenues that come from users reading their posts, clicking their links, and visiting their websites1. It’s easy to guess that those who want to remain visible have two choices: either pay more for Facebook ads in order to make their posts visible; or create more engaging content. But the generated engagement in Facebook is deeply connected with the number of followers. This will probably increase the gap between attention reach and attention poor, which is in line with the observed Matthew effect (Merton, 1968) that rules many patterns and practices online (Barabasi, 2013) and in social media.
In fact, many aspects of the society both online and offline are governed by the preferential attachment process that stays behind the so-called “Matthew effect” or the “80/20 rule”. Hence, the more connection you have the more visible you are, and the more new connections you would get as a consequence. This principle can easily be illustrated by the fact that famous websites and people tend to have more followers on social media. But the other way around is equally true: the fewer connection you have, the less attention you would get. In conclusion, contents produced by people or organizations with less power/resources and with lower budgets will decrease in visibility.
2. Continuous personalisation
The second consequence of the news feed change deals with the kind of content that will be dominant in users’ feeds. According to Mark Zuckerberg, content produced and shared by “friends and family” will be more visible in all Facebook timelines. But a news feed dominated by friends’ posts could arguably exacerbate two negative social media aspects, previously expressed through notions of the filter bubbles and the echo chamber. Online social networks developed in social media platforms are strongly based on homophily (Barberà, 2014; Aiello et al 2012) meaning that users connect with others who share similar interests, values, political views, etc. This typical behaviour is also found in offline social networks (McPherson, Smith-Lovin, Cook, 2001), and shows its most problematic characteristics when focusing on information diffusion.
On the one hand, this change will foster the filter bubble in which we are all involved. In fact, filter bubbles (Pariser, 2011) are the result of users’ activities on the web: social media algorithms which continuously learn from every users’ clicks and likes2. On the other hand, more homophily in social media due to the prevalence of “friends and family contents” could easily sustain the echo chamber effect. This phenomenon preceded social media platforms, for like-minded people love to talk to each other fostering their opinions and biases. However, in social media, it is easier to avoid a contrasting point of views, values, or interests as a consequence of the self-selection of “friends”, pages, and groups. Indeed, as research has highlighted, there is a user tendency to promote their favourite narratives and to form polarised groups on Facebook (Quattrociocchi, Scala, Sunstein 2016; Bakshy, Messing, Adamic, 2015) even though it is not a clear and deterministic process (Barberà et al. 2015).
Based on these last considerations, another outcome of news feed changes will be a growth in the visibility of friends’ opinions and points of view. This will most probably result in more polarised information flow in users’ news feeds and a limited number of different point of views and professional (or semi-professional) content. In practice this means that if we would think about a contested news like glyphosate and cancer causation, we have to take in account that information sources will be more socially driven; the chance to read a different point of views and professional news will be smaller than before.
3. Negative impact on users’ mood
The news feed changes will probably influence the mood of billions of people in an inscrutable way. One can say that a news feed more populated by friend’s content would have a negative impact on happiness. According to Mark Zuckerberg “the research shows that when we use social media to connect with people we care about, it can be good for our well-being”. In fact, according to an experiment conducted on users timeline (Kramer, Guillory, Hancock, 2013) content on the users’ timeline does indeed influences their mood. As many researchers have shown, personal feelings (happiness, depression, etc.) flow through offline social networks (Fowler, Christakis, 2008) and their representation in online environments seems to share similar diffusion patterns. In other words: moods contagiously spread online. And in extension, recent scholarly and non-scholarly work shows that scrolling through your Facebook feed can have a negative impact on well-being (Shakya, Christakis, 2017)3. Lastly, it has been demonstrated that the constant bombardment of everyone’s news, biases the attempt to provide the best representation of the self and it seems to have a negative impact on happiness.
Questions to ask
Throughout the hypothesis, I have tried to show some real-life aspects that might be affected by the important changes on Facebook algorithms. As Facebook stated, there are around 2 billion active users on its platform monthly.
These statements subsequently evoke two questions:
- Can these changes be made by a private company without any form of public discussion?
- Is it our democratic right to scrutinize algorithms as organiser of public space?
Further information on how Facebook algorithms work can be found here: an interesting article edited by Share Lab that has tried to shed some light on what is behind this platform.
Aiello, Luca Maria, Barrat, Alain, Schifanella, Rossano, Cattuto, Ciro, Markines, Benjamin, Menczer, Filippo. 2012. Friendship prediction and homophily in social media. ACM Trans. Web 6, 2, Article 9, 33 p. 66.
Bakshy, Eytan, Messing, Solomon, Adamic, Lada A. 2015.Exposure to ideologically diverse news and opinion on Facebook in Science 05 Jun 2015: Vol. 348, Issue 6239, pp. 1130-1132.
Pariser, Eli, 2012, The Filter Bubble: What The Internet Is Hiding From You, Penguin: London.
Quattrociocchi, Walter, Scala, Antonio, Sunstein, Cass R. 2013. Echo Chambers on Facebook. Available at SSRN: https://ssrn.com/abstract=2795110.
Shakya, Holly B., Christakis, Nicholas A. 2017. Association of Facebook Use With Compromised Well-Being: A Longitudinal Study in American Journal of Epidemiology, 185:3, pp. 203–211.
Rogers, Richard, 2015. Digital Methods for Web Research, in Emerging Trends in the Social and Behavioral Sciences: An Interdisciplinary, Searchable, and Linkable Resource (ed. Scott, Roberts; Buchmann, Marlis C.; Kosslyn Stephan), Wiley & Sons: New York
For example, this is exactly what happened to the blog LittleThings. This blog had to shut down a month after the news feed change due to the web traffic drop.
This is already happening as an Italian experiment on Facebook have partially shown during the last Italian election (link, unfortunately only in Italian). According to this experiment, Facebook news feed shows different kind of content and media (photo, video, web links) based on likes, comment and shares of each user. Indeed, according to Facebook statements, proposed content will be more based on each user’s intention to interact (algorithmically predicted) fostering the visibility of tailored content.
For example «Liking others’ content and clicking links posted by friends were consistently related to compromised well-being, whereas the number of status updates was related to reports of diminished mental health» (Shakya, Christakis, 2017, p. 210).
On the author: Antonio is a PhD candidate in Political Science at the University of Pisa. His research focus is political leaders populism in social media. His approach coincides with the Digital Methods for Web Research recommendations (Rogers, 2015), and he is particularly interested in social media algorithms and their effects.