On February 7, 2019 the Internet Policy Review published an op-ed by Stefania Milan and Claudio Agosti. We reflect on personalization algorithms and elections, and share some ideas about algorithmic sovereignty and literacy. Thanks to Frédéric Dubois for the invitation.
“Personalisation algorithms allow platforms to carefully target web content to the tastes and interests of their users. They are at the core of social media platforms, dating apps, shopping and news sites. They make us see the world as we want to see it. By forging a specific reality for each user, they silently and subtly shape customised “information diets”, including around our voting preferences. We still remember Facebook’s CEO Mark Zuckerberg testifying before the US Congress (in April 2018) about the many vulnerabilities of his platform during election campaigns. With the elections for the European Parliament scheduled for May 2019, it is about time to look at our information diets and take seriously the role of platforms in shaping our worldviews. But how? Personalisation algorithms are kept a closely guarded secret by social media platform companies. The few experiments auditing these algorithms rely on data provided by platform companies themselves. Researchers are sometimes subject to legal challenges by social media companies who accuse them of violating the Terms of Services of their utility. As we speak, technological fencing-offs are emerging as the newest challenge to third-party accountability. Generally, auditing algorithms fail to involve ordinary users, missing out on a crucial opportunity for awareness raising and behavioural change.
The Algorithms Exposed (ALEX) project1, funded by a Proof of Concept grant of the European Research Council, intervenes in this space by promoting an approach to algorithms auditing that empowers and educates users. ALEX stabilises and expands the functionalities of a browser extension – fbtrex – an original idea of lead developer Claudio Agosti. Analysing the outcomes of Facebook’s news feed algorithm, our software enables users to monitor their own social media consumption, and to volunteer their data for scientific or advocacy projects of their choosing. It also empowers advanced users, including researchers and journalists, to produce sophisticated investigations of algorithmic biases. Taking Facebook and the forthcoming EU elections as a test case, ALEX unmasks the functioning of personalisation algorithms on social media platforms.”
Continue reading in the website of the Internet Policy Review.