Category: show in team updates

Welcome to DATACTIVE’s spinoff ALEX! An interview with fbtrex Lead Developer Claudio Agosti

by Tu Quynh Hoang and Stefania Milan

DATACTIVE is proud to announce that its spin-off ALEX project has been awarded a Proof of Concept grant of the European Research Council. ALEX, which stands in for “ALgorithms Exposed (ALEX). Investigating Automated Personalization and Filtering for Research and Activism”, aims at unmasking the functioning of personalization algorithms on social media platforms, initially taking Facebook as a test case. ALEX marks the engagement of DATACTIVE with “data activism in practice”that is to say, turning data into a point of intervention in society.

To mark the occasion, we publish an interview with Claudio Agosti, DATACTIVE Research Associate and Lead Developer of facebook.tracking.exposed browser extension (fbtrex), whose open-source code is at the core of the ALEX project. Claudio was interviewed by DATACTIVE Principal Investigator Stefania Milan at the Internet Freedom Festival in Valencia, Spain, in relation to a project on content regulation on/by platforms.

Claudio (also known as vecna) is a self-taught technician in digital security. With the internet gradually becoming a central agent in the political debate, he moved from the corporate security services to the defence of human rights in the digital sphere. Currently, he is exploring the influence of algorithms on society. Claudio is the coordinator of the free software projects behind https://tracking.exposed and a Founding Member and Vice-President of the Hermes Center for Transparency and Digital Human Rights

Stefania: Is the spread of fake news predominantly a technical or social problem?

Claudio: It is a social problem in the sense that the lack of critical judgment in individuals creates the conditions for fake news or misinformation to spread. However, through technology, the dissemination of misinformation is much faster and can scale up. The problem we are facing now is that when the costs of spreading content drop, the possibilities for an individual to deliver a successful information operation (or infops, I feel this term is more accurate than propaganda) is higher. However, it isn’t true that people lack critical judgment in absolute terms. At a personal level, one can only be an knowledgeable on a limited range of subjects, but the information we receive is very diverse and, most of the time, outside of our domain of expertise. As social media users and information consumers, we should have a way to validate that information. I wonder what if we would know how to validate on our own? This does not exist in mainstream news media either. It is possible, for example, on Wikipedia, but anywhere else, the way that information is spread implies that information is true on its own. A news report, a blog post or a status update on social media do not contain any information that helps validation. All in all, I think fake news is simultaneously a technical and a political problem, because those who create and spread information have responsibility towards user expectations, and this shape also the users’ vulnerability to infops.

Stefania: As a developer, what is your main goal with the facebook.tracking.exposed browser extension?

Claudio: At the moment we haven’t had the tools to assess responsibility with respect to infops. If we say that fake news is a social problem because people are gullible, we put responsibility on users/readers. But it’s also a problem of those publishing the information, who allow themselves to publish incorrect information because they will be hardly held accountable. According to some observers, social media platforms such as Facebook are to be blamed for the spread of misinformation. We have three actors: the user/the reader, the publisher, and the platform. With facebook.tracking.exposed, I’m trying to collect actual data that allows us to reflect on where the responsibilities are. For example, sometimes Facebook is thought to be responsible but in fact it is the responsibility of the content publisher. And sometimes the publishers are to be blamed, but are not legally responsible. We want to collect actual data that can help investigate these assumptions. We do so from an external, neutral position.

Stefania: Based on your studies of the spread of information on social media during the recent elections in Argentina and Italy, can you tell us what the role of platforms is, and of Facebook in particular?

Claudio: In the analyses we did in Argentina and Italy, we realized that there are two accountable actors: the publisher and the platform. Some of the publishers are actually spamming users’ timelines as they are producing too many posts per day. I find it hard to believe that they are producing quality content in that way. They just aim at occupying users’ timelines to exploit some of their seconds of attention. In my opinion, this is to be considered spam. What we also found is that Facebook’s algorithms are completely arbitrary in deciding what a user is or is not going to see. It’s frightening when we consider that a person that displays some kind of deviant behavior such as reading and sharing only fascist or racist content will keep being exposed to even less diverse content. From our investigations of social media content during two heated election campaigns, we have the evidence that if a person expresses populist or fascist behavior, the platform is designed to show her less diverse information in comparison to other users, and that can only reinforce her position. We can also argue that the information experience of that person is of lower quality, assuming that maximum information exposure is always to be preferred.  

Stefania: So what can users do to fix this problem? 

Claudio: I think users should be empowered to run their own algorithms and they should have better tools at their disposal to select the sources of their information diets. This has to become also a task of information publishers. Although everybody on social media is both a publisher and a consumer, people who do publishing as their daily jobs are ever more responsible. For example, they should create much more metadata to go along with information so to permit the system to better filter and categorize content. Users, on the other hand, should have these tools in hand. When we don’t have that set of metadata and thus the possibility to define our own algorithm, we have to rely on Facebook’s algorithms. But Facebook’s algorithms are implicitly promoting Facebook’s agenda and its capitalist imperative of maximizing users’ attention and engagement. For users to have the possibility of defining their own algorithms, we should first of all create the need and the interest to do so by showing how much of the algorithm is the platform’s agenda and how it can really influence our perception of reality. That is what I’m doing now: collecting evidence about this problems and trying to explain it to a broader audience, raising awareness amongst social media users. 

Stefania: Do you think we should involve the government in the process? From your perspective of software developer, do you think we need more regulation?

Claudio: Regulation is really key because it’s important to keep corporations in check. But I’m afraid that, among others, there is a misunderstanding in making regulations which seem to have direct benefits on people’s life, but for example might end up limiting some of the key features of open source software and its distribution. Therefore I’m quite skeptical. I have to say that high level regulations like the General Data Protection Regulation do not try to regulate the technology but rather its effects and in particular data usage. They are quite abstract and distant from the technology itself. If the regulators want to tell the company what to do and what not to do, I’m afraid that in the democratic competition of the technical field the probability of making mistakes is higher. On the other hand, if we just regulate users/consumers production explicitly, we would end up reinforcing the goals of the data corporations even more. So far, regulations have in fact been exploited by the biggest fish in the market. In this game we can distinguish three political entities: users, companies, and governments. In retrospect, we see that there have been cases where companies have helped citizens against governments and, in some other case, governments have helped citizen against companies. I hope we can aggregate users and civil society organizations around our project, because that’s the political entity that is in utmost need to be somehow guided or supported.

Stefania: So the solution is ultimately in users?

Claudio: The problem is complex thus the solution can’t be found in one of the three entities only. With ALEX we will have the opportunity to re-use our data with policies we determine, and therefore try to produce features which can, at least, offer a new social imaginary.

First of all, we aim at promoting diversity. Fbtrex will provide users with tools for comparing their social media timelines to those of others users, based on mutual sharing agreements which puts the users—rather than the company—on the driver seat. The goal is to involve and compare a diverse group of users and their timelines across the globe. In so doing, we empower users to understand what is hidden from them on a given topic. Targeted communication and user defined grouping, as implemented on most social media, lead to fragmentation of knowledge. Filtered interactions confirming a user’s position have been complicit in this fragmentation. Our approach doesn’t intend to solve this technocratic subterfuges with other technological fixes, but to let the user explore the diversity.

In fact, the fragmentation of information and individuals produced by social media has made it even more difficult for users to relate to problems far removed from their reality. How do you understand the problems of migrants, for example, if you have never been away from home yourself, and you don’t spend time in their company? To counter this effect, thanks to the funding of the European Research Council, we will work on an advanced functionality which will… turn the so-called filter bubbles against themselves, sort to speak. 

Secondly, we want to support delegation and fact-checking, enabling third-party researchers to play a role in the process. The data mined by fbtrex will be anonymized and provided to selected third-party researchers, either individuals or collectives. These will be enabled to contextualize the findings, combine it with other data and complement it with data obtained through other social science research methods such as focus groups. But, thanks to the innovative data reuse protocols we will devise, in any given moment users, as data producers, will have a say as to whether and how they want to volunteer their data. We will also work to create trusted relationships and networks with researchers and users.

In conclusion, if users want to really be free, they have to be empowered to be able to exercise their freedom. This means: they have to own their own data, run their algorithms, and understand the political implications behind technological decision-making. To resort to a metaphor, this is exactly the difference between dictatorship and democracy: you can believe or accept that someone will do things for your own good like in a dictatorship, or you can decide to assume your share of responsibility, taking things in your hands and trying to do what is best for you while respecting others—which is exactly what democracy teaches us.

***

ALEX is a joint effort by Claudio Agosti, Davide Beraldo, Jeroen de Vos and Stefania Milan.

See more: the news in Dutch, the press release by the ERC, our project featured in the highlights of the call

Stay tuned for details.

The new website https://algorithms.exposed will go live soon!

 

Stefania at AlgoSov Summit in Copenhagen, 8 September

On the 8th of September Stefania will give a talk at the Algorithmic Sovereignty Summit in Copenhagen, in the framework of the TechFestival, a festival “to find human answers to the big questions of technology”.

The Summit in an initiative of Jaromil Rojo from Dyne.org, who also sits in the DATACTIVE’s Ethics Advisory Board. The summit kickstarts the European Observatory on Algorithmic Sovereignty.

DATACTIVE welcomes a new member to the team

DATACTIVE welcomes its newest addition, Lara AlMalakeh, who joins the team as a Managing Editor of the three blogs of the project.

Lara has just obtained a MA in Comparative Cultural Analysis from the University of Amsterdam. Before that she obtained a Post Graduate in Principles and Practice of Translation from City University London. Lara wears many hats as she initially studied Fine Arts in Damascus University and graduated with specialization in oil painting. She then built a career in administration spanning over 12 years in Dubai, UAE. During that time, she joined Global Voices as the Arabic language editor and started translating for NGOs specialized in advocacy and digital rights, namely Electronic Frontier Foundation and First Draft News.

DATACTIVE is glad with the diversity that this new addition brings to its ensemble of academics and activists. The team is looking forward to leveraging on the various skills and attributes Lara brings along, whether from her professional background or her various involvements in the activism sphere.

Lara is a proud mother of two girls under 10. She enjoys discussing politics and debating art over a beer. Her new found love is philosophy and she dreads bikes.

XRDS Summer 2018 issue is out -with contributions from DATACTIVE

The last issue of XRDS – The ACM Magazine for Students is out. The issue has been co-edited by our research associate Vasilis Ververis and features contributions by three of us: Stefania Milan, Niels ten Oever, Davide Beraldo, and Vasilis himself.

  1. Stefania’s piece ‘Autonomous infrastructure for a suckless internet’ explores the role of politically motivated techies in rethinking a human rights respecting internet.
  2. Niels and Davide, in their ‘Routes to rights’, discuss the problems of ossification and commercialization of internet architecture.
  3. Vasilis, together with Gunnar Wolf (also editor of the issue), has written on ‘Pseudonimity and anonymity as tools for regaining privacy’.

XRDS (Crossroads) is the quarterly magazine of the Association for Computing Machinery. You can reach the full issue here.

Stefania keynotes at ‘The Digital Self’ workshop, Kings’ College London, July 6

On July 6, Stefania will deliver a keynote at the workshop ‘The Digital Self’, organized by the Departments of Digital Humanities and of Culture, Media & Creative Industries of Kings’ College, London. This workshop focuses on how digital technology influences our daily lives, its impacts on the ways culture is re-shaped, and as a result how our identities as workers, consumers and media and cultural producers are changing. Stefania’s keynote is entitled ‘Identity and data infrastructure’. Read more.

 

 

 

Kersti presenting during RNW Media Global Weeks, July 6

With her talk ‘NGO ETHICS IN THE DIGITAL AGEHOW TO WORK WITH DATA  RESPONSIBLY’ Kersti will address a transnational team of media and advocacy practitioners during RNW Media‘s annual summit.

RNW Media is working with youth in fragile and repressive states, aiming to empower young women and men to unleash their own potential for social change. As the organization has transitioned from a traditional international broadcaster towards increasing engagement in advocacy activities utilizing digital means and data, a critical moment of moving towards a responsible data approach has come.

Kersti is consulting RNW Media on the process to develop a responsible data framework and respective program strategies.

[DATACTIVE event] Democracy Under Siege: Digital Espionage and Civil Society Resistance, July 4

 

 

July 4th, 20.00 hrs @spui25, (TICKETS HERE)

The most recent US elections, during which hackers exposed political parties’ internal communications, revealed the devastating power of digital espionage. But election meddling is only one aspect of this growing phenomenon. From Mexico to Egypt and Vietnam, human rights organizations, journalists, activists and opposition groups have been targeted by digital attacks. How can civil society defend itself against such threats?

The DATACTIVE project (University of Amsterdam) invites you to hear from leading experts on questions of digital espionage, cybersecurity and the protection of human rights in new technological environments. This public event aims to provide a global view of digital threats to civil society and discuss what can be done to fight back.

Ron Deibert (University of Toronto) will present the work of the Citizen Lab, which has pioneered investigation into information controls, covert surveillance and targeted digital espionage of civil society worldwide. He will be in conversation with Seda Gürses (KU Leuven) and Nishant Shah (ArtEZ University of the Arts/Leuphana University).

Speakers

Ronald Deibert is Professor of Political Science and Director of the Citizen Lab at the Munk School of Global Affairs, University of Toronto. The Citizen Lab undertakes interdisciplinary research at the intersection of global security, ICTs, and human rights. Deibert is the author of Black Code: Surveillance, Privacy, and the Dark Side of the Internet (Random House: 2013), as well as numerous books, chapters, articles, and reports on Internet censorship, surveillance, and cyber security. He is a former founder and principal investigator of the OpenNet Initiative (2003-2014) and a founder of Psiphon, a world leader in providing open access to the Internet.

Seda Gürses is an FWO post-doctoral fellow at COSIC/ESAT in the Department of Electrical Engineering at KU Leuven, Belgium. She works at the intersection of computer science, engineering and privacy activism, with a focus on privacy enhancing technologies. She studies conceptions of privacy and surveillance in online social networks, requirements engineering, software engineering and algorithmic discrimination and looks into tackling some of the shortcomings of the counter-surveillance movements in the US and EU.

Nishant Shah is the Dean of Graduate School at ArtEZ University of the Arts, The Netherlands, Professor of Culture and Aesthetics of Digital Media at Leuphana University, Germany, and the co-founder of the Centre for Internet & Society, India. His work is informed by critical theory, political activism, and equality politics. He identifies as an accidental academic, radical humanist, and an unapologetic feminist, with particular interests in questions of life, love, and language. His current preoccupations are around digital learning and pedagogy, ethics and artificial intelligence, and being human in the face of seductive cyborgification.

This event is hosted by Spui25 and sponsored by the European Research Council (ERC) and the Amsterdam School of Cultural Analysis (ASCA).

Stefania at the Future Forum of IG Metall, Berlin

On June 19, Stefania will deliver a talk at the Zukunftsforum (‘Future Forum’) of IG Metall, the dominant metalworkers’ union in Germany and Europe’s largest industrial union. Stefania has been asked to reflect on how digitalisation and datafication change the dynamics of solidarity today. Check out the program of the day.

AGENDA Future Forum IG Metall 19.6.2018_final_eng
.

Guillén at the CRISP biannual Doctoral Training School

As every two years, the Center for Research into Information, Surveillance and Privacy offers a summer school focused on Surveillance Studies. This time, it is the turn of the University of St. Andrews (yes, that one where Kate and Edward fell in love! *sarcastic wink*) to host, from Monday 18 to Friday 22 June.

The one-week course will feature sessions on the intersection between surveillance and religion, activism, privacy, Freedom of Information, and a two-day intensive research proposal competition. The activities are coordinated by Prof. Kirstie Ball.

EDIT: The team Guillen was a part of won the research proposal competition, getting awarded 2.5 million euros of fake funding for a project called: “New Lateral Surveillance in Naming and Shaming Culture: The Impact of Viral Media on Liberal Democracies“. You can check the slides here.

“error 404. social life not found” at TEDxYouth AICS, June 4

On June 4, Stefania will give a TEDx talk at the TEDx Youth event of the Amsterdam International Community School. Entitled ‘error 404. social life not found. (but you can take it back)’, Stefania’s talk will contribute to this year’s theme ‘Next Nature’: what is the next nature of the human experience as we enter the technological age of big data, consumerism and automation? Read more on the TED website. You can also download the presentation.