Author: Stefania

Workshop ‘Big Data from the South: Towards a Research Agenda’, Amsterdam, December 4-5

How would datafication look like seen… ‘upside down’? What questions would we ask? What concepts, theories and methods would we embrace or have to devise? These questions are at the core of the two-day research seminar ‘Big Data from the South: Towards a Research Agenda’, scheduled to take place at the University of Amsterdam on December 4-5, 2018. The event is the third gathering of the Big Data from the South Initiative, launched in 2017 by Stefania Milan and Emiliano Treré (Cardiff University). It interrogates ‘Big Data from the South’, moving beyond the Western centrism and ‘digital universalism’ (Say Chan, 2013) of much of the critical scholarship on datafication and digitalization. It allows the Initiative to advance with charting its field of inquiry, including in the conversation practitioners from various corners of the globe and scholars from media studies, development studies, law, globalization studies, philosophy, science and technology studies, critical data studies (and counting).

Watch the event here.

The event is made possible by the generous funding of the Amsterdam Center for Globalization Studies, the Amsterdam Center for European Studies, the Amsterdam School of Cultural Analysis, and the European Research Council. With the participation of SPUI25 and Terre Lente.

Rationale

The workshop builds on the work of DATACTIVE and the Data Justice Lab in thinking the relation between data, citizenship and participation, but goes beyond engaging with a much needed debate at the intersection of feminist theory, critical theory, and decolonial thinking, which, ‘thinking in radical exteriority’ (Vallega, 2015, p. x), interrogates the coloniality of power. It intends to contribute also to the ongoing epistemological repositioning of the humanities and the social sciences in light of the raising inequality. We depart from the observation that, ‘while the majority of the world’s population resides outside the West, we continue to frame key debates on democracy and surveillance—and the associated demands for alternative models and practices—by means of Western concerns, contexts, user behavior patterns, and theories’  (Milan and Treré, 2017) . If on the one hand, ‘we need concerted and sustained scholarship on the role and impact of big data on the Global South’ (Arora, 2015, p. 1693), on the other ‘new’ theory and ‘new’ understandings are key, as ‘if the injustices of the past continue into the present and are in need of repair (and reparation), that reparative work must also be extended to the disciplinary structure that obscure as much as illuminate the path ahead’ (Bhambra & De Sousa Santos, 2017, p. 9). Thus, this event will be a stepping stone towards rethinking the sociotechnical dynamics of datafication in light of ‘the historical processes of dispossession, enslavement, appropriation and extraction […] central to the emergence of the modern world’ (Ibid.).

But what South are we referring to? First, our definition of ‘South’ is a flexible and expansive one, inspired to the writings of globalization sociologist Boaventura De Sousa Santos (2014) who is at the forefront of the reflection on the emergence and the urgency of epistemologies from the South against the ‘epistemicide’ of neoliberalism. Including but also going beyond the geographical South and emphasising the plurality of the South(s), our South is a place for and a metaphor of resistance, subversion, and creativity . Secondly, our notion emerges in dialectic interaction with the continuous critical interrogating and situating of our privilege as Western academics vs. the imperative to do ‘nothing about them without them’ (see Milan and Treré, 2017).

Participants (in alphabetical order)

Carla Alvial (NUMIES, Chile), Payal Arora (Erasmus University Rotterdam), Sérgio Barbosa (University of Coimbra), Davide Beraldo (UvA), Enrico Calandro (Research ICT Africa), Bernardo Caycedo (UvA), Fabien Cante (University of Birmingham), Alberto Cossu (UvA), Nick Couldry (LSE), Álvaro Crovo (ISUR, Colombia), Monika Halkort (American University of Lebanon), Becky Kazansky (UvA), Anja Kovacs (The Internet Democracy Project), Merlyna Lim (Carleton University), Joan Lopez (Fundacion Karisma), Aaron Martin (Tilburg University), Silvia Masiero (Loughborough University), Ulises Mejias (SUNY Oswego), Stefania Milan (UvA), Hellen Mukiri-Smith (Tilburg University), Nelli Piattoeva (University of Tampere), Anita Say Chan (Illinois, Urbana-Champagne), Gabriela Sued (Tecnologico de Monterrey), Anna Suman (Tilburg University), Linnet Taylor (Tilburg University), Gunes Tavmen (Birbeck College), Niels ten Oever (UvA), Emiliano Treré (Cardiff University), Guillen Torres (UvA), Etienne von Bertrab (UCL), Norbert Wildermuth (Roskilde University), Kersti Wissenbach (UvA)

Schedule 

DAY 1, December 4th
15.00-16.30

@UvA library, Singel 425, room ‘Belle van Zuylen’

Open session: Can Data be Decolonized? Data Relations and the Emerging Social Order of Capitalism, with Nick Couldry (London School of Economics and Political Science) & Ulises A. Mejias (State University of New York at Oswego)

This talk (which draws on the author’s forthcoming book from Stanford University Press, The Costs of Connection: How Data is Colonizing Human Life and Appropriating it for Capitalism) examines how contemporary practices of data extraction and processing replicate colonial modes of exploitation. Couldry and Mejias present the concept of “data colonialism” as a tool to analyze emerging forms of political control and economic dispossession. To that effect, their analysis engages the disciplines of critical political economy, sociology of media, and postcolonial science and technology studies to trace continuities from colonialism’s historic appropriation of territories and material resources to the datafication of everyday life today. While the modes, intensities, scales and contexts of dispossession have changed, the underlying function remains the same: to acquire resources from which economic value can be extracted. Just as historic colonialism paved the way for industrial capitalism, this phase of colonialism prepares the way for a new economic order. In this context, the authors analyze the ideologies and rationalities through which “data relations” (social relations conducted and organized via data processes) contribute to the capitalization of human life. Their findings hold important implications for how we study the internet, and how we may advocate for the decolonization of data in the future.

Chair: Stefania Milan (DATACTIVE, University of Amsterdam)

17.00-19.30 @Terre Lente, Westerstraat 55 Informal research session with light dinner & drinks (for subscribed participants only)
20-21.30 @SPUI25, Spui 25 Public event: Big Data from the South: Decolonization, Resistance and Creativity, Payal Arora (Erasmus University Rotterdam), Nick Couldry (London School of Economics), Merlyna Lim (Carleton University) and Ulises A. Mejias (State University of New York, College at Oswego).

Datafication has dramatically altered the way we understand the world around us. Understanding the so-called ‘big data’ means to explore the profound consequences of the computational turn, as well as the limitations, errors and biases that affect the gathering, interpretation and access to information on such a large scale. However, much of this critical scholarship has emerged along a Western axis ideally connecting Silicon Valley, Cambridge, MA and Northern Europe. What does it mean to think datafication from a Southern perspective? This roundtable interrogates the mythology and universalism of datafication and big data, moving beyond the Western centrism and ‘digital universalism’ (Say Chan, 2013) of the critical scholarship on datafication and digitalization. It asks how would datafication look like seen… ‘upside down’? What problems should we address? What questions would we ask? We will explore these questions in conversation with four engaged academics: Payal Arora (Erasmus University Rotterdam), Nick Couldry (London School of Economics), Merlyna Lim (Carleton University), and Ulises A. Mejias (State University of New York, Oswego).

Chair: Stefania Milan (DATACTIVE, University of Amsterdam)
Moderator: Emiliano Treré (Data Justice Lab, Cardiff University)

Drinks will follow!

DAY 2, December 5th @e-lab, UvA Media Studies, Turfdraagsterpad 9 (for subscribed participants only)
10.00-10.15 Welcome by Stefania Milan (coffee & tea in the room!)
10.15-11.00 Setting the scene by Stefania and Emiliano Treré
11.00-11.45 Workgroup slot 1
11.45-12.30 Workgroup slot 2
12.30-13.40 Short presentation by Tecnológico de Monterrey(Mexico)
12.40-13.30 Lunch served in the room (by Terre Lente)
13.30-14.15 Workgroup slot 3
14.15-15.00 Workgroup slot 4
15.00-15.45 Workgroup slot 5
15.45-16.00 Stretching break
16.00-17.00 Plenary session: Reporting back and next steps

Follow the conversation online with the hashtag #BigDataSur

Check out the blog and subscribe to the mailing list!

 

Stefania at the Pathways to impact in the SSH research conference, Vienna, November 28-29

Stefania will contribute to the conference on Pathways to impact in the Social Science and Humanities research conference, taking place in Vienna on November 28-29 in the framework of the Austrian Presidency of the European Union. Stefania serves in the Scientific Committee supporting the organization of the conference. In Vienna she will chair the session “Valuation pathways of SSH – drivers, barriers, successes and failures”. Both days of the conference are streamed live.

 

Stefania speaks at Falling Walls 2018 in Berlin

DATACTIVE PI Stefania Milan is in Berlin on November 8-9, as an invited speaker at the Falling Wall conference 2018. Falling Walls is an annual science event that coincides with the anniversary of the Fall of the Berlin Wall. The one-day scientific conference showcases the research work of international scientists from a wide range of fields. Stefania’s presentation will revolve around the theme of data empowerment. Check out the conference program, and the description. The event is streamed live.

Stefania will also attend the Falling Wall Circle, whose theme this year in “Human genius in the age of Artificial Intelligence”.

two more articles of the special issue “Big Data from the South” are online!

After the “teaser” by Nick Couldry and Ulises A. Mejias, two more articles of the Special Issue on “Big Data from the South” have now gone online! Happy reading!

This paper calls for an epistemic disobedience in privacy studies by decolonizing the approach to privacy. As technology companies expand their reach worldwide, the notion of privacy continues to be viewed through an ethnocentric lens. It disproportionately draws from empirical evidence on Western-based, white, and middle-class demographics. We need to break away from the market-driven neoliberal ideology and the Development paradigm long dictating media studies if we are to foster more inclusive privacy policies. This paper offers a set of propositions to de-naturalize and estrange data from demographic generalizations and cultural assumptions, namely, (1) predicting privacy harms through the history of social practice, (2) recalibrating the core-periphery as evolving and moving targets, and (3) de-exoticizing “natives” by situating privacy in ludic digital cultures. In essence, decolonizing privacy studies is as much an act of reimagining people and place as it is of dismantling essentialisms that are regurgitated through scholarship.

(Big) Data and the North-in-South: Australia’s Informational Imperialism and Digital Colonialism by Monique Mann and Angela Daly

Australia is a country firmly part of the Global North, yet geographically located in the Global South. This North-in-South divide plays out internally within Australia given its status as a British settler-colonial society which continues to perpetrate imperial and colonial practices vis-à-vis the Indigenous peoples and vis-à-vis Australia’s neighboring countries in the Asia-Pacific region. This article draws on and discusses five seminal examples forming a case study on Australia to examine big data practices through the lens of Southern Theory from a criminological perspective. We argue that Australia’s use of big data cements its status as a North-in-South environment where colonial domination is continued via modern technologies to effect enduring informational imperialism and digital colonialism. We conclude by outlining some promising ways in which data practices can be decolonized through Indigenous Data Sovereignty but acknowledge these are not currently the norm; so Australia’s digital colonialism/coloniality endures for the time being.

DATACTIVE Speaker Series: Can Data be Decolonized?, December 4

DATACTIVE is proud to announce a talk by Nick Couldry (London School of Economics and Political Science) and Ulises A. Mejias (State University of New York at Oswego) in the framework of the DATACTIVE Speaker Series and in occasion of the Big Data from the South workshop. The talk, entitled “Can Data be Decolonized? Data Relations and the Emerging Social Order of Capitalism”, will take place on December the 4th at 3pm, at the University Library (Potgieterzaal). Below you find the blurb.

Can Data be Decolonized? Data Relations and the Emerging Social Order of Capitalism
A talk by Nick Couldry (London School of Economics and Political Science) and Ulises A. Mejias (State University of New York at Oswego)

This talk (which draws on the author’s forthcoming book from Stanford University Press, The Costs of Connection: How Data is Colonizing Human Life and Appropriating it for Capitalism) examines how contemporary practices of data extraction and processing replicate colonial modes of exploitation. Couldry and Mejias present the concept of “data colonialism” as a tool to analyze emerging forms of political control and economic dispossession. To that effect, their analysis engages the disciplines of critical political economy, sociology of media, and postcolonial science and technology studies to trace continuities from colonialism’s historic appropriation of territories and material resources to the datafication of everyday life today. While the modes, intensities, scales and contexts of dispossession have changed, the underlying function remains the same: to acquire resources from which economic value can be extracted. Just as historic colonialism paved the way for industrial capitalism, this phase of colonialism prepares the way for a new economic order. In this context, the authors analyze the ideologies and rationalities through which “data relations” (social relations conducted and organized via data processes) contribute to the capitalization of human life. Their findings hold important implications for how we study the internet, and how we may advocate for the decolonization of data in the future.

Stefania at the AoIR 2018 conference, Montreal

DATACTIVE PI Stefania Milan has taken part in the annual conference of the Association of Internet Researchers, in Montreal (Canada), October 10-13. This year’s conference theme was “Transnational materialities”. Among others, she presented a work in progress, co-authored with Miren Gutierrez (Universidad de Deusto), on the social consequences of engagement with data and data infrastructure. On October 14th, she has taken part in the academic Festschrift to celebrate the career of Prof. Marc Raboy. The event, entitled Networking Global Communication in and Beyond the Age of Social Media, took place at McGill University.

NEW article out: “Everyday acts of authoritarianism in the liberal West”, International Journal of Communication

DATACTIVE is happy to announce the publication of the article “Through a Glass, Darkly”: Everyday Acts of Authoritarianism in the Liberal West, co-Arne Hintz (Data Justice Lab, Cardiff University) and Stefania Milan, in the International Journal of Communication. The essay is part of a Special Section on “Authoritarian Practices in the Digital Age”, edited by Marlies Glasius and Marcus Michaelsen, University of Amsterdam. The Special Section brings together nine papers that extend our understanding of the relationship between contemporary forms of authoritarianism and digital communication technologies. The contributions investigate Internet control and censorship, surveillance, and disinformation, presenting insights from China, Russia and Central Asia, Iran, Pakistan, Sub-Saharan Africa, and Western Europe. The articles are available in open-access. The abstract of Through a Glass, Darkly is below.

“Through a Glass, Darkly”: Everyday Acts of Authoritarianism in the Liberal West

Institutional practices undermining citizen agency and infringing on individual freedoms are typically associated with authoritarian countries. However, they are also proliferating in Western democracies. This article redefines data-based surveillance as a “Western” authoritarian and illiberal practice in the digital realm, resulting from state–industry collaboration and alienated from accountability mechanisms. Straddling critical data studies and surveillance studies, the article explores these dynamics of surveillance in the West by focusing on two dimensions: the institutionalization of governmental practices in law and the societal normalization of surveillance in popular cultural practices. It thus investigates the renegotiation of the boundaries of state power along two axes—top down and bottom up. It connects the notions of “authoritarian and illiberal practices” and “surveillance cultures,” asking how the former are produced, negotiated, and legitimized and reviewing their consequences for citizens and civil society. Based on empirical data from two projects exploring the interplay between citizenship and surveillance, the article argues that acts of authoritarianism in the West are institutionalized at the intersection of top-down governmental practices and bottom-up popular reactions.

Keywords: authoritarian practices, surveillance, surveillance cultures, liberal democracy, Internet freedoms

Why we won’t be at APC 2018

In October 2018, the Amsterdam Privacy Conference (APC) will be back at the University of Amsterdam. Two DATACTIVE project team members, Stefania (Principal Investigator), and Becky (PhD candidate), enthusiastically supported the conference as coordinators of the ‘Digital Society and Surveillance’ theme. The Data Justice Lab at Cardiff University submitted a panel proposal, which was successfully included. Regretfully, neither will take part in the conference: DATACTIVE and the Data Justice Lab have decided to withdraw over the participation of the US-based software company Palantir as one of the APC’s Platinum Sponsors.

Our decision to withdraw stems from an active refusal to legitimize companies accused of enabling human rights abuses, and a concern with the lack of transparency surrounding sponsorship.

Palantir is a company specializing in big data analytics, which develops technologies for the military, law enforcement and border control. The deployment of Palantir’s technologies has raised wide-spread concern among civil liberties and human rights advocates. Reporting shows that, in the United States, Palantir has played an important role in enabling the efforts of the ICE (Immigration and Customs Enforcement) to identify, detain, and deport undocumented immigrants, refugees, and asylum seekers. This has resulted in the indefinite detention of thousands of children who have been separated from their parentsThis indefensible policy has come under strong criticism from the United Nations and prompted an alliance of technology workers and affected communities, to call – so far, unsuccessfully – for Palantir to cancel its contracts with ICE.

We feel that providing Palantir with a platform, as a sponsor of a prominent academic conference on privacy, significantly undermines efforts to resist the deployment of military-grade surveillance against migrants and marginalized communities already affected by abusive policing. 

Because we have organized conferences ourselves, we believe transparency in sponsorship agreements is key. While we praise the APC organizing committee forcommitting to full transparency, we were not informed of sponsorship agreements until the very last minute. The APC Sponsors page, in addition, was only populated after the participant registration deadline. As conference coordinators and prospective participants, we feel that we were not given the chance to make an informed choice about our contribution.

Sponsorship concerns are not a new issue: the very same controversy, around the involvement of this very same company (as well as others), emerged during the 2015 edition of APC. Though we acknowledge the complexity of corporate sponsorship, we note that other prominent tech policy conferences, such as Computers, Privacy and Data Protection (CPDP) conference, have recently stopped accepting sponsorship from Palantir. We thus believe this is a good moment for a larger discussion about how conferences should be organized in the future.

Academia—and especially publicly-funded universities—need to consider their role in efforts to neutralize or undermine human rights concerns. Such considerations are particularly pertinent in the context of what has been described as the increased neoliberalization of higher education, in which there is significant pressure to attract and pursue funding from different sources. As academics and as citizens, we will increasingly be asked to make choices of this kind. Hence, we believe it is time to set down a clear set of principles for sponsorship going forward.

 

Amsterdam and Cardiff, 19 September 2018

Stefania Milan and Becky Kazansky (DATACTIVE) & Lina Dencik, Arne Hintz, Joanna Redden, Fieke Jansen (Data Justice Lab)

Welcome to DATACTIVE’s spinoff ALEX! An interview with fbtrex Lead Developer Claudio Agosti

by Tu Quynh Hoang and Stefania Milan

DATACTIVE is proud to announce that its spin-off ALEX project has been awarded a Proof of Concept grant of the European Research Council. ALEX, which stands in for “ALgorithms Exposed (ALEX). Investigating Automated Personalization and Filtering for Research and Activism”, aims at unmasking the functioning of personalization algorithms on social media platforms, initially taking Facebook as a test case. ALEX marks the engagement of DATACTIVE with “data activism in practice”that is to say, turning data into a point of intervention in society.

To mark the occasion, we publish an interview with Claudio Agosti, DATACTIVE Research Associate and Lead Developer of facebook.tracking.exposed browser extension (fbtrex), whose open-source code is at the core of the ALEX project. Claudio was interviewed by DATACTIVE Principal Investigator Stefania Milan at the Internet Freedom Festival in Valencia, Spain, in relation to a project on content regulation on/by platforms.

Claudio (also known as vecna) is a self-taught technician in digital security. With the internet gradually becoming a central agent in the political debate, he moved from the corporate security services to the defence of human rights in the digital sphere. Currently, he is exploring the influence of algorithms on society. Claudio is the coordinator of the free software projects behind https://tracking.exposed and a Founding Member and Vice-President of the Hermes Center for Transparency and Digital Human Rights

Stefania: Is the spread of fake news predominantly a technical or social problem?

Claudio: It is a social problem in the sense that the lack of critical judgment in individuals creates the conditions for fake news or misinformation to spread. However, through technology, the dissemination of misinformation is much faster and can scale up. The problem we are facing now is that when the costs of spreading content drop, the possibilities for an individual to deliver a successful information operation (or infops, I feel this term is more accurate than propaganda) is higher. However, it isn’t true that people lack critical judgment in absolute terms. At a personal level, one can only be an knowledgeable on a limited range of subjects, but the information we receive is very diverse and, most of the time, outside of our domain of expertise. As social media users and information consumers, we should have a way to validate that information. I wonder what if we would know how to validate on our own? This does not exist in mainstream news media either. It is possible, for example, on Wikipedia, but anywhere else, the way that information is spread implies that information is true on its own. A news report, a blog post or a status update on social media do not contain any information that helps validation. All in all, I think fake news is simultaneously a technical and a political problem, because those who create and spread information have responsibility towards user expectations, and this shape also the users’ vulnerability to infops.

Stefania: As a developer, what is your main goal with the facebook.tracking.exposed browser extension?

Claudio: At the moment we haven’t had the tools to assess responsibility with respect to infops. If we say that fake news is a social problem because people are gullible, we put responsibility on users/readers. But it’s also a problem of those publishing the information, who allow themselves to publish incorrect information because they will be hardly held accountable. According to some observers, social media platforms such as Facebook are to be blamed for the spread of misinformation. We have three actors: the user/the reader, the publisher, and the platform. With facebook.tracking.exposed, I’m trying to collect actual data that allows us to reflect on where the responsibilities are. For example, sometimes Facebook is thought to be responsible but in fact it is the responsibility of the content publisher. And sometimes the publishers are to be blamed, but are not legally responsible. We want to collect actual data that can help investigate these assumptions. We do so from an external, neutral position.

Stefania: Based on your studies of the spread of information on social media during the recent elections in Argentina and Italy, can you tell us what the role of platforms is, and of Facebook in particular?

Claudio: In the analyses we did in Argentina and Italy, we realized that there are two accountable actors: the publisher and the platform. Some of the publishers are actually spamming users’ timelines as they are producing too many posts per day. I find it hard to believe that they are producing quality content in that way. They just aim at occupying users’ timelines to exploit some of their seconds of attention. In my opinion, this is to be considered spam. What we also found is that Facebook’s algorithms are completely arbitrary in deciding what a user is or is not going to see. It’s frightening when we consider that a person that displays some kind of deviant behavior such as reading and sharing only fascist or racist content will keep being exposed to even less diverse content. From our investigations of social media content during two heated election campaigns, we have the evidence that if a person expresses populist or fascist behavior, the platform is designed to show her less diverse information in comparison to other users, and that can only reinforce her position. We can also argue that the information experience of that person is of lower quality, assuming that maximum information exposure is always to be preferred.  

Stefania: So what can users do to fix this problem? 

Claudio: I think users should be empowered to run their own algorithms and they should have better tools at their disposal to select the sources of their information diets. This has to become also a task of information publishers. Although everybody on social media is both a publisher and a consumer, people who do publishing as their daily jobs are ever more responsible. For example, they should create much more metadata to go along with information so to permit the system to better filter and categorize content. Users, on the other hand, should have these tools in hand. When we don’t have that set of metadata and thus the possibility to define our own algorithm, we have to rely on Facebook’s algorithms. But Facebook’s algorithms are implicitly promoting Facebook’s agenda and its capitalist imperative of maximizing users’ attention and engagement. For users to have the possibility of defining their own algorithms, we should first of all create the need and the interest to do so by showing how much of the algorithm is the platform’s agenda and how it can really influence our perception of reality. That is what I’m doing now: collecting evidence about this problems and trying to explain it to a broader audience, raising awareness amongst social media users. 

Stefania: Do you think we should involve the government in the process? From your perspective of software developer, do you think we need more regulation?

Claudio: Regulation is really key because it’s important to keep corporations in check. But I’m afraid that, among others, there is a misunderstanding in making regulations which seem to have direct benefits on people’s life, but for example might end up limiting some of the key features of open source software and its distribution. Therefore I’m quite skeptical. I have to say that high level regulations like the General Data Protection Regulation do not try to regulate the technology but rather its effects and in particular data usage. They are quite abstract and distant from the technology itself. If the regulators want to tell the company what to do and what not to do, I’m afraid that in the democratic competition of the technical field the probability of making mistakes is higher. On the other hand, if we just regulate users/consumers production explicitly, we would end up reinforcing the goals of the data corporations even more. So far, regulations have in fact been exploited by the biggest fish in the market. In this game we can distinguish three political entities: users, companies, and governments. In retrospect, we see that there have been cases where companies have helped citizens against governments and, in some other case, governments have helped citizen against companies. I hope we can aggregate users and civil society organizations around our project, because that’s the political entity that is in utmost need to be somehow guided or supported.

Stefania: So the solution is ultimately in users?

Claudio: The problem is complex thus the solution can’t be found in one of the three entities only. With ALEX we will have the opportunity to re-use our data with policies we determine, and therefore try to produce features which can, at least, offer a new social imaginary.

First of all, we aim at promoting diversity. Fbtrex will provide users with tools for comparing their social media timelines to those of others users, based on mutual sharing agreements which puts the users—rather than the company—on the driver seat. The goal is to involve and compare a diverse group of users and their timelines across the globe. In so doing, we empower users to understand what is hidden from them on a given topic. Targeted communication and user defined grouping, as implemented on most social media, lead to fragmentation of knowledge. Filtered interactions confirming a user’s position have been complicit in this fragmentation. Our approach doesn’t intend to solve this technocratic subterfuges with other technological fixes, but to let the user explore the diversity.

In fact, the fragmentation of information and individuals produced by social media has made it even more difficult for users to relate to problems far removed from their reality. How do you understand the problems of migrants, for example, if you have never been away from home yourself, and you don’t spend time in their company? To counter this effect, thanks to the funding of the European Research Council, we will work on an advanced functionality which will… turn the so-called filter bubbles against themselves, sort to speak. 

Secondly, we want to support delegation and fact-checking, enabling third-party researchers to play a role in the process. The data mined by fbtrex will be anonymized and provided to selected third-party researchers, either individuals or collectives. These will be enabled to contextualize the findings, combine it with other data and complement it with data obtained through other social science research methods such as focus groups. But, thanks to the innovative data reuse protocols we will devise, in any given moment users, as data producers, will have a say as to whether and how they want to volunteer their data. We will also work to create trusted relationships and networks with researchers and users.

In conclusion, if users want to really be free, they have to be empowered to be able to exercise their freedom. This means: they have to own their own data, run their algorithms, and understand the political implications behind technological decision-making. To resort to a metaphor, this is exactly the difference between dictatorship and democracy: you can believe or accept that someone will do things for your own good like in a dictatorship, or you can decide to assume your share of responsibility, taking things in your hands and trying to do what is best for you while respecting others—which is exactly what democracy teaches us.

***

ALEX is a joint effort by Claudio Agosti, Davide Beraldo, Jeroen de Vos and Stefania Milan.

See more: the news in Dutch, the press release by the ERC, our project featured in the highlights of the call

Stay tuned for details.

The new website https://algorithms.exposed will go live soon!