By Jeroen

Screenshot 2019-09-13 11.40.23

Algorithms Exposed (ALEX) @MediaLiteracy Challenge

The ‘ALEX’s angels’ team, consisting of a team of five with DATACTIVE, medialab SETUP and user experience designer from ‘KO nieuwsgierig‘, made it to the next round of the MediaDiamond Challenge with their pitch to work on a game for young adults to facilitate critical engagement with social media personalisation algorithms. The game would build on the logic of algorithm inquiry also used in FbTREX and YtTREX. They get to October 20th to work on a renewed proposal.

More information:

MediaWijzer, the media literacy organisation in the Netherlands

Algorithms Exposed, the DATACTIVE PoC trajectory to bring-to-market knowledge and software for personalisation algorithm research

Schermafdruk van 2019-09-04 13.32.52

“Spotting Sharks”: new Working Paper by Jeroen de Vos

We are happy to announce the ALEX’s Competitor analysis, published as part of the DATACTIVE Working Paper Series, by Jeroen de Vos:

Vos, J. de (2019) “Spotting Sharks: ALEX’s Competitor analysis”, DATACTIVE Working paper series, No 2/2019 ISSN: 2666-0733.


This paper summarizes the output of the competitor analysis for fbTREX conducted as part of the market research for the project Algorithms Exposed (ALEX). fbTREX is a browser plugin that allows harvesting publicly available data on the users Facebook timeline, and its development is currently hosted by the Algorithms Exposed initiative – an effort to facilitate repurposing personal social media data to allow the scaling of systematic empirical inquiry for academic, educational or journalistic purposes. The desk research is enhanced by several interviews and aims to: 1) create initial insights into existing potentially competing organisations; 2) analyse market potential present in a specific field; 3) situate the current understanding of fbTREX in the context of bringing a product to market; and 4) and help prioritize the next step. This research should be read as an intermediate product, which can provide valuable insights to both partners and competitors. Algorithm Exposed is funded by the ERC Proof of Concept grant [grant agreement number 825974].

About Algorithms Exposed
ALEX, a short-cut for “Algorithms Exposed. Investigating Automated Personalization and Filtering for Research and Activism”, aims at unmasking the functioning of personalization algorithms on social media platforms. From an original idea of lead developer Claudio Agosti, ALEX marks the engagement of DATACTIVE with “data activism in practice”—that is to say, turning data into a point of intervention in society. Link to the website.

About the DATACTIVE working paper series
The DATACTIVE Working Paper Series presents results of the DATACTIVE research project. The series aims to disseminate the results of their research to a wider audience. An editorial committee consisting of the DATACTIVE PI and Postdoctoral fellows reviews the quality of the Working Papers. The Series aims to disseminate research results in an accessible manner to a wider audience. Readers are encouraged to provide the authors with feedback and/or questions.

Schermafdruk van 2019-07-19 09.27.54

Facebook’s Anatomy, DMI Summerschool II.

Together with the Mercator working group, DATACTIVE had the pleasure of joining the DMI (Digital Methods Initiative) summer school to work on a special project: Facebook’s Anatomy. As a form of data-activism in-practice, this project was devoted to try and dissect the working mechanisms of the Facebook user interface, split into a more qualitative, visual language/psychological analysis of the front-end and a more quantitative analysis of the back-end. The analysis tried to track the ‘coming to life’ onboarding process, and the way in which users are gently nudged and persuaded to enter more personal data through explicit performative steps (think drop-down menus and text bars). This was measured against the role of language and colour/placement design formatting in this onboarding trajectory on the one hand. On the other, this sequence of events was matched with the growth of the data that is inferred from these explicit actions and implicit input (like IP-address, browser, operating system for instance).

Find the wiki documenting the research here.

Our tentative findings are presented using the slides below. This Facebook Anatomy project has sprung out of the minds of the Mercator working group and has been reworked into a DMI research print which accommodated 15 participants. DATACTIVE was represented by Guillen, Davide & Jeroen.


PRESENTATION_The Anatomy of Facebook (1)


Schermafdruk van 2019-05-31 15-05-34

“Auditing the state”: new Working Paper by Guillen Torres

We are happy to announce the first in our 2019 DATACTIVE Working Paper Series, by Guillen Torres:

Torres, G. (2019) “Auditing the State: Everyday Forms of Institutional Resistance in the Mexican Freedom of Information Process”, DATACTIVE Working paper series, No 1/2019 ISSN: 2666-0733.


Governmental transparency through Freedom of Information (FOI) Laws has become a standard in modern liberal democracies. However, a recent trend documented by practitioners and academics alike consists of governments stating in paper their support for transparency, but in practice implementing various kinds of strategies to limit the flow of information towards engaged citizens, increasing secrecy and opaqueness. While scholarly attention has mostly been set on the motivations and effects of secrecy within institutions, the consequences experienced by politically engaged citizens have received less interest. In this paper I focus on how information activists experience and make sense of instances of information control during the performance of the FOI process, through a case study set in Mexico. I suggest that the constant denials, delays and obstructions activists face during the process of requesting information can be productively analyzed through the concept of Everyday Forms of Resistance.

About the DATACTIVE working paper series
The DATACTIVE Working Paper Series presents results of the DATACTIVE research project. The series aims to disseminate the results of their research to a wider audience. An editorial committee consisting of the DATACTIVE PI and Postdoctoral fellows reviews the quality of the Working Papers. The Series aims to disseminate research results in an accessible manner to a wider audience. Readers are encouraged to provide the authors with feedback and/or questions.



Student/volunteer needed for development of the plugin

As part of the spin-off project on algorithmic personalization, the DATACTIVE team is looking for an enthusiastic volunteer who would like to engage in qualitative market research for the development of the tool, a browser plugin that allows users to “re-appropriate” Facebook timeline data for research purposes. The position starts halfway February for two days a week (flexible) for a period of one month with possibility of extension should both parties be interested. If interested, please contact project manager Jeroen de Vos before Tuesday February 5th, he can be reached at

More information: DATACTIVE, ALEX (coming soon:

Schermafdruk van 2019-01-24 10.42.18

ALEX @DMI winterschool

Between the 7th and 11th of January, Claudio Agosti, Davide Beraldo and Jeroen de Vos from the ALEX / DATACTIVE team took part in the annual Digital Methods Initiative’s Winter School. This was the perfect occasion to kick-off a project related to the development and application of, a browser extension developed in order to expose the functioning of the secretive Facebook’s News Feed algorithm and adopted by the Algorithms Exposed (ALEX) project.

The ALEX project pitch collected quite some interest, with about 15 people coming together for a hectic as much as a fun week of collaborative thinking, experimenting and analyzing. Given the many possibilities of exploring and the variety of available skills, the group split into 2 subgroups: one has been busy with the creation of brand new ‘bots’, with the experimental interest in assessing the role of emotional engagement and friendship making in the Facebook timeline algorithm; the second group has been working on an existing dataset related to last year’s Italian national elections.

You can take a deeper look at what has been done and the insights that have been collected in the Winter School’s project wiki (TBA) and in the final presentation. Here is a bullet-point summary of the main findings:
• a bot’s life is a dangerous life… you gotta be smart not to be Facebook-killed
• being a bot is not that boring though… many new bot friends are ready to connect to you
• love wins over hate… consistently love-reacting posts seems to trigger more content on the timeline than consistently “angry”/negative reactions
• tell me what you liked, I tell you what you’ll see… selectively liking posts from different political orientations will affect the political issues you will see on your future timeline
• and lastly, and this is controversial… centre-left wing bots are more prone to be exposed to controversial content than far-right wing bots.

Many thanks to the organizers of the Winter School and, of course, to all those who contributed to the Algorithms Exposed group: Iain Emsley, Fatma Yalgin, Hannah Vischer, Victor Pak, Claudio Agosti, Mathilde Simon, Victor Bouwmeester, Yao Chen, Sophia Melanson, Hanna Jemmer, Patrick Kapsch, Giovanni Rossetti, Davide Beraldo, Giulia Corona, Leonardo Sanna, Jeroen de Vos.


#BigDataSur @LASA: An overview by Anita Say Chan

Why study Big Data from the South? This was the question we – the founder of the Big Data from the South Initiative and the author of this blog post – asked by pulling together a three-session workshop and panel series on “Big Data from the South” at the 2018 Latin American Studies Association Conference (LASA), that took place in May of this year in Barcelona, Spain. The timing of the series was auspicious. That very month, the European Union’s new General Data Protection Regulation (GDPR) – a law introducing new reforms that intended to strengthen EU citizens’ control over personal data, privacy rights, and ensure organizations that collect data do so only with a user’s consent, and while ensuring its protection from misuse and exploitation – had come into enforcement. And only months earlier, the Facebook and Cambridge Analytica scandal had come to public light – in a case that put the world’s biggest social network at the center of an international scandal involving the manipulation of user data and voter profiles for global misinformation campaigns. The case was all the more significant for demonstrating not only the possibility of hacking electoral processes in the 2016 US presidential election or the UK Brexit referendum campaign – but for making evident the pre-existing and potentially continuing precarity of global electoral processes well beyond. That very month, while Silicon Valley corporate heads in the US pronounced to publics around the world that they should continue to be trusted – as data’s and Western liberal economies’ foremost technical experts – with the design and management of data ecologies, across the Atlantic, EU political representatives made parallel arguments for renewed public trust (voting scandals aside) around data policy, leveraging their authority as key spokespersons of the Western world’s legal and political expertise.

The varied crises currently facing Western data institutions – private and public alike – gave an immediate urgency to deepen our understanding and analyses of other forms of data practice and processing beyond the given centers of “data” expertise – technical, legal, or otherwise. But the work of this volume demonstrates the breadth of scholarship long underway from across varied disciplines and research communities (bridging from Latin American and global area studies, to communications and new media studies, anthropology, sociology, science and technology studies, and emerging fields like critical data studies) to address such glaring imbalances – to ask what limited forms of citizen and user are indeed “spoken for” under the interests of Western innovation and political centers –and to ask how it is that such particular centers of knowledge production and research are still enabled to speak for (and in place of) the “global rest”– particularly when issues of technology, the digital, and now indeed, data are involved.

In bringing the LASA session series together, we thus noted how critical scholarship had already begun to undertake analyses of the politics surrounding big data – drawing attention to how datafication regimes bring about new and opaque techniques of population management, control, and discrimination – but how such accounts still largely stemmed from scholars based in institutions in the global north. Our aim was thus to build and expand upon such scholarship by engaging dialogues with new and existing work critical of the dominance of Western approaches to datafication, and that aimed towards recognition of the diversity of voices emerging from the Global South. Stressing opportunities for co-learning across dialogues, we tabled a range of questions that included:

  • How does the availability of data bring novel opportunities for research and collaboration across the Global South?
  • How do activists take advantage of big data for social justice advocacy?
  • What initiatives and actors ask for the release of data?
  • What negative consequences of datafication are activists and organizations facing in the Global South?
  • What practices of resistance emerge?
  • What frames of reference, imaginaries, and culture do people mobilize in relation to big data and massive data collection?
  • Which conceptual and methodological frameworks are best suited to capture the complexity and the peculiarities of data activism in the Global South?
  • And which alternative understandings and epistemologies could help us to better address the contested terrain of data power and activism in the Global South, and Latin America in particular?

The shift involved not only a broadening of geographical and political lenses, but also entailed a broadening of frameworks to encompass – alongside the critical work of analyzing datafication regimes under development by state and corporate actors – new frameworks that could take new and existing practices around data activism seriously. Parallel with growing calls for broadening debates in information, technology and new media studies with “decolonial computing” frameworks (Amrute and Murillo 2018, Chan, 2018, Philip and Sengupta, 2018), such a broadened lens draws from work in Latin American and post-colonial studies around the “decolonization of knowledge” as a means to underscore the significance of the diverse ways through which citizens and researchers in the Global South engage in bottom-up data practices for social change as well as speak for the resistances to uses of big data that increase oppression, inequality, or social harm. Indeed, the prominent collective of global scholars who wrote of decolonial thinking and the “decolonial option” in 2007 did so urging a broader recognition of the diverse contexts and agents of knowledge production who long represented “a colonial subaltern epistemology.” They wrote to draw attention to the long and diverse histories of decolonial interventions that emerged to confront the “variegated faces of the colonial wound inflicted [over] five hundred years of… modernity as a weapon of imperial/colonial global expansion.” (Mignolo, 2007, Mignolo and Escobar, 2010)

Writing as researchers bridging conversations and debates across four continents, they renewed critiques of how the colonial underpinnings of global knowledge production continued to reassert Western frames of thought as universal scientific truths. And they underscored how this “historically worked to subordinate and negate ‘other’ frames [and] ‘other’ knowledge,… reproduc[ing] the meta-narratives of the West while discounting or overlooking the critical thinking produced by indigenous, Afro, and mestizos whose thinking… depart not from modernity alone but also from the long horizon of coloniality” (Walsh, 2007: 224). They thus stressed the vitality of “other” forms of knowledge production occurring “beyond the academy” (Mignolo and Escobar 2010:18), and highlighted the de-colonial options enacted by indigenous and other social movement actors as vital to future decolonial projects. Pressing on “the importance of thinking within” and alongside the perspective of these movements (Mignolo and Escobar 2010:19), they urged scholars not only to reimagine their roles as academic documentarians of movements (actors, that is, still dedicated to a reproduction of dominant forms of modern epistemologies) , but to decenter their own forms of knowledge practice by beginning to “think with [movements] theoretically and politically.” As such, decolonialists posed the significance of how cultivating a politics of decentralization – and a de-centering of the self as expert and knowledge practitioner – might offer an affront to modernity’s domination of other epistemologies – and might open up possibilities for a more radical politics of inclusion and intentionality of dialogues across lines of difference.

And indeed, the encounter in Barcelona last May drew forth vital and vibrant responses from a diverse range of scholars who together represented more than 20 different research institutions (public and private) across over more than a dozen national contexts, and four different continents. Building upon the prompt the editors of this volume following the first conference on Big Data from the South in Cartegena, Colombia to imagine what varied southern theories – in vital, vibrant, plurality — around big data would entail (Milan and Trere, 2017), the participants of our second workshop mapped collaboratively a terrain marked by a complex of readily identifiable contemporary challenges and possibilities alike. These included varied forms of new datafication practices undertaken by the state – but conducted in fundamental partnership with corporate data industries – that were read as explicitly deleterious to civic forms of critical intervention. These encompassed projects that participants marked as material and techno-cultural articulations of “Nation branding,” “Surveillance” in urban and online spaces alike, growing “Smart City Initiatives” that saw to the “Automation of State Functions within Urban Infrastructures,” growing CCTV-like “Centers of Control with Cameras,” and indeed, “Bureaucracy.”

Other participants marked emerging data-driven projects launched under state and private sector partnerships that – less than outright excluding or marginalizing civic participation – instead included narrowly-defined forms of citizen inclusion, that were typically based on recognizable forms of “innovation” practice. This included noticeably growing trends in “Open Government” and “Open Data” initiatives “and “Open Innovation Centers” as a means to transform citizens’ perceptions of and relations to the state.

Mapping more promising vectors, participants noted new growth in the use of “media archives” in film, video, literature, or music and civic data collections as resources newly utilized for new citizen-driven projects around “Data Literacy,” “Memory Mappings and Weavings from Neighborhoods” (including those especially marked by conflict and violence, such as those in urban Colombia), “Communal and Neighborhood Open Street Maps”, and “Feminist Mappings of Femicides” and sex-based hate crimes. Participants also marked the development of new practices or use of existing data sets (acquired from either government, public, or corporate data sources) – as practices that drew from existing data resources or infrastructures, and reoriented or hacked them to create fundamentally new technocultural and material resources. This included the “Reappropriation of Stolen Archives” and cultural artifacts taken (whether under colonial powers or in the name of national patrimony) from traditional and indigenous communities, the “Use of Drones to Map Marches” and document potential state abuses, the “Use of Analog Phone Communication between Taxi Drivers” as a means to circumvent smart city programs in Mexico City, and even the outright “Rejection and Refusal” of dominant technology products and solutions, until alternative civic uses might be defined.

Working together over the course of the afternoon-long session, the participants brought to life a number of principles underscored in the earliest iteration of Big Data from the South that alternatives theories and approaches to big data would entail. This included a considerations of the heterogeneity of data practices – coming from state, corporate and also civic actors – who could facilitate or resist “datafication” processes, to center decolonial thinking that would attend to alternative practices, imaginaries, and epistemologies in relation to data; to consider the work of infrastructure within diverse contexts in the Global South; and to be open to the dialogue the varied vectorizations it might have between actors representing diverse and complex realities between “northern” and “southern” worlds.

In conversation, and in consideration of the recent globally-scaled data scandals of 2018 that had brought the legitimacy of national elections and the authority of dominant Western data institutions – private and public alike – into question – the roomful of participants began to collectively map a series of other concerns and problematics that built upon earlier mappings. This included how data archives and practices had been influenced by community-defined communication infrastructures. It included too how other objects that defined people’s day to day contact with data resources might especially be mindful of how everything from seeds to digitally tagged farm animals (and objects beyond cell phones and urban smart city infrastructures) might be recognized as implicating datafication in more-than-human worlds. How might such considerations and practices developing within community contexts – and that draw attention to the rights, responsibilities and obligations around “community data” or “comuni-datos” – how might these emerge as a collective argument and resource to defend as an alternative to Western framework’s privileging of individual privacy rights (or data as personal property). How might recognizing the innovation within such work deepen a decolonial data project by decentering recognition of conventional data experts – as industry employed or IT-trained data scientist and engineer – to more everyday forms of data expert and practice centered around citizen and civic actors? And finally, could taking seriously the work of such processes as Data Dialogues help to forge new convergences, interfaces, or forms of technosocialities that could further deepen the ethical debates and intersectional, inter-allied work needed to energize the development of alternative data practices in the face of the evident global crises of dominant data institutions today confront?

It is worth noting that such a project and core of concerns within a Data from the South initiative finds ready resonances within existing debates in critical data studies, and the growing scholarship around algorithm studies, software and platform studies, and post-colonial computing. And while most of this scholarship has indeed emerged from institutions in the Global North, varied concerns scholars within such circles have signaled as key areas for future development, indeed point towards potentials for convergences. This includes a reinforced rejection of data fundamentalism (Crawford and boyd) and technological determinism infused within many analysis of algorithms in application, and a fundamental recentering of the human within data-fied worlds and data industries – that resists the urge to read “algorithms as fetishized objects… and firmly resist[s] putting the technology in the explanatory driver’s seat… A sociological analysis must not conceive of algorithms as abstract, technical achievements, but must unpack the warm human and institutional choices lie behind these cold mechanisms. (Gillespie 2013, Crawford 2016) It also involves treating data infrastructures and the underlying algorithms that give political life them intentionally as both ambiguous but approachable – to develop methodologies that “not only explore new empirical [and everyday] settings,” for data politics, including airport security, credit scoring, academic writing, and social media – “ but also find creative ways to make the figure of the algorithm productive for analysis… [and] show that mythologies like the algorithmic drama do not have to be reductive but can be rich and complex ‘stories that help people deal with contradictions in social life that can never fully be resolved’’ (Mosco 2005, 28; see also Lévi-Strauss 1955). Finally, in parallel with approaches for a post-colonial computing that STS and critical informatic scholars have called for have called for (Irani, Phillips and Dourish 2010) in developing decolonial computing frameworks that aim for growing “tactics… that expand the transdisciplinary scope of what one needs to know,” developing approaches around and with Data from the South might further aim to develop new interfaces with allied scholars – from across varied disciplines and regions – required to “think within” between and among in the diverse perspective of wide-ranging and widely-situated movements both inside and outside traditional research spaces. Writing now in the Fall of 2018, as renewed calls for alternative and urgently needed forms of global political imaginaries that no longer take for granted a presumed stability and centrality of Western liberalism and modernity are being called upon, such forms of open-ended relating and experimentation indeed yield valuable lessons.



Amrute S. and L. R. Murillo. (2018). “Computing in/from the South.” Catalyst, 4(2).

Andrejevic, M. (2012). Exploitation in the data-mine. In C. Fuchs, K. Boersma, A. Albrechtslund, & M. Sandoval (Eds.), Internet and Surveillance: The Challenges of Web 2.0 and Social Media (pp. 71–88). New York: Routledge.

Arora, P. (2016). Bottom of the Data Pyramid: Big Data and the Global South. International Journal of Communication, 10, 19.

Boyd, d., & Crawford, K. (2012). Critical questions for big data: Provocations for a cultural, technological, and scholarly phenomenon. Information, Communication & Society, 15(5), 662–679.

Chan, A. (2014). Networking Peripheries: Technological Futures and the Myth of Digital Universalism. Cambridge, MA: MIT Press.

Chan, A. 2018. “Decolonial Computing and Networking Beyond Digital Universalism.” Catalyst, 4(2).

Crawford, K. 2016. Can an Algorithm be Agonistic? Ten Scenes from Life in Calculated Publics, Science, Technology & Human Values, 41(1), 77-92.

Crawford, K., Miltner, K., and M. Gray. (2014). “Critiquing Big Data: Politics, Ethics, Epistemology,” International Journal of Communication 8, 1663–1672.

Dourish, P. (2016). “Algorithms and their others: Algorithmic culture in context.” Big Data & Society, July–December 2016: 1–11.

Eubanks, V. (2018). Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor. New York: St. Martin’s Press.

Gillespie, T. (2014). The relevance of algorithms. In T. Gillespie, P. Boczkowski, & K. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167–194). Cambridge, MA: MIT Press.

Mignolo, W. D. (2007). Introduction: Coloniality of power and de-colonial thinking. Cultural Studies, 21, (2 -3 March/May), 155 -167.

Mignolo, W. D. & E. A. Escobar (Eds.). (2010). Globalization and the decolonial. London, GB: Routledge Press.

Milan, S., & Trere, E. (2017). Big Data from the South: The Beginning of a Conversation We Must Have.

Mosco, V. 2005. The Digital Sublime: Myth, Power, and Cyberspace. Cambridge, MA: The MIT Press.

Noble, S. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York: NYU Press.

O’Neil, Cathy. (2016). Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. London: Allen Lane.

Philip, K., Irani, L. & Dourish, P. (2010). Postcolonial computing: A tactical survey. Science, Technology, & Human Values. 37(1), 3–29.

Philip, K. and A. Sengupta. 2018. “Afterword: Computing in/from the South.” Catalyst, 4(2).

Schäfer, M. & K. van Es, eds. (2017). The Datafied Society: Studying Culture through Data. Amsterdam: Amsterdam University Press.

Walsh, C. (2007). Shifting the geopolitics of critical knowledge: Decolonial thought and cultural studies “others” in the Andes. Cultural Studies 21, (2-3 March/May), 224 -239.

Ziewitz, M. (2015). “Governing algorithms: Myth, mess, and methods.” Science, Technology & Human Values 41(4): 3– 16.


About the author

Anita Say Chan is an Associate Research Professor of Communications in the Department of Media and Cinema Studies at the University of Illinois, Urbana-Champaign. Her research and teaching interests include globalization and digital cultures, innovation networks and the “periphery”, science and technology studies in Latin America, and hybrid pedagogies in building digital literacies. She received her PhD in 2008 from the MIT Doctoral Program in History; Anthropology; and Science, Technology, and Society. Her first book the competing imaginaries of global connection and information technologies in network-age Peru, Networking Peripheries: Technological Futures and the Myth of Digital Universalism was released by MIT Press in 2014. Her research has been awarded support from the Center for the Study of Law & Culture at Columbia University’s School of Law and the National Science Foundation, and she has held postdoctoral fellowships at The CUNY Graduate Center’s Committee on Globalization & Social Change, and at Stanford University’s Introduction to Humanities Program. She is faculty affiliate at the Institute for Computing in Humanities, Arts and Social Sciences (I-CHASS), the Illinois Informatics Institute, the Unit for Criticism and Interpretive Theory, and the Collaborative for Cultural Heritage Management and Policy (CHAMP). She was a 2015-16 Faculty Fellow with the Illinois Program for Research in the Humanities. She will be 2017-18 Faculty Fellow with the National Center for Supercomputing Applications, and a 2017-19 Faculty Fellow with the Unit for Criticism and Interpretive Theory.

Schermafdruk van 2018-07-25 14.50.30

XRDS Summer 2018 issue is out -with contributions from DATACTIVE

The last issue of XRDS – The ACM Magazine for Students is out. The issue has been co-edited by our research associate Vasilis Ververis and features contributions by three of us: Stefania Milan, Niels ten Oever, Davide Beraldo, and Vasilis himself.

  1. Stefania’s piece ‘Autonomous infrastructure for a suckless internet’ explores the role of politically motivated techies in rethinking a human rights respecting internet.
  2. Niels and Davide, in their ‘Routes to rights’, discuss the problems of ossification and commercialization of internet architecture.
  3. Vasilis, together with Gunnar Wolf (also editor of the issue), has written on ‘Pseudonimity and anonymity as tools for regaining privacy’.

XRDS (Crossroads) is the quarterly magazine of the Association for Computing Machinery. You can reach the full issue here.

July Event Poster

[DATACTIVE event] Democracy Under Siege: Digital Espionage and Civil Society Resistance, July 4



July 4th, 20.00 hrs @spui25, (TICKETS HERE)

The most recent US elections, during which hackers exposed political parties’ internal communications, revealed the devastating power of digital espionage. But election meddling is only one aspect of this growing phenomenon. From Mexico to Egypt and Vietnam, human rights organizations, journalists, activists and opposition groups have been targeted by digital attacks. How can civil society defend itself against such threats?

The DATACTIVE project (University of Amsterdam) invites you to hear from leading experts on questions of digital espionage, cybersecurity and the protection of human rights in new technological environments. This public event aims to provide a global view of digital threats to civil society and discuss what can be done to fight back.

Ron Deibert (University of Toronto) will present the work of the Citizen Lab, which has pioneered investigation into information controls, covert surveillance and targeted digital espionage of civil society worldwide. He will be in conversation with Seda Gürses (KU Leuven) and Nishant Shah (ArtEZ University of the Arts/Leuphana University).


Ronald Deibert is Professor of Political Science and Director of the Citizen Lab at the Munk School of Global Affairs, University of Toronto. The Citizen Lab undertakes interdisciplinary research at the intersection of global security, ICTs, and human rights. Deibert is the author of Black Code: Surveillance, Privacy, and the Dark Side of the Internet (Random House: 2013), as well as numerous books, chapters, articles, and reports on Internet censorship, surveillance, and cyber security. He is a former founder and principal investigator of the OpenNet Initiative (2003-2014) and a founder of Psiphon, a world leader in providing open access to the Internet.

Seda Gürses is an FWO post-doctoral fellow at COSIC/ESAT in the Department of Electrical Engineering at KU Leuven, Belgium. She works at the intersection of computer science, engineering and privacy activism, with a focus on privacy enhancing technologies. She studies conceptions of privacy and surveillance in online social networks, requirements engineering, software engineering and algorithmic discrimination and looks into tackling some of the shortcomings of the counter-surveillance movements in the US and EU.

Nishant Shah is the Dean of Graduate School at ArtEZ University of the Arts, The Netherlands, Professor of Culture and Aesthetics of Digital Media at Leuphana University, Germany, and the co-founder of the Centre for Internet & Society, India. His work is informed by critical theory, political activism, and equality politics. He identifies as an accidental academic, radical humanist, and an unapologetic feminist, with particular interests in questions of life, love, and language. His current preoccupations are around digital learning and pedagogy, ethics and artificial intelligence, and being human in the face of seductive cyborgification.

This event is hosted by Spui25 and sponsored by the European Research Council (ERC) and the Amsterdam School of Cultural Analysis (ASCA).


[blog] Data by citizens for citizens

Author: Miren Gutierrez

In spite of what we know about how big data are employed to spy on us, manipulate us, lie to us and control us, there are still people who get excited by hype-generating narratives around social media influence, machine learning and business insights. At the other end of the spectrum, there is apocalyptic talk that preaches that we must become digital anchorites in small, secluded and secret cyber-cloisters.

Don’t get me wrong; I am a big fan of encryption and virtual private networks. And yes, the CEOs of the technology corporations have more resources than governments to understand social and individual realities. The consequence of this unevenness is evident because companies do not share their information unless forced or in exchange for something else. Thus, public representatives and citizens lose their capacity for action vis-à-vis private powers.

But precisely because of the severe imbalances in practices of dataveillance (van Dijck 2014) it is vital to consider alternative forms of data that enable the less powerful to act with agency (Poell, Kennedy, and van Dijck 2015) in the era of the so-called “data power”. While the debate on big data is hijacked by techno-utopians and techno-pessimists and the big data progress stories come from the private sector, little is being said about what ordinary people and non-governmental organisations do with data; namely, how data are created, amassed and used by alternative actors to come up with their own diagnoses and solutions.


My new book Data activism and social change talks about how people and organised society are using the data infrastructure as a critical instrument in their quests. These people include fellow action-oriented researchers and number-churning practitioners and citizens generating new maps, platforms and alliances for a better world. And they are showing a high degree of ingenuity, against the odds.

The starting point of this book is an article in which Stefania Milan and I set the scene, link data activism to the tradition of citizens’ media and lay out the fundamental questions surrounding this new phenomenon (Milan and Gutierrez 2015).

Most of the thirty activists, practitioners and researchers I interviewed and forty plus organisations I observed for the book practice data activism in one way or another. In my analysis, I classify them in four not-so-neat boxes: These include skills transferrers, or organisations, such as DataKind, that transfer skills by deploying data scientists into non-governmental organisations so they can work together on projects. Other skills transferrers, for example, Medialab-Prado and Civio, create platforms and tools or generate the matchmaking opportunities for actors to meet and collaborate in data projects with social goals.

A second group –including catalysts such as the Open Knowledge Foundation— sponsor some of these endeavours. Journalism producers can include journalistic organisations such as the International Consortium of Investigative Journalists, or civil society organisations, such as Civio, providing analysis that can support campaigns and advocacy efforts.


This is a moment in the Western Africa’s Missing Fish map where irregular fish transshipments are being conducted in Senegal waters. See interactive map here.

Proper data activists take it further, securing in sheltered archives vital information and evidence of human rights abuses (i.e. The Syrian Archive); recreating stories of human suffering and abuse (i.e. Forensic Architecture’s “Liquid Traces”); tracking illegal fishing and linking it to development issues (i.e. “Western Africa’s Missing Fish”, co-led by me at the Overseas Development Institute); visualising evictions and mobilising crowds to stop them (i.e. in San Francisco and Spain); and mapping citizen data to produce verified and actionable information during humanitarian crises and emergencies (i.e. the “Ayuda Ecuador” application of the Ushahidi platform), to mention just a few.

This classification is offered as a heuristic tool to think more methodically about real cases of data activism, and also to guide efforts to generate more projects.

We know datasets and algorithms do not speak for themselves and are not neutral. Data cannot be raw (Gitelman 2013); data and metadata are “made” in processes that are “made” as well (Boellstorff 2013). That is, data are not to be treated as natural resources, inevitable and spontaneous, but as cultural resources that to be curated and stored. And the fact that the data infrastructure is employed in good causes does not abolish the prejudices and asymmetries present in datasets, algorithms, hardware and data processes. But the exciting thing is that even using flawed technology, these activists gets results.

But where do these activists get data from? Because data can be difficult to find…

How do activists get their hands on data?

Corporations do not usually give their data away, and the level of government openness is not fantastic. “Data is hard (or even impossible) to find online, 2) data is often not readily usable, 3) open licensing is rare practice and jeopardised by a lack of standards” (Global Open Data Index 2017). This lack of open access to public data is shocking when considering this is mostly information about how governments administer everyone’s resources and taxes.

So when governments and corporations do not open their data vaults, people get organised and generate their own data. This is the case of “Rede InfoAmazonia”, a project that maps water quality and quantity based on a network of sensors deployed by communities of the Brazilian Amazon. The map issues alarms to the community when water levels or quality surpass or fall behind a range of standard indicators.

In my book, I discuss five ways in which data activists and practitioners can get their hands on data: from the simplest to the most complex, 1) someone else (i.e. a whistle-blower) can offer them the data; 2) data activists can also resort to public data that can be acquired (i.e. automatic identification system signals captured by satellites from vessels) or are simply open; 3) they can generate communities to crowdsource citizen data; 4) they can appropriate data or resort to data scraping; and 5) they deploy drones and sensors to gather images or obtain data via primary research (i.e. surveys). Again, this taxonomy is offered as a tool to examine real cases.

Of them, crowdsourcing data can be a powerful process. The crowdsourced map set up using the Ushahidi platform in Haiti in 2010 tackled “key information gaps” in the early period of the response before large organisations were operative, providing geolocalised data to small non-governmental organisations that did not have a field presence, offering situational awareness and rapid information with high degree of accuracy, and enabling citizens’ decision-making, found an independent evaluation of the deployment (Morrow, Mock, and Papendieck 2011). The Haiti map marked a transformation in the way emergencies and crises are tackled, giving rise to digital humanitarianism.


Forensic Architecture’s Liquid Traces.

Other forms of obtaining data are quite impressive too. Forensic Architecture’s “Liquid Traces” employed AIS signals, heat signatures of the ships, radar signals and other surveillance technologies to demonstrate that the failure to save a group of 72 people who had been forced by armed Libyan soldiers on-board of an inflatable craft on March 27, 2011, was due to callousness, not the inability to locate them. Only nine would survive. Another organisation, WeRobotics, helps communities in Nepal to analyse and map vulnerability to landslides in a changing climate.

Alliances, maps and hybridisation

From the observation of how these organisations work, I have identified eleven traits that define data activists and organisations.

One interesting commonality is that data activists tend to work in alliances. This sounds quite commonsensical. Either the problems these activists are trying to analyse and solve are too big to tackle on their own (i.e. from a humanitarian crisis to climate change), or the datasets that they confront are too big (i.e. “Western Africa’s Missing Fish” and the ICIJ’s “Panama papers” processed terabytes of data). I cannot think of any data project that does not include some form of collaboration.


The first Ushahidi map: Kenyan violence.

Another quality is that data activists often rely on maps as tools for analysis, coordination and mobilisation. Maps are objects bestowed with knowledge, power and influence (Denil 2011; Harley 1989; Hohenthal, Minoia, and Pellikka 2017). The rise of digital cartography, mobile media, data crowdsourcing platforms and geographic information systems reinforces the maps’ muscle. This trend overlaps with a growing interest in crisis and activist mapping, a practice that blends the capabilities of the geoweb with humanitarian assistance and campaigning. In the hands of people and organisations, maps have been a form of political counter-power (Gutierrez 2018). One example is Ushahidi’s first map (see map), which was set up in 2008 to bypass an information shutdown during the bloodbath that arose after the presidential elections in Kenya a year earlier, and to give voice to the anonymous victims. The deployment allowed victims to disseminate alternative narratives about the post-electoral violence.

The employment of maps is so usual in data activism that I have called this variety of data activism geoactivism –defined precisely by the way activists use digital cartography and often crowdsourced data to provide alternative narratives and spaces for communication and action. InfoAmazonia, an organisation dedicated to environmental issues and human rights in the Amazon region, is an example of another organisation specialised in visualising geolocalised data, in this case for journalism and advocacy. I defend the idea that this use of maps almost by default has generated a change in paradigm, standardising maps for humanitarianism and activism.


Vagabundos de la chatarra, the book.

Besides, data activists usually do not have any qualms about mixing methods and tools from other trades. Not only many data organisations are hybrid –crossing the lines that separate journalism, advocacy, research and humanitarianism—, but they also combine repertoires of action from different areas. An example is “Los vagabundos de la chatarra”, a year-long project that includes comics journalism, a book, interactive maps, videos and a website to tell the stories of the people who gathered and sold scrap metal for a living on the edges of Barcelona during the economic crisis that started in 2007 (Gutierrez, Rodriguez, and Díaz de Guereñu 2018).

Civio, mentioned before, produces journalism, hosts data projects, advocates around issues such as transparency, corruption, health and forest fires. “España en llamas” is a project hatched at Civio that, for the first time in Spain, paints a comprehensive picture of fires. Civio also opens the data behind these projects.

The values that motivate these data activists include sharing knowledge, collaborating and inspiring processes of social change and justice, uncovering and providing undisputable evidence for them, and deploying collective action powered by indignation and also by hope. These data activists deserve more attention.

*A version of this blog has been published at Medium.


Boellstorff, Tom. 2013. ‘Making Big Data, in Theory’. First Monday 18 (10).

Denil, Mark. 2011. ‘The Search for a Radical Cartography’. Cartographic Perspectives 68.

Gitelman, Lisa, ed. 2013. Raw Data Is an Oxymoron. Cambridge and London: The MIT Press.

Global Open Data Index. 2017. ‘The GODI 2016/17 Report: The State Of Open Government Data In 2017’.

Gutiérrez, Miren. 2018. ‘Maputopias: Cartographies of Knowledge, Communication and Action in the Big Data Society – The Cases of Ushahidi and InfoAmazonia’. GeoJournal 1–20.

Gutiérrez, Miren, Pilar Rodríguez, and Juan Manuel Díaz de Guereñu. 2018. ‘Journalism in the Age of Hybridization: Barcelona. Los Vagabundos de La Chatarra – Comics Journalism, Data, Maps and Advocacy’. Catalan Journal of Communication and Cultural Studies 10 (1): 43-62.

Harley, John Brian. 1989. ‘Deconstructing the Map’. Cartographica: The International Journal for Geographic Information and Geovisualization 26 (2): 1–20.

Hohenthal, Johanna, Paola Minoia, and Petri Pellikka. 2017. ‘Mapping Meaning: Critical Cartographies for Participatory Water Management in Taita Hills, Kenya’. The Professional Geographer 69 (3): 383–95.

Milan, Stefania, and Miren Gutiérrez. 2015. ‘Citizens´ Media Meets Big Data: The Emergence of Data Activism’. Mediaciones 14.

Morrow, Nathan, Nancy Mock, and Adam Papendieck. 2011. ‘Independent Evaluation of the Ushahidi Haiti Project’. Port-au-Prince: ALNAP.

Poell, Thomas, Helen Kennedy, and Jose van Dijck. 2015. ‘Special Theme: Data & Agency’. Big Data & Society.

van Dijck, Jose. 2014. ‘Datafication, Dataism and Dataveillance: Big Data between Scientific Paradigm and Ideology’. Surveillance & Society 12 (2): 197–208.


About Miren
Miren is a Research Associate at DATACTIVE. She is also a professor of Communication, director of the postgraduate programme “Data analysis, research and communication”, and member of the research team of the Communication Department at the University of Deusto, Spain. Miren’s main interest is proactive data activism, or how the data infrastructure can be utilized for social change in areas such as development, climate change and the environment. She is a Research Associate at the Overseas Development Institute of London, where she leads and participates in data-based projects exploring the intersection between biodiversity loss, environmental crime and development.