Category: Uncategorized

Announcing the Magma project

By Vasilis Ververis, DATACTIVE

Magma aims to build a scalable, reproducible, standard methodology on measuring, documenting and circumventing internet censorship, information controls, internet blackouts and surveillance in a way that will be streamlined and used in practice by researchers, front-line activists, field-workers, human rights defenders, organizations and journalists.

In recent years, a number of research fellows, journalists, human rights activists, lawyers as well as a larger research community, have been working in high-risk contexts, which creates the need to consider their qualitative and quantitative research data as highly sensitive. Albeit their competitiveness and high qualification in their respective areas (social and political science, usability, law, political economy analysis), they can rarely claim to have a specific expertise or extensive experience when it comes to networks services and systems, telecommunication infrastructure, applied data analysis of network measurements, internet censorship, surveillance and information controls.

Ideally, researchers working with various network measurement tools and frameworks such as the Open Observatory of Network Interference (OONI), should have qualified technical help and assistance, thus enabling them to develop appropriate testing methodologies, suiting exactly their research environment and needs.

Magma aims to build a research framework for people working on information controls and network measurements, facilitating their working process in numerous ways. As such, this framework will enable them to properly structure an activity plan, make informed choices regarding the required tools (including ethical and security aspects) and analyze the data produced by such tools.

Through Magma, we wish to provide our expertise and experience in network measurements, internet censorship research, assessment of ISP network, surveillance probing and data analysis in order to:

  • Asses the risks by providing, implementing and maintaining technologies demanded by researchers on front-lines and areas where the need of operational security, anti-surveillance and censorship circumvention is of paramount importance.
  • Provide tailored technical assistance, developing at the same time appropriate testing methodology for network measurements, evaluation and analysis of data and reports that correspond to the respective research questions.
  • On a long-term basis, build a scalable and reproducible methodology for collecting, evaluating and analyzing data and reports’ self-defense for front-line researchers, front-line activists, field-workers, human rights defenders, organizations and journalists, by keeping exact documentation.

Below, we list some examples of potential future research around internet censorship, information controls and surveillance, mainly based on conducting networks measurements and analyzing its results:

Egypt: Media censorship, Tor interference, HTTPS throttling and ads injections?

A study on Tor network and media websites blockages, network bandwidth throttling and malicious network packet injections that contained malware and advertising content.

OONI Data Reveals How WhatsApp Was Blocked (Again) in Brazil

A study to determine how WhatsApp has been blocked after a judge’s court order all over the country of Brazil.

Understanding Internet Censorship Policy: The Case of Greece

An extensive large scale research analyzing the policies and techniques used to block content deemed illegal by a state identifying transparency problems, collateral damage and the implications of over or under blocking.

Identifying cases of DNS misconfiguration: Not quite censorship

A study on a non-malicious technical issue that leads to the interference and non-accessibility of a regional news media outlet throughout several different networks and countries.

To this respect, we would like to hear from all of you who are interested in researching information controls and internet censorship, and are intrigued to better understand how to work with network measurements and analyze data from various data sources and OONI reports.

We wanted to keep this post as concrete and terse as possible to encourage both technical and non-technical entities and individuals to get in touch with us, even if they are currently engaged in an undergoing project. The results of this collaboration will help form a complete guideline handbook expressed by the needs of the communities that work, or conduct research, in this field.

Please use any of these communications channels to get in touch with us.

 

Vasilis Ververis is a research associate with DATACTIVE and a practitioner of the principles ~ undo / rebuild ~ the current centralization model of the internet. Their research deals with internet censorship and investigation of collateral damage via information controls and surveillance. Some recent affiliations: Humboldt-Universität zu Berlin, Germany; Universidade Estadual do Piaui, Brazil; University Institute of Lisbon, Portugal.

 

This post is co-published with the Magma Project

Lonneke on Open Sourcing Open Source Intelligence

In late September, I gave a talk in which she considered the connections between Open Source Intelligence (OSINT) and data activism at the ‘DIGITAL CULTURES: Knowledge / Culture / Technology’ conference at Leuphana University Lüneburg. The presentation asked how OSINT might be understood through the prism of ‘data activist epistemologies’ (Milan and Van der Velden 2016).

The starting point for this interrogation is that Open Source Intelligence, despite its name, appears to have little in common with ‘open source’ cultures as we know them, for example through open source technologies. Open Source Intelligence simply means intelligence, for states or businesses, that is gathered from ‘open’ or publicly available sources. The initial question in the paper is, thus, one of terminology: What is really ‘open source’ about OSINT? And how might a critical interrogation of ‘open source’ change the way we think about OSINT? Hence the title of the talk: ‘Open Sourcing Open Source Intelligence’.

As a type of data activism, open source can be described as having its associated ‘epistemic culture’. This is a concept which refers to the diversity in modes of knowledge-making. ‘Epistemic culture’ originally comes from studies into scientific practices, and it directs attention to the ‘specific strategies that generate, validate, and communicate scientific accomplishments’ (Knorr-Cetina and Reichmann 2015, 873). It guides one’s focus toward the complex ‘relationships between experts, organisational formats, and epistemic objects’ (ibid. 873-4).

What we encounter in open source cultures is that knowledge is not legitimated institutionally, but technologically: the (open source) software function as a token of trust. The knowledge is legitimated because the software and the verification model can be reviewed, the methods are shared publicly, many of the findings are publicly shared, public learning is crucial and, ideally, expertise thus becomes distributed.

Open Source Intelligence (OSINT), by contrast, is a practice that seems to belong to – and to be legitimated by – formal and relatively closed institutions such as intelligence agencies. Yet the label can usefully be reclaimed to describe activist projects – such as the Syrian Archive – which seek to put open source tools and principles in the service of a different kind of knowledge-making, one that is genuinely public-oriented and collective. The question thus becomes: What can we learn from the interface between OSINT and open source? What kind of knowledge is being made, how? And how might activist forms of OSINT inform our understanding of data activism broadly speaking?

Stay tuned for the forthcoming paper, which is being co-authored with Jeff Deutch from the Syrian Archive. It will no doubt be enriched by a good discussion with the conference audience.

The abstract for the talk is available through the full conference programme (pp. 215-6).

 

Lonneke van der Velden is postdoctoral researcher with DATACTIVE and a lecturer at the department of media studies at the University of Amsterdam. Her research deals with internet surveillance and activism. She is part of the editorial board of Krisis, Journal for Contemporary Philosophy, and is on the Board of Directors of Bits of Freedom.   

 

References:

Knorr Cetina, Karin, and Werner Reichmann (2015) Epistemic cultures, in International Encyclopedia of the Social & Behavioral Sciences, ed. James D. Wright. Amsterdam: Elsevier, pp. 873-880.

Milan, Stefania, and Lonneke Van der Velden (2016) The Alternative Epistemologies of Data Activism. Digital Culture & Society 2(2) pp. 57-74.

26 October: Noortje Marres and DATACTIVE in conversation on the social science scene today

On 26 October, DATACTIVE hosts the philosopher and science studies scholar Noortje Marres to discuss and problematize the role of social science today.  The DATACTIVE team will engage with Marres to discuss chapters of her book Digital Sociology: The Reinvention of Social Research. The exchange is expected to delve into the social sciences from various perspectives derived from team members’ research fields, and will be anchored in the contemporary challenges to digital societies and beyond.

Marres is Associate Professor in the Centre for Interdisciplinary Methodologies at the University of Warwick and sits in the advisory board of DATACTIVE. Currently, she is a Visiting Professor in the Centre for Science and Technology Studies at the University of Leiden. Her work is located in the interdisciplinary field of Science, Technology and Society (STS).

Organization After Social Media: orgnets and alternative socio-technical infrastructures

by Lonneke van der Velden

Last month, I was invited to be a respondent (together with Harriet Bergman) for the launch of Geert Lovink and Ned Rossiter’s latest book, Organization After Social Media. The book is a collection of essays which re-interrogate the concerns and contributions of social movements and counter-cultural collectives in light of a significant contemporary problem: the existence of tech-monopolies such as Google and Facebook.

If social media cannot deliver on their promise to help collectives organize, how then should movements proceed? How to make such movements sustainable? The authors invite us to reflect on these issues through central concept of ‘organized networks’, or ‘orgnets’.

I liked many aspects of the book, but will highlight here two things I found interesting from the perspective of DATACTIVE’s own concerns. The first has to do with a re-evaluation of encryption, and the second with where we search for historical and theoretical lessons to help us organize ‘after social media’.

Re-evaluating encryption?

One thing I read in the book is a re-evaluation of encryption. Encryption is presented, not as an individual intervention, but as an intervention with a potential to allow for the emergence of collective forms. “The trick,” the authors tell us, “is to achieve a form of collective invisibility without having to reconstitute authority” (p. 5).

I think this collective potential of encryption is interesting. Research into activism in the UK after the Snowden revelations (Dencik, Hintz & Cable 2016) showed that digital rights groups tended to operate in a rather specialized space demarcated from issues championed by other groups and organizations. Digital rights organizations speak about privacy and freedom of speech, but hardly touch upon other social issues. And vice versa: organizations that work on, for instance, social justice issues, tend to regard digital rights as a working package for those specific NGOs that are dedicated to privacy. Encryption does not feature as a strategy that is part of their activist work. This has only partly to do with a lack of knowledge. What’s more, activists told the researchers that they want to do things in public, and using encryption is associated with having something to hide. This is a reductive summary of some of the findings by Dencik and others, but the study provides food for thought about how encryption is often precipitated.

What Lovink and Rossiter’s book nicely does is show that this is not the only possible way to conceive of encryption, opening up a different interpretation. Not one that stages privacy or security, which is a discourse about protection, but one that forefronts organized unpredictability, which is a more productive discourse about what encryption has to offer in terms of collective organization. This idea might be more interesting for activist groups; that is, if they are not interested in hiding, they might well want to remain unsuspected and surprising.

Against the background of the analysis that social media and algorithms make people readable and predictable, infrastructures that help organize unpredictability become important. In fact, from the discussion that followed with the authors during the book launch, it turned out that many of the concerns in the book relate to organizing unpredictability: merging the inventive (as exemplified by tactical media) with a wish for sustainability. How to build digital infrastructures that allow for the disruptive qualities that tactical media had in the past?

Some questions remain. Technologies of encryption are not infrastructures that can emerge out of the blue: they in turn need organized networks and communal work to remain up to date. Together with the audience at the book launch, we had an interesting back and forth about whether a notion of ‘community’, and community struggles, was needed.

Realizing organized networks

Another thing we talked about that evening was the tension between organized networks as a concept and as actually-existing practices. As the authors write: “Organized networks are out there. They exist. But they should still be read as a proposal. This is why we emphasize the design element. Please come on board to collectively define what orgnets could be all about.” (p. 16)

Hence, the authors invite anyone who has been part of an organized network, or thinks that he or she had been part of one, or wished that their network had been more organized, to ‘fill in’ their core concept. That means that much is left open in the book to the inventive powers of orgnet-organizers.

Technological infrastructures are an exception: the book is quite prescriptive in this regard, arguing for example that servers should not be centralized, and that we should prevent the emergence of tech-giants and develop alternative protocols and new spaces for action.

I could not help but wonder about the other kinds of prescription that are not so present in the book. Might we also offer prescriptive accounts in respect to things social movements experience over and over again, such as burnouts, internal strife, sexual harassment, and all things that hinder the sustainability of networks? And shouldn’t we reach out for documentation from, say, social movement studies or feminist histories, in addition to media theory? I am thinking about these in echo of Kersti’s and others’ discussion around community and critical community studies.

All in all, given that the focus of Lovink and Rossiter’s book is on forms of organization ‘after social media’, the choice of focusing on (alternative) socio-technical infrastructures is as understandable as it is valuable in itself. Indeed, it is an issue our research group cares about a lot; we hope to contribute to some of the causes laid out in the book.

The book can be ordered here and is also freely available online in pdf.

 

Lonneke van der Velden is a lecturer at the department of media studies at the University of Amsterdam. Her research deals with conceptualizations of internet surveillance and internet activism. She is also on the Board of Directors of Bits of Freedom.   

 

Dencik, Lina, Arne Hintz, and Jonathan Cable. 2016. “Towards Data Justice? The Ambiguity of Anti-Surveillance Resistance in Political Activism.” Big Data & Society 3 (2): 1–12. https://doi.org/10.1177/2053951716679678.

Rossiter, Ned, and Geert Lovink. Organization after Social Media. Minor Compositions, 2018.

Why we won’t be at APC 2018

In October 2018, the Amsterdam Privacy Conference (APC) will be back at the University of Amsterdam. Two DATACTIVE project team members, Stefania (Principal Investigator), and Becky (PhD candidate), enthusiastically supported the conference as coordinators of the ‘Digital Society and Surveillance’ theme. The Data Justice Lab at Cardiff University submitted a panel proposal, which was successfully included. Regretfully, neither will take part in the conference: DATACTIVE and the Data Justice Lab have decided to withdraw over the participation of the US-based software company Palantir as one of the APC’s Platinum Sponsors.

Our decision to withdraw stems from an active refusal to legitimize companies accused of enabling human rights abuses, and a concern with the lack of transparency surrounding sponsorship.

Palantir is a company specializing in big data analytics, which develops technologies for the military, law enforcement and border control. The deployment of Palantir’s technologies has raised wide-spread concern among civil liberties and human rights advocates. Reporting shows that, in the United States, Palantir has played an important role in enabling the efforts of the ICE (Immigration and Customs Enforcement) to identify, detain, and deport undocumented immigrants, refugees, and asylum seekers. This has resulted in the indefinite detention of thousands of children who have been separated from their parentsThis indefensible policy has come under strong criticism from the United Nations and prompted an alliance of technology workers and affected communities, to call – so far, unsuccessfully – for Palantir to cancel its contracts with ICE.

We feel that providing Palantir with a platform, as a sponsor of a prominent academic conference on privacy, significantly undermines efforts to resist the deployment of military-grade surveillance against migrants and marginalized communities already affected by abusive policing. 

Because we have organized conferences ourselves, we believe transparency in sponsorship agreements is key. While we praise the APC organizing committee forcommitting to full transparency, we were not informed of sponsorship agreements until the very last minute. The APC Sponsors page, in addition, was only populated after the participant registration deadline. As conference coordinators and prospective participants, we feel that we were not given the chance to make an informed choice about our contribution.

Sponsorship concerns are not a new issue: the very same controversy, around the involvement of this very same company (as well as others), emerged during the 2015 edition of APC. Though we acknowledge the complexity of corporate sponsorship, we note that other prominent tech policy conferences, such as Computers, Privacy and Data Protection (CPDP) conference, have recently stopped accepting sponsorship from Palantir. We thus believe this is a good moment for a larger discussion about how conferences should be organized in the future.

Academia—and especially publicly-funded universities—need to consider their role in efforts to neutralize or undermine human rights concerns. Such considerations are particularly pertinent in the context of what has been described as the increased neoliberalization of higher education, in which there is significant pressure to attract and pursue funding from different sources. As academics and as citizens, we will increasingly be asked to make choices of this kind. Hence, we believe it is time to set down a clear set of principles for sponsorship going forward.

 

Amsterdam and Cardiff, 19 September 2018

Stefania Milan and Becky Kazansky (DATACTIVE) & Lina Dencik, Arne Hintz, Joanna Redden, Fieke Jansen (Data Justice Lab)

Data Colonialism – the first article of the Special Issue on “Big Data from the South” is out

By London School of Economics Library and Political Science - https://www.flickr.com/photos/lselibrary/3925726761/in/set-72157622828540200/, No restrictions, https://commons.wikimedia.org/w/index.php?curid=10180000
Photo by London School of Economics Library and Political Science

Nick Couldry and Ulisse A. Mejias re-frame the Data from the South debate within the context of modern day colonialism: data colonialism; an alarming stage where human life is “appropriated through data” and life is, eventually, “capitalized without limit”.

This essay marks the beginning of a series of articles under a special issue on Big Data from the South, edited by Stefania Milan and Emiliano Trerè and published on the Television and New Media Journal. This article will be freely accessible for the first month, so we encourage you to put it high up on your to-read list.

The special issue promises interesting takes and approaches from renowned scholars and experts in the filed, such as Angela Daly and Monique Mann, Payal Arora, Stefania Milan and Emiliano Trerè, Jean-Marie Chenou and Carolina Cepeda, Paola Ricaurte Quijano, Jacobo Najera and Jesús Robles Maloof, with a special commentary by Anita Say Chan. Stay tuned for our announcements of these articles as they come up.

DATACTIVE welcomes a new member to the team

DATACTIVE welcomes its newest addition, Lara AlMalakeh, who joins the team as a Managing Editor of the three blogs of the project.

Lara has just obtained a MA in Comparative Cultural Analysis from the University of Amsterdam. Before that she obtained a Post Graduate in Principles and Practice of Translation from City University London. Lara wears many hats as she initially studied Fine Arts in Damascus University and graduated with specialization in oil painting. She then built a career in administration spanning over 12 years in Dubai, UAE. During that time, she joined Global Voices as the Arabic language editor and started translating for NGOs specialized in advocacy and digital rights, namely Electronic Frontier Foundation and First Draft News.

DATACTIVE is glad with the diversity that this new addition brings to its ensemble of academics and activists. The team is looking forward to leveraging on the various skills and attributes Lara brings along, whether from her professional background or her various involvements in the activism sphere.

Lara is a proud mother of two girls under 10. She enjoys discussing politics and debating art over a beer. Her new found love is philosophy and she dreads bikes.

Kersti presenting during RNW Media Global Weeks, July 6

With her talk ‘NGO ETHICS IN THE DIGITAL AGEHOW TO WORK WITH DATA  RESPONSIBLY’ Kersti will address a transnational team of media and advocacy practitioners during RNW Media‘s annual summit.

RNW Media is working with youth in fragile and repressive states, aiming to empower young women and men to unleash their own potential for social change. As the organization has transitioned from a traditional international broadcaster towards increasing engagement in advocacy activities utilizing digital means and data, a critical moment of moving towards a responsible data approach has come.

Kersti is consulting RNW Media on the process to develop a responsible data framework and respective program strategies.