Category: show in team updates

Big Data from the South at the LASA conference in Barcelona

Stefania Milan is the co-organizer with Emiliano Trere (Cardiff University) and Anita Say Chan (University of Illinois, Urbana-Champaign) of two panels at the forthcoming conference of the Latin American Studies Association (LASA), in Barcelona on May 23-26.

The (slightly revised) lineup:

Big Data from the Global South Part I (1033 // COL – Panel – Friday, 2:15pm – 3:45pm, Sala CCIB M211 – M2)
PROJECT FRAMING + GROUP BRAINSTORMING
·      Stefania Milan, Emiliano Trere, Anita Chan: From Media to Mediations, from Datafication to Data Activism
 
Big Data from the Global South Part II: Archive Power (1101 // COL – Panel – Friday, 4:00pm – 5:30pm, Sala CCIB M211 – M2)
·      Inteligencia Artificial y Campañas Electorales en la Era PostPolítica: Seguidores, Bots, Apps: Paola Ricaurte Quijano, Eloy Caloca Lafont 
·      Open Government, APPs and Citizen Participation in Argentina, Chile, Colombia, Costa Rica and Mexico: Luisa Ochoa; Fernando J Martinez de Lemos 
·      Cryptography, Subjectivity and Spyware: From PGP Source Code and Internals to Pegasus: Zac Zimmer
·      Engineering Data Conduct: Prediction, Precarity, and Data-fied Talent in Latin America’s Start Up Ecology: Anita J Chan
 
Big Data from the Global South Part III: Data Incorporations (1167 // COL – Panel – Friday, 5:45pm – 7:15pm, Sala CCIB M211 – M2)
·      Evidence of Structural Ageism in Intelligent Systems: Mireia Fernandez 
·      Doing Things with Code: Opening Access through Hacktivism: Bernardo Caycedo
·      Decolonizing Data: Monika Halkort
·      Maputopias: Miren Gutierrez

 

About LASA 2018

Latin American studies today is experiencing a surprising dynamism. The expansion of this field defies the pessimistic projections of the 1990s about the fate of area studies in general and offers new opportunities for collaboration among scholars, practitioners, artists, and activists around the world. This can be seen in the expansion of LASA itself, which since the beginning of this century has grown from 5,000 members living primarily in the United States to nearly 12,000 members in 2016, 45 percent of whom reside outside of the United States (36 percent in Latin America and the Caribbean). And while the majority of us reside in the Americas, there are also an increasing number of Latin American studies associations and programs in Europe and Asia, most of which have their own publications and annual seminars and congresses.

Several factors explain this dynamism. Perhaps the most important is the very maturity of our field. Various generations of Latin Americanists have produced an enormous, diverse, and sophisticated body of research, with a strong commitment to interdisciplinarity and to teaching about this important part of the world. Latin American studies has produced concepts and comparative knowledge that have helped people around the world to understand processes and problematics that go well beyond this region. For example, Latin Americanists have been at the forefront of debates about the difficult relationship between democracy, development, and dependence on natural resource exports—challenges faced around the globe. Migration, immigration, and the displacement of people due to political violence, war, and economic need are also deeply rooted phenomena in our region, and pioneering work from Latin America can shed light on comparable experiences in other regions today. Needless to say, Latin American studies also has much to contribute to discussions about populism and authoritarianism in their various forms in Europe and even the United States today.

With these contributions in mind, we propose that the overarching theme of the Barcelona LASA Congress be “Latin American Studies in a Globalized World”, and that we examine both how people in other regions study and perceive Latin America and how Latin American studies contributes to the understanding of comparable processes and issues around the globe.

Becky and Stefania at the Data Justice conference

Stefania will present on “Questioning data universalism” with Emiliano Treré (Cardiff University) and she will be chairing the session on Data Activism (14.00 – 15.30 Parallel Sessions B).

Becky will present on “It Depends On Your threat Model: Understanding strategies for uncertainty amidst digital surveillance and data exploitation” as part of the Civil Society and Data (chair: Isobel Rorison).

About the data justice conference (website, program)

Date: 21-22 May 2018
Location: Cardiff University, Cardiff, UK
Host: Data Justice Lab, Cardiff University, Cardiff, UK

The collection and processing of massive amounts of data has become an increasingly contentious issue. Our financial transactions, communications, movements, relationships, all now generate data that are used to profile and sort groups and individuals. What are the implications for social justice? How do we understand social justice in an age of datafication? In what way do initiatives around the globe address questions of data in relation to inequality, discrimination, power and control? What is the role of policy reform, technological design and activism? How do we understand and practice ‘data justice’? How does data justice relate to other justice concerns?

This conference will examine the intricate relationship between datafication and social justice by highlighting the politics and impacts of data-driven processes and exploring different responses. Speakers include Anita Gurumurthy (IT for Change, India), David Lyon (Queen’s University, Canada), Evelyn Ruppert (Goldsmiths, University of London, UK), Rob Kitchin (Maynooth University, Ireland), Sasha Costanza-Chock (MIT Center for Civic Media, US), Seeta Peña Gangadharan (London School of Economics, UK), Solon Barocas (Cornell University, US and FAT/ML).

Stefania at RightsCon 2018

Organising, together with the Data Justice Lab (Cardiff University) the following session:

The Fight Against the Institutional Datafication of Social Life: Challenges and Tactics

Friday 18 May, 4pm

Chair: Stefania Milan, Datactive Ideas Lab, University of Amsterdam

Speakers: Arne Hintz, Data Justice Lab, Cardiff University
Malavika Jayaram, Digital Asia Hub
Nandini Chami, ITforChange
Javiera Moreno, Datos Protegidos
Anita Say Chan, University of Illinois
Mitchell Baker, Mozilla (tbc)

This session will serve to discuss challenges and opportunities for civil society advocacy in response to data analytics by governments. While the use of data analytics, scoring and identification systems by state institutions is advancing rapidly, often underpinned by tightening surveillance legislation, civil society efforts to address the datafication of citizens and its consequences have faced difficulties. We will map these challenges, in view of exploring potential tactics and forms of influence. The session will allow participants to share knowledge about innovative strategies of intervention; engage in a dialogue on how citizens can have a say in the adoption and implementation of big data analytics; and advance a transnational mobilization connecting struggles on the consequences of datafication with the social justice and human rights agenda.

About Rightscon

As the world’s leading conference on human rights in the digital age, we bring together business leaders, policy makers, general counsels, government representatives, technologists, and human rights defenders from around the world to tackle pressing issues at the intersection of human rights and digital technology. This is where our community comes together to break down silos, forge partnerships, and drive large-scale, real-world change toward a more free, open, and connected world.

Jeroen presents at Bevrijdingsfestival Utrecht

Part of de Denkplaats [space to think], hosted by the Utrecht public library, Jeroen gave a talk named “About data, governance, and the role of the citizen”. The talk aimed to engage bevrijdingsfestival visitors with questions of governance of online public space, distribution of responsibility and accountability and the surveillance rationalities underpinning these practices.

 

About the Denkplaats (in Dutch):

De Denkplaats is voor één keer op het Bevrijdingsfestival. We praten verder over social media.

Google en Facebook verdienen bakken met geld aan hun gebruikers. De persoonlijke gegevens van die gebruikers zijn de grondstof van deze miljoenenbedrijven. Meer dan de helft van alle mensen gebruikt hun diensten. Gebruikers mogen gratis gebruikmaken van die diensten, het liefst lekker veel.

Wat weten al die mensen eigenlijk van de intenties en methodes van deze kolossen? Is er sprake van een eerlijke ruil; wij het gemak van snelle, gratis communicatie, zij onze gegevens? Of zijn gebruikers ongemerkt handelswaar geworden?

Zorgen digitale diensten grote vrijheid in communicatie of juist voor onvrijheid omdat de bedrijven er achter aan de haal (kunnen) gaan met onze gegevens?

 

Kom bij ons langs op het Vrijheidspodium!

 

Niels presented his work in London, Gothenborg and Berlin

This Spring Niels ten Oever hit the road and gave different talks about the Internet architecture and infrastructure at:
– the Alan Turing Institute in a Workshop on Protocol Governance: Internet Standard Bodies and Public Interest questions.
– the University of Gothenburg’s School of Global Studies, Department of Political Science and Department of Social Work in a Conference on Internet Governance and Human Rights.
– the Humboldt Instut fuer Internet und Gesellschaft in a Workshop: “We are on a mission”. Exploring the role of future imaginaries.

This helped Niels to gather feedback on his preliminary findings in his research on Internet imaginaries, architecture consolidation, and quantitative mailinglist analysis. Next to that he got to engage with other Internet governance researchers on Internet imaginaries and governance innovations.

The questions, comments and talks by other researchers led to a lot of insights which Niels is now reworking into two forthcoming articles.
28115139570_ce8bdecc69_o

annual DATACTIVE PhD Colloquium, May 4th

Date: Tomorrow, May 4th 13.30

Location: Oudemanhuispoort 4-6, Amsterdam, room OMHP-E0.12

Tomorrow we will have our yearly PhD colloquium, a moment to showcase our work and receive feedback. You’re invited to join us.

This year’s guests, acting as respondents, are Marlies Glasius (Amsterdam School for Social Science Research) and Annalisa Pellizza (University of Twente). Our new postdoctoral fellow Fabien Cante will be also be in attendance.

The program is as follows:

(13:30 – 14:15) Niels ten Oever: “The evolving notion of the public interest in the Internet architecture”

(14:20 – 15:05) Kersti Wissenbach: “Accounting for Power in a Datafied World: A Social Movement Approach to Civic Tech Activism”

(15:10 – 15:25) Coffee Break

(15:25 – 16:10) Becky Kazansky: “Infrastructures of Anticipation: civil society strategies in an age of ubiquitous surveillance”

(16:15 – 17:00) Guillen Torres: “Empowering information activists through institutional resistance”.

 

Welcome to two new team members: Hoang & Fabien

DATACTIVE is happy to welcome two new team members!

Fabien Cante will join us as a postdoc, mostly to help with empirical research. Fabien is interested in media as contested infrastructures of city life. His PhD (London School of Economics, 2018) work was grounded in Abidjan, Côte d’Ivoire; he hopes to continue asking what datafication means in an African metropolis. In addition to academic work, Fabien is comms officer for the Migrants’ Rights Network and active in neighbourhood struggles in South London.

IMG_20180422_171539

 

We are also happy to have Hoang joining us to help us out with our empirical research practices as a part of her rMA studies.

Tu Quynh Hoang has a BA in Professional Communication from RMIT University. Concerned about human rights issues in Asia, she moved from working in media companies to doing research on Internet controls and citizens’ media. She is currently studying towards a Research MA in Media Studies at the University of Amsterdam.

Quynh profile photo

Welcome both, we are very much looking forward to working with you!

NOW OUT! Special issue on ‘data activism’ of Krisis: Journal for Contemporary Philosophy

DATACTIVE is proud to announce the publication of the special issue on ‘data activism’ of Krisis: Journal for Contemporary Philosophy. Edited by Stefania Milan and Lonneke van der Velden, the special issue features six articles by Jonathan Gray, Helen Kennedy, Lina Dencik, Stefan Baack, Miren Gutierrez, Leah Horgan and Paul Dourish; an essay by, and three book reviews. The journal is open access; you can read and download the article from http://krisis.eu.

Issue 1, 2018: Data Activism
Digital data increasingly plays a central role in contemporary politics and public life. Citizen voices are increasingly mediated by proprietary social media platforms and are shaped by algorithmic ranking and re-ordering, but data informs how states act, too. This special issue wants to shift the focus of the conversation. Non-governmental organizations, hackers, and activists of all kinds provide a myriad of ‘alternative’ interventions, interpretations, and imaginaries of what data stands for and what can be done with it.

Jonathan Gray starts off this special issue by suggesting how data can be involved in providing horizons of intelligibility and organising social and political life. Helen Kennedy’s contribution advocates for a focus on emotions and everyday lived experiences with data. Lina Dencik puts forward the notion of ‘surveillance realism’ to explore the pervasiveness of contemporary surveillance and the emergence of alternative imaginaries. Stefan Baack investigates how data are used to facilitate civic engagement. Miren Gutiérrez explores how activists can make use of data infrastructures such as databases, servers, and algorithms. Finally, Leah Horgan and Paul Dourish critically engage with the notion of data activism by looking at everyday data work in a local administration. Further, this issue features an interview with Boris Groys by Thijs Lijster, whose work Über das Neue enjoys its 25th anniversary last year. Lastly, three book reviews illuminate key aspects of datafication. Patricia de Vries reviews Metahavens’ Black Transparency; Niels van Doorn writes on Platform Capitalism by Nick Srnicek and Jan Overwijk comments on The Entrepeneurial Self by Ulrich Bröckling.

Stefania discusses data, citizenship and democracy in Lisbon, Bologna & Fribourg

On April 12, Stefania will give a talk on the politics of code and data at the ISCTE – Instituto Universitário de Lisboa, in Lisbon, Portugal.

On April 23, she will be in Bologna, Italy, at the School of Advanced International Studies of Johns Hopkins University. She will present her thoughts on ‘Citizenship Re-invented: The Evolution of Politics in the Datafied Society’.

Finally, on April 30 Stefania will lecture at the University of Fribourg, in Switzerland, upon invitation of Prof. Regula Haenggli. The lecture is entitled ‘Digitalization as a challenge to democracy: Possibilities of self-organization, emancipation, and autonomy’.

Para exercer plenamente a cidadania, é preciso conhecer os filtros virtuais (Época Negócios)

Stefania was commissioned an article by the Brazilian business magazine Época Negócios. In sum, she argues that “estar ciente dos elementos que moldam profundamente nossos universos de informação é um passo fundamental para deixarmos de ser prisioneiros da internet”. Continue reading the article in Portuguese online. Here you can read the original in English.

Why personalization algorithms are ultimately bad for you (and what to do about it)

Stefania Milan

I like bicycles. I often search online for bike accessories, clothing, and bike races. As a result, the webpages I visit as well as my Facebook wall often feature ads related to biking. The same goes for my political preferences, or my last search for the cheapest flight or the next holiday destination. This information is (usually) relevant to me. Sometimes I click on the banner; largely, I ignore it. Most of the cases, I hardly notice it but process and “absorb” it as part of “my” online reality. This unsolicited yet relevant content contributes to make me feel “at home” in my wanderings around the web. I feel amongst my peers.

Behind the efforts to carefully target web content to our preferences are personalization algorithms. Personalization algorithms are at the core of social media platforms, dating apps, and generally of most of the websites we visit, including news sites. They make us see the world as we want to see it. By forging a specific reality for each individual, they silently and subtly shape customized “information diets”.

Our life, both online and offline, is increasingly dependent on algorithms. They shape our way of life, helping us find a ride on Uber or hip, fast food delivery on Foodora. They might help us finding a job (or losing it), and locating a partner for the night or for life on Tinder. They mediate our news consumption and the delivery of state services. But what are they, and how can they do their magic? Algorithms can be seen like a recipe for baking an apple tart: in the same way in which the grandma’s recipe tells us, step by step, what to do to make it right, in computing algorithms tell the machine what to do with data, namely how to calculate or process it, and how to make sense of it and act upon it. As forms of automated reasoning, they are usually written by humans, however they operate into the realm of artificial intelligence: with the ability to train themselves over time, they might eventually “take up” their own life, sort to speak.

The central role played by algorithms in our life should be of concern, especially if we conceive of the digital as complementary to our offline self. Today, our social dimension is simultaneously embedded and (re)produced by technical settings. But algorithms, proprietary and opaque, are invisible to end users: their outcome is visible (e.g., the manipulated content that shows up on one’s customized interface), but it bears no indication of having been manipulated, because algorithms leave no trace and “exist” only when operational. Nevertheless, they do create rules for social interaction and these rules indirectly shape the way we see, understand and interact with the world around us. And far from being neutral, they are deeply political in nature, designed by humans with certain priorities and agendas.

While there are many types of algorithms, what affects us most today are probably personalization algorithms. They mediate our web experience, easing our choices by giving us information which is in tune with our clicking habits—and thus, supposedly, preferences.

They make sure the information we are fed is relevant to us, selecting it on the basis of our prior search history, social graph, gender and location, and generally speaking about all the information we directly on unwillingly make available online. But because they are invisible to the eyes of users, most of us are largely unaware this personalization is even happening. We believe we see “the real world”, yet it is just one of the many possible realities. This contributes to envelop us in what US internet activist and entrepreneur Eli Pariser called the “filter bubble”— that is to saythe intellectual isolation caused by algorithms constantly guessing what we might like or not, based on the ‘image’ they have of us. In other words, personalization algorithms might eventually reduce our ability to make informed choices, as the options we are presented with and exposed to are limited and repetitive.

Why should we care, if all of this eventually is convenient and makes our busy life easier and more pleasant?

First of all, this is ultimately surveillance, be it corporate or institutional. Data is constantly collected about us and our preferences, and it ends up “standing in” for the individual, who is made to disappear in favoir of a representation which can be effortlessly classified and manipulated.“When you stare into the Internet, the Internet stares back into you”, once tweeted digital rights advocate @Cattekwaad. The web “stares back” by tracking our behaviours and preferences, and profiling each of us in categories ready for classification and targeted marketing. We might think of the Panopticon, a circular building designed in mid-19thcentury by the philosopher Jeremy Bentham as “a new mode of obtaining power of mind over mind” and intended to serve as prison. In this special penal institute, a single guard would be effortlessly able to observe all inmates without them being aware of the condition of permanent surveillance they are subjected to.

But there is a fundamental difference between the idea of the Panopticon and today’s surveillance ecosystem. The jailbirds of the internet age are not only aware of the constant scrutiny they are exposed to; they actively and enthusiastically participate in generation of data, prompted by the imperative to participate of social media platforms. In this respect, as the UK sociologist Roy Boyne explained, the data collection machines of personalization algorithms can then be seen as post-Panopticon structures, whereby the model rooted on coercion have been replaced by the mechanisms of seduction in the age of big data. The first victim of personalization algorithms is our privacy, as we seem to be keen to sacrifice freedom (including the freedom to be exposed to various opinions and the freedom from the attention of others) to the altar of the current aggressive personalized marketing in favour of convenience and functionality.

The second victim of personalization algorithms is diversity, of both opinions and preferences, and the third and ultimate casualty is democracy. While this might sound like an exaggerated claim, personalization algorithms dramatically—and especially, silently—reduce our exposure to different ideas and attitudes, helping us to reinforce our own and allowing us to disregard any other as “non-existent”. In other words, the “filter bubble” created by personalization algorithms isolates us in our own comfort zone, preventing us from accessing and evaluating the viewpoints of others.

The hypothesis of the existence of a filter bubble has been extensively tested. On the occasion of the recent elections in Argentina, last October, Italian hacker Claudio Agosti in collaboration with the World Wide Web Foundation, conducted a research using facebook.tracking.exposed,a software intend to “increase transparency behind personalization algorithms, so that people can have more effective control of their online Facebook experience and more awareness of the information to which they are exposed.”

The team rana controlled experiment with nine profiles created ad hoc, creating a sort of “lab experiment” in which profiles were artificially polarized (e.g., maintaining some variables constant, each profile “liked” different items). Not only did the data confirmed the existence of a filter bubble; it showed a dangerous reinforcement effect which Agosti termed “algorithm extremism”.

What can we do about all this? This question has two answers. The first is easy but uncomfortable. The second is a strategy for the long run and calls for an active role.

Let’s start from the easy. We ultimately retain a certain degree of human (and democratic) agency: in any given moment, we can choose to opt out. To be sure, erasing our Facebook account doesn’t do the trick of protecting our long-eroded privacy: the company has the right to retain our data, as per Terms of Service, the long, convoluted legal document—a contract, that is—we all sign to but rarely read. With the “exit” strategy we lose in contacts, friendships, joyful exchange and we are no longer able to sneak in the life of others, but we gain in privacy and, perhaps, reclaim our ability to think autonomously. I bet not many of you will do this after reading this article—I haven’t myself found the courage to disengage entirely from my leisurely existence on social media platforms.

But there is good news. As the social becomes increasingly entrenched in its algorithmic fabric, there is a second option, a sort of survival strategy for the long run. We can learn to live with and deal withalgorithms. We can familiarize with their presence, engaging in a self-reflexive exercise that questions what they show us in any given interface and why. If understandably not all of us might be inclined to learn the ropes of programming, “knowing” the algorithms that so much affect us is a fundamental step to be able to fully exercise our citizenship in the age of big data. “Knowing” here means primarily making the acquaintance with their occurrence and function, and questioning the fact that being turned into a pile of data is almost an accepted fact of life these days. Because being able to think with one’s own head today, means also questioning the algorithms that so much shape our information worlds.