Category: show on blog page

[blog] Growth for Critical Studies? Social scholars, let’s be shrewder

Author: Miren Gutierrez
This is a response the call for a critical community studies  ‘Tech, data and social change: A plea for cross-disciplinary engagement, historical memory, and … Critical Community Studies‘ by Kersti Wissenbach and the first contribution to the debate  ‘Can We Plan Slow – But Steady – Growth for Critical Studies?’ by Charlotte Ryan.
Commenting on the thought-provoking blogs by Charlotte Ryan and Kersti Wissenbach, I feel in good company. Both of them speak of the need in research to address inequalities embedded in technology and to focus on the critical role that communities play in remedying dominant techno-centric discourses and practices, and of the idea of new critical community studies. That is, the need to place people and communities at the centre of our activity as researchers and practitioners, asking questions about the communities instead of about the technologies, demanding a stronger collaboration between the two, and the challenges that this approach generates.
Their blogs incite different but related ideas.

First, different power imbalances can be found in scholarship. Wissenbach suggests that dominant discourses in academia, as well as in practice and donor agendas, are driving the technology hype. But as Ryan proposes, academia is not a homogeneous terrain.

Always speaking from the point of view of critical data studies, the current predominant techno-centrism seems to be diverting research funding towards applied sciences, engineering and tools (what Wissenbach calls “the state of technology” and Ryan refers to as a “profit-making industry”). Talking about Canada, Srigley describes how, for a while, even the Social Sciences and Humanities Research Council of Canada “fell into line by focusing its funding on business-related degrees. All the while monies for teaching and research in the humanities, social sciences, and sciences with no obvious connection to industry, which is to say, most of it, began to dry up”(Srigley 2018).  Srigley seems to summarise what is happening everywhere. “Sponsored research” and institutions requiring that research can be linked to business and industry partners appear as the current mantra.

Other social scholars around me are coming up with similar stories: social sciences focusing critically on data are getting a fraction of the research funding opportunities vis-à-vis computational data-enabled science and engineering within the fields of business and industry, environment and climate, materials design, robotics, mechanical and aerospace engineering, and biology and biomedicine. Meanwhile, critical studies on data justice, governance and how ordinary people, communities and non-governmental organisations experience and use data are left for another day.
Thus, the current infatuation with technology appears not to be evenly distributed across donors and academia.

Second, I could not agree more with Wissenbach and Ryan when they say that we should take communities as entry points in the study of technology for social change. Wissenbach further argues against the objectification of “communities”, calling for actual needs-driven engaged research and more aligned with practice.

Then again, here lies another imbalance. Even if scholars work alongside with practitioners to bolster new critical community studies, these actors are not in the same positions. We, social scholars, are gatekeepers of what is known in academia, we are often more skilful in accessing funds, we dominate the lingo. Inclusion therefore lies at the heart of this argument and remains challenging.

If funds for critical data studies are not abundant, resources to put in place data projects with social goals and more practice engaged research are even scarcer. That is, communities facing inequalities may find themselves competing for resources not only within their circles (as Ryan suggests). Speaking too as a data activist involved in projects that look at illegal fishing’s social impacts on coastal communities of developing countries (and trying hard to fund-raise for them), I think that we must make sure that more possible for data activism research does not mean less funding for data activist endeavours. I know they are not the same funds, but there are ways in which research could foster practice, and one of them is precisely putting communities at the centre.

Third, another divide lies underneath the academy’s resistance to engaged scholarship. While so-called “hard sciences” have no problems with “engaging”, some scholars in “soft-sciences” seem to recoil from it. Even if few people still support Chris Anderson’s “end of theory” musings (Anderson 2008), some techno-utopians pretend a state of asepsis exists, or at least it is possible now, in the age of big data. But they could not be more misleading. What can be more “engaged scholarship” than “sponsored research”? I mean, research driven and financed by companies is necessarily “engaged” with the private sector and its interests, but rarely acknowledges its own biases. Meanwhile, five decades after Robert Lynd asked “Knowledge for what?” (Lynd 1967), this question still looms over social sciences. Some social scientists shy away from causes and communities just in case they start marching into the realm of advocacy and any pretentions of “objectivity” disappear. While we know computational data-enabled science and engineering cannot be “objective”, why not accept and embrace engaged scholarship in social sciences, as long as we are transparent about our prejudices and systematically critical about our assumptions?

Fourth, data activism scholars have to be smarter in communicating findings and influencing discourses. Our lack of influence is not all attributable to market trends and donors’ obsessions; it is also our making. Currently, the stories of data success and progress come mostly from the private sector. And even when prevailing techno-enthusiastic views are contested, prominent criticism comes from the same quarters. An example is Bernard Marr’s article “Here’s why Data Is Not the New Oil”. Marr does not mention the obvious, that data are not natural resources, spontaneous and inevitable, but cultural ones, “made” in processes that are also “made” (Boellstorff 2013). In his article, Marr refers only to certain characteristics that make data different from oil. For example, while fossil fuels are finite, data are “infinitely durable and reusable”, etc. Is that all that donors, and publics, need to know about data? Although this debate has grown over the last years, is it reaching donors’ ears? Not long ago, during the Datamovida conference in Madrid in 2016 organised by Vizzuality, a fellow speaker –Aditya Agrawal from the Open Data Institute and Global Partnership for Sustainable Development Data— opened his presentation saying precisely that data were “the new oil”. If key data people in the UN system have not caught up with the main ideas emerging from critical data studies, we are in trouble and it is partly our making.

This last argument is closely related to the other ideas in this blog. The more we can influence policy, public opinion, decision-makers and processes, the more resources data activism scholars can gather to work alongside with practitioners in exploring how people and organisations appropriate data and their processes, create new data relations and reverse dominant discourses. We cannot be content with publishing a few blogs, getting our articles in indexed journals and meeting once in a while in congresses that seldom resonate beyond our privilege bubbles. Both Wissenbach and Ryan argue for stronger collaborations and direct community engagement; but this is not the rule in social sciences.

Making an effort to reach broader publics could be a way to break the domination that, as Ryan says, brands, market niches and revenue streams seem to exert on academic institutions. Academia is a bubble but not entirely hermetic. And even if critical community studies will not ever be a “cash cow”, they could be influential. There are other critical voices in the field of journalism, for example, which have denounced a sort of obsession with technology (Kaplan 2013; Rosen 2014). Maybe critical community studies should embrace not only involved communities and scholars but also other critical voices from journalism, donors and other fields. The collective “we” that Ryan talks about could be even more inclusive. And to do that, we have to expand beyond the usual academic circles, which is exactly what Wissenbach and Ryan contend.

I do not know how critical community studies could look like; I hope this is the start of a conversation. In Madrid, in April, donors, platform developers, data activists and journalists met at the “Big Data for the Social Good” conference, organised by my programme at the University of Deusto focussing on what works and what does not in critical data projects. The more we expand this type of debates the more influence we could gain.

Finally, the message emerging from critical data studies cannot be only about dataveillance (van Dijck 2014) and ways of data resistance. However imperfect and biased, the data infrastructure is enabling ordinary people and organised society to produce diagnoses and solutions to their problems (Gutiérrez 2018). Engaged research means we need to look at what communities do with data and how the experience the data infrastructure, not only at how communities contest dataveillance, which I have the feeling has dominated critical data studies so far. Yes, we have to acknowledge that often these technologies are shaped by external actors with vested interests before communities use them and that they embed power imbalances. But if we want to capture people’s and donor’s imagination, the stories of data success and progress within organised and non-organised society should be told by social scholarship as well. Paraphrasing Ryan, we may lose but live to fight another day.

Cited work
Anderson, Chris. 2008. ‘The End of Theory: The Data Deluge Makes the Scientific Method Obsolete’. Wired. https://www.wired.com/2008/06/pb-theory/.
Boellstorff, Tom. 2013. ‘Making Big Data, in Theory’. First Monday 18 (10). http://firstmonday.org/article/view/4869/3750.
Dijck, Jose van. 2014. ‘Datafication, Dataism and Dataveillance: Big Data between Scientific Paradigm and Ideology’. Surveillance & Society 12 (2): 197–208.
Gutierrez, Miren. 2018. Data Activism and Social Change. Pivot. London: Palgrave Macmillan.
Kaplan, David E. 2013. ‘Why Open Data Isn´t Enough’. Global Investigative Journalism Network (GIJN). 4 February 2013. http://gijn.org/2013/04/02/why-open-data-isnt-enough/.
Lynd, Robert Staughton. 1967. Knowledge for What: The Place of Social Science in American Culture. Princeton: Princeton University Press. http://onlinelibrary.wiley.com/doi/10.1525/aa.1940.42.1.02a00250/pdf.
Rosen, Larry. 2014. ‘Our Obsessive Relationship With Technology’. Huffington Post, 2014. https://www.huffingtonpost.com/dr-larry-rosen/our-obsession-relationshi_b_6005726.html?guccounter=1.
Srigley, Ron. 2018. ‘Whose University Is It Anyway?’, 2018. https://lareviewofbooks.org/article/whose-university-is-it-anyway/#_ednref37.

[blog] Making ‘community’ critical: Tech collectives through the prism of power

Author: Fabien Cante

In her recent blog post, Kersti Wissenbach expresses her frustration with the field of “civic tech,” which, as she puts it, remains far more focused on the “tech” than the “civic.” This resonates with me in many ways. I write as someone who is possibly more of an outsider to the field than Wissenbach: my previous research was on local radio (all analog), and as my DATACTIVE colleagues have found out, I am clueless about even the basics of encryption, so anything more technically complex will leave me flummoxed. In agreeing with Wissenbach, then, I do not mean to diminish the wonders of tech itself, as a field of knowledge and intervention, but rather underscore that the civic (or “the social,” as my former supervisor Nick Couldry would put it) is itself an immensely complex realm of knowledge, let alone action.

Wissenbach proposes that our thinking efforts, as scholars and activists concerned about the relations between technology and social change, shift toward “Critical Community Studies.” By this she means that we should ask “critical questions beyond technology and about communities instead.” I strongly agree. The research projects around data, tech and society that most excite me are the ones that are rooted in some community or other – transnational activist communities, in the case of DATACTIVE, or marginalised urban communities, in the case of the Our Data Bodies project. However, like Charlotte Ryan (whose response to Wissenbach can be read here), I would also like to be a bit cautious. In what follows, I really emphasise the critical and contextual aspects of Critical Community Studies, as envisioned by Wissenbach. I do so because I am a bit sceptical about the middle term – community.

I am involved in urban planning struggles in south London where the word “community” is frequently employed. Indeed, it serves as a kind of talisman: it invokes legitimacy and embeddedness. Community is claimed by activists, local authorities, and even developers, for obviously very different aims. This experience has shown me that, politically, community is an empty signifier. Bullshit job titles like “Community Manager” in marketing departments (see also Mark Zuckerberg speeches) further suggest to me that community is one of the most widely misappropriated words of our time.

More seriously, and more academically perhaps, community denotes a well-defined and cohesive social group, based on strong relationships, and as such self-evident for analysis. This is not, in many if not most circumstances, what collectives actually look like in real life. Anthropologist John Postill (2008), studying internet uptake in urban Malaysia, writes that researchers too often approach tech users as either “communities” or “networks.” Neither of these concepts captures how technology is woven into social relations. Where community presumes strong bonds and a shared identity, network reduces human relations to interaction frequencies and distance between nodes, flattening power differentials.

As Wissenbach rightly notes, people who use tech, either as producers or users, are “complex [beings] embedded in civil society networks and power structures.” It is these power structures, and the often tense dynamics of embeddedness, that Wissenbach seems to find most interesting – and I do too. This, for me, is the vital question behind Critical Community Studies (or, for that matter, the study of data activism): what specific power relations do groups enact and contest?

Still from Incoming (2017), by Richard Mosse - http://www.richardmosse.com/projects/incoming
Still from Incoming (2017), by Richard Mosse – http://www.richardmosse.com/projects/incoming

The critical in Critical Community Studies thus asks tough questions about race, class, gender, and other lines of inequality and marginalization. It asks how these lines intersect both in the community under study (in the quality of interactions, the kinds of capital required to join the collective, language, prejudices, etc.) and beyond it (in wider patterns of inequality, exclusion, and institutionalized domination). We see examples of such questioning happening, outside academia, through now widespread feminist critiques calling out pervasive gender inequalities in the tech industry, or through Data for Black Lives’ efforts to firmly center race as a concern for digital platforms’ diversity and accountability. Within the university, Seda Gürses, Arun Kundnani and Joris Van Hoboken’s (2016) paper “Crypto and Empire,” which could be said to examine the “crypto community” (however diffuse), offers some brilliant avenues to think data/tech communities critically, and thereby “re-politicize” data itself. More broadly, a wealth of feminist, post/decolonial (e.g. Mignolo 2011; Bhambra 2014; or Flavia Dzodan’s stellar Twitter feed) and critical race theory (see for example Browne 2015) can help us think through the histories from which civic tech communities arise, their positions in a complex landscape of power and inequality, and the ways in which they see their place in the world.

There is always a risk, when researchers consider community critically, that they put certain communities under strain; that they are seen to hinder positive work through their (our) critical discourse. Certainly, challenging a community’s inclusiveness is hard (and researchers are very bad at challenging their “own” community). But I think this is a limited view of critique as “not constructive” (a crime in certain circles where “getting things done” is a primary imperative). I would argue that collectives are strengthened through critique. As Charlotte Ryan beautifully puts it, Critical Community Studies can be instrumental in forming “a real ‘we’.” She adds: “an aggregate of individuals, even if they share common values, does not constitute ‘us’.” Building a “we” requires, at every step, asking difficult questions about who that “we” is (“we” the social movement, or “we” the civic tech community), who doesn’t fall under “we’s” embrace, and why.

Bibliography

Bhambra, Gurminder K. (2014) Connected Sociologies. London: Bloomsbury Press

Browne, Simone (2015) Dark Matters: On the Surveillance of Blackness. Durham, NC & London: Duke University Press

Gürses, Seda, Kundnani, Arun & Joris Van Hoboken (2016) “Crypto and Empire: The Contradictions of Counter-Surveillance Advocacy” Media, Culture & Society 38 (4), 576-590

Mignolo, Walter (2011) The Darker Side of Western Modernity: Global Futures, Decolonial Options. Durham, NC & London: Duke University Press

Postill, John (2008) “Localizing the Internet Beyond Communities and NetworksNew Media & Society 10 (3), 413-431

[blog] Data by citizens for citizens

Author: Miren Gutierrez

In spite of what we know about how big data are employed to spy on us, manipulate us, lie to us and control us, there are still people who get excited by hype-generating narratives around social media influence, machine learning and business insights. At the other end of the spectrum, there is apocalyptic talk that preaches that we must become digital anchorites in small, secluded and secret cyber-cloisters.

Don’t get me wrong; I am a big fan of encryption and virtual private networks. And yes, the CEOs of the technology corporations have more resources than governments to understand social and individual realities. The consequence of this unevenness is evident because companies do not share their information unless forced or in exchange for something else. Thus, public representatives and citizens lose their capacity for action vis-à-vis private powers.

But precisely because of the severe imbalances in practices of dataveillance (van Dijck 2014) it is vital to consider alternative forms of data that enable the less powerful to act with agency (Poell, Kennedy, and van Dijck 2015) in the era of the so-called “data power”. While the debate on big data is hijacked by techno-utopians and techno-pessimists and the big data progress stories come from the private sector, little is being said about what ordinary people and non-governmental organisations do with data; namely, how data are created, amassed and used by alternative actors to come up with their own diagnoses and solutions.

hi

My new book Data activism and social change talks about how people and organised society are using the data infrastructure as a critical instrument in their quests. These people include fellow action-oriented researchers and number-churning practitioners and citizens generating new maps, platforms and alliances for a better world. And they are showing a high degree of ingenuity, against the odds.

The starting point of this book is an article in which Stefania Milan and I set the scene, link data activism to the tradition of citizens’ media and lay out the fundamental questions surrounding this new phenomenon (Milan and Gutierrez 2015).

Most of the thirty activists, practitioners and researchers I interviewed and forty plus organisations I observed for the book practice data activism in one way or another. In my analysis, I classify them in four not-so-neat boxes: These include skills transferrers, or organisations, such as DataKind, that transfer skills by deploying data scientists into non-governmental organisations so they can work together on projects. Other skills transferrers, for example, Medialab-Prado and Civio, create platforms and tools or generate the matchmaking opportunities for actors to meet and collaborate in data projects with social goals.

A second group –including catalysts such as the Open Knowledge Foundation— sponsor some of these endeavours. Journalism producers can include journalistic organisations such as the International Consortium of Investigative Journalists, or civil society organisations, such as Civio, providing analysis that can support campaigns and advocacy efforts.

missing_ship

This is a moment in the Western Africa’s Missing Fish map where irregular fish transshipments are being conducted in Senegal waters. See interactive map here.

Proper data activists take it further, securing in sheltered archives vital information and evidence of human rights abuses (i.e. The Syrian Archive); recreating stories of human suffering and abuse (i.e. Forensic Architecture’s “Liquid Traces”); tracking illegal fishing and linking it to development issues (i.e. “Western Africa’s Missing Fish”, co-led by me at the Overseas Development Institute); visualising evictions and mobilising crowds to stop them (i.e. in San Francisco and Spain); and mapping citizen data to produce verified and actionable information during humanitarian crises and emergencies (i.e. the “Ayuda Ecuador” application of the Ushahidi platform), to mention just a few.

This classification is offered as a heuristic tool to think more methodically about real cases of data activism, and also to guide efforts to generate more projects.

We know datasets and algorithms do not speak for themselves and are not neutral. Data cannot be raw (Gitelman 2013); data and metadata are “made” in processes that are “made” as well (Boellstorff 2013). That is, data are not to be treated as natural resources, inevitable and spontaneous, but as cultural resources that to be curated and stored. And the fact that the data infrastructure is employed in good causes does not abolish the prejudices and asymmetries present in datasets, algorithms, hardware and data processes. But the exciting thing is that even using flawed technology, these activists gets results.

But where do these activists get data from? Because data can be difficult to find…

How do activists get their hands on data?

Corporations do not usually give their data away, and the level of government openness is not fantastic. “Data is hard (or even impossible) to find online, 2) data is often not readily usable, 3) open licensing is rare practice and jeopardised by a lack of standards” (Global Open Data Index 2017). This lack of open access to public data is shocking when considering this is mostly information about how governments administer everyone’s resources and taxes.

So when governments and corporations do not open their data vaults, people get organised and generate their own data. This is the case of “Rede InfoAmazonia”, a project that maps water quality and quantity based on a network of sensors deployed by communities of the Brazilian Amazon. The map issues alarms to the community when water levels or quality surpass or fall behind a range of standard indicators.

In my book, I discuss five ways in which data activists and practitioners can get their hands on data: from the simplest to the most complex, 1) someone else (i.e. a whistle-blower) can offer them the data; 2) data activists can also resort to public data that can be acquired (i.e. automatic identification system signals captured by satellites from vessels) or are simply open; 3) they can generate communities to crowdsource citizen data; 4) they can appropriate data or resort to data scraping; and 5) they deploy drones and sensors to gather images or obtain data via primary research (i.e. surveys). Again, this taxonomy is offered as a tool to examine real cases.

Of them, crowdsourcing data can be a powerful process. The crowdsourced map set up using the Ushahidi platform in Haiti in 2010 tackled “key information gaps” in the early period of the response before large organisations were operative, providing geolocalised data to small non-governmental organisations that did not have a field presence, offering situational awareness and rapid information with high degree of accuracy, and enabling citizens’ decision-making, found an independent evaluation of the deployment (Morrow, Mock, and Papendieck 2011). The Haiti map marked a transformation in the way emergencies and crises are tackled, giving rise to digital humanitarianism.

forensic_architecture

Forensic Architecture’s Liquid Traces.

Other forms of obtaining data are quite impressive too. Forensic Architecture’s “Liquid Traces” employed AIS signals, heat signatures of the ships, radar signals and other surveillance technologies to demonstrate that the failure to save a group of 72 people who had been forced by armed Libyan soldiers on-board of an inflatable craft on March 27, 2011, was due to callousness, not the inability to locate them. Only nine would survive. Another organisation, WeRobotics, helps communities in Nepal to analyse and map vulnerability to landslides in a changing climate.

Alliances, maps and hybridisation

From the observation of how these organisations work, I have identified eleven traits that define data activists and organisations.

One interesting commonality is that data activists tend to work in alliances. This sounds quite commonsensical. Either the problems these activists are trying to analyse and solve are too big to tackle on their own (i.e. from a humanitarian crisis to climate change), or the datasets that they confront are too big (i.e. “Western Africa’s Missing Fish” and the ICIJ’s “Panama papers” processed terabytes of data). I cannot think of any data project that does not include some form of collaboration.

Ushahidi_map

The first Ushahidi map: Kenyan violence.

Another quality is that data activists often rely on maps as tools for analysis, coordination and mobilisation. Maps are objects bestowed with knowledge, power and influence (Denil 2011; Harley 1989; Hohenthal, Minoia, and Pellikka 2017). The rise of digital cartography, mobile media, data crowdsourcing platforms and geographic information systems reinforces the maps’ muscle. This trend overlaps with a growing interest in crisis and activist mapping, a practice that blends the capabilities of the geoweb with humanitarian assistance and campaigning. In the hands of people and organisations, maps have been a form of political counter-power (Gutierrez 2018). One example is Ushahidi’s first map (see map), which was set up in 2008 to bypass an information shutdown during the bloodbath that arose after the presidential elections in Kenya a year earlier, and to give voice to the anonymous victims. The deployment allowed victims to disseminate alternative narratives about the post-electoral violence.

The employment of maps is so usual in data activism that I have called this variety of data activism geoactivism –defined precisely by the way activists use digital cartography and often crowdsourced data to provide alternative narratives and spaces for communication and action. InfoAmazonia, an organisation dedicated to environmental issues and human rights in the Amazon region, is an example of another organisation specialised in visualising geolocalised data, in this case for journalism and advocacy. I defend the idea that this use of maps almost by default has generated a change in paradigm, standardising maps for humanitarianism and activism.

Vagabundos

Vagabundos de la chatarra, the book.

Besides, data activists usually do not have any qualms about mixing methods and tools from other trades. Not only many data organisations are hybrid –crossing the lines that separate journalism, advocacy, research and humanitarianism—, but they also combine repertoires of action from different areas. An example is “Los vagabundos de la chatarra”, a year-long project that includes comics journalism, a book, interactive maps, videos and a website to tell the stories of the people who gathered and sold scrap metal for a living on the edges of Barcelona during the economic crisis that started in 2007 (Gutierrez, Rodriguez, and Díaz de Guereñu 2018).

Civio, mentioned before, produces journalism, hosts data projects, advocates around issues such as transparency, corruption, health and forest fires. “España en llamas” is a project hatched at Civio that, for the first time in Spain, paints a comprehensive picture of fires. Civio also opens the data behind these projects.

The values that motivate these data activists include sharing knowledge, collaborating and inspiring processes of social change and justice, uncovering and providing undisputable evidence for them, and deploying collective action powered by indignation and also by hope. These data activists deserve more attention.

*A version of this blog has been published at Medium.

References

Boellstorff, Tom. 2013. ‘Making Big Data, in Theory’. First Monday 18 (10). http://firstmonday.org/article/view/4869/3750.

Denil, Mark. 2011. ‘The Search for a Radical Cartography’. Cartographic Perspectives 68. http://cartographicperspectives.org/index.php/journal/article/view/cp68-denil/14.

Gitelman, Lisa, ed. 2013. Raw Data Is an Oxymoron. Cambridge and London: The MIT Press.

Global Open Data Index. 2017. ‘The GODI 2016/17 Report: The State Of Open Government Data In 2017’. https://index.okfn.org/insights/.

Gutiérrez, Miren. 2018. ‘Maputopias: Cartographies of Knowledge, Communication and Action in the Big Data Society – The Cases of Ushahidi and InfoAmazonia’. GeoJournal 1–20. https://doi.org/https://doi.org/10.1007/s10708-018-9853-8.

Gutiérrez, Miren, Pilar Rodríguez, and Juan Manuel Díaz de Guereñu. 2018. ‘Journalism in the Age of Hybridization: Barcelona. Los Vagabundos de La Chatarra – Comics Journalism, Data, Maps and Advocacy’. Catalan Journal of Communication and Cultural Studies 10 (1): 43-62. https://doi.org/10.1386/cjcs.10.1.43_1

Harley, John Brian. 1989. ‘Deconstructing the Map’. Cartographica: The International Journal for Geographic Information and Geovisualization 26 (2): 1–20.

Hohenthal, Johanna, Paola Minoia, and Petri Pellikka. 2017. ‘Mapping Meaning: Critical Cartographies for Participatory Water Management in Taita Hills, Kenya’. The Professional Geographer 69 (3): 383–95. https://doi.org/10.1080/00330124.2016.1237294.

Milan, Stefania, and Miren Gutiérrez. 2015. ‘Citizens´ Media Meets Big Data: The Emergence of Data Activism’. Mediaciones 14. http://biblioteca.uniminuto.edu/ojs/index.php/med/article/view/1086/1027.

Morrow, Nathan, Nancy Mock, and Adam Papendieck. 2011. ‘Independent Evaluation of the Ushahidi Haiti Project’. Port-au-Prince: ALNAP. http://www.alnap.org/resource/6000.

Poell, Thomas, Helen Kennedy, and Jose van Dijck. 2015. ‘Special Theme: Data & Agency’. Big Data & Society. http://bigdatasoc.blogspot.com.es/2015/12/special-theme-data-agency.html.

van Dijck, Jose. 2014. ‘Datafication, Dataism and Dataveillance: Big Data between Scientific Paradigm and Ideology’. Surveillance & Society 12 (2): 197–208.

 

About Miren
Miren is a Research Associate at DATACTIVE. She is also a professor of Communication, director of the postgraduate programme “Data analysis, research and communication”, and member of the research team of the Communication Department at the University of Deusto, Spain. Miren’s main interest is proactive data activism, or how the data infrastructure can be utilized for social change in areas such as development, climate change and the environment. She is a Research Associate at the Overseas Development Institute of London, where she leads and participates in data-based projects exploring the intersection between biodiversity loss, environmental crime and development.

[blog] Can We Plan Slow – But Steady – Growth for Critical Studies?

Author: Charlotte Ryan (University of Massachusetts, Lowell/Movement-Media
Research Action Project), member of the DATACTIVE ethics board.

This is a response post to the blog ‘Tech, data and social change: A plea for cross-disciplinary engagement, historical memory, and … Critical Community Studies‘ written by Kersti Wissenbach.

To maximize technologies’ value in social change efforts, Kersti Wissenbach urges researchers to join with communities facing power inequalities to draw lessons from practice. In short, the liberating potential of technologies for social change cannot be realized without holistically addressing broader inequalities. Her insights are many, in fact, communication activists and scholars could use her blog as a guide for ongoing conversations. Three points especially resonate with my experiences as a social movement scholar/activist working in collaboration with communities and other scholars:

  • Who is at the table?
    Wissenbach stresses the critical role of proactive communities in fostering technologies for social change as a corrective to the “dominant civic tech discourse [that] seems to keep departing from the ‘tech’ rather than the ‘civic’.” She stresses that an inclusive “we” emerges from intentional and sustained working relationships.
  • Power (and inequalities of power) matter!
    Acknowledging that technologies’ possibilities are often shaped long before many constituencies are invited to participate, Wissenbach asks those advancing social change technologies to notice the creation and recreation of power structures:
    “Only inclusive communities,” she cautions, “can really translate inclusive technology approaches, and consequently, inclusive governance.”
  • Tech for social change needs critical community studies
    Wissenbach calls for the emergence of critical community studies that—as do critical development, communication, feminist, and subaltern studies–crosses disciplines, “taking the community as an entry point in the study of technology for social change.” Practitioners and scholars would reflect together to draw and disseminate shared lessons from experience. This would allow “communities, supposed to benefit from certain decisions, [to] have a seat on the table.”

Anyone interested in the potential of civic tech—activists, scholar-activists, engineers, designers, artists, or other social communication innovators—will warmly welcome Wissenbach’s vision of Critical Community Studies. She proposes not another sub-specialty with esoteric journals and self-referential jargon, but a research network of learning communities expanding conceptual dialogs across the usual divides. And, she recognizes the urgent need to preserve and broadly disseminate learning about technologies for social change.

I agree but cautiously. It is just what’s needed. But the academy tends to resist engaged scholarship. We need to think about where to locate transformative theory-building; sadly, calls to break with traditional research approaches may be more warmly received outside academic institutions than within. The academy itself, at least in the United States, is under duress. How would Critical Community Studies explain itself to academic institutions fascinated by brand, market niche, and revenue streams? Critical Community Studies is not likely to be a cash cow generating more profits faster, and with less investment. The U.S. trend to turn education into a profit-making industry may be extreme, but it raises the need to look before we leap.

Like Wissenbach, I entered the academy with deep roots in social movements and community activism. Like her, I want the academy to produce knowledge and technology for the social good. Like her, I want communities directly affected to be fully vested in all phases of learning. Like her, I am eager to move beyond vague calls for participation and inclusion. My experiences to date, however, give me pause for thought.

button life

Caption: Thirty years in buttons

In the mid-1980’s, I was among a dozen established and emerging scholars who formed the university-based Media Research Action Project (MRAP). We were well-positioned to bridge the theorist-practitioner divide; many of us had begun as movement activists and we had ties to practitioners. This made it easier for MRAP to work with under-represented and misrepresented communities and constituencies to identify and challenge barriers to democratic communication and to build communication capacity.

U.S. based social movements face recurring challenges: our movements hemorrhage learning between generations; we still need to grapple with the legacies of slavery, colonialism and jingoism; our labor movement has withered. Living amidst relative plenty, U.S. residents may feel far removed from crises elsewhere. Competitive individualism, market pressures, and dismantled social welfare programs leave U.S. residents feeling precarious —even if we embrace liberatory ideals.

In light of these material conditions, MRAP wanted to broaden political dialogs about equality and justice. At first, we focused on transferring communication skills—one and two-day workshops. We soon realized that we needed ongoing working relationships to test strategies, build infrastructure and shared conceptual frameworks. But it took years to find the funds to run a more sustained program. Foundations—even when they liked our work—wanted us to ‘scale up’ fast (one national foundation asked us to take on 14 cities). In contrast, we saw building viable working relations as labor-intensive and slow. One U.S. federal agency offered hefty funding for proposals to “bridge the digital divide.” MRAP filed a book-length application with ten community partner organizations, eight in communities of color. The agency responded positively to MRAP’s plan, they urged us to resubmit but asked that we dump our partners and replace them with mainstream charities, preferably statewide.

And so the constraints tightened. Government and foundations’ preference for quick gains could marginalize (again) the very partners MRAP formed to support. To support ourselves, we could take day jobs, but this limited our availability. Over and over, we found—at least in the U.S. context—talk of addressing power inequalities far exceeded public will and deeds. Few mainstream institutions would commit the labor, skill, and time to reduce institutionalized power inequalities. Nor did they appreciate that developing shared lessons from practical experiences is labor intensive. (Wissenbach notes a number of these obstacles).

Despite all of the above, MRAP and our partners had victories. One neighborhood collaboration took over local political offices; another defeated an attempt to shut down an important community school; others passed legislation; and made common cause with the Occupy Movement to challenge the demonization of poor people in America. We won…sometimes. More often, we lost but lived to fight another day. And we helped document the ups and downs of our social movements. It was enormous fun even when it was really hard. As the designated holders and tellers of these histories, MRAP participants deepened our understanding of the macro-mezzo-micro interplay of political, social, economic, and cultural power.

From hundreds of conversations, dozens of collaborations, and gigabytes of notes, case studies, and foundation proposals, came a handful of collaborations that advanced our understanding of how U.S. movement organizations synchronize communication, political strategizing, coalition building, and leader and organizational development, and how groups integrate learning into ongoing campaigns.

We have begun to upload MRAP’s work at www.mrap.info. But those pursuing a transformed critical research tradition, should acknowledge that the academy has resisted grounded practice, and that the best critical reflections were often led by activists outside the academy rooted in communities directly facing power inequalities. In light of this, Wissenbach’s insistence that communities directly affected “be at the table” becomes an absolute.

Let me turn to Critical Communication Studies more specifically. To maximize publishing, U.S. scholars tend to communicate within, not across, disciplines. Anxious regarding slowing their productivity, they tend to avoid the unpredictability of practical work. For their part, the civic tech networks and communities facing inequalities find themselves competing for resources, a competition that can undermine the very collaborations they want to build. Even if resources are located, efforts may fade if a grant ends or a government changes hands.

So while I welcome the call for researchers to join practitioners in designing mutually beneficial projects, I want to do it right and that may mean do it slow. First off, who is the “we/us” mentioned twenty times by Wissenbach (or an equal number of times by me)? We need a real “we”: transforming institutional practices and priorities whether in academic or communication systems is a collective process. An aggregate of individuals even if they share common values does not constitute “us,” social movements as dialogic communities that consider, test, and unite around strategies. (As Wissenbach underscores, “we” need to shift power, and this requires shared strategies, efficient use of sustainable resources, and a capacity to learn from experience).

In short, transforming scholarly research from individual to collective models will take movement building. A first step may be recognizing that “we” needs to be built. Calling “we” a social construction does not mean it’s unreal; it means it’s our job to make it real.

Conclusion

I share Wissenbach’s respect for past and present efforts to lessen social inequalities via communication empowerment. I agree that “only inclusive communities can really translate inclusive technology approaches and, consequently, inclusive governance.” And I know that this will be hard to achieve. Progress may lie ahead but precarity and heavy work lie ahead as well. A beloved friend says to me these days, “Getting old is not for the faint of heart.” Neither is movement building.

 

Bibliography:

Howley, K. (2005). Community media: people, places, and communication technologies. Cambridge, UK ; New York : Cambridge University Press.

Kavada, A. (2010). Email lists and participatory democracy in the European social forum. Media, Culture & Society, 32(3), 355. doi: 10.1080/13691180802304854

Kavada, A. (2013). Internet cultures and protest movements: The cultural links between strategy, organizing and online communication. In B. Cammaerts, A. Mattoni & P.

McCurdy (Eds.), Mediation and protest movements (pp. 75–94). Bristol, England: Intellect.

Kidd, D., Barker-Plummer, B., & Rodriguez, C. (2005). Media democracy from the ground up: mapping communication practices in the counter public sphere. Report to the Social Science Research Council. New York

Kidd, D., Rodriguez, C., & Stein, L. (2009). Making our media: Global initiatives toward a democratic public sphere. Cresskill: Hampton Press.

Lentz, R. G., & Oden, M. D. (2001). Digital divide or digital opportunity in the Mississippi Delta region of the US. Telecommunications policy, 25(5), 291-313.

Lentz, R. G. Regulation as Linguistic Engineering. (2011). The Handbook of Global Media and Communication Policy, 432-448. IN Mansell, R., & Raboy, M. (Eds.) (Vol. 6). John Wiley & Sons.

Magallanes-Blanco, C., & Pérez-Bermúdez, J. A. (2009). Citizens’ publications that empower: social change for the homeless. Development in practice, 19(4-5), 654-664.

Mattoni, A. (2016). Media practices and protest politics: How precarious workers mobilise. Routledge.

Mattoni, A., & Treré, E. (2014). Media practices, mediation processes, and mediatization in the study of social movements. Communication theory, 24(3), 252-271.

Milan, S. (2009). Four steps to community media as a development tool. Development in Practice, 19(4-5), 598-609.

Rubin, N. (2002). Highlander media justice gathering final report. New Market, TN: Highlander Research and Education Center.

Treré, E. and Magallanes-Blanco, C. (2015) Battlefields, Experiences, Debates: Latin American Struggles and Digital Media Resistance, International Journal of Communication 9: 3652–366.

[blog] #Data4Good, Part II: A necessary debate

By Miren Gutiérrez*
In the context of the Cambridge Analytica scandal, fake news, the use of personal data for propagandistic purposes and mass surveillance, the Postgraduate Programme “Data analysis, research and communication” proposed a singular debate on how the (big) data infrastructure and other technologies can serve to improve people’s lives and the environment. The discussion was conceived as the second part of an ongoing conversation that started in Amsterdam with the Data for the Social good conference in November 2017.

We understand that four communities converge in the realisation of data projects with social impact: organisations that transfer skills, create platforms and tools and generate opportunities; the catalysts, which provide the funds and the means; those that produce data journalism, and the data activists. However, on rare occasions we see them debate together in public. Last April 12, at the headquarters of the Deusto Business School in Madrid, we met with representatives of these four communities, namely:

file

(From left to right, see picture), Adolfo Antón Bravo, head of the DataLab at Medialab-Prado, where he has led the experimentation, production and dissemination of projects around the data culture and the promotion of open data. Adolfo has also been representative of the Open Knowledge Foundation Spain, a catalyst organisation dedicated to finance and promote data projects, among others.

Mar Cabra, a well-known investigative journalist specialising in data analysis, who has been in charge of the Data and Research Unit of the International Consortium of Investigative Journalists (ICIJ), winner of the 2017 Pulitzer Prize with the investigation known as “The Papers of Panama”.

Juan Carlos Alonso, designer at Vizzuality, an organisation that offers applications that help to understand data through its visualisation better and comprehend global processes such as deforestation, disaster preparedness, the global flow of trade in agricultural products or action against climate change around the world.

Ignacio Jovtis, head of Research and Policies of Amnesty International Spain. AI uses testimonies, digital cartography, data and satellite photography to denounce and produce evidence of human rights abuses, for example in the war in Syria and the military appropriation of Rohingya land in Myanmar.

And Juanlu Sánchez, another well-known journalist, co-founder and deputy director of eldiario.es, who specialises in digital content, new media and independent journalism. Based on data analysis, he has led and collaborated in various investigative stories rocking Spain, such as the Bankia scandal.

The prestigious illustrator Jorge Martín facilitated the conversation with a 3.5×1 m mural summarising the main issues tackled by the panellists and the audience.

deusto

The conference’s formula was not conventional, as the panellists were asked not to offer a typical presentation, but to engage in a dialogue with the audience, most of whom belonged to the four communities mentioned earlier, representing NGOs, foundations, research centres and news media organisations.

Together, we talked about:

• the secret of successful data projects combining a “nose for a good story”, legwork (including hanging out in bars) and data in sufficient quantity and quality;
• the need to merge wetware and algorithms;
• the skills gaps within organisations;
• the absolute necessity to collaborate to tackle datasets and issues that are too big to handle alone;
• the demand to engage funders at all level –from individuals to foundations— to make these projects possible;
• the advantages of a good visualisation for both analysis and communication of findings;
• where and how to obtain data, when public data is not public much less open;
• the need for projects of any nature to have real social impact and shape policy;
• the combination of analogic methodologies (i.e. interviews, testimonies, documents) with data-based methodologies (i.e. satellite imagery, interactive cartography and statistics), and how this is disrupting humanitarianism, human rights and environmental campaigning and newsrooms;
• the need to integrate paper archives (i.e. using optical recognition systems) to incorporate the past into the present;
• the magic of combining seemingly unrelated datasets;
• the imperative to share not only datasets but also code, so others can contribute to the conversation, for example exploring venues that were not apparent to us;
• the importance of generating social communities around projects;
• the blurring of lines separating journalism, activism and research when it comes to data analysis;
• the experiences of using crowds, not only to gather data but also to analyse them.

Cases and issues discussed included Amnesty’s “troll patrol”, an initiative to assign digital volunteers to analyse abusive tweets aimed at women, and investigation on the army appropriation of Rohingyas’ land in Myanmar based on satellite imagery; Trase, a Vizzuality project that tracks agricultural trade flows (including commodities such as soy, beef and palm oil), amazingly based both on massive digitalised datasets and the paper trail left by commodities in ports; the “Panama papers”, and the massive collaborative effort that involved analysing 2,6 terabytes of data, and 109 media outlets in 76 countries; the successful diario.es business model, based on data and investigative journalism and supported by subscribers who believe in independent reporting; and the Datalab’s workshops, focused on data journalism and visualisation, which have been going on for six years now and have given birth to projects still active today.

The main conclusions could be summarised as follows:

1) the human factor –wetware— is as essential for the success of data projects with social impact as software and hardware, since technology alone is not a magic bullet;
2) the collaboration of different actors from the four communities with different competencies and resources is essential for these projects to be successful and to have an impact; and
3) a social transformation is also needed within non-profit and media organisations so that the culture of the data spreads far and away, and the data infrastructure is maximised for the transformation of the whole society and the conservation of nature.

* Dr Miren Gutiérrez is the director of the postgraduate Programme “Data analysis, research and communication” at the University of Deusto and a Lecturer on Communication. She is also a Research Associate at Datactive.

Para exercer plenamente a cidadania, é preciso conhecer os filtros virtuais (Época Negócios)

Stefania was commissioned an article by the Brazilian business magazine Época Negócios. In sum, she argues that “estar ciente dos elementos que moldam profundamente nossos universos de informação é um passo fundamental para deixarmos de ser prisioneiros da internet”. Continue reading the article in Portuguese online. Here you can read the original in English.

Why personalization algorithms are ultimately bad for you (and what to do about it)

Stefania Milan

I like bicycles. I often search online for bike accessories, clothing, and bike races. As a result, the webpages I visit as well as my Facebook wall often feature ads related to biking. The same goes for my political preferences, or my last search for the cheapest flight or the next holiday destination. This information is (usually) relevant to me. Sometimes I click on the banner; largely, I ignore it. Most of the cases, I hardly notice it but process and “absorb” it as part of “my” online reality. This unsolicited yet relevant content contributes to make me feel “at home” in my wanderings around the web. I feel amongst my peers.

Behind the efforts to carefully target web content to our preferences are personalization algorithms. Personalization algorithms are at the core of social media platforms, dating apps, and generally of most of the websites we visit, including news sites. They make us see the world as we want to see it. By forging a specific reality for each individual, they silently and subtly shape customized “information diets”.

Our life, both online and offline, is increasingly dependent on algorithms. They shape our way of life, helping us find a ride on Uber or hip, fast food delivery on Foodora. They might help us finding a job (or losing it), and locating a partner for the night or for life on Tinder. They mediate our news consumption and the delivery of state services. But what are they, and how can they do their magic? Algorithms can be seen like a recipe for baking an apple tart: in the same way in which the grandma’s recipe tells us, step by step, what to do to make it right, in computing algorithms tell the machine what to do with data, namely how to calculate or process it, and how to make sense of it and act upon it. As forms of automated reasoning, they are usually written by humans, however they operate into the realm of artificial intelligence: with the ability to train themselves over time, they might eventually “take up” their own life, sort to speak.

The central role played by algorithms in our life should be of concern, especially if we conceive of the digital as complementary to our offline self. Today, our social dimension is simultaneously embedded and (re)produced by technical settings. But algorithms, proprietary and opaque, are invisible to end users: their outcome is visible (e.g., the manipulated content that shows up on one’s customized interface), but it bears no indication of having been manipulated, because algorithms leave no trace and “exist” only when operational. Nevertheless, they do create rules for social interaction and these rules indirectly shape the way we see, understand and interact with the world around us. And far from being neutral, they are deeply political in nature, designed by humans with certain priorities and agendas.

While there are many types of algorithms, what affects us most today are probably personalization algorithms. They mediate our web experience, easing our choices by giving us information which is in tune with our clicking habits—and thus, supposedly, preferences.

They make sure the information we are fed is relevant to us, selecting it on the basis of our prior search history, social graph, gender and location, and generally speaking about all the information we directly on unwillingly make available online. But because they are invisible to the eyes of users, most of us are largely unaware this personalization is even happening. We believe we see “the real world”, yet it is just one of the many possible realities. This contributes to envelop us in what US internet activist and entrepreneur Eli Pariser called the “filter bubble”— that is to saythe intellectual isolation caused by algorithms constantly guessing what we might like or not, based on the ‘image’ they have of us. In other words, personalization algorithms might eventually reduce our ability to make informed choices, as the options we are presented with and exposed to are limited and repetitive.

Why should we care, if all of this eventually is convenient and makes our busy life easier and more pleasant?

First of all, this is ultimately surveillance, be it corporate or institutional. Data is constantly collected about us and our preferences, and it ends up “standing in” for the individual, who is made to disappear in favoir of a representation which can be effortlessly classified and manipulated.“When you stare into the Internet, the Internet stares back into you”, once tweeted digital rights advocate @Cattekwaad. The web “stares back” by tracking our behaviours and preferences, and profiling each of us in categories ready for classification and targeted marketing. We might think of the Panopticon, a circular building designed in mid-19thcentury by the philosopher Jeremy Bentham as “a new mode of obtaining power of mind over mind” and intended to serve as prison. In this special penal institute, a single guard would be effortlessly able to observe all inmates without them being aware of the condition of permanent surveillance they are subjected to.

But there is a fundamental difference between the idea of the Panopticon and today’s surveillance ecosystem. The jailbirds of the internet age are not only aware of the constant scrutiny they are exposed to; they actively and enthusiastically participate in generation of data, prompted by the imperative to participate of social media platforms. In this respect, as the UK sociologist Roy Boyne explained, the data collection machines of personalization algorithms can then be seen as post-Panopticon structures, whereby the model rooted on coercion have been replaced by the mechanisms of seduction in the age of big data. The first victim of personalization algorithms is our privacy, as we seem to be keen to sacrifice freedom (including the freedom to be exposed to various opinions and the freedom from the attention of others) to the altar of the current aggressive personalized marketing in favour of convenience and functionality.

The second victim of personalization algorithms is diversity, of both opinions and preferences, and the third and ultimate casualty is democracy. While this might sound like an exaggerated claim, personalization algorithms dramatically—and especially, silently—reduce our exposure to different ideas and attitudes, helping us to reinforce our own and allowing us to disregard any other as “non-existent”. In other words, the “filter bubble” created by personalization algorithms isolates us in our own comfort zone, preventing us from accessing and evaluating the viewpoints of others.

The hypothesis of the existence of a filter bubble has been extensively tested. On the occasion of the recent elections in Argentina, last October, Italian hacker Claudio Agosti in collaboration with the World Wide Web Foundation, conducted a research using facebook.tracking.exposed,a software intend to “increase transparency behind personalization algorithms, so that people can have more effective control of their online Facebook experience and more awareness of the information to which they are exposed.”

The team rana controlled experiment with nine profiles created ad hoc, creating a sort of “lab experiment” in which profiles were artificially polarized (e.g., maintaining some variables constant, each profile “liked” different items). Not only did the data confirmed the existence of a filter bubble; it showed a dangerous reinforcement effect which Agosti termed “algorithm extremism”.

What can we do about all this? This question has two answers. The first is easy but uncomfortable. The second is a strategy for the long run and calls for an active role.

Let’s start from the easy. We ultimately retain a certain degree of human (and democratic) agency: in any given moment, we can choose to opt out. To be sure, erasing our Facebook account doesn’t do the trick of protecting our long-eroded privacy: the company has the right to retain our data, as per Terms of Service, the long, convoluted legal document—a contract, that is—we all sign to but rarely read. With the “exit” strategy we lose in contacts, friendships, joyful exchange and we are no longer able to sneak in the life of others, but we gain in privacy and, perhaps, reclaim our ability to think autonomously. I bet not many of you will do this after reading this article—I haven’t myself found the courage to disengage entirely from my leisurely existence on social media platforms.

But there is good news. As the social becomes increasingly entrenched in its algorithmic fabric, there is a second option, a sort of survival strategy for the long run. We can learn to live with and deal withalgorithms. We can familiarize with their presence, engaging in a self-reflexive exercise that questions what they show us in any given interface and why. If understandably not all of us might be inclined to learn the ropes of programming, “knowing” the algorithms that so much affect us is a fundamental step to be able to fully exercise our citizenship in the age of big data. “Knowing” here means primarily making the acquaintance with their occurrence and function, and questioning the fact that being turned into a pile of data is almost an accepted fact of life these days. Because being able to think with one’s own head today, means also questioning the algorithms that so much shape our information worlds.

 

 

 

 

 

[blog] Critical reflections on FAT* 2018: a historical idealist perspective

Author: Sebastian Benthall, Research Scientist at NYU Steinhardt and PhD Candidate UC Berkeley School of Information.

In February, 2018, the inaugural 2018 FAT* conference was held in New York City:

The FAT* Conference 2018 is a two-day event that brings together researchers and practitioners interested in fairness, accountability, and transparency in socio-technical systems. This inaugural conference builds on success of prior workshops like FAT/ML, FAT/Rec, DAT, Ethics in NLP, and others.

FAT stands for “Fairness, Accountability, Transparency”, and the asterisk, pronounced “star”, is a wildcard character, which indicates that the conference ranges more widely that earlier workshops it succeeds, such as FAT/ML (ML meaning, “machine learning“), FAT/Rec (Rec meaning “recommender systems“). You might conclude from the amount of geekery in the title and history of the conference that FAT* is a computer science conference.

You would be half right. Other details reveal that the conference has a different, broader agenda. It was held at New York University’s Law School, and many of the committee chairs are law professors, not computer science professors. The first keynote speaker, Latanya Sweeney, argued that technology is the new policy as more and more decisions are delegated to automated systems. The responsibility of governance, it seems, is falling to the creators of artificial intelligence. The keynote speaker on the second day was Prof. Deborah Hellman, who provided a philosophical argument for why discrimination is morally wrong. This opened into a conversation about the relationship between random fate and justice with computer scientist Cynthia Dwork. The other speakers in the program in one way or another grappled with the problem of how to responsibly wield technological power over society.

It was a successful conference and it has great promise as venue for future work. It has this promise because it has been set up to expand intellectually beyond the confines of the current state of discourse around accountability and automation. This post is about the tensions within FAT* that make it intellectually dynamic. FAT* reflects the conditions of our a particular historical, cultural, and economic moment. The contention of this post is that the community involved in the conference has the opportunity to transcend that moment if they encounter its own contradictions head-on through praxis.

One significant tendency among the research at FAT* was the mathematization of ethics. Exemplified by Menon and Williamson’s “The cost of fairness in binary classification” (2018) (winner of a best paper award at the conference), many researchers come to FAT* to translate ethical injunctions, and the tradeoffs between them, into mathematical expressions. This striking intellectual endeavor sits at the center of a number of controversies between the humanities and sciences that have been going on for decades and continue today.

As has been long recognized in the foundational theory of computer science, computational algorithms are powerful because the are logically equivalent to the processes of mathematical proof. Algorithms, in the technical sense of the term, can be no more and no less powerful than mathematics itself. It has long been a concern that a world controlled by algorithms would be an amoral one; in his 1947 book Eclipse of Reason, Max Horkheimer argued that the increasing use of formal reason (which includes mathematics and computation) for pragmatic purposes would lead to a world dominated by industrial power that was indifferent to human moral considerations of what is right or good. Hannah Arendt, in The Human Condition (1959), wrote about the power of scientists who spoke in obscure mathematical language and were therefore beyond the scrutiny of democratic politics. Because mathematics is universal, it is unable to express political interests, which arise from people’s real, particular situations.

We live in a strikingly different time from the mid-20th century. Ethical concerns with the role of algorithms in society have been brought to trained computer scientists, and their natural and correct inclination has been to determine the mathematical form of the concern. Many of these scholars would sincerely like to design a better system.

Perhaps disappointingly, all the great discoveries in foundations of computing are impossibility results: the Halting Problem, the No Free Lunch theorem, etc. And it is no different in the field of Fairness in Machine Learning. What computer scientists have discovered is that life isn’t, and can’t be, fair, because “fairness” has several different definitions (twenty-one at last count) that are incompatible with each other (Hardt et al., 2016; Kleinberg et al., 2016). Because there are inherent tradeoffs to different conceptions of fairness and any one definition will allocate outcomes differently for different kinds of people, the question of what fairness is has now been exposed as an inherently political question with no compelling scientific answer.

Naturally, computer scientists are not the first to discover this. What’s happened is that it is their turn to discover this eternal truth because in this historical moment computer science is the scientific discipline that is most emblematic of power. This is because the richest and most powerful companies, the ones almost everybody depends on daily, are technology companies, and these companies project the image that their success is do mainly to the scientific genius of their early employees and the quality of the technology that is at their operational core.

The problem is that computer science as scientific discipline has very little to do with why large technology companies have so much power and sometimes abuse that power. These companies are much more than their engineers; they also include designers, product managers, salespeople, public relations people, and of course executives and shareholders. As sociotechnical organizations, they are most responsive to the profit motive, government regulations, and consumer behavior. Even if being fair was technically possible, they would still be businesses with very non-technical reasons for being unfair or unaccountable.

Perhaps because these large companies are so powerful, few of the papers at the conference critiqued them directly. Instead, the focus was often on the software systems used by municipal governments. These were insightful and important papers. Barabas et al.’s paper questioned the assumptions motivating much of the inquiry around “fairness in machine learning” by delving into the history and ideology of actuarial risk assessment in criminal sentencing. Chouldechova et al.’s case study in the workings of a child mistreatment hotline (winner of a best paper award) was a realistic and balanced study of the challenges of operating an algorithmic risk assessment system in municipal social services. At its best, FAT* didn’t look much like a computer science conference at all, even when the speakers and authors had computer science training. At its best, FAT* was grappling towards something new.

Some of this grappling is awkward. Buolamwini and Gebru presented a technically and politically interesting study of how commercially available facial recognition technologies underperform on women, on darker-skinned people, and intersectionally on darker-skinned women. In addition to presenting their results, the speakers proudly described how some the facial recognition companies responded to their article by improving the accuracy of their technology. For some at the conference, this was a victory for fairer representation and accountability of facial recognition technology that was otherwise built to favor lighter skinned men. But others found it difficult to celebrate the improved effectiveness of a technology for automated surveillance. Out of context, it’s impossible to know whether this technology does good or ill to those wearing the faces it recognizes. What was presented as a form of activism against repressive or marginalizing political forces may just as well have been playing into their hands.

This political ambiguity was glossed over, not resolved. And therein lay the crux of the political problem at the heart of FAT*: it’s full of well-intentioned people trying to discover technical band-aids to what are actually systemic social and economic problems. Their intentions and their technical contributions are both laudable. But there was something ideologically fishy going on, a fishiness reflective of a broader historical moment. Nancy Fraser (2016) has written about the phenomenon of progressive neoliberalism, an ideology that sounds like an oxymoron but in fact reflects the alliance between the innovation sector and identity-based activist movements. Fraser argues that progressive neoliberalism has been a hegemonic force until very recently. This year FAT*, with its mainly progressive sense of Fairness and Accountability and arguably neoliberal emphasis on computational solutions, was a throwback to what for many at the conference was a happier political time. I hope that next year’s conference takes a cue from Fraser and is more critical of the zeitgeist.

For now, as form of activism that changes things for the better, this year’s conference largely fell short because it would not address the systemic elephants in the room. A dialectical sublation is necessary and imminent. For it to do this effectively, the conference may need to add another letter to its name, representing another value. Michael Veale has suggested that the conference add an “R”, for reflexivity, perhaps a nod to the cherished value of critical qualitative scholars, who are clearly welcome in the room. However, if the conference is to realize its highest potential, it should add a “J”, for justice, and see what the bright minds of computer science think of that.

References

Arendt, Hannah. The human condition:[a study of the central dilemmas facing modern man]. Doubleday, 1959.

Barabas, Chelsea, et al. “Interventions over Predictions: Reframing the Ethical Debate for Actuarial Risk Assessment.” arXiv preprint arXiv:1712.08238 (2017).

Buolamwini, Joy, and Timnit Gebru. “Gender shades: Intersectional accuracy disparities in commercial gender classification.” Conference on Fairness, Accountability and Transparency. 2018.

Chouldechova, Alexandra, et al. “A case study of algorithm-assisted decision making in child maltreatment hotline screening decisions.” Conference on Fairness, Accountability and Transparency. 2018.

Fraser, Nancy. “Progressive neoliberalism versus reactionary populism: A choice that feminists should refuse.” NORA-Nordic Journal of Feminist and Gender Research 24.4 (2016): 281-284.

Hardt, Moritz, Eric Price, and Nati Srebro. “Equality of opportunity in supervised learning.” Advances in Neural Information Processing Systems. 2016.

Hellman, Deborah. “Indirect Discrimination and the Duty to Avoid Compounding Injustice.” (2017).

Horkheimer, Max. “Eclipse of Reason. 1947.” New York: Continuum (1974).

Kleinberg, Jon, Sendhil Mullainathan, and Manish Raghavan. “Inherent trade-offs in the fair determination of risk scores.” arXiv preprint arXiv:1609.05807 (2016).

[blog] Cloud communities and the materiality of the digital (GLOBALCIT project, EUI)

cropped-GlobalCitggp-logo

This invited blog post originally appeared in the forum ‘Cloud Communities: The Dawn of Global Citizenship?’ of the GLOBALCIT project (European University Institute). It is part of an interesting multidisciplinary conversation accessible from the GLOBALCIT website. I wish to thank Rainer Baubock and Liav Orgad for the invitation to contribute to the debate. 

Cloud communities and the materiality of the digital

By Stefania Milan (University of Amsterdam)

As a digital sociologist, I have always found ‘classical’ political scientists and lawyers a tad too reluctant to embrace the idea that digital technology is a game changer in so many respects. In the debate spurred by Liav Orgad’s provocative thoughts on blockchain-enabled cloud communities, I am particularly fascinated by the tension between techno-utopianism on the one hand (above all, Orgad and Primavera De Filippi), and socio-legal realism on the other (e.g., Rainer Bauböck, Michael Blake, Lea Ypi, Jelena Dzankic, Dimitry Kochenov). I find myself somewhere in the middle. In what follows, I take a sociological perspective to explain why there is something profoundly interesting in the notion of cloud communities, why however little of it is really new, and why the obstacles ahead are bigger than we might like to think. The point of departure for my considerations is a number of experiences in the realm of transnational social movements and governance: what we can learn from existing experiments that might help us contextualize and rethink cloud communities?

Three problems with Orgad’s argument

To start with, while I sympathise with Orgad’s provocative claims, I cannot but notice that what he deems new in cloud communities—namely the global dimension of political membership and its networked nature—is indeed rather old. Since the 1990s, transnational social movements for global justice have offered non-territorial forms of political membership—not unlike those described as cloud communities. Similar to cloud communities, these movements were the manifestation of political communities based on consent, gathered around shared interests and only minimally rooted in physical territories corresponding to nation states (see, e.g., Tarrow, 2005). In the fall of 2011 I observed with earnest interest the emergence of yet another global wave of contention: the so-called Occupy mobilisation. As a sociologist of the web, I set off in search for a good metaphor to capture the evolution of organised collective action in the age of social media, and the obvious candidate was… the cloud. In a series of articles (see, for example, here and here) and book chapters (e.g., here and here), I developed my theory of ‘cloud protesting’, intended to capture how the algorithmic environment of social media alters the dynamics of organized collective action. In light of my empirical work, I agree with Bauböck, who acknowledges that cloud communities might have something to do with the “expansion of civil society, of international organizations, or of traditional territorial polities into cyberspace”. He also points out how, sadly, people can express their political views – and, I would add, engage in disruptive actions, as happens at some fringes of the movement for global justice – only because “a secure territorial citizenship” protects their exercise of fundamental rights, such as freedom of expression and association. Hence the questions a sociologist might ask: do we really need the blockchain to enable the emergence of cloud communities? If, as I argue, the existence of “international legal personas” is not a pre-requisite for the establishment of cloud communities, what would the creation of “international legal personas” add to the picture?[1]

Secondly, while I understand why a blockchain-enabled citizenship system would make life easier for the many who do not have access to a regular passport, I am wary of its “institutionalisation”, on account of the probable discrepancies between the ideas (and the mechanisms) associated with a Westphalian state and those of politically active activists and radical technologists alike. On the one hand, citizens interested in “advanced” forms of political participation (e.g., governance and the making of law) might not necessarily be inclined to form a state-like entity. For example, many accounts of the so-called “movement for global justice” (McDonald, 2006; della Porta & Tarrow, 2005) show how “official” membership and affiliation is often not required, not expected and especially not considered desirable. Activism today is characterised by a dislike and distrust of the state, and a tendency to privilege flexible, multiple identities (e.g., Bennett & Segerberg, 2013; Juris, 2012; Milan, 2013). On the other hand, the “radical technologists” behind the blockchain project are animated by values—an imaginaire (Flichy, 2007)—deeply distinct from that of the state (see, e.g., Reijers & Coeckelbergh, 2018). While the blockchain technology is enabled by a complex constellation of diverse actors, it is legitimate to ask whether it is possible to bend a technology built with an “underlying philosophy of distributed consensus, open source, transparency and community” with the goal to “be highly disruptive”(Walport, 2015)… to serve similar purposes as those of states?

Thirdly, Orgad’s argument falls short of a clear description of what the ‘cloud’ stands for in his notion of cloud communities. When thinking about ‘clouds’, as a metaphor and a technical term, we cannot but think of cloud computing, a “key force in the changing international political economy” (Mosco, 2014, p. 1) of our times, which entails a process of centralisation of software and hardware allowing users to reduce costs by sharing resources. The cloud metaphor, I argued elsewhere (Milan, 2015), is an apt one as it exposes a fundamental ambivalence of contemporary processes of “socio-legal decentralisation”. While claiming distance from the values and dynamics of the neoliberal state, a project of building blockchain-enabled communities still relies on commercially-owned infrastructure to function.

Precisely to reflect on this ambiguity, my most recent text on cloud protesting interrogates the materiality of the cloud. We have long lived in the illusion that the internet was a space free of geography. Yet, as IR scholar Ron Deibert argued, “physical geography is an essential component of cyberspace: Where technology is located is as important as what it is” (original italics). The Snowden revelations, to name just one, have brought to the forefront the role of the national state in—openly or covertly—setting the rules of user interactions online. What’s more, we no longer can blame the state alone, but the “surveillant assemblage” of state and corporations (Murakami Wood, 2013). To me, the big absent in this debate is the private sector and corporate capital. De Filippi briefly mentioned how the “new communities of kinship” are anchored in “a variety of online platforms”. However, what Orgav’s and partially also Bauböck’s contributions underscore is the extent to which intermediation by private actors stands in the way of creating a real alternative to the state—or at least the fulfilment of certain dreams of autonomy, best represented today by the fascination for blockchain technology. Bauböck rightly notes that “state and corporations… will find ways to instrumentalise or hijack cloud communities for their own purposes”. But there is more to that: the infrastructure we use to enable our interpersonal exchanges and, why not, the blockchain, are owned and controlled by private interests subjected to national laws. They are not merely neutral pipes, as Dumbrava reminds us.

Self-governance in practice: A cautionary tale

To be sure, many experiments allow “individuals the option to raise their voice … in territorial communities to which they do not physically belong”, as beautifully put by Francesca Strumia. Internet governance is a case in point. Since the early days of the internet, cyberlibertarian ideals, enshrined for instance in the ‘Declaration of Independence of Cyberspace’ by late JP Barlow, have attributed little to no role to governments—both in deciding the rules for the ‘new’ space as well as the citizenship of its users (read: the right to participate in the space and in the decision-making about the rules governing it). In those early flamboyant narratives, cyberspace was to be a space where users—but really engineers above all—would translate into practice their wildest dreams in matter of self-governance, self-determination and, to some extent, fairness. While cyberlibertarian views have been appropriated by both conservative (anti-state) and progressive forces alike, some of their founding principles have spilled over to real governance mechanisms—above all the governance of standards and protocols by the Internet Engineering Task Force (IETF), and the management of the the Domain Name System (DNS) by the Internet Corporation for Assigned Names and Numbers (ICANN).[2] Here I focus on the latter, where I have been active for about four years (2014-2017).

ICANN is organized in constituencies of stakeholders, including contracted parties (the ‘middlemen’, that is to say registries and registrars that on a regional base allocate and manage on behalf of ICANN the names and numbers, and whose relationship with ICANN is regulated by contract), non-contracted parties (corporations doing business on the DNS, e.g. content or infrastructure providers) and non-commercial internet users (read: us). ICANN’s proceedings are fully recorded and accessible from its website; its public meetings, thrice a year and rotating around the globe, are open to everyone who wants to walk in. Governments are represented in a sort of United Nations-style entity called the Government Advisory Committee. While corporate interests are well-represented by an array of professional lobbyists, the Non-Commercial Stakeholder Group (NCSG), which stands in for civil society,[3] is a mix and match of advocates of various extraction, expertise and nationality: internet governance academics, nongovernmental organisations promoting freedom of expression, and independent individuals who take an interest in the functioning of the logical layer of the internet.

The 2016 transition of the stewardship over the DNS from the US Congress to the “global multistakeholder community” has achieved a dream unique in its kind, straight out of the cyberlibertarian vision of the early days: the technical oversight of the internet[4] is in the hands of the people who make and use it, and the (advisory) role of the state is marginal. Accountability now rests solely within the community behind ICANN, which envisioned (and is still implementing) a complex system of checks and balances to allow the various stakeholder voices to be fairly represented. No other critical infrastructure is regulated by its own users. To build on Orgad’s reasoning, the community around ICANN is a cloud community, which operates by voluntary association and consensus [5],[5] and is entitled to produce “governance and the creation of law”.[6]

But the system is far from perfect. Let’s look at how the so-called civil society is represented, focusing on one such entity, the NCSG. Firstly, given that everyone can participate, the variety of views represented is enormous, and often hinders the ability of the constituency to be effective in policy negotiations. Yet, the size of the group is relatively small: at the time of writing, the Non-Commercial User Constituency (the bigger one among the two that form the NCSG) comprises “538 members from 161 countries, including 118 noncommercial organizations and 420 individuals”, making it the largest constituency within ICANN: this is nothing when compared to the global internet population it serves, confirming, as Dzankic argues, that “direct democracy is not necessarily conducive to broad participation in decision-making”. Secondly, ICANN policy-making is highly technical and specialised; the learning curve is dramatically steep. Thirdly, to be effective, the amount of time a civil society representative should spend on ICANN is largely incompatible with regular daily jobs; civil society cannot compete with corporate lobbyists. Fourthly, with ICANN meetings rotating across the globe, one needs to be on the road for at least a month per year, with considerable personal and financial costs.[7] In sum, while participation is in principle open to everyone, informed participation has much higher access barriers, which have to do with expertise, time, and financial resources (see, e.g., Milan & Hintz, 2013).

As a result, we observe a number of dangerous distortions of political representation. For example, when only the highly motivated participate, the views and “imaginaries” represented are often at the opposite ends of the spectrum (cf., Milan, 2014). Only the most involved really partake in decision-making, in a mechanism which is well known in sociology: the “tyranny of structurelessness” (Freeman, 1972), which is typical of participatory, consensus-based organising. The extreme personalisation of politics that we observe within civil society at ICANN—a small group of long-term advocates with high personal stakes—yields also another similar mechanism, known as “the tyranny of emotions” (Polletta, 2002), by which the most invested, independently of the suitability of their curricula vitae, end up assuming informal leadership roles—and, as the case of ICANN shows, even in presence of formal and carefully weighted governance structures. Decision-making is thus based on a sort of “microconsensus” within small decision-making cliques (Gastil, 1993).[8] To make things worse, ICANN is increasingly making exceptions to its own, community-established rules, largely under the pressure of corporations as well as law enforcement: for example, the corporation has recently been accused of bypassing consensus policy-making through voluntary agreements ad private contracting.

Why not (yet?): On new divides and bad players

In conclusion, while I value the possibilities the blockchain technology opens for experimentation as much as Primavera De Filippi, I do not believe it will really solve our problems in the short to middle-term. Rather, as it is always with technology because of its inherent political nature (cf., Bijker, Hughes, & Pinch, 2012), new conflicts will emerge—and they will concern both its technical features and its governance.

Earlier contributors to this debate have raised important concerns which are worth listening to. Besides Bauböck’s concerns over the perils for democracy represented by a consensus-based, self-governed model, endorsed also by Blake, I want to echo Lea Ypi’s reminder of the enormous potential for exclusion embedded in technologies, as digital skills (but also income) are not equally distributed across the globe. For the time being, a citizenship model based on blockchain technology would be for the elites only, and would contribute to create new divides and to amplify existing ones. The first fundamental step towards the cloud communities envisioned by Orgad would thus see the state stepping in (once again) and being in charge of creating appropriate data and algorithmic literacy programmes whose scope is out of reach for corporations and the organised civil society alike.

There is more to that, however. The costs to our already fragile ecosystem of the blockchain technology are on the rise along with its popularity. These infrastructures are energy-intensive: talking about the cryptocurrency Bitcoin, tech magazine Motherboard estimated that each transaction consumes 215 Kilowatt-hour of electricity—the equivalent of the weekly consumption of an American household. A world built on blockchain would have a vast environmental footprint (see also Mosco, 2014). Once again, the state might play a role in imposing adequate regulation mindful of the environmental costs of such programs.

But I do not intend to glorify the role of the state. On the contrary, I believe we should also watch out for any attempts by the state to curb innovation. The relatively brief history of digital technology, and even more that of the internet, is awash with examples of late but extremely damaging state interventions. As soon as a given technology performs roles or produces information that are of interest to the state (e.g., interpersonal communications), the state wants to jump in, and often does so in pretty clumsy ways. The recent surveillance scandals have abundantly shown how state powers firmly inhabit the internet (cf., Deibert, 2009; Deibert, Palfrey, Rohozinski, & Zittrain, 2010; Lyon, 2015)—and, as the Cambridge Analytica case reminds us, so do corporate interests. Moreover, the two are, more often than not, dangerously aligned.

I do not intend, with my cautionary tales, to hinder any imaginative effort to explore the possibilities offered by blockchain to rethink how we understand and practice citizenship today. The case of Estonia shows that different models based on alternative infrastructure are possible, at least on the small scale and in presence of a committed state. As scholars we ought to explore those possibilities. Much work is needed, however, before we can proclaim the blockchain revolution.

References

Bennett, L. W., & Segerberg, A. (2013). The Logic of Connective Action Digital Media and the Personalization of Contentious Politics. Cambridge, UK: Cambridge University Press.

Bijker, W. E., Hughes, T. P., & Pinch, T. (Eds.). (2012). The Social Construction of Technological Systems. New Direction in the Sociology and History of Technology. Cambridge, MA and London, England: MIT Press.

Deibert, R. J. (2009). The geopolitics of internet control: censorship, sovereignty, and cyberspace. In A. Chadwick & P. N. Howard (Eds.), The Routledge Handbook of Internet Politics (pp. 323–336). London: Routledge.

Deibert, R. J., Palfrey, J. G., Rohozinski, R., & Zittrain, J. (Eds.). (2010). Access Controlled: The Shaping of Power, Rights, and Rule in Cyberspace. Cambridge, MA: MIT Press.

della Porta, D., & Tarrow, S. (Eds.). (2005). Transnational Protest and Global Activism. Lanham, MD: Rowman & Littlefield.

Flichy, P. (2007). The internet imaginaire. Cambridge, Mass.: MIT Press.

Freeman, J. (1972). The Tyranny of Structurelessness.

Gastil, J. (1993). Democracy in Small Groups. Participation, Decision Making & Communication. Philadelphia, PA and Gabriola Island, BC: New Society Publishers.

Juris, J. S. (2012). Reflections on #Occupy Everywhere: Social Media, Public Space, and Emerging Logics of Aggregation. American Ethnologist, 39(2), 259–279.

Lyon, D. (2015). Surveillance After Snowden. Cambridge and Malden, MA: Polity Press.

McDonald, K. (2006). Global Movements: Action and Culture. Malden, MA and Oxford: Blackwell.

Milan, S. (2013). WikiLeaks, Anonymous, and the exercise of individuality: Protesting in the cloud. In B. Brevini, A. Hintz, & P. McCurdy (Eds.), Beyond WikiLeaks: Implications for the Future of Communications, Journalism and Society (pp. 191–208). Basingstoke, UK: Palgrave Macmillan.

Milan, S. (2015). When Algorithms Shape Collective Action: Social Media and the Dynamics of Cloud Protesting. Social Media + Society, 1(1).

Milan, S., & Hintz, A. (2013). Networked Collective Action and the Institutionalized Policy Debate: Bringing Cyberactivism to the Policy Arena? Internet & Policy, 5, 7–26.

Milan, S., & ten Oever, N. (2017). Coding and encoding rights in internet infrastructure. Internet Policy Review, 6(1).

Mosco, V. (2014). To the Cloud: Big Data in a Turbulent World. New York: Paradigm Publishers.

Murakami Wood, D. (2013). What Is Global Surveillance?: Towards a Relational Political Economy of the Global Surveillant Assemblage. Geoforum, 49, 317–326.

Polletta, F. (2002). Freedom Is an Endless Meeting: Democracy in American Social Movements. Chicago: University of Chicago Press.

Reijers, W., & Coeckelbergh, M. (2018). The Blockchain as a Narrative Technology: Investigating the Social Ontology and Normative Configurations of Cryptocurrencies. Philosophy & Technology, 31(1), 103–130.

Tarrow, S. (2005). The New Transnational Activism. New York: Cambridge University.

Walport, M. (2015). Distributed Ledger Technology: Beyond blockchain. London: UK Government Office for Science. London: UK Government Office for Science.

Notes:

[1] I am aware that there is a fundamental drawback in social movements when compared to cloud communities: unlike the latter, the former are not rights providers. However, these are the questions one could ask taking a sociological perspective.

[2] The system of unique identifiers of the DNS comprises the so-called “names”, standing in for domain names (e.g., www.eui.eu), and “numbers”, or Internet Protocol (IP) addresses (e.g., the “machine version” of the domain name that a router for example can understand). The DNS can be seen as a sort of “phone book” of the internet.

[3] Technically, of the DNS, which is only a portion of what we call “the internet”, although the most widely used one.

[4] Civil society representation in ICANN is more complex than what is described here. The NCSG is composed of two (litigious) constituencies, namely the Non-Commercial User Constituency (NCUC) and the Non-Profit Operational Concerns (NPOC). In addition, “non-organised” internet users can elect their representatives in the At-Large Advisory Committee (ALAC), organised on a regional basis. The NCSG, however, is the only one who directly contributes to policy-making.

[5] ICANN is both a nonprofit corporation registered under Californian law, and a community of volunteers who set the rules for the management of the logical layer of the internet by consensus. See also the ICANN Bylaws (last updated in August 2017).

[6] This should at least in part address Post’s doubts about the ability of a political community to govern those outside of its jurisdiction. One might argue that internet users are, perhaps unwillingly or simply unconsciously, within the “jurisdiction” of ICANN. I do believe, however, that the case of ICANN is an interesting one for its being in between the two “definitions” of political communities.

[7] ICANN allocates consistent but not sufficient resources to support civil society participation in its policymaking. These include travel bursaries and accommodation costs and fellowship programs for induction of newcomers.

[8] Although a quantitative analysis of the stickiness of participation in relation to discursive change reveals a more nuanced picture (see, for example, Milan & ten Oever, 2017).

 

[blog] Tech, data and social change: A plea for cross-disciplinary engagement, historical memory, and … Critical Community Studies

Kersti R. Wissenbach | March 2018

It has been a while since I first got my feet into the universe of technology and socio-political change. Back then, coming from a critical development studies and communication science background, I was fascinated by the role community radio could play in fostering dialogue among communities in remote areas, and between those communities and their government representatives.

My journey started in the early 2000s, in the most remote parts of Upper West Ghana, with Radio Progress, a small community radio station doing a great job in embracing diversity. Single feature mobile phones were about to become a thing in the country and the radio started to experiment with call-in programs for engaging its citizens in live discussions with local politicians. Before, radio volunteers would drive to the different villages in order to collect people’s concerns, and only then bring those recorded voices back into a studio-based discussion with invited politicians. The community could merely listen in as their concerns were discussed. With the advent of mobile phones, people suddenly could do more than just passively listen to the responses: finally they could engage in real-time dialogue with their representatives, hearing their own voices on air. Typically, people were gathering with family and other community members during the call-in hours to voice their concerns collectively. Communities would not only raise concerns, but also share positive experiences with local representatives following up on their requests. These stories encouraged neighbouring communities to also get involved in the call-in programs to raise their concerns and needs to be addressed.

Fast forward to today and much has changed on the ‘tech for social change’ horizon, at least if we listen to donor agendas and the dominant discourses in the field and in the academia. But what has really changed is largely one thing: the state of technology [1]. In the space of two decades, our enthusiasm, and donor attention, fixed on the ubiquity of mobile technologies, followed by online (crowdsourcing) platforms, social media, everything data (oh, wait … BIG data), and blockchain technology.

Whilst much of what has changed in these regards over the last few decades can be bundled under the Information and Communication for Development (ICT4D) label, one aspect seems to remain constant: change, if it is meant to happen and last, has to be rooted in the contexts and needs of those it intends to address. This is the ultimate ingredient for direct and inclusive engagement of the so-called civil society. Like a cake that needs yeast to rise, no matter whether we add chocolate or lemon, socio-political change in the interest of the people requires the buy-in of the people, no matter what tech is on the menu at a certain moment in time, and in a certain place of the world.

We have learnt many lessons along the way, and we had to sometimes learn them the hard way. Some are condensed in initiatives such as the Principles for Digital Development, a living set of principles helping practitioners engaging with the role of technologies in social or political change programs to learn from past experiences, in order to avoid falling into the same traps – be it of technological, political, and/or ethical nature.

We have observed an upsurge in ‘civic’ users of technologies for facilitating people’s direct engagement in governance, coupled with an emphasis on ‘open government models’. Much of this work emerged in parallel to or from earlier ICT4D experiences, and largely taps into the same funding structures. The lessons learned should be a shared heritage in the field. With various early programs coming to an end, this transnational community of well-intended practitioners, many of which have been involved in what we have earlier called ICT4D work, is now reflecting on the effectiveness of technology in promoting civil society participation in governance dynamics. What puzzles me year after year, however, is how practitioners of civic tech and open government, currently producing ‘first lessons learned’ on the effectiveness of technology in civil society participation in governance, are largely reproducing what we already know, and thus lessons we should have learnt. As critical as I am towards project work driven by traditional development cooperation, all this leaves me wondering what is novel, if anything, in these newest networks – largely breathing from the same funding pots.

New developments in the tech field do not liberate us from the responsibility to learn from what has already been learned – and build on it. The lessons learnt in decades of development communication and ICT4D works evidently cut across technological innovations, and apply to mobile technology as much as to the blockchain. Most importantly: different socio-political contexts call for personalized solutions, given the challenges remain distinct and increase in complexity, as we can see in the growing literature on critical data studies (see e.g. Dalton et al., 2016; Kitchin and Lauriault, 2014).

The critical role of proactive communities, their contexts and needs in fostering social or political change has been discussed since decades. Besides, as the Radio Progress anecdote shows, it applies across technologies. Sadly, once again, the dominant civic tech discourse seems to keep departing from the ‘tech’ rather than the ‘civic’. Analyses start off from the technology-in-governance side, rather than from the much-needed critical discourse of the fundamental role of power in governance: how it is constructed, reproduced, and distributed.

Departing from the aseptic end of the spectrum confines us to a tech-centric perspective, with all the limitations highlighted since the early days of Communication for Social Change and ICT4D critique. Instead, we should reflect on how power structures are seeded and nourished from within the very same communities. This relates to issues such as geographical as much as skill-related biases, originating patterns of exclusion that no technology alone can solve. Those biases are then reproduced, not solved, by technological solutions which aim would be, instead, to enable inclusive forms of governance.

For the civic tech field to move forward, we should move beyond an emphasis on feedback allocation and end-users ultimately centring on the technological component; we should instead adopt a broader perspective in which we recognise the user not merely as a tech consumer/adopter, but as a complex being embedded in civil society networks and power structures. We, therefore, should ask critical questions beyond technology and about communities instead; we should ask ourselves, for example, how to best integrate people’s needs and backgrounds across all stages of civic tech programs. Such a perspective should include a critical examination of who the driving forces of the civic tech community are and how they do subsequently affect decision-making on the development of infrastructures. What is crucial to understand, I argue, is that only inclusive communities can really translate inclusive technology approaches and, consequently, inclusive governance.

From the perspective of an academic observer, a disciplinary evolution is in order too, if we are to capture, understand, and critically contribute to these dynamics. The proposed shift of focus from the ‘tech’ to the ‘civic’ should be mirrored in the literature with a new sub-field, which we may call Critical Community Studies. Emerging at the crossroad of disciplines such as Social Movement Studies, Communication for Social Change, and Critical Data Studies, Critical Community Studies would encourage to taking the community as an entry point in the study of technology for social change. This means, in a case such as the civic tech community, addressing issues such as internal diversity, inclusiveness of decision-making processes, etc. and ways of different ways of engaging people. It also relates to the roots of decisions made in civic tech projects, and in how far those communities, supposed to benefit from certain decisions, have a seat on the table. More generally, Critical Community Studies should invite to critically reflect on the concept of inclusion, both for practitioner agendas and academic frameworks. It would also encourage us to contextualize, take a step back and ask difficult questions, departing from critical development and communication studies (see e.g. Enghel, 2014; Freire, 1968; Rodriguez, 2016) , while taking a feminist perspective (see e.g. Haraway, 1988; Mol, 1999).

Since such a disciplinary evolution cannot but happen in dialogue with existing approaches and thinkers, I would wish to see this post to evolve into a vibrant, cross-disciplinary conversation on how a Critical Community Studies could look like.

 

I would like to thank Stefania Milan for very valuable and in-depth feedback and insights whilst writing this post.

 

 

Cited work

Dalton CM, Taylor L and Thatcher (alphabetical) J (2016) Critical Data Studies: A dialog on data and space. Big Data & Society 3(1): 2053951716648346. DOI: 10.1177/2053951716648346.

Enghel F (2014) Communication, Development, and Social Change: Future Alternatives. In: Global communication: new agendas in communication. Routledge, pp. 129–141.

Freire P (1968) Pedagogy of the Oppressed. New York: Herder and Herder.

Haraway D (1988) Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective. Feminist Studies 14(3): 575–599. DOI: 10.2307/3178066.

Kitchin R and Lauriault T (2014) Towards Critical Data Studies: Charting and Unpacking Data Assemblages and Their Work. ID 2474112, SSRN Scholarly Paper. Rochester, NY: Social Science Research Network. Available at: https://papers.ssrn.com/abstract=2474112 (accessed 19 March 2018).

Mol A (1999) Ontological politics. A word and some questions. The Sociological Review 47(S1): 74–89. DOI: 10.1111/j.1467-954X.1999.tb03483.x.

Rodriguez C (2016) Human agency and media praxis: Re-centring alternative and community media research. Journal of Alternative and Community Media 1(0): 36–38.

 

I am consciously not using the innovation term here since I truly believe that innovation can only be what truly features into people’s contexts and needs. Innovation, then, is not to be confused with the latest tech advancement or hype.

[blog] Facebook newsfeed changes: Three hypotheses to look into the future

Image: Vincenzo Cosenza

In this blog post, DATACTIVE research associate Antonio Martella is looking forward to the consequences of Facebook’s news feed modifications as result of larger corporate policy changes. He investigates and discusses implications through three hypotheses: 1) the divide between the attention-rich and the attention poor will grow 2) increasing engagement with peer-created content will tighten the filter bubble aspect of networking and 3) the “new” news feed will have a negative impact on users’ mood.

Guest Author: Antonio Martella

On November 11th, 2017, Facebook has announced that the user timeline will change in January 2018. In their words:

“With this update, we will also prioritize posts that spark conversations and meaningful interactions between people. To do this, we will predict which posts you might want to interact with your friends about and show these posts higher in the feed. These are posts that inspire back-and-forth discussion in the comments and posts that you might want to share and react to – whether that’s a post from a friend seeking advice, a friend asking for recommendations for a trip, or a news article or video prompting lots of discussions. […] We will also prioritize posts from friends and family over public content, consistent with our News Feed values.” (Newsroom Facebook 2018)

Any modification in the feed algorithm will have many consequences, and these are not equally predictable. Facebook is a very complicated environment, semi-public in nature and not only related to friendship management. In fact, as the Pew Research Center reported last September, 67% of Americans consume news over social media. This pattern seems to apply to the European news consumption too, where youngsters are exposed to news mostly in a social media context rather than television or newspaper. Indeed, as the Reuters Institute’s Digital News Report 2017 shows, many users follow others because of the news they share.

According to the Pew Research Report, Facebook surpasses other social media as a source of news consumption. This is partially due to the large userbase Facebook has, and partially because news is actually interwoven with people’s timelines. The Digital News Report also shows that exposure to news in Facebook is often incidental; a direct result of news shared by other users, a wide range of news companies that are followed, etc. Notwithstanding, we need to keep in mind that exposure to any content in social media or search engines is algorithm-driven.

Following these considerations, there are several possible consequences to the Facebook news feed changes. This blogpost invests into three probable implication, being

  1. the divide between the attention-rich and the attention poor will grow;
  2. continuous personalisation;
  3. negative impact on users’ mood

1. The divide between the attention-rich and the attention poor will grow

All pages and groups that share content on Facebook will lose visibility and revenues that come from users reading their posts, clicking their links, and visiting their websites1. It’s easy to guess that those who want to remain visible have two choices: either pay more for Facebook ads in order to make their posts visible; or create more engaging content. But the generated engagement in Facebook is deeply connected with the number of followers. This will probably increase the gap between attention reach and attention poor, which is in line with the observed Matthew effect (Merton, 1968) that rules many patterns and practices online (Barabasi, 2013) and in social media.

In fact, many aspects of the society both online and offline are governed by the preferential attachment process that stays behind the so-called “Matthew effect” or the “80/20 rule”. Hence, the more connection you have the more visible you are, and the more new connections you would get as a consequence. This principle can easily be illustrated by the fact that famous websites and people tend to have more followers on social media. But the other way around is equally true: the fewer connection you have, the less attention you would get. In conclusion, contents produced by people or organizations with less power/resources and with lower budgets will decrease in visibility.

2. Continuous personalisation

The second consequence of the news feed change deals with the kind of content that will be dominant in users’ feeds. According to Mark Zuckerberg, content produced and shared by “friends and family” will be more visible in all Facebook timelines. But a news feed dominated by friends’ posts could arguably exacerbate two negative social media aspects, previously expressed through notions of the filter bubbles and the echo chamber. Online social networks developed in social media platforms are strongly based on homophily (Barberà, 2014; Aiello et al 2012) meaning that users connect with others who share similar interests, values, political views, etc. This typical behaviour is also found in offline social networks (McPherson, Smith-Lovin, Cook, 2001), and shows its most problematic characteristics when focusing on information diffusion.

On the one hand, this change will foster the filter bubble in which we are all involved. In fact, filter bubbles (Pariser, 2011) are the result of users’ activities on the web: social media algorithms which continuously learn from every users’ clicks and likes2. On the other hand, more homophily in social media due to the prevalence of “friends and family contents” could easily sustain the echo chamber effect. This phenomenon preceded social media platforms, for like-minded people love to talk to each other fostering their opinions and biases. However, in social media, it is easier to avoid a contrasting point of views, values, or interests as a consequence of the self-selection of “friends”, pages, and groups. Indeed, as research has highlighted, there is a user tendency to promote their favourite narratives and to form polarised groups on Facebook (Quattrociocchi, Scala, Sunstein 2016; Bakshy, Messing, Adamic, 2015) even though it is not a clear and deterministic process (Barberà et al. 2015).

Based on these last considerations, another outcome of news feed changes will be a growth in the visibility of friends’ opinions and points of view. This will most probably result in more polarised information flow in users’ news feeds and a limited number of different point of views and professional (or semi-professional) content. In practice this means that if we would think about a contested news like glyphosate and cancer causation, we have to take in account that information sources will be more socially driven; the chance to read a different point of views and professional news will be smaller than before.

3. Negative impact on users’ mood

The news feed changes will probably influence the mood of billions of people in an inscrutable way. One can say that a news feed more populated by friend’s content would have a negative impact on happiness. According to Mark Zuckerberg “the research shows that when we use social media to connect with people we care about, it can be good for our well-being”. In fact, according to an experiment conducted on users timeline (Kramer, Guillory, Hancock, 2013) content on the users’ timeline does indeed influences their mood. As many researchers have shown, personal feelings (happiness, depression, etc.) flow through offline social networks (Fowler, Christakis, 2008) and their representation in online environments seems to share similar diffusion patterns. In other words: moods contagiously spread online. And in extension, recent scholarly and non-scholarly work shows that scrolling through your Facebook feed can have a negative impact on well-being (Shakya, Christakis, 2017)3. Lastly, it has been demonstrated that the constant bombardment of everyone’s news, biases the attempt to provide the best representation of the self and it seems to have a negative impact on happiness.

Questions to ask

Throughout the hypothesis, I have tried to show some real-life aspects that might be affected by the important changes on Facebook algorithms. As Facebook stated, there are around 2 billion active users on its platform monthly.

These statements subsequently evoke two questions:

  1. Can these changes be made by a private company without any form of public discussion?
  2. Is it our democratic right to scrutinize algorithms as organiser of public space?

Further information on how Facebook algorithms work can be found here: an interesting article edited by Share Lab that has tried to shed some light on what is behind this platform.

 

References

Aiello, Luca Maria, Barrat, Alain, Schifanella, Rossano, Cattuto, Ciro, Markines, Benjamin, Menczer, Filippo. 2012. Friendship prediction and homophily in social media. ACM Trans. Web 6, 2, Article 9, 33 p. 66.

Bakshy, Eytan, Messing, Solomon, Adamic, Lada A. 2015.Exposure to ideologically diverse news and opinion on Facebook in Science 05 Jun 2015: Vol. 348, Issue 6239, pp. 1130-1132.

Pariser, Eli, 2012, The Filter Bubble: What The Internet Is Hiding From You, Penguin: London.

Quattrociocchi, Walter, Scala, Antonio, Sunstein, Cass R. 2013. Echo Chambers on Facebook. Available at SSRN: https://ssrn.com/abstract=2795110.

Shakya, Holly B., Christakis, Nicholas A. 2017. Association of Facebook Use With Compromised Well-Being: A Longitudinal Study in American Journal of Epidemiology, 185:3, pp. 203–211.

Rogers, Richard, 2015. Digital Methods for Web Research, in Emerging Trends in the Social and Behavioral Sciences: An Interdisciplinary, Searchable, and Linkable Resource (ed. Scott, Roberts; Buchmann, Marlis C.; Kosslyn Stephan), Wiley & Sons: New York

 

  1. For example, this is exactly what happened to the blog LittleThings. This blog had to shut down a month after the news feed change due to the web traffic drop.
  2. This is already happening as an Italian experiment on Facebook have partially shown during the last Italian election (linkunfortunately only in Italian). According to this experiment, Facebook news feed shows different kind of content and media (photo, video, web links) based on likes, comment and shares of each user. Indeed, according to Facebook statements, proposed content will be more based on each user’s intention to interact (algorithmically predicted) fostering the visibility of tailored content.
  3. For example «Liking others’ content and clicking links posted by friends were consistently related to compromised well-being, whereas the number of status updates was related to reports of diminished mental health» (Shakya, Christakis, 2017, p. 210).

 

On the author: Antonio is a PhD candidate in Political Science at the University of Pisa. His research focus is political leaders populism in social media. His approach coincides with the Digital Methods for Web Research recommendations (Rogers, 2015), and he is particularly interested in social media algorithms and their effects.