Category: show on landing page

[BigDataSur] India’s Aadhaar: The datafication of anti-poverty programmes and its implications

By Silvia Masiero, Loughborough University

The notion of datafication implies rendering existing objects, actions and processes into data. Widely studied in the field of business intelligence, datafication is known to restructure consumer behaviour and the functioning of markets in multiple ways. But a less-widely researched aspect pertains to the datafication of public welfare and social protection programmes, on which the livelihoods of many poor and vulnerable people worldwide are based. The field of information and communication technology for development (ICT4D), which for more than thirty years has focused on the roles of informatics in development processes, is coming to realize the growing importance of datafication in the enactment of social policies.

Datafication acquires a particular meaning when referring to anti-poverty programmes, which are social protection schemes designed specifically for the poor. In such schemes, what is converted into machine-readable data is in the first place the population of entitled users. This leads to restructuring two core functions of anti-poverty schemes: first is the recognition of beneficiaries, automatizing the process that discriminates entitled individuals and households from non-entitled ones. Second is the correct assignation of entitlements, based on the availability of machine-readable data for their determination. While both functions were previously paper-based or only partially digitized, datafication affords the power to automatize them, with a view of infusing greater effectiveness and accountability in programme design.

Against this backdrop, my research focuses on the two concomitant aspects of the effects of datafication on the architecture of anti-poverty programmes, and its consequences on the entitlements that beneficiaries receive through them. My PhD thesis focused on the digitalization of the Public Distribution System (PDS), which is India’s largest food security programme and centres on distributing primary necessity items (mainly rice, wheat, sugar and kerosene) at subsidized prices to the nation’s poor. The back-end digitalization of the scheme, started at the state level in the early 2000s, is now culminating in datafication of the programme through the Unique Identity Project (Aadhaar), an identity scheme that constitutes the biggest biometric identification database in the world. Built with the declared purpose of facilitating the socioeconomic inclusion of India’s poor, Aadhaar provides all enrolees with a 12-digit number and the capture of biometric details, to make sure, among other aspects, that each enrolee obtains their social benefits through a simple operation of biometric recognition.

Datafication contributes to deep transformation of anti-poverty programmes, with mixed effects on programme architecture and entitlements of beneficiaries

My data collection on the datafied PDS has occurred in the two southern Indian states of Kerala and Karnataka, and also comprehends a review of the state-level cases of Aadhaar-based PDS currently operating in India. Through the years, my research has developed three lines of reflection which I synoptically illustrate below.

First, datafication is constructed by the Indian central government as a tool for simplification of access, and of improvement of users’ capability to obtain their entitlements under existing schemes. The Aadhaar-based PDS is indeed constructed to reduce the inclusion error, meaning access to the programme by non-entitled people, and the exclusion error (Swaminathan 2002), meaning the negation of subsidy to the entitled. In doing so, the biometric system traces sales from PDS ration shops to reduce diversion (rice mafia), an illegal network through which foodgrains aimed at the poor are diverted on the market for higher margins. What emerges from my research is a strong governmental narrative portraying Aadhaar as a problem-solver of PDS: technology is depicted by government officials as a simplifier of the existing system, facilitating a better and more accountable functioning of a leakage-prone anti-poverty scheme that has been in operation for a long time.

Second, recipients’ view of the datafied PDS is mixed: it reveals some positive changes, but also a set of issues that were not in place before the advent of the biometric system. One, making access conditional to enrolment in the Aadhaar database, the new system subordinates the universal right to food to enrolment in a biometric database, leading the poor to ‘trade’ their data for the food rations needed for their livelihoods. Two, while the programme is designed to combat the inclusion error, new forms of exclusion are caused by systems’ malfunctionings leading to failure in user recognition, which in turn results in families having their food rations denied even for several months in a row. Three, the system is not built to act on the back-end diversion (PDS commodities being diverted before they reach the ration shops where users buy them), where, according to existing studies of the PDS supply chain, the greatest part of goods is diverted (Khera 2011, Drèze & Khera 2015).

Third, there is a specific restructuring intent behind the creation of an Aadhaar-based PDS. From documents and narratives released by the central government, a clear teleology emerges: Aadhaar is not conceived to simply streamline the PDS, but to substitute it, in the longer run, with a system of cash transfers to the bank accounts of beneficiaries. As government officials declare, this serves the purpose of reducing the distortion caused by subsidies, and create a more effective system where existing leakages cannot take place. A large majority of beneficiaries, however, is suspicious towards cash transfers (Drèze et al. 2017): a prominent argument is that these are more complex to collect and handle, with respect to the secure materiality of PDS food rations. What is sure, beyond points of view on the appropriateness of cash transfers, is that the teleology behind the Aadhaar-based PDS is not that of streamlining the system, but that of creating a new one where the logic of buying goods on the market replaces the existing logic of subsidies.

Aadhaar concurs to enable a shift from in-kind subsidies to cash transfers, with uncertain consequences on poor people’s entitlements

Rooted into field research on datafied anti-poverty systems, these reflections offer two main contributions to extant theorizations of datafication in the Global South. First, they highlight the role of state governments in using datafied systems towards construction of a positive image of themselves, portraying datafication as a problem-solving tool adopted to tackle the most pressing issues affecting existing programmes. The power of datafication, embodied by large biometric infrastructures such as Aadhaar, is used to project an image of accountability and effectiveness, relied upon in electoral times and in the construction of consensus from the public opinion. At the same time, citizens’ perspectives reveal forms of data injustice (Heeks & Renken 2018) which did not exist before datafication, such as the denial of subsidies based on failure of user recognition by point-of-sale machines or the subordination of the right to food to enrolment in a national biometric database.

Second, datafication is often portrayed by governments and public entities as a means to streamline anti-poverty programmes, improving the mechanisms at the basis of their functioning. By contrast, my research suggests a more pervasive role of datafication, capable of transforming the very basis on which existing social protection systems are grounded (Masiero 2015). The Aadhaar case is a revealing one in this respect: as it is incorporated in extant subsidy systems, Aadhaar does not aim to simply improve their functioning, but to substitute the logic of in-kind subsidies with a market-based architecture of cash transfers. Moving the drivers of governance of anti-poverty systems from the state to the market, datafication is hence implicated in a deep reformative effort, which may have massive consequences on programme architecture and the entitlements of the poor.

Entrenched in the Indian system of social protection, Aadhaar is today the greatest datafier of anti-poverty programmes in the world. Here we have outlined its primary effects, and especially its ability to reshape existing anti-poverty policies at their very basis. Ongoing research across ICT4D, data ethics and development studies pertains to the ways datafication will affect anti-poverty programme entitlements, for the many people whose livelihoods are predicated on them.

 

Silvia Masiero is a lecturer in International Development at the School of Business and Economics, Loughborough University. Her research concerns the role of information and communication technologies (ICTs) in socio-economic development, with a focus on the participation of ICT artefacts in the politics of anti-poverty programmes and emergency management.

 

References:

Drèze, J., and Khera, R. (2015) Understanding leakages in the Public Distribution System. Economic and Political Weekly, 50(7), 39-42.

Drèze, J., Khalid, N., Khera, R., & Somanchi, A. (2017). Aadhaar and Food Security in Jharkhand. Economic & Political Weekly, 52(50), 50-60.

Heeks, R., & Renken, J. (2018). Data justice for development: What would it mean? Information Development, 34(1), 90-102.

Khera, R. (2011). India’s Public Distribution System: utilisation and impact. Journal of Development Studies, 47(7), 1038-1060.

Masiero, S. (2015). Redesigning the Indian food security system through e-governance: The case of Kerala. World Development, 67, 126-137.

Swaminathan, M. (2002). Excluding the needy: The public provisioning of food in India. Social Scientist, 30(3-4), 34-58.

[BigDataSur] My experience in training women on digital safety

by Cecilia Maundu

I remember it was September 2015 when I was invited for a two-day workshop on digital safety by the Association of Media Women in Kenya. At first I was very curious because I had not heard much about digital security. The two-day workshop was an eye opener. After the workshop I found myself hungry for more information on this issue.

Naturally, I went online to find more information. I must say I was shocked at the statistics I came across on the number of women who have been abused online, and continue to suffer. Women were being subjected to sexist attacks. They were attacked because of their identity as women, not because of their opinions. I asked myself what can I do? I am well aware that I am just a drop in the ocean, but any little change I can bring will help in some way. That was a light bulb moment for me.

It was in that moment that I knew I wanted to be a digital safety trainer. I wanted to learn how to train people, especially women, on how to stay safe online. The internet is the vehicle of the future. This future is now, and we cannot afford for women to be left behind.

Online violence eventually pushes victims to stay offline. It is censorship hidden behind the veil of freedom of expression.

After this realization, I embarked on the quest to become a digital safety trainer. As fate would have it, my mentor Grace Githaiga came across the SafeSister fellowship opportunity and sent it to me. I applied and got into the program. The training was taking place in Ngaruga lodge, Uganda. The venue of the training was beyond serene. The calm lake waters next to the hotel signified how we want the internet to be for women: a calm place and a safe space where women can express themselves freely without fear of being victimized, or their voices being invalidated.

On arrival we were met by one of the facilitators, Helen, who gave us a warm welcome. The training was conducted by five facilitators, all of whom were women.

The training was student friendly. The topics were broken down in a way that allows everyone to understand what was being discussed. Each facilitator had her own way and style of delivering the different topics, from using charts to power point presentations. I must say they did an exceptional job. I got to learn more about online gender violence and how deeply rooted it is in our society, and hence the importance of digital security trainings.

Being a trainer is not only about having digital safety skills, it also requires you to be an all rounded person. While giving training you are bound to meet different types of people with different personalities, and it is your duty to make them feel comfortable and make sure the environment around them is safe. It is in this safe space that they will be able to talk and express their fears and desires, and, most importantly, they will be willing to learn. As a digital security trainer, you should first know more about your participants and how much they know about digital security. This will enable you to package your material according to their learning needs.

Being a trainer requires you to read a lot on digital security, because this keeps you updated and allows you, therefore, to relay accurate information to your trainees. As a trainer, it is also necessary to understand the concept of hands on training because it gives the participants the opportunity to put into practice what they have learnt. For example, when you are teaching about privacy status on Facebook, you don’t just talk about it, your should rather ask the participants to open their Facebook accounts – that is if they have any – and go through the instructions step by step with them till they are able to achieve the task. As a trainer there is also the possibility of meeting a participant who does not give the opportunity to the rest of the group to express their views, as they want to be the one to talk throughout. However, the trainer needs to take charge and make sure that each participant is given an equal opportunity to talk.

Before the training we had each been given a topic to make a presentation on, and mine was to do with encryption; VeraCrypt to be more specific. At first it sounded Greek to me, but then I resorted to my friend Google to get more details (this begs the question of: how was life before the internet?). By the time I was leaving Kenya for Uganda I had mastered VeraCrypt. We kept discussing our topics with the rest of the group to a point where they started calling me Vera. My presentation went so well to my surprise. The week went by so fast. By the time we realized it, it was over and it was time to go back home and start implementing what we had learnt.

We continued receiving very informative material online from the trainers. In September 2017 they opened up a pool of funding where we could apply to fund a training in our different home countries. I got the funding, and chose to hold the training at Multimedia University where I lecture part time. The reason behind my choice was that this was home for upcoming young media women, and we needed to train them on how to stay safe online, especially since media women in Kenya form the majority of victims of gender based violence. They needed to know what awaits them out there and the mechanisms they needed to have to protect themselves from the attacks. The training was a success, and the young ladies walked away happy and strong.

The second, and last, part of SafeSister (I am barely holding my tears here, because the end did come) took place in Uganda at the end of March 2018. It was such a nice reunion, meeting the other participants and our trainers after a year. This time the training was more relaxed. We were each given a chance to talk about the trainings we conducted, the challenges we encountered, the lessons learnt and what we would have done differently. For me the challenge I encountered was time management. The trainers had prepared quite informative materials, hence the time ran over, add to it a 3o minutes delayed start.

This was my first training, and one take home for me as a digital safety trainer was that not all participants will be enthusiastic about the training, but one shouldn’t be discouraged or feel like they are not doing enough. The trainer just needs to make sure that no participant is left out. The trainer should not just throw questions at the participants, or just ask for their opinion on different issues regarding digital safety. As time progresses, they gradually get enthusiastic and start feeling more at ease.

One thing I have learnt since I became a digital security trainer is that people are quite ignorant on digital security matters. People go to cybercafés and forget to sign out of their email accounts, or use the same password for more than a single account, and  then they ask you ‘’why would someone want to hack into my account or abuse me and I am not famous?” However, such questions should not discourage you, on the contrary, they should motivate you to give more trainings, because people don’t know how vulnerable they are by being online while their accounts and data are not protected. Also as a trainer, when you can, and when the need arises, give as much free trainings as you can, since not everyone can afford to pay you. It is through these trainings that you continue to sharpen your skills and become an excellent trainer.

After the training we were each awarded a certificate. It felt so good to know that I am now a certified digital security trainer; nothing can beat that feeling.  As they say, all good things must come to an end. Long live Internews, long live DefendDefenders Asante Sana. I will forever be grateful.

 

Cecilia Mwende Maundu is a broadcast journalist in Kenya, a digital security trainer and consultant with a focus on teaching women how to stay safe online. She is also a user experience (UX) trainer, collecting user information feedback and sharing it with developers.

XRDS Summer 2018 issue is out -with contributions from DATACTIVE

The last issue of XRDS – The ACM Magazine for Students is out. The issue has been co-edited by our research associate Vasilis Ververis and features contributions by three of us: Stefania Milan, Niels ten Oever, Davide Beraldo, and Vasilis himself.

  1. Stefania’s piece ‘Autonomous infrastructure for a suckless internet’ explores the role of politically motivated techies in rethinking a human rights respecting internet.
  2. Niels and Davide, in their ‘Routes to rights’, discuss the problems of ossification and commercialization of internet architecture.
  3. Vasilis, together with Gunnar Wolf (also editor of the issue), has written on ‘Pseudonimity and anonymity as tools for regaining privacy’.

XRDS (Crossroads) is the quarterly magazine of the Association for Computing Machinery. You can reach the full issue here.

DATACTIVE at EASST 2018

Stefania and Guillén will be present this week at EASST 2018: Meetings – Making Science, Technology and Society Together, in Lancaster, UK.

If you are around, drop by our panel “After data activism: reactions to civil society’s engagement with data” on Saturday morning (9:30) at the Elizabeth Livingston Lecture Theatre. We will be focusing on how data governance, data science and social technologies are co-producing asymmetries of power through five papers dealing with Data flows, data sharing, the scoring society, civil society and data practices, and resistance through data.

Apart from that, Stefania will also be presenting along Anita Chan a paper on “Data cultures from the Global South: decentering data universalism” and will participate in a panel organized by the European Research Council.

Come say hi!

BigBang v0.2.0 ‘Tulip Revolution’ released

DATACTIVE has been collaborating with researchers from New York University and the University of California at Berkeley to release version 0.2.0 of the quantitative mailinglists analysis software BigBang. Mailinglists are among the most widely used communication tools in Internet Governance institutions and among software developers. Therefore mailinglists lend themselves really well to do analysis on the development of the communities as well as topics for discussion and their propagation through the community. BigBang, a python based tool, is there to facilitate this. You can start analyzing mailinglists with BigBang by following the installation instructions.

This release, BigBang v0.2.0 Tulip Revolution, marks a new milestone in BigBang development. A few new features:
– Gender participation estimation
– Improved support for IETF and ICANN mailing list ingest
– Extensive gardening and upgrade of the example notebooks
– Upgraded all notebooks to Jupyter 4
– Improved installation process based on user testing

En route to this milestone, the BigBang community made a number of changes to its procedures. These include:

– The adoption of a Governance document for guiding decision-making.
– The adoption of a Code of Conduct establishing norms of respectful behavior within the community.
– The creation of an ombudsteam for handling personal disputes.

We have also for this milestone adopted by community decision the GNU Affero General Public License v3.0.

If you have any questions or comment, feel free to join the mailinglist,
join us on gitter chat or file an issue on Github.

If you are interested in using BigBang but don’t know where to start, we are happy to help you on your way via videochat or organize a webinar for you and your community. Feel free to get in touch!

[blog] Growth for Critical Studies? Social scholars, let’s be shrewder

Author: Miren Gutierrez
This is a response the call for a critical community studies  ‘Tech, data and social change: A plea for cross-disciplinary engagement, historical memory, and … Critical Community Studies‘ by Kersti Wissenbach and the first contribution to the debate  ‘Can We Plan Slow – But Steady – Growth for Critical Studies?’ by Charlotte Ryan.
Commenting on the thought-provoking blogs by Charlotte Ryan and Kersti Wissenbach, I feel in good company. Both of them speak of the need in research to address inequalities embedded in technology and to focus on the critical role that communities play in remedying dominant techno-centric discourses and practices, and of the idea of new critical community studies. That is, the need to place people and communities at the centre of our activity as researchers and practitioners, asking questions about the communities instead of about the technologies, demanding a stronger collaboration between the two, and the challenges that this approach generates.
Their blogs incite different but related ideas.

First, different power imbalances can be found in scholarship. Wissenbach suggests that dominant discourses in academia, as well as in practice and donor agendas, are driving the technology hype. But as Ryan proposes, academia is not a homogeneous terrain.

Always speaking from the point of view of critical data studies, the current predominant techno-centrism seems to be diverting research funding towards applied sciences, engineering and tools (what Wissenbach calls “the state of technology” and Ryan refers to as a “profit-making industry”). Talking about Canada, Srigley describes how, for a while, even the Social Sciences and Humanities Research Council of Canada “fell into line by focusing its funding on business-related degrees. All the while monies for teaching and research in the humanities, social sciences, and sciences with no obvious connection to industry, which is to say, most of it, began to dry up”(Srigley 2018).  Srigley seems to summarise what is happening everywhere. “Sponsored research” and institutions requiring that research can be linked to business and industry partners appear as the current mantra.

Other social scholars around me are coming up with similar stories: social sciences focusing critically on data are getting a fraction of the research funding opportunities vis-à-vis computational data-enabled science and engineering within the fields of business and industry, environment and climate, materials design, robotics, mechanical and aerospace engineering, and biology and biomedicine. Meanwhile, critical studies on data justice, governance and how ordinary people, communities and non-governmental organisations experience and use data are left for another day.
Thus, the current infatuation with technology appears not to be evenly distributed across donors and academia.

Second, I could not agree more with Wissenbach and Ryan when they say that we should take communities as entry points in the study of technology for social change. Wissenbach further argues against the objectification of “communities”, calling for actual needs-driven engaged research and more aligned with practice.

Then again, here lies another imbalance. Even if scholars work alongside with practitioners to bolster new critical community studies, these actors are not in the same positions. We, social scholars, are gatekeepers of what is known in academia, we are often more skilful in accessing funds, we dominate the lingo. Inclusion therefore lies at the heart of this argument and remains challenging.

If funds for critical data studies are not abundant, resources to put in place data projects with social goals and more practice engaged research are even scarcer. That is, communities facing inequalities may find themselves competing for resources not only within their circles (as Ryan suggests). Speaking too as a data activist involved in projects that look at illegal fishing’s social impacts on coastal communities of developing countries (and trying hard to fund-raise for them), I think that we must make sure that more possible for data activism research does not mean less funding for data activist endeavours. I know they are not the same funds, but there are ways in which research could foster practice, and one of them is precisely putting communities at the centre.

Third, another divide lies underneath the academy’s resistance to engaged scholarship. While so-called “hard sciences” have no problems with “engaging”, some scholars in “soft-sciences” seem to recoil from it. Even if few people still support Chris Anderson’s “end of theory” musings (Anderson 2008), some techno-utopians pretend a state of asepsis exists, or at least it is possible now, in the age of big data. But they could not be more misleading. What can be more “engaged scholarship” than “sponsored research”? I mean, research driven and financed by companies is necessarily “engaged” with the private sector and its interests, but rarely acknowledges its own biases. Meanwhile, five decades after Robert Lynd asked “Knowledge for what?” (Lynd 1967), this question still looms over social sciences. Some social scientists shy away from causes and communities just in case they start marching into the realm of advocacy and any pretentions of “objectivity” disappear. While we know computational data-enabled science and engineering cannot be “objective”, why not accept and embrace engaged scholarship in social sciences, as long as we are transparent about our prejudices and systematically critical about our assumptions?

Fourth, data activism scholars have to be smarter in communicating findings and influencing discourses. Our lack of influence is not all attributable to market trends and donors’ obsessions; it is also our making. Currently, the stories of data success and progress come mostly from the private sector. And even when prevailing techno-enthusiastic views are contested, prominent criticism comes from the same quarters. An example is Bernard Marr’s article “Here’s why Data Is Not the New Oil”. Marr does not mention the obvious, that data are not natural resources, spontaneous and inevitable, but cultural ones, “made” in processes that are also “made” (Boellstorff 2013). In his article, Marr refers only to certain characteristics that make data different from oil. For example, while fossil fuels are finite, data are “infinitely durable and reusable”, etc. Is that all that donors, and publics, need to know about data? Although this debate has grown over the last years, is it reaching donors’ ears? Not long ago, during the Datamovida conference in Madrid in 2016 organised by Vizzuality, a fellow speaker –Aditya Agrawal from the Open Data Institute and Global Partnership for Sustainable Development Data— opened his presentation saying precisely that data were “the new oil”. If key data people in the UN system have not caught up with the main ideas emerging from critical data studies, we are in trouble and it is partly our making.

This last argument is closely related to the other ideas in this blog. The more we can influence policy, public opinion, decision-makers and processes, the more resources data activism scholars can gather to work alongside with practitioners in exploring how people and organisations appropriate data and their processes, create new data relations and reverse dominant discourses. We cannot be content with publishing a few blogs, getting our articles in indexed journals and meeting once in a while in congresses that seldom resonate beyond our privilege bubbles. Both Wissenbach and Ryan argue for stronger collaborations and direct community engagement; but this is not the rule in social sciences.

Making an effort to reach broader publics could be a way to break the domination that, as Ryan says, brands, market niches and revenue streams seem to exert on academic institutions. Academia is a bubble but not entirely hermetic. And even if critical community studies will not ever be a “cash cow”, they could be influential. There are other critical voices in the field of journalism, for example, which have denounced a sort of obsession with technology (Kaplan 2013; Rosen 2014). Maybe critical community studies should embrace not only involved communities and scholars but also other critical voices from journalism, donors and other fields. The collective “we” that Ryan talks about could be even more inclusive. And to do that, we have to expand beyond the usual academic circles, which is exactly what Wissenbach and Ryan contend.

I do not know how critical community studies could look like; I hope this is the start of a conversation. In Madrid, in April, donors, platform developers, data activists and journalists met at the “Big Data for the Social Good” conference, organised by my programme at the University of Deusto focussing on what works and what does not in critical data projects. The more we expand this type of debates the more influence we could gain.

Finally, the message emerging from critical data studies cannot be only about dataveillance (van Dijck 2014) and ways of data resistance. However imperfect and biased, the data infrastructure is enabling ordinary people and organised society to produce diagnoses and solutions to their problems (Gutiérrez 2018). Engaged research means we need to look at what communities do with data and how the experience the data infrastructure, not only at how communities contest dataveillance, which I have the feeling has dominated critical data studies so far. Yes, we have to acknowledge that often these technologies are shaped by external actors with vested interests before communities use them and that they embed power imbalances. But if we want to capture people’s and donor’s imagination, the stories of data success and progress within organised and non-organised society should be told by social scholarship as well. Paraphrasing Ryan, we may lose but live to fight another day.

Cited work
Anderson, Chris. 2008. ‘The End of Theory: The Data Deluge Makes the Scientific Method Obsolete’. Wired. https://www.wired.com/2008/06/pb-theory/.
Boellstorff, Tom. 2013. ‘Making Big Data, in Theory’. First Monday 18 (10). http://firstmonday.org/article/view/4869/3750.
Dijck, Jose van. 2014. ‘Datafication, Dataism and Dataveillance: Big Data between Scientific Paradigm and Ideology’. Surveillance & Society 12 (2): 197–208.
Gutierrez, Miren. 2018. Data Activism and Social Change. Pivot. London: Palgrave Macmillan.
Kaplan, David E. 2013. ‘Why Open Data Isn´t Enough’. Global Investigative Journalism Network (GIJN). 4 February 2013. http://gijn.org/2013/04/02/why-open-data-isnt-enough/.
Lynd, Robert Staughton. 1967. Knowledge for What: The Place of Social Science in American Culture. Princeton: Princeton University Press. http://onlinelibrary.wiley.com/doi/10.1525/aa.1940.42.1.02a00250/pdf.
Rosen, Larry. 2014. ‘Our Obsessive Relationship With Technology’. Huffington Post, 2014. https://www.huffingtonpost.com/dr-larry-rosen/our-obsession-relationshi_b_6005726.html?guccounter=1.
Srigley, Ron. 2018. ‘Whose University Is It Anyway?’, 2018. https://lareviewofbooks.org/article/whose-university-is-it-anyway/#_ednref37.

Stefania keynotes at ‘The Digital Self’ workshop, Kings’ College London, July 6

On July 6, Stefania will deliver a keynote at the workshop ‘The Digital Self’, organized by the Departments of Digital Humanities and of Culture, Media & Creative Industries of Kings’ College, London. This workshop focuses on how digital technology influences our daily lives, its impacts on the ways culture is re-shaped, and as a result how our identities as workers, consumers and media and cultural producers are changing. Stefania’s keynote is entitled ‘Identity and data infrastructure’. Read more.

 

 

 

Kersti presenting during RNW Media Global Weeks, July 6

With her talk ‘NGO ETHICS IN THE DIGITAL AGEHOW TO WORK WITH DATA  RESPONSIBLY’ Kersti will address a transnational team of media and advocacy practitioners during RNW Media‘s annual summit.

RNW Media is working with youth in fragile and repressive states, aiming to empower young women and men to unleash their own potential for social change. As the organization has transitioned from a traditional international broadcaster towards increasing engagement in advocacy activities utilizing digital means and data, a critical moment of moving towards a responsible data approach has come.

Kersti is consulting RNW Media on the process to develop a responsible data framework and respective program strategies.

Advisory Board Workshop, July 4-5

In July 4-5, DATACTIVE has gathered the Advisory Board members for a sharing & feedback workshop.

Participants include Anita Say Chan (University of Illinois, Urbana-Champaign), Chris Csikszentmihályi (Madeira Interactive Technologies Institute), Ronald Deibert (University of Toronto), Seda Gürses (KU Leuven), Evelyn Ruppert (Goldsmiths, University of London), Hishant Shah (ArtEZ) … and the DATACTIVE team. Day 1 only: Hisham Al-Miraat (Justice & Peace Netherlands), Julia Hoffmann (Hivos).

*Day 1*
Fireside chat from 4pm @ Terre Lente, followed by light dinner [CLOSED]
Public event at 8pm @ SPUI25: ‘Democracy Under Siege: Digital Espionage and Civil Society Resistance’, with Ronald Deibert (Citizen Lab – University of Toronto), Seda Guerses (COSIC/ESAT – KU Leuven), and Nishant Shah (ArtEZ / Leuphana University)

*Day 2* @ the University Library, Singel 425 [BY INVITATION ONLY]
9.30 Welcome & coffee
9.45 Intro (Stefania)
Session 1 (10-11am): The framework: Concepts and Infrastructure (Presenters: Stefania & Davide)
Session 2 (11.10-12.15): Data as stakes (Presenters: Becky & Niels)
Lunch break (12.15-1.30)
Session 3 (1.30-2.35): Data as tools (Presenters: Guillen & Kersti)
Session 4 (2.40-3.45): Next in line: Emerging work (Presenters: Fabien, Jeroen, Quynn)
Session 5 (4-4.30) Wrap-up (Stefania, all)

 

[blog] Making ‘community’ critical: Tech collectives through the prism of power

Author: Fabien Cante

In her recent blog post, Kersti Wissenbach expresses her frustration with the field of “civic tech,” which, as she puts it, remains far more focused on the “tech” than the “civic.” This resonates with me in many ways. I write as someone who is possibly more of an outsider to the field than Wissenbach: my previous research was on local radio (all analog), and as my DATACTIVE colleagues have found out, I am clueless about even the basics of encryption, so anything more technically complex will leave me flummoxed. In agreeing with Wissenbach, then, I do not mean to diminish the wonders of tech itself, as a field of knowledge and intervention, but rather underscore that the civic (or “the social,” as my former supervisor Nick Couldry would put it) is itself an immensely complex realm of knowledge, let alone action.

Wissenbach proposes that our thinking efforts, as scholars and activists concerned about the relations between technology and social change, shift toward “Critical Community Studies.” By this she means that we should ask “critical questions beyond technology and about communities instead.” I strongly agree. The research projects around data, tech and society that most excite me are the ones that are rooted in some community or other – transnational activist communities, in the case of DATACTIVE, or marginalised urban communities, in the case of the Our Data Bodies project. However, like Charlotte Ryan (whose response to Wissenbach can be read here), I would also like to be a bit cautious. In what follows, I really emphasise the critical and contextual aspects of Critical Community Studies, as envisioned by Wissenbach. I do so because I am a bit sceptical about the middle term – community.

I am involved in urban planning struggles in south London where the word “community” is frequently employed. Indeed, it serves as a kind of talisman: it invokes legitimacy and embeddedness. Community is claimed by activists, local authorities, and even developers, for obviously very different aims. This experience has shown me that, politically, community is an empty signifier. Bullshit job titles like “Community Manager” in marketing departments (see also Mark Zuckerberg speeches) further suggest to me that community is one of the most widely misappropriated words of our time.

More seriously, and more academically perhaps, community denotes a well-defined and cohesive social group, based on strong relationships, and as such self-evident for analysis. This is not, in many if not most circumstances, what collectives actually look like in real life. Anthropologist John Postill (2008), studying internet uptake in urban Malaysia, writes that researchers too often approach tech users as either “communities” or “networks.” Neither of these concepts captures how technology is woven into social relations. Where community presumes strong bonds and a shared identity, network reduces human relations to interaction frequencies and distance between nodes, flattening power differentials.

As Wissenbach rightly notes, people who use tech, either as producers or users, are “complex [beings] embedded in civil society networks and power structures.” It is these power structures, and the often tense dynamics of embeddedness, that Wissenbach seems to find most interesting – and I do too. This, for me, is the vital question behind Critical Community Studies (or, for that matter, the study of data activism): what specific power relations do groups enact and contest?

Still from Incoming (2017), by Richard Mosse - http://www.richardmosse.com/projects/incoming
Still from Incoming (2017), by Richard Mosse – http://www.richardmosse.com/projects/incoming

The critical in Critical Community Studies thus asks tough questions about race, class, gender, and other lines of inequality and marginalization. It asks how these lines intersect both in the community under study (in the quality of interactions, the kinds of capital required to join the collective, language, prejudices, etc.) and beyond it (in wider patterns of inequality, exclusion, and institutionalized domination). We see examples of such questioning happening, outside academia, through now widespread feminist critiques calling out pervasive gender inequalities in the tech industry, or through Data for Black Lives’ efforts to firmly center race as a concern for digital platforms’ diversity and accountability. Within the university, Seda Gürses, Arun Kundnani and Joris Van Hoboken’s (2016) paper “Crypto and Empire,” which could be said to examine the “crypto community” (however diffuse), offers some brilliant avenues to think data/tech communities critically, and thereby “re-politicize” data itself. More broadly, a wealth of feminist, post/decolonial (e.g. Mignolo 2011; Bhambra 2014; or Flavia Dzodan’s stellar Twitter feed) and critical race theory (see for example Browne 2015) can help us think through the histories from which civic tech communities arise, their positions in a complex landscape of power and inequality, and the ways in which they see their place in the world.

There is always a risk, when researchers consider community critically, that they put certain communities under strain; that they are seen to hinder positive work through their (our) critical discourse. Certainly, challenging a community’s inclusiveness is hard (and researchers are very bad at challenging their “own” community). But I think this is a limited view of critique as “not constructive” (a crime in certain circles where “getting things done” is a primary imperative). I would argue that collectives are strengthened through critique. As Charlotte Ryan beautifully puts it, Critical Community Studies can be instrumental in forming “a real ‘we’.” She adds: “an aggregate of individuals, even if they share common values, does not constitute ‘us’.” Building a “we” requires, at every step, asking difficult questions about who that “we” is (“we” the social movement, or “we” the civic tech community), who doesn’t fall under “we’s” embrace, and why.

Bibliography

Bhambra, Gurminder K. (2014) Connected Sociologies. London: Bloomsbury Press

Browne, Simone (2015) Dark Matters: On the Surveillance of Blackness. Durham, NC & London: Duke University Press

Gürses, Seda, Kundnani, Arun & Joris Van Hoboken (2016) “Crypto and Empire: The Contradictions of Counter-Surveillance Advocacy” Media, Culture & Society 38 (4), 576-590

Mignolo, Walter (2011) The Darker Side of Western Modernity: Global Futures, Decolonial Options. Durham, NC & London: Duke University Press

Postill, John (2008) “Localizing the Internet Beyond Communities and NetworksNew Media & Society 10 (3), 413-431