Category: show on landing page

[BigDataSur-COVID] Digital Social Protection during COVID-19: The Shifted Meaning of Data during the Pandemic

Silvia Masiero reflects on changes in digital social protection during the pandemic, outlines the implications of such changes for data justice, and co-proposes an initiative to discuss them.

by Silvia Masiero

One year ago today was my last time leaving a field site. Leaving friends and colleagues in India, promising to return as usual for the Easter break, it was hard to imagine to be suddenly plugged into the world we live in today. As a researcher of social protection schemes, little did I know that my research universe – digital anti-poverty programmes across the Global South – would have changed as it has over the last 12 months. As I have recently stated in an open commentary, COVID-19 has yielded manifold implications on social protection systems, implications that require reflection as conditions of pandemic exceptionalism perdurate over time and across regions.

The Shifted Meaning of Beneficiary Data

My latest study was on a farmer subsidy programme based on the datafication of recipients – a term that indicates, from previous work, the conversion of human beneficiaries into machine-readable data. The programme epitomises the larger global trend of matching demographic and, increasingly, biometric credentials of individuals with data on eligibility for anti-poverty schemes, such as poverty status, family size and membership of protected groups. Seeding social protection databases with biometric details, a practice exemplified by India’s Aadhaar, is supposed to combat exclusion and inclusion errors alike, assigning benefits to all entitled subjects while scrapping all the non-entitled. At the same time, quantitative and qualitative research works have shown the limits of datafication, especially its consequences in reinforcing exclusions of entitled subjects whose ability to authenticate is reduced by failures in recognition, sometimes resulting in denial of vital schemes.

During the pandemic, as numerous contributions to this blog have illustrated, existing vulnerabilities have become deeper and new ones have emerged, expanding the pool of people in need for social protection. Instances of the former are daily-wage and gig workers – who have seen their extant subalternities deepened in the pandemic, in terms of loss of income or severely heightened risks at work. Instances of new vulnerabilities, instead, are associated to the “new poor” of the pandemic, affected in many ways by the backlashes of economic paralyses across the globe. The result is the heightened global need for social protection to work smoothly, making the affordance of inclusiveness – being able to cover for the (old and new) needful – arguably prioritarian to that of exclusiveness, aimed at “curbing fraud” by secure biometric identification.

Since its launch in May 2020, this blog has hosted contributions on social protection schemes from countries including Colombia, Peru, India, Brazil and Spain, all highlighting the heightened needs of social protection under COVID-19. While describing different world realities, all contributions remark how the vulnerabilities brought by COVID-19 call for means to combat wrongful exclusions, for example using excess stocks of commodities to expand scheme coverage. Against the backdrop of a world in which the priority was “curbing fraud” through the most up-to-date biometrics, the pandemic threw us in a world in which inclusion of the needful takes priority over the roles of anti-poverty scheme datafication. The first implication, for researchers of digital social protection, is the need to devise ways to learn from examples of expanded coverage in social protection, of which India’s National Food Security Act has offered an important instantiation over the last decade.

Social Protection in the Pandemic: New Data Injustices

As the edited book “Data Justice and COVID-19: Global Perspectives” notes, the hybrid public-private architectures emerged during COVID-19 have generated new forms of data injustice, detailed in the volume through 33 country cases. The book opens, along with important debates on the meaning of data in a post-pandemic world, the question on data justice implications of COVID-19 for digital social protection. Drawing on contributions published in this blog, as well as reports of social protection initiatives taken during the pandemic, I have recently highlighted three forms of data injustice – legal, informational and design-related – that need monitoring as the pandemic scenario persists.

From a legal perspective, injustice is afforded by the subordination of entitlements to registration of users into biometric databases, which become a condition for access – leading to scenaria of forced trading of data for entitlements, widely explored in the literature before COVID-19. The heightened need for social protection in the pandemic deepens the adverse implications of exclusions, exacerbating the consequences of injustice for those excluded from the biometric datasets. Stories from urban poor contexts ranging from Nebraska, US to São Paulo, Brazil, underscore the same point: while the legal data injustice of exclusion was problematic before, it only heightens its problematicity in the context of the economic backlash of the pandemic on the poor.

From an informational perspective, the way entitlements are determined – specifically, the use of citizens’ information across databases to determine entitlements – has become crucial during the pandemic. Two cases from this blog especially detail this point. In Colombia, information to determine eligibility for the Ingreso Solidario (Solidarity Income) program was combined from existing data repositories, but without detail on how the algorithm combined information and thus, on how eligibility was determined. In Peru, subsidies have leveraged information gathered through databases such as the Census, property registry and electricity consumption, again without further light on how information was combined. Uncertainty on eligibility criteria, beyond deepening pandemic distress, arguably limits beneficiaries’ ability to contest eligibility decisions, due to lack of clarity on the very grounds on which these are taken.

Finally, design-related data injustices arise from the misalignment of datafied social protection schemes with the effective needs of beneficiaries. In the pandemic, the trade-off brought by biometric social protection – entailing increased accuracy of identification, at the cost of greater exclusions – has been brought to its extreme consequences, as extreme are the implications of denial of subsidy for households left out by social protection schemes. This brings to light a trade-off whose problematicity was already known well before the pandemic started, and further heightened by studies questioning the effective ability of biometrics to increase the accuracy of targeting. As a result, a third, design-related form of data injustice needs monitoring as we trace the evolution of social protection systems through COVID-19.

Ways Forward: A Roundtable to Discuss

As the pandemic and its consequences perdurate, new ways are needed to appraise the consequences of shifts in datafied social protection that the crisis that the crisis has brought. Not surprisingly, my promise of going back to the field for Easter 2020 could not be maintained, and established ways to conduct research on global social protection needed reinvention. It is against this backdrop that a current initiative, launched in the context of the TILTing Perspectives Conference 2021, may make a substantial contribution to knowledge on the theme.

The initiative, a Roundtable on COVID-19 and Biometric ID: Implications for Social Protection, invites presentations on how social protection systems have transformed during the pandemic, with a focus on biometric social protection and the evolution of its roles and systems. Abstracts (150-200 words) are invited as submissions to the TILTing Perspectives Conference, with the objective of gathering presentations from diverse world regions and draw conclusions together. Proposals for the role of discussants – to take part in the roundtable and formulate questions for debate – are also invited through the system. In an epoch where established ways to do fieldwork are no longer practicable, we want the roundtable to be an occasion to advance collective knowledge, together deepening our awareness of how social protection has changed in the first pandemic of the datafied society.

Submission to the Roundtable are invited at: https://easychair.org/cfp/Tilting2021

 

 

[BigDataSur-COVID] COVID-19 and the Stripping of Power from the Edges

By Niels ten Oever

At the start of the COVID-19 pandemic, people wondered whether the internet infrastructure would be capable of handling the increase in data traffic. When many people started working, streaming, and following the rapidly unfolding news on social media from home, many expected this would strain on the internet infrastructure. Some European politicians were so concerned that they called on Netflix to lower the resolution of their video streams. Why did it turn out the internet infrastructure was able to cope with the increasing demand? The answer is, because the internet no longer works as most people think it does. An extra layer of control was added to the internet by Content Delivery Networks. This chapter will discuss how pressure on the infrastructural margins of the internet is strengthening the center of the network, and examine how COVID-19 has exacerbated this trend.

In 2011, the Tunisian government started heavily censoring the internet in response to popular uprisings in the country. In response, many internet users engaged in what is commonly called a Distributed Denial of Service (DDoS) attack on the Tunisian government’s website. In a DDoS attack, hundreds or even thousands of computers try to reach a website at the same time. This can lead to the website’s server, or the connection to the server, being overloaded and thus render the website unavailable to internet users. When a website suddenly becomes very popular, this can also lead to similar behavior. When many users try to connect at the same time, the traffic effectively renders the site or service unavailable. Eight of Tunisia’s websites were forced offline.

In response to the DDoS attacks, and to prevent down-time of servers due to their popularity, Content Distribution Networks (CDNs) were increasingly used. CDNs are globally-distributed proxy servers, often placed in data centers close to internet eXchange Points (IXPs). While a user thinks they are connecting to a popular website far away, they are connecting to a CDN server that is located near them. While you are thinking you are streaming a video from a jurisdiction that you think is safe, the video is more likely to be stored close to the network controlled by your Internet Service Provider (ISP) or your telecommunications operator.

When the internet was designed, an engineer adopted the end-to-end principles as their central motto. This was included in the mission statement of the Internet Engineering Taskforce, the institution responsible for co-developing and standardizing the internet infrastructure:

The Internet isn’t value-neutral, and neither is the IETF. We want the Internet to be useful for communities that share our commitment to openness and fairness. We embrace technical concepts such as decentralized control, edge-user empowerment and sharing of resources, because those concepts resonate with the core values of the IETF community. These concepts have little to do with the technology that’s possible, and much to do with the technology that we choose to create (RFC3935).

When users connected to the internet during the COVID-19 pandemic, it may seem they were edge-users connecting to another endpoint over “dumb pipes”—leveraging the powers of decentralized control. The truth it quite the opposite. The internet infrastructure held up during the COVID-19 pandemic not because people were getting their content from the global internet, but from a data center near them. You may think is actually a good thing, since it caused the internet to not collapse? Maybe. CDNs are the mere latest cause and consequence of centralization on the internet. The difference between CDNs and other large players such as Google and Facebook (who have their own CDNs) is that these other CDNs remain largely invisible. Some of you might have heard about Cloudflare, but what about Akamai, Fastly, and Limelight?

In 2017, Cloudflare unilaterally removed the neo-nazi forum and website Daily Stormer from its services. In 2019, it similarly removed the imageboard 8chan after two shootings in the United States. The company cited the following reason for removal: “In the case of the El Paso shooting, the suspected terrorist gunman appears to have been inspired by the forum website known as 8chan. Based on evidence we’ve seen, it appears that he posted a screed to the site immediately before beginning his terrifying attack on the El Paso Walmart killing 20 people”. The interesting point was that no one asked Cloudflare to do this; they removed the content on their own volition, without a clear process in place. Many critical internet scholars such as Suzanne van Geuns, Corinne Cath, and Kate Klonick have reported on this. While such decisions show the concrete impact these companies can have, it is perhaps even more telling that one hears very little about these companies.

CDNs are perhaps the internet infrastructure that companies benefitted most from during the COVID-19 epidemic, because there was increased traffic to the websites that they provide services to. But what about the people who requested information from these websites? Technically, they got served by another server than the one they thought they were connected to. They might have received something else than what they asked for, because CDNs allow for particularly fine-mazed geography-based adaptation of content. The CDN that served a user in Senegal might have different data than a CDN that served a user in Brisbane. And there is almost no way of knowing by which particular CDN server you got served, or to bypass the CDN. In this way, the opacity of internet infrastructure was exacerbated by the COVID-19 pandemic. In other words, the COVID-19 pandemic led to further black-boxing of the internet infrastructure, making it harder for users to understand how it works. While this might make the internet faster and more available, it does not make the internet more reliable. Arguably, it makes the internet a better tool for control, because it increases power asymmetries between users and transnational corporations.

In 2011, Tunisian internet users were able to use the internet infrastructure against their own government. In 2020, it is nearly impossible for users around the world to even know where the websites they are accessing are located, let alone take them down. The internet is no longer a bazaar. The COVID-19 pandemic helped fortify an industrial zone that now is the internet, which only allows users to connect on the outside, without having a view or control on the inside. The internet has become a smart network, with not so smart edges.

 

Niels ten Oever is a post-doctoral researcher at the University of Amsterdam (The Netherlands) and Texas A&M University (USA), associated also with the Centro de Tecnologia e Sociedade at the Fundação Getúlio Vargas, Brazil. His research focuses on how norms such as human rights get inscribed, resisted, and subverted in the Internet infrastructure through transnational governance. Previously, Niels has worked as Head of Digital for ARTICLE19 and served as programme coordinator for Free Press Unlimited. He holds a cum laude MA in Philosophy and a PhD in Media Studies from the University of Amsterdam. He sometimes

 

Talking media literacy, Jeroen @DikkeDataShow to discuss TikTok personalisation algorithm

Our own Jeroen de Vos starred in the first episode of the new VRPO series ‘de dikke data show’ (the thick data show). Each episode looking into another aspect of digital culture trying to explain and increase literacy tailored to a younger audience. Jeroen was invited to explain personalisation algorithms functioning on TikTok, with the work of Algorithms Exposed being the backbone to a small experiment to look into this algorithm.

Find the episode here (in Dutch)

About the episode: Who decides whether you go viral on TikTok? Jard Struik and media scholar Jeroen de Vos discover what an algorithm is and how that algorithm determines which videos you see. What is the downside of this invisible regulator? Together with successful and well-known TikTokkers such as Lorenzo Dinatelle, Emma Keuven and Bo Beljaars, he checks whether the algorithm really knows us that well. And Jard also dresses up as an e-boy to increase the chance of going viral.

 

[BigDataSur-COVID] Alternative Perspectives on Relationality, People and Technology During a Pandemic: Zenzeleni Networks in South Africa

By Nic Bidwell & Sol Luca de Tena

Many rural communities in Africa have characteristics that are neither represented by data about COVID-19, nor addressed by public health information designed to help people protect themselves. This does not mean to say that rural inhabitants are unaffected by information designed for different populations; and grassroots initiatives have been vital in countering the impacts of this. Here, we reflect on the role of community networks in customising information and services for rural inhabitants during the pandemic, and how they reveal constructs embedded in data representation and aggregation. Community networks (CNs) are telecommunications initiatives that are installed, maintained, and operated by local inhabitants to meet their own communication needs. Rey-Moreno’s 2017 survey identified 37 community networks in 12 African countries. With the success of four Annual African CN Summits, more are emerging every year. Our account focuses on Zenzeleni Networks in South Africa. Thus, we begin by introducing its response to COVID-19 and ensuring health information suited local circumstances. We end by arguing that examples of contextualisation reveal logics about personhood that are vital to tackling the disease, but not represented by individualist models embedded in datafication.

Zenzeleni’s Response to COVID-19

Zenzeleni is a community-owned wireless internet service provider that has connected more than 13,000 people and 10 organisations to the internet in South Africa’s Eastern Cape province. The network is owned by amaXhosa inhabitants (including 40% women) and is run by two local cooperatives. A cooperative approach ensures internet access costs are up to 20 times lower than services offered by existing telecommunications operators, and expenditure is retained locally. The non-profit organisation Zenzeleni Networks NPC was established through the cooperative, and provides vital connections with regulatory authorities and telecommunications expertise. Zenzeleni was seeded in Mankosi, a remote district of 12 villages, by PhD researchers at the University of the Western Cape in Cape Town, which followed prolonged collaborations on solar electricity and media sharing technologies. Over the past eight years, the community network has evolved as a social innovation ecosystem in which rural communities own their telecommunication businesses. Like other community networks in the global south, Zenzeleni has created employment and developed technical skills in one the most disadvantaged areas in South Africa.

As well as providing more affordable and higher quality network services than alternatives, Zenzeleni’s embeddedness directly links technology and media considerations to local life. As the COVID-19 lockdown ensued, inhabitants working, studying or seeking work in cities returned to their rural family homes. Zenzeleni played a vital role in providing continuity to residents’ urban lives, by adding network infrastructure to extend the community access points and ensuring free and open access to education websites, including all of the nation’s universities and further education colleges. Indeed, usage of access points tripled during since the pandemic began.

Not only are health services difficult to access, but the local populations served by Zenzeleni are particularly vulnerable; they have a high incidence of HIV, tuberculosis, and child and maternal health issues. Thus, Zenzeleni sourced funding to connect the District Hospital. Just as importantly, however, from the pandemic’s onset, the network started to address health information needs. Like other groups across Africa, Zenzeleni immediately recognised the mismatch between health information issued by WHO and South Africa’s national government and local circumstances. Not only was information initially unavailable in most of Africa’s 2000 languages, even when advice was in a home language it was ill-suited to many rural contexts. Recommending regular handwashing, for instance, is inappropriate for Mankosi’s inhabitants who share a few unreliable taps in their villages because water is not supplied to households. Similarly, guidelines on shared transport are irrelevant when only one bus a day connects villages on a five hour round trip to the nearest supermarket. Zenzeleni ensured free and open access to official health websites. Understanding the local context launched projects also increased access to relevant information resources and raised awareness of health strategies that matched local circumstances.

My Mask Protects you, and Yours Protects me: Accounting for Personhood in the Datafied Society

While providing health information in home languages suited to local constraints is vital, but efficacy in managing a socially-spread disease requires integrating deeper insights about the nuances of local social practices and relations. For instance, people returning to villages from cities bring information of varying legitimacy, from recommendations to outright falsehoods. Locally, this information was interpreted through assumptions that information in cities was inherently more credible because cities are highly connected. The valorisation of information associated with electronic media has been discussed elsewhere in rural southern Africa. An implicit part of Zenzeleni’s role has been to foster critical approaches to disinformation by directing inhabitants to legitimate information and ensuring information was properly contextualised. However, at the same time, promoting information access must account for sharing practices. While internet hotspots safely offer socially-distanced access, many inhabitants group around tablets and phones.

Device-sharing practices in Mankosi are not merely about limited access to devices. They also involve a cultural construct of relationality. Devices like smartphones are embedded with logic that personhood exists prior to interpersonal relationships (Bidwell, 2016). This individualist logic contrasts with the philosophy of Ubuntu, an isiXhosa word which is often translated as “I am because we.” This collective logic assumes that neither community or individual exists prior, and being human depends on the mutual and dynamic constitution of other humans. As Eze explains:

We create each other and need to sustain this otherness creation. And if we belong to each other, we participate in our creations: we are because you are, and since you are, definitely I am.

The importance of the construct of Ubuntu to effective contextualisation is illustrated by Zenzeleni’s local volunteers’ observations that community members assisted each other in putting on face-masks. Senses of mutual responsibility are straightforward in communities such as Mankosi. However, routinely performing responsibility involves physical help and, since none of the guidelines explicitly combine social distancing with putting on a mask, this represents an ambiguity.

The challenge of translating a guideline such as “wear it for me” reveals an important role for community networks in COVID-19 times, and in datafication more generally. Much like the assumption of a person putting on their masks themselves, prevalent models of data extraction, representation, and personalisation cultivate and amplify an individualist logic. Yet, as many commentators have suggested, the best protection we have against the virus is Ubuntu. Zenzeleni and other community networks around the world offer an alternative perspective on relationality, people, and technology.

 

Nicola Bidwell is an adjunct professor at the International University of Management, Namibia, and a researcher at University College Cork, Ireland. She has applied her expertise in community-based, action research for technology design in the Global South for the past 15 years, and catalysed thought about indigenous-led digital design and decolonality. Nic is an associate editor for the journal AI & Society: Knowledge, Culture and Communication.

Sol Luca de Tena has over a decade of experience in strategic project management within technology development, capacity building, social impact, and policy, with a focus on utilising technologies to address environmental and social challenges. She is currently the acting CEO of Zenzeleni Networks Not for Profit company, supporting the operation and seeding of community networks in rural communities in South Africa. She also leads various projects which seek to address the digital divide in a human centre approach, and collaborates on various working groups and forums on Community Networks in Africa and around the world.

DATACTIVE 2020 year-in-review

2020 has been an intense year under many points of view. As you know, the DATACTIVE project was supposed to end in August 2020, but due to COVID-19 we negotiated a so-called no-cost extension with our funder, the European Research Council, which extended the life span of the project until June 30th, 2021.

Over these months, we have kept busy despite the many uncertainties and logistical problems imposed by the pandemic. We would love to share the good news and our main accomplishments, together with our best wishes for the new year.

What are we most proud of? The first of the four DATACTIVE PhD students successfully defended his PhD in October 2020! The dissertation, entitled ‘Wired Norms: Inscription, resistance, and subversion in the governance of the Internet infrastructure’, can be found here [0]. Watch out for the other PhD candidates…

We gave many talks, mostly on Zoom!  But we also managed to host the workshop ‘Contentious Data: The Social Movement Society in the Age of Datafication’, organized by Davide Beraldo and Stefania on November 12-13 2020 and contributing towards the Special Issue of the same title, in preparation for the journal Social Movement Studies.

We completed data collection and analysis of over 250 interviews with civil society actors from all corners of the globe. Our developer Christo have finalized (and will soon release in GitHub) an open-source infrastructure that allows to collaboratively analyze and manage qualitative data outside the corporate environment of mainstream data analysis software and safeguarding the privacy and safety of our informants. We are now busy making sense of all these beautiful ‘thick’ data and writing up articles and chapters.

Stefania has been particularly busy with the spin-off blog of the Big Data from the South Research Initiative, dedicated to exploring the first pandemic of the datafied society seen from… communities and individuals left at the margins of media coverage, public concern and government response. You can access the many contributions published since May in COVID-19 from the Margins [1]. We are happy to announce that the blog resulted in an open-access multilingual book edited by Stefania, Emiliano Treré and Silvia Masiero for the Amsterdam-based Institute of Network Culture. The book will be released in both digital and printed form in January 2021. Book your copy if you want to receive one!

Collectively, we published four articles and three book chapters, listed below. Four articles—for New Media & Society, Globalization, and Big Data & Society, will be released in early 2021, alongside three book chapters. A co-edited special issue on media innovation and social change has been released in early 2020, while three co-edited special issues, respectively for the peer-reviewed international journals Internet Policy Review, Social Movement Studies and Palabra Clave, are in the working and will be released in the course of 2021.

Many people worked in the background alongside with PI Stefania, in particular our tireless project manager Jeroen de Vos, our developer Christo, our PhD candidates Guillén Torres and Niels ten Oever, and postdoc Davide Beraldo and to them goes our gratitude.

We wish you happy holidays and a peaceful and healthy 2021!

Best regards, Stefania for the DATACTIVE team

[0] https://nielstenoever.net/wp-content/uploads/2020/09/WiredNorms-NielstenOever.pdf
[1] https://data-activism.net/blog-covid-19-from-the-margins/

OUR PUBLICATIONS IN 2020

PhD DISSERTATION

ten Oever, Niels. (2020). Wired Norms: Inscription, resistance, and subversion in the governance of the Internet infrastructure. Ph.D thesis, University of Amsterdam

ARTICLES

Milan, S., & Treré, E. (2020). The rise of the data poor: The COVID-19 pandemic seen from the margins. Social Media + Society, July. https://doi.org/10.1177/2056305120948233

Milan, S., & Barbosa, S. (2020). Enter the WhatsApper: Reinventing digital activism at the time of chat apps. First Monday, 25(1). https://doi.org/10.5210/fm.v25i12.10414

Tanczer, L. M., Deibert, R. J., Bigo, D., Franklin, M. I., Melgaço, L., Lyon, D., Kazansky, B., & Milan, S. (2020). Online Surveillance, Censorship, and Encryption in Academia. International Studies Perspectives, 21(1), 1–36. https://doi.org/10.1093/isp/ekz016

Milan, S. (2020). Techno-solutionism and the standard human in the making of the COVID-19 pandemic. Big Data & Society. https://doi.org/10.1177/2053951720966781

SPECIAL ISSUES

Ni Bhroin, N., & Milan, S. (Eds.). (2020). Special issue: Media Innovation and Social Change. Journal of Media Innovations, 6(1). https://journals.uio.no/TJMI

BOOK CHAPTERS

ten Oever, N., Milan, S., & Beraldo, D. (2020). Studying Discourse in Internet Governance through Mailing-list Analysis. In D. L. Cogburn, L. DeNardis, N. S. Levinson, & F. Musiani (Eds.), Research Methods in Internet Governance (pp. 213–229). MIT Press. https://doi.org/10.7551/mitpress/12400.003.0011

Milan, S. (2020a). Big Data. In B. Blaagaard, L. Pérez-González, & M. Baker (Eds.), Routledge Encyclopedia of Citizen Media (pp. 37–42). Routledge.

Milan, S., & Treré, E. (2020b). Una brecha de datos cada vez mayor: La Covid-19 y el Sur Global. In B. M. Bringel & G. Pleyers (Eds.), Alerta global. Políticas, movimientos sociales y futuros en disputa en tiempos de pandemia (pp. 95–100). CLACSO and ALAS. http://biblioteca.clacso.edu.ar/clacso/se/20200826014541/Alerta-global.pdf

OTHER

Milan, S., & Treré, E. (2020c, April 3). A widening data divide: COVID-19 and the Global South. OpenDemocracy. https://www.opendemocracy.net/en/openmovements/widening-data-divide-covid-19-and-global-south/

ten Oever, Niels. (2020). ‘Cybernetica, dataficatie en surveillance in de polder‘ in: Ni Dieu, Ni Maitre. Festschrift for Ruud Kaulingfrek. Waardenwerk, Journal for Humanistic Studies, SWP.

Di Salvo, P., & Milan, S. (2020, April 24). I quattro nemici (quasi) invisibili nella prima pandemia dell’era della società dei dati. Il Manifesto. https://ilmanifesto.it/i-quattro-nemici-quasi-invisibili-nella-prima-pandemia-dellera-della-societa-dei-dati/

Milan, S., & Di Salvo, P. (2020, June 8). Four invisible enemies in the first pandemic of a “datafied society.” Open Democracy. https://www.opendemocracy.net/en/can-europe-make-it/four-invisible-enemies-in-the-first-pandemic-of-a-datafied-society/

Milan, S., Pelizza, A., & Lausberg, Y. (2020, April 28). Making migrants visible to COVID-19 counting: The dilemma. OpenDemocracy. https://www.opendemocracy.net/en/can-europe-make-it/making-migrants-visible-covid-19-counting-dilemma/

Pelizza, A., Lausberg, Y., & Milan, S. (2020, maggio). Come e perché rendere visibili i migranti nei dati della pandemia. Internazionale. https://www.internazionale.it/opinione/annalisa-pelizza/2020/05/14/migranti-dati-pandemiazza/2020/05/14/migranti-dati-pandemia

IN PRESS

BOOKS

Milan, S., Treré, E., & Masiero, S. (2021). COVID-19 from the Margins: Pandemic Invisibilities, Policies and Resistance in the Datafied Society. Institute for Networked Cultures.

ARTICLES

Kazansky B (2021). “It depends on your threat model”: Understanding the anticipatory dimensions of resistance to datafication harms. Big Data & Society.

Kazansky, B., & Milan, S. (2021). Bodies Not Templates: Contesting Mainstream Algorithmic Imaginaries. New Media & Society.

ten Oever, N. (2021). ‘This is not how we imagined it’ – Technological Affordances, Economic Drivers and the Internet Architecture Imaginary. New Media & Society.

ten Oever, Niels (2021). Norm conflict in the governance of transnational and distributed i­nfrastructures: the case of Internet routing. Globalizations.

CHAPTERS

Milan, S., & Treré, E. (2021). Big Data From the South(s): An Analytical Matrix to Investigate Data at the Margins. In D. Rohlinger & S. Sobieraj (Eds.), The Oxford Handbook of Sociology and Digital Media. Oxford University Press.

Milan, S., & Treré, E. (2021). Latin American visions for a Digital New Deal: Learning from critical ecology, liberation pedagogy and autonomous design. In IT for Change (Ed.), Digital New Deal. IT for Change.

ten Oever, Niels. 2021. ‘The metagovernance of internet governance’. In eds. B. Haggart, N. Tusikov, and J.A. Scholte, Power and Authority in Internet Governance: Return of the State?. Routeledge Global Cooperation Series

SPECIAL ISSUES IN THE WORKING

Three special issues we are very excited about

Milan, S., Beraldo, D., & Flesher Fominaya, C. Contentious Data: The Social Movement Society in the Age of Datafication, Social Movement Studies

Treré, E., & Milan, S., Latin American Perspectives on Datafication and Artificial Intelligence, Palabra Clave

Burri, M., Irion, K, Milan, S.& Kolk, A. Governing European values inside data flows, Internet Policy Review

ALSO FROM THE TEAM….

Beraldo, D. (2020). Movements as multiplicities and contentious branding: lessons from the digital exploration of# Occupy and# Anonymous, Information, Communication & Society, DOI: 10.1080/1369118X.2020.1847164

Grover, G., & ten Oever, N. (2021). Guidelines for Human Rights Protocol and Architecture Considerations, RFC-series, Internet Research Taskforce.

Knodel, Mallory., Uhlig, Ulrike., ten Oever, Niels., Cath, Corinne. (2020) How the Internet Really Works: An Illustrated Guide to Protocols, Privacy, Censorship, and Governance. No Starch Press, San Francisco, United States.

Milan, C., & Milan, S. (2020). Fighting Gentrification from the Boxing Ring: How Community Gyms reclaim the Right to the City. Social Movement Studies. https://doi.org/10.1080/14742837.2020.1839406.

BigDataSur 2020 year-in-review

by Stefania Milan, Emiliano Treré and Silvia Masiero

December 18, 2020

2020 has been a tough year for many reasons. The COVID-19 global health emergency has claimed lives, exposed our dependence on the digital infrastructure, and impoverished many communities even further. We were forced to change plans, subvert our lifestyles, distance ourselves from our loved ones. The first pandemic of the datafied society has exposed the weakness of people and communities at the margins. Not only has the Global South been severely hit, but also gig workers, impoverished families, domestic violence survivors, LGBTQ+, indigenous, migrant, racialized and rural communities have paid an even higher price in terms of lowered income, loneliness, violence, death. If anything, this pandemic has made clear the need for an initiative like Big Data from the South, tasked with interrogating and exposing impending forms of inequality, injustice and poverty as they intercept the datafied society. 

Against this backdrop, Big Data Sur has not remained quiet. Our network has produced a number of critical cutting-edge reflections on the main challenges of the pandemic. The thematic, multilingual blog ‘COVID-19 from the margins’, launched in May 2020, has given voice to the many fringes left in the dark by the mainstream coverage of the pandemic. It has and continues offering precious food for thought to reflect on the challenges of the pandemic for the disempowered.

To date, we published contributions from over 80 authors, in five idioms, and reporting from some 25 countries. We covered all continents–from Indonesia to Mexico, from Peru to New Zealand, from Namibia to China to Spain. Among others, we ran a special on Brazil when the controversial president Jair Bolsonaro dismissed the pandemic as a mere ‘gripezinha’ (light flu). Lately, a group of astronomers contributed their experience with working with indigenous communities in the rural areas of Brazil. 

We worked in the shadows (we even designed the logo ourselves!), we worked nights. We fundraised to be able to pay a small contributor fee to authors in need, and provided editorial support in several languages to empower less experienced writers to share their stories for a global audience. This was only possible thanks to new team members that joined us. Silvia Masiero, Associate Professor of Information Systems at the University of Oslo, has joined Emiliano Treré and Stefania Milan in the editorial team. Nicolás Fuster, Guillén Torres, Zhen Ye, Jeroen de Vos, and Yiran Zhao provided key support in the background. Volunteer proof-readers like Sergio Barbosa (Portuguese) e Giulia Polettini (Chinese) helped us occasionally. To this splendid team goes our gratitude and appreciation: without their precious help, we would not have been able to publish in so many idioms and with such a high frequency.

Unfortunately, our project is chronically underfunded. But some illuminated organizations believed in the urgency of the BigDataSur agenda. In particular, the COVID-19 blog was supported by the Amsterdam School of Cultural Analysis at the University of Amsterdam (The Netherlands) and School of Journalism, Media and Culture at Cardiff University (UK), and by the European Research Council via the DATACTIVE project: thank you!

Besides the blog, also in 2020 BigDataSur work and values has been featured in public talks and lectures, and in a seizable number of academic writings. An analytical matrix to study ‘data from the margins’ will soon appear as part of the Oxford Handbook of Sociology and Digital Media edited by Rohlinger and Sobieraj for Oxford University Press. A special issue of the multilingual journal Palabra Clave will be released in August 2021 exploring ‘Latin American perspectives on datafication and Artificial Intelligence’. And more is in store, including plans for a course on ‘Decolonizing Datafication’ to be added to the teaching curriculum at the University of Amsterdam—for a start. 

What’s next?

Due to lack of funding, the COVID-19 blog will progressively wind down. So hurry up and send us your posts if you want to join the conversation! 

But we also have great news in store for you: the blog has given birth to the multi-vocal book COVID-19 from the Margins: Pandemic Invisibilities, Policies and Resistance in the Datafied Society. The book—proudly multilingual and rigorously open access—will be released in early January by the Amsterdam-based Institute for Networked Cultures, as part of their edgy series ‘Theory on Demand’. As some of you know, the December release date had to be postponed because COVID-19 hit the publisher, too. We wish to extend our heartfelt thanks to our amazing copy-editor Andrew Schrock from Indelible Voice, who worked against the clock to deliver the final manuscript. Thanks to funding by two of the University of Amsterdam’s Research Priority Areas, namely ’Global Digital Culture’ and ‘Amsterdam Center for European Studies’, as well as the DATACTIVE project, we will print a sizable number of copies for free distribution. Let us know if we should reserve a copy for you! We can mail it anywhere. 

 

 

[BigDataSur-COVID] Towards Civic Data Policies: Participatory Safeguards in COVID-19 Times

By Arne Hintz

The pervasive tracing, tracking, and analysing of citizens and populations has emerged as the tradeoff of an increasingly datafied world. Citizens are becoming more transparent to the major data-collecting institutions of the platform economy and the state, while they have limited possibilities to intervene into processes of data governance, control the data that is collected about them, and affect how they are profiled and assessed through data assemblages. The COVID-19 pandemic has highlighted the centrality of these dynamics. Contact tracing and detailed identification of outbreak clusters have been essential responses to COVID-19. Yet, detailed data about our movements, interactions and pastimes is now tracked, stored, and analysed, both “online” through the use of contact-tracing apps and “offline” (e.g., when we fill in a form at a bar or restaurant). The rise of tracking raises the question of how exactly data is collected and analysed, by whom, for what purposes, and with what limitations. Essentially, it signals the necessity of legal safeguards to ensure that data analytics fulfil their purpose while preventing privacy infringements, discrimination, and the misuse of data. The COVID-19 pandemic thus alerts us to the importance of effective regulatory frameworks that protect the rights and freedoms of digital citizens. It also demands public involvement in a debate that affects our lives during the pandemic and beyond.

The wider context of data policy in the wake of major data controversies by both public and commercial institutions—from the Snowden revelations to Cambridge Analytica—is currently ambiguous. On the one hand, it reflects a deeply entrenched commitment to expansive data collection. On the other hand, it increasingly recognises the need for enhanced data protection and citizens’ data rights. In many countries, the possibilities for monitoring people’s data traces (particularly by state agencies) have significantly expanded. The UK Investigatory Powers Act from 2016 serves as a stark example, because it legalised a broad range of measures, including the “bulk collection” of people’s data and communication; the “internet connection records” (i.e., people’s web browsing habits); and “computer network exploitation” (i.e., state-sponsored hacking into the networks of companies and other governments as well as the devices of individual citizens).1

At the same time as these encroachments, we have also seen the strengthening of data protection rules, most prominently by the European Union General Data Protection Regulation (GDPR) in 2018. The GDPR enhances citizen control over data by providing rights to access and withdraw personal data, request an explanation for data use, and deny consent to data tracking by platforms. It requires that data be collected only for specific purposes to reduce indiscriminate data sharing and trading. The GDPR also limits the processing of sensitive personal data. While some elements of the GDPR have been controversial and the regulation overall is often described as insufficient, it has been recognised as an important building block towards a citizen-oriented data policy framework. The emerging policy environment of data collection and data use has been significant in societies that are increasingly governed through data analysis and processes of automated decision-making. Profiling citizens and segmenting populations through detailed analysis of personal and behavioural data are now at the core of governance processes and shape state-citizen relations.

What does the shifting data environment mean during COVID-19 times? How should regulatory frameworks enable and constrain the tracking and tracing of virus outbreaks, and what boundaries should exist? If we accept that some data collection and analysis is useful to address the pandemic and its serious health implications, the purpose limitation of this data (as highlighted by the GDPR) becomes crucial. In some countries, contact-tracing apps were designed to track a much wider range of data than initially necessary for tracing infection chains and enable government agencies to use that data for non-medical tracking purposes. In order to avoid contact-tracing becoming a Trojan Horse for widespread citizen surveillance, strict purpose limitation would be an essential cornerstone of a robust regulatory framework. Similarly, limitations to the collection of sensitive data and the deletion of all data at fixed times during or after the pandemic would be core components of such a framework. While it may be debatable whether wider data collection and sharing would be acceptable as long as the affected individuals give their consent, a consent model often leads to pressures and incentives for citizens to hand over data against their will and interest, which would make strict prohibitions seem a more appropriate mechanism. The COVID-19 contact-tracing case thus points to some of the elements that are increasingly discussed and regulated as part of policy reforms such as the GDPR, and it highlights the challenges of indiscriminate data collection.

Indiscriminate data collection also poses questions about who should develop such policy, and whether broader public involvement would be desirable or even necessary. The COVID-19 pandemic helps us explore the role of citizens as policy actors. Contributions to the regulatory and legislative environment by civic actors outside the realm of traditional “policymakers” have received increased attention in recent years. These range from the role of civil society in multi-stakeholder policy processes to policy influences by social movements and to the development of specific legislation by citizens in the form of what has been called crowd law and policy hacking.’ The COVID-19 case demonstrates multiple dimensions of these kinds of public engagement. It shows the strong normative role of technical developers arguing for decentralised data storage options in contact-tracing apps (e.g., the Decentralised Privacy-Preserving Proximity Tracing project), who have prevailed in many cases over the initial government intention to centralise data handling. Further, we have seen legal scholars taking the lead in proposing relevant legislative frameworks, for example, by developing a dedicated Coronavirus Safeguards Bill for the UK (which has not, so far, been adopted by the UK government but has still influenced the debate on contact-tracing). The public discourse on COVID-19 responses in many countries has also considered the problem of data collection and possible privacy infringements, thus placing data analytics firmly on the public agenda.

The current pandemic has shown that emergency situations require the rapid adoption of legal safeguards, and a wider public debate on what data analyses are acceptable and where boundaries lie. Policy components from recent regulatory frameworks such as the GDPR can be an important part of this endeavour, as should critical reflection on data extraction laws such as the Investigatory Powers Act. Expert proposals from civil society have promoted rules that address problems raised by the pandemic while protecting civic rights. At the “margins” of established policy processes, these interventions by civil society and the public play a significant role in advancing normative pressure on civic data policies.

 

About the author

Arne Hintz is Reader at Cardiff University’s School of Journalism, Media and Culture and Co-Director of its Data Justice Lab. His research focuses on digital citizenship and the future of democracy and participation in the age of datafication. He is Co-Chair of the Global Media Policy Working Group of the International Association for Media and Communication Research and co-author of Digital Citizenship in a Datafied Society (Polity, 2019).

[BigDataSur-COVID] Africa’s Responses to COVID-19: An Early Data Science View

By Vukosi Marivate, Elaine Nsoesie & Herkulaas MVE Combrink

 

COVID-19 is a unique event that has shaken the world. It has disrupted the way we live, how we work, and what we think. Across Africa, the arrival of COVID-19 also drew attention to the continent. We have had to live through grim forecasts of how “badly” the continent was going to respond to the virus, or whether the continent was different and we would not feel the impact. Given that we are still in the midst of the pandemic, we have a hard task of sifting through the opinions and reports to get to a better understanding of what has happened. We have to deal both with trying to better measure impact or contemplate if natural remedies would prevent spread. As data scientists, we believe that what is measured obscures shortcomings that otherwise might enlighten us on how we can better deal with such situations in the future.

Africa has significant experience dealing with infectious disease epidemics. For example, countries in West and Central Africa have responded over the decades to Ebola outbreaks, and Southern Africa has had HIV/AIDS to deal with. The experiences gained from these epidemics have prepared African health systems to respond to the pandemic. We are likely to see many research papers in the coming years dissecting what impact this preparedness may have had. In this article, we focus on how Africa worked to track COVID and what that might mean for data scientists in the future. What should we learn? Where did things go well? Where did things fail? How do we improve?

When we Measure the Spread

As the pandemic spread across the Northern Hemisphere, throughout the African continent questions formed about the potential impact of COVID-19 on different African countries. In many countries, COVID working groups were set up. These working groups were typically were made up of government and external experts who planned to look at different factors in the responses to COVID-19. In many instances, these groups used data to track COVID-19 and assist in modelling and data-driven decision making. One would have noticed the proliferation of country-led dashboards or infographics on the COVID-19 spread. In some countries, numbers were difficult to track and understand, because of low numbers of tests. The tracking of COVID-19 spread required a pipeline that could test, report, and aggregate information in a meaningful way for epidemiological and clinical surveillance.

Challenges in Reporting

We have seen international challenges to the free, transparent, and open reporting on the severity of COVID-19. Some African countries had these challenges as well, from denying the pandemic exists to refusing to release information on testing and confirmed cases. These challenges cannot be explained by simplistic reasons such as political pandering, but likely indicate challenges in resources available to respond to the pandemic. Countries have been stretched thin in a short period of time, and systems may not have the capacity to change direction this quickly. In this environment, how do you compile statistics and share meaningful information with both the public and policy stakeholders?

COVID-19 Will Still be With us

No one should underplay how COVID-19 will ultimately impact African countries. Its impact will not only be on healthcare; many sectors of society will likely be reeling from the sustained effects of the pandemic. There is already looming evidence about the adverse and secondary damage to other sectors such as education, crime, healthcare, and the economy. Decisions on border and business closures made during the early stages of the outbreak may also have lasting effects on countries in Africa.

Tracking More than Health

COVID-19 has affected more than just health, and the effects will be with us for some time. As we move into second waves in some countries, we are now deciding how to rehabilitate economies, the education systems, and tourism. All of these decisions require data that crosses between national statistics offices and stakeholders. To better plan recoveries and interventions, organisations and states are working to use data to make choices about which interventions might be best. This process extends the need for data beyond the healthcare system toward a coordinated response driven by the public, private and non-governmental institutions. Data and data related issues are the ultimate reflection of people and capacity issues present within a system. If we are to combat negative outcomes, we should all work toward capacitating our nations to prepare for the future.

Lessons we Must Learn

Counting is hard. It requires will, cooperation and resources that together improve policy. We need to learn how to set up the data infrastructure so that counting can catalyze data practices in the future. Yet, setting up a data infrastructure requires money and human capacity. Across the global population, we will have more emergencies to deal with. As such, governments must prepare adequately during the “peace times.” If we do not prepare, we will not get ahead to manage future crisis and crisis situations better. Investing in capacity and building the required skills to disseminate information in a more reliable way helps prepare us for the future. We should never sway away from training, innovation and incentivising education for the purpose of growth and improvement. Technical skills across all sectors—especially within healthcare—have served vital roles during the pandemic and will continues to do so. Capacitating the healthcare system with the technical skills to manage information, actively strive for excellence, and innovate still remains the foundation of preparedness, and drives the proactive strategies we need to be successful as a society.

Vukosi Marivate (https://dsfsi.github.io/) is the ABSA UP Chair of Data Science at the University of Pretoria. A large part of his work over the last few years has been in the intersection of Machine Learning and Natural Language Processing. Vukosi is interested in Data Science for Social Impact, and uses local challenges as a springboard for research. Vukosi is a co-founder of the Deep Learning Indaba, the largest Artificial Intelligence grassroots organisation on the African continent, aiming to strengthen African Machine Learning. He tweets at @vukosi.

Elaine Nsoesie is an Assistant Professor at the Boston University School of Public Health. She has a PhD in Computational Epidemiology, an MS in Statistics, and a BS in Mathematics. Her research is focused on the use of digital data and technology to improve health in global communities. Her work has also addressed bias in digital data. She is on the advisory boards of Data Science Africa and Data Science Nigeria. She is also the founder of Rethé (rethe.org), an initiative that provides scientific writing tools and resources to student communities in Africa to increase representation in scientific publications.

Herkulaas Michael Combrink is a medical biological scientist with more than six years data science experience with “Big Data” of institutional databases. Over the past seven years, he has been active in both healthcare and education. Herkulaas has won several awards for his work in Data Science, Data management and Healthcare. During the COVID-19 outbreak in the Free State, he has been seconded to assist the Free State Department of Health in data science and surveillance support. Additionally, Herkulaas is a PhD candidate in computer science at the University of Pretoria, South Africa.

 

[BigDataSur-COVID] Solutionism, Surveillance, Borders and Infrastructures in the “Datafied Pandemic”

By Philip Di Salvo

The COVID-19 pandemic has been a prism and an amplifier for anything data. It has exposed underlying issues that require the attention of academics, activists, journalists, and policy makers. Health emergencies are enormous stress tests for civil rights and freedoms, and for the platforms through which societies come together. With most of the world population under lockdown or subjected to monitoring, digital platforms and internet infrastructures have become leading spaces where social life takes place. This may sound obvious now, but as Franco “Bifo” Berardi wrote in his pandemic-influenced book Fenomenologia della fine, COVID-19 globally recodified the assumptions of our societies, so we must consider their datafied sides. While we live on the internet more than ever, access to tools, basic services, and social environments is becoming increasingly unequal. Such inequalities have increased due to the uneven distribution of opportunities, resources, and the exclusive design of socially-impactful technologies.

In a piece for Open Democracy written from the Dutch and Italian lockdowns last spring, Stefania Milan and I tried to identify “four enemies” from the pandemic in the context of the “datafied society.” Back then, we claimed that the pandemic was accelerating “potentially dangerous dynamics” capable of causing huge collective damage. In the fall of 2020, those dynamics apparently exploded in plain sight, exacerbated by the long-awaited “second wave” of the virus and political intervention worldwide. As we expected, the pandemic reformulated the relationships between tech, power, and justice, as claimed by Linnet Taylor, Gargi Sharma, Aaron Martin, and Shazade Jameson in their book Data Justice and COVID-19: Global Perspectives. The outcomes of these reformulation have not yet manifested clearly, but their occurrence appears visible in some domains, especially the most marginalised communities. In this essay, I will discuss four keywords: solutionism, surveillance, borders, and infrastructures.

Solutionism

The pandemic has been accompanied by a new wave of solutionism in policy making, healthcare, and beyond. Solutionism has been described by Evgeny Morozov as the the “idea that given the right code, algorithms and robots, technology can solve all of mankind’s problems.” We heard lots of these calls during the pandemic, especially when the release of contact-tracing apps were heralded as the “silver bullet” to the spread of the pandemic. In Italy, the government adopted privacy-respectful solutions and frameworks for its national app Immuni (“the immune ones”). However, the sensitivity of the Italian app development came only from weeks of pressure from privacy activists, academics, and journalists to avoid more invasive software solutions. Even in an established democracy, China was frequently described as a model to follow, especially in regards to the tracking of citizens during the pandemic. Although that pressure led to better decisions and an improved app, privacy and surveillance are not the only potential problems in regards to these apps. Whereas they’re undoubtedly effective to trace cases and are one more solution that states can adopt in the battle against COVID-19, they’re not the most fundamental solution.

Even when privacy-respectful, contact tracing apps may exclude enormous segments of the population: Singapore has been an interesting and dramatic case study in these regards, also because the city has been frequently indicated as an excellent example in the response to the pandemic, especially in regards to technology usage. As the BBC reports, though, “success crumbled when the virus reached its many foreign worker dormitories” that are home to over 300,000 low-wage foreign workers, living in inadequate conditions where social distancing is impossible and contact-tracing apps fail in their mission. As the cases number in the dorms sky-rocketed, Singapore authorities started releasing different statistics about the contagion: one about the city community, and one about the population in dorms. Excluded from any form of assistance and prevention, foreing workers were even hidden from the main data, ending up in dedicated statistics highlighting a clear inequality pattern. Stories of exclusion and blatant inequality related to technological responses to the pandemic have emerged from all over the world and also in developed and fully democratic countries. In Canada, for instance, it has been reported that the national contact-tracing app was released in French and English only, signaling another sign of exclusion for the four million Canadians who do not command those languages. In the UK, an expert board reporting to the government highlighted that some 21% of the UK adults do not use a smartphone, de facto excluding them to the access to contact-tracing apps. In Italy, the national contact tracing app doesn’t run on an array of older Android and Apple phones (and has shown some bugs also with more recents models), making income and consumer electronics competence as decisive factors in the spread of the app among the Italian population. The predominance of older versions of smartphones in Italy has been indicated as a driver of the low adoption of the app, as Wired reports. Although the Bangladesh and Western stories can’t be put on the same level in regards of their severity, it is clear that at every latitude technological determinism, when pushed with too much sublime emphasis on “smart” and “shiny” digital technologies, may in any case lead to forms of inequality and exclusion. Furthermore, evidence about the effectiveness of contact tracing apps is also limited, as reported by Lancet in August.

Surveillance

Whereas much of the debate about privacy in the context of the COVID-19 pandemic was about contact tracing apps, they’re certainly not the only potentially harmful technology revitalized in recent months. Surveillance studies scholars Martin French and Torin Monahan have pointed out that there is “evidence of surveillance dynamics at play with how bodies and pathogens are being measured, tracked, predicted, and regulated.” Basically, controlling a pandemic spread involves forms of surveillance. The spread of the pandemic has seen an acceleration in the adoption of various monitoring technologies and automated decision-making systems, according to an AlgorithmWatch report. These technologies include bracelets, selfies-apps, thermal scanners, facial recognition systemsm and programs for digital data collection and analysis. As AlgorithWatch posits, are these technologies becoming the “new normal?” The pandemic has seen an acceleration of the implementation of these technologies, frequently supported by a deterministic approach, raising critical questions about informed consent and the impact of such technologies on our fundamental rights. As we wrote at the beginning of this essay, global emergencies are also stress tests for societies and democracies at large, since they’re forced to cope with extraordinary situations. As Elise Racine, a research associate at A Path for Europe (PfEU), argues, “risk for function creep means that these tools may be co-opted by other security initiatives.” In this way, data-driven technologies may endanger the fundamental rights of the most vulnerable, who are more exposed to abusive forms of monitoring and surveillance.

The pandemic has revitalized the appetite for surveillance all around the world, with facial recognition and other controversial technologies leading the way. As the Centre for Security Studies at ETH Zürich reports, the market for surveillance cameras is expecting a substantial growth in 2021, reaching 300,000 new cameras being installed every day globally and a billion cameras installed by the end of the same year.13 Democratic institutions are at stake, since intrusive technologies undermine democratic values and have been shown to be disproportionately used to target minorities and exacerbate racial biases.

Examples of facial recognition being used to enforce COVID-19 restrictions have already emerged from Russia, where Moscow’s enormous network of cameras has been used to control residents during the lockdown. Even in democratic contexts like Italy, facial recognition is making its way into public spaces, often pushed as migration-containment strategy, as happened in the Italian city of Como—another sign that the most vulnerable communities of our societies are also the most exposed to constant monitoring. Crises set new standards. Are we slowly moving into a surveillance state where immediate health measures are paving the way for overreaching forms of surveillance that are here to stay? Without proper testing, clear frameworks, and guidelines, we risk endorsing a normalization of surveillance with effects that could be difficult to assess and take years to be de-implemented.

Borders

Borders have traditionally been surveilled. Unsurprisingly, technologies for monitoring borders are also accelerating their adoption across the world, riding promises to make life easier and safer during the pandemic. Whereas boarding a plane without touching any surface may sound like a viable solution to prevent the further spread of the virus, boarding a plane only through facial recognition raises obvious privacy concerns. Datafied “immunity passports” now being discussed in various countries pose serious threats to various segments of the population. They have been sold as another “crisis-response that depends on technology, as we saw with contact-tracing apps,” writes Privacy International. As the London-based NGO argues, these technical solutions are currently being hyped and pushed by private actors involved in travelling and border services, but their adoption may have serious impacts on the right of citizens to movement, and the lives of those most discriminated against. Also, these tools may become useful for profiling, as they may give “the police and security services more powers to not only know information about our health, but also to stop people and demand proof of immunity in certain situations,” as Privacy International again argues. The global lockdown has also deeply changed the nature or geographical borders and their political meanings, as migrants have been disproportionately victimized by this new status quo. Frequently, migrants and refugees failed to be included in COVID-19 statistics and figures, given their invisibility. Refugees are usually the first targets of the datafied surveillance practices discussed here. In April, the Bureau of Investigative Journalism reported how the digital monitoring and surveillance technological practices being now adopted during the pandemic were originally tested on refugees and migrants during the 2015 migration crisis in Europe. In Singapore, migrant workers have been forced to download a contact-tracing app, while Russia is reportedly considering following suit. Vulnerable communities, like migrants on the move, who are already suffering from weaker safeguards for their rights and freedoms, are now also increasingly becoming a testing ground for implementing datafied monitoring practices that may end up becoming standardized practices in a post-pandemic world.

Infrastructures

Digital infrastructures and platforms gained new centrality in our daily lives because of the pandemic. Smartworking, remote teaching, and public services were forced to migrate online and still rely on digital tools to function. This evolution also has profound implications in a society pushing for more datafication. It is time to ask, what are the long-term implications of making private services the de facto infrastructure of social life, citizenship, and agency? Coming back to contact-tracing apps as an example, there is little doubt that the framework provided by the Apple-Google alliance made a privacy-respectful structure readily available. Yet, we should demand greater transparency when such powerful companies become official suppliers of digital infrastructures used for health services. Power balances between national states and private entities are at stake. Most urgently, as David Lyon urges, the pandemic should be the moment when we start considering surveillance implications beyond the singular privacy issue.22 More is at stake, because surveillance has become a structural element of today’s societies. With most of our lives moving online, we’re also moving into spaces where what Shoshana Zuboff calls “surveillance capitalism” is the ruling economical, political, and social structure. Surveillance capitalism is increasingly exposing all societies’ activities to extended datafatication: the constant monitoring, sorting, and profiling of people for profit. It is time to build exit strategies and new forms of resistance; the datafied society is now an established reality and is already affected by global issues such as a pandemic. The view from inside this crisis has indicated that, in its current shape, the datafied society is increasingly working against its own citizens.

Philip Di Salvo is a post-doc and Professor at the Media and Journalism Institute of Università della Svizzera italiana in Lugano, Switzerland. His areas of research include leaks, the relationship between journalism, hacking, and internet surveillance. His latest books are Leaks. Whistleblowing e hacking nell’età senza segreti (LUISS University Press, 2019) and Digital Whistleblowing Platforms in Journalism. Encrypting Leaks (Palgrave Macmillan, 2020). He tweets at@philipdisalvo.

 

Stefania’s talks in November-December 2020

In November-December, DATACTIVE PI Stefania Milan will participate in a number of public events where DATACTIVE will variably feature.

In November:

  • Internet Governance Forum (3 November)
  • FallingWalls Fireside chat (6 November)
  • Jean Monnet Network
  • Wagening workshop (9 and 16 November )
  • ASCA Summit (24 November)

In December:

  • Roundtable “AI and Bias in Translation”, organised by the Goethe-Institut (2 December)
  • “Stories of the pandemic: voices and data from communities and the South”, part of the Speaker Series of the Institute for Media and Creative Industries at Loughborough University London (3 December). Stefania will join Vinod Pavarala (University of Hderabad, India) and Emiliano Treré (Cardiff University, UK) in a two-hour discussion on the role of community media during the COVID-19 crisis.
  • Digital methods for social change” workshops for the Swedish International Centre for Local Democracy (15-15 December)