Author: Stefania

“Contentious data” paper development workshop hosted by Davide and Stefania (November 12-13)

On November 12-13, Davide and Stefania will host a paper development workshop where the prospective authors of the special issue Contentious Data: The Social Movement Society in the Age of Datafication will have the chance to discuss their draft papers. The issue is edited by Stefania Milan, Davide Beraldo and Cristina Flesher Fominaya, and will be submitted to the International peer-reviewed journal Social Movement Studies. Due to the pandemic, the workshop has been moved to Zoom.




NEW! “Techno-solutionism and the standard human in the making of the COVID-19 pandemic” (Big Data & Society)

The commentary “Techno-solutionism and the standard human in the making of the COVID-19 pandemic” by Stefania Milan has appeared in the Special Section “Viral Data” of the journal Big Data & Society. You can read it here. A video abstract is available on the journal website and YouTube as well. “Viral Data” is edited by Agnieszka Leszczynski and Matthew Zook and includes a number of important reflections, which can be accessed here.

Cite as Milan S. Techno-solutionism and the standard human in the making of the COVID-19 pandemic. Big Data & Society, 7(2). July 2020. doi:10.1177/2053951720966781

28th at 14:00 PhD defence Niels ten Oever: How to align the Internet infrastructures with human rights?

On October 28th at 14:00, Niels ten Oever will defend his dissertation titled ‘Wired Norms: Inscription, resistance, and subversion in the governance of the Internet infrastructure’. In his dissertation Niels analyzes the governance of the Internet infrastructure and the role norms play in it. While the governance of earlier information networks, such as the telephone and the telegraph, was done by nation states, the Internet is governed in so-called private multistakeholder bodies. This research analyzes how social and legal norms evolve, are introduced, subverted, and resisted by participants in Internet governance processes in order to develop policies, technologies, and standards to produce an interconnected Internet. The research leverages notions and insights from science and technology studies and international relations and combines quantitative and qualitative methods to show that the private multistakeholder Internet governance regime is designed and optimized for the narrow and limited role of increasing interconnection. As a result, the governance regime resists aligning Internet infrastructure with social or legal norms that might limit or hamper increasing interconnection.

The defense will be streamed online on October 28th at 14:00 – you can find the URL by then on Niels’ twitter account: – you can find his PhD dissertation and his other writings on his website:

[BigDataSur-COVID] When Health Code becomes Health Gradient: Safety or Social Control?

“Please show your Health Code.” Almost all public places in China have posted such requests at the entrances nowadays. Health Code, a three color-based application, is rolled out to control people’s movements and curb the coronavirus’s spread. A local government then proposed a Gradient Health Code to rank citizens based on smoking, sleeping, and medical records.

English translation by Giulia Polettini – Read in Chinese

by Yiran Zhao

“Please show your health code”, a few months after the spread of Covid-19, this request is attached on the entrance of almost all public places in China. A health code that is available through Alipay [a payment platform spreadly used in China] , through programs like Wechat [the app number one in China, literaly used to do anything, from blogging, instant messaging to ordering food, booking a flight, paying etc…] and local apps or other platforms and that it can be obtained only after that the government service platform has acquired your name surname, id number and phone number. A green code, means that you are free to go. A yellow or red code, means different levels of risk to contracting the new Coronavirus.

Authority, means ownership

After the validation of your identity, your personal health code situation can be shown in a few seconds. The logic at its base follows factors as your movements history, the permanence time in a risky area, as well as the relationships with potential carriers, in order to evaluate people level of hazard, though the specific algorithm hasn’t been published yet.

While we think that the classification operated by the staff of social medias is based on the information filled in by individuals, interests and precise personal behaviour that are constantly fed, the health code emerges instead as another new type of control system: for which you do not need any autonomy to fill out or engage. The birth of the health code can’t be found in the moment you authorize it, but before that. Also, the environment it uses does not follow your abitual actions but it is instead a civil obligation with a mandatory nature: “Without a green code you cannot enter”.

Focault called discipline the politic of power, by establishing the relationship between this discipline and the rules he created the value of power. To this extent, the debate on the question of individual privacy and safety, takes the right to health safety and it transfer its target to the individual person, driving it on a panoramic operational model.

Apart from hospitals and public transportations, also some big companies or even small private groups and forums, started to use the green health code to develop their own social duty, along with the safeguard of the security of their event attendees. The green code symbolize a decentralized and panoramic power model.

Data can be wrong too

When the access to the network can be viewed as a kind of human right, it is very difficult to accurately identify marginalized people’s position drifting outside the data. Those who live in impoverished areas, or the elderly, who don’t know how to own and generate a health code, will be rejected by public transportation and in other building. Also the display and tracking of the regions mainly rely on phone numbers, but the name identity registration in phone numbers took place earlier than the Internet era, so sometimes the problem is that the person tracked by the phone number do not correspond to the original registered person. As for the rating of risk level and division of the regions, it is even more difficult to handle: when Beijing is divided into orange areas, then locating people living in adjacent areas is closely related to the level of accuracy of the position based on the cellular stations sites, wifi coverage, GPS and Bluetooth.

What is more costly than wrong data is that there is no manual to guide you how to change your status. After being classified as orange, all you can do is to spend the quarantine time staying at home. The inability to self-prove the wrong data itself overlaps with the authority status of these data.

In this picture the individual health code value states a 88/100 grade corresponding to a green health code, calculated on a value resulting from the number of steps taken, the number of cigarettes smoked, the amount of alcohol assumed and the hours of sleep reached in one day by its owner.


In this picture the company/group health code value states a 78/100 grade corresponding to a light green health code, calculated on a value resulting from the actions of the company employees such as the number of steps taken, the number of hours slept, the rate of yearly health check-ups taken and the rate of chronicle diseases, contracted by the employees, that were contained.

Health lies beyond 3 colors

On May 22nd, the Health Commission of Hangzhou municipality, where Alibaba is located, envisaged, through electronic medical records, health checkups and other related data, to set up a personal health gradient index, ranked from 0 to 100, and released a collective appraise of healthy groups within corridors, communities, and enterprises. Although this was only an official plan, it still has been enough for people to be amazed by the health code.
After that, to manage the public health the privacy right is crossed and [after], being labelled with the “three colors” to ease its management, it is also possible that the trend may go on with “progress”, “competitivity”, and “performance” gradients. Will the health privacy data of “chromatic gradient” be spread by a wider range commercial capitalism? What kind of “inequality” will the employee health data cause?
We have to admit the possibility that after the new crown pneumonia, people have begun to adapt to this exceptional state and habitually transfer personal privacy interactions for big data to take over control. It may be possible that the existence of “people” itself will also lie in the spectrum of the chromatic gradient, labelled with the pattern of (R, G, B) [international color model used by most devices].


About the author

Yiran Zhao lived in China for almost twenty years and then moved to Taiwan to get a Bachelor’s degree. Now she is in the Netherlands as a Research Master student in Media Studies at the University of Amsterdam.

[BigDataSur-COVID] 走向渐变的健康码:这是安全,还是控制?

“Please show your Health Code.” Almost all public places in China have posted such requests at the entrances nowadays. Health Code, a three color-based application, is rolled out to control people’s movements and curb the coronavirus’s spread. A local government then proposed a Gradient Health Code to rank citizens based on smoking, sleeping, and medical records.

Read in English

by Yiran Zhao


















About the author

Yiran Zhao lived in China for almost twenty years and then moved to Taiwan to get a Bachelor’s degree. Now she is in the Netherlands as a Research Master student in Media Studies at the University of Amsterdam.

Stefania on surveillance at DIG Awards (Modena, 9 October)

On October the 9th, Stefania is in Modena (Italy) to talk about society and surveillance at the investigative journalism festival DIG (Documentari Inchieste Giornalismi) Awards. She will join on stage Philip Di Salvo (University della Svizzera Italiana), Veronica Barassi (University of St Gallen), Riccardo Coluccini and Biella Coleman (McGill University), in a panel asking “Can we live a life without surveillance”?. DIG is the biggest European investigative journalism festival.

[BigDataSur-COVID] COVID Data on the Fringes: the Scottish story

by Angela Daly


COVID hit at a febrile time more generally for the United Kingdom in the context of its exit from the European Union and ongoing issues over the devolved nations, particularly Northern Ireland with its land border with the Republic, and Scotland, both of which had voted to remain in the EU during the 2016 referendum. Scotland had in 2014 its own referendum on independence from the UK, which was won, albeit fairly narrowly, by the ‘No’ side. While a pro-Brexit right-wing Conservative government rules in London, the devolved administration in Edinburgh is led by the centre-left Scottish National Party (SNP) government and first minister Nicola Sturgeon.

However, when the pandemic first hit the UK in the early months of 2020 there was no discernible difference in approach between the Scottish Government and the UK Government. In March 2020 both Scotland the wider UK imposed lockdowns later than in other European countries and in mid-March both abandoned manual contact tracing around the same time that big tech firms such as Palantir were invited to meetings with the UK government. Later that month, NHSX (the English public health service unit tasked with setting policy and best practice for digital technologies and data in health) started developing a contact tracing app amid cries of digital triumphalism and technodeterminism from the Johnson administration in London that we could digitise our way out of the pandemic.

Health is a devolved power in the UK, so the Scottish Government has full responsibility for health policy in Scotland. In May we began to see divergence between Scotland the wider UK on pathways out of lockdown (Scotland has generally taken a more cautious approach to this issue than the UK government) and also on data, with the publication of the Test, Trace, Isolate, Support policy. It signalled the relaunch of Scotland’s own contact tracing scheme, foregrounding manual contact tracing which may then be supplemented by a ‘web-based’ digital ‘tool’, pointedly not an app.

But data in the context of COVID is not just contact tracing and apps, even though they have been the focus for significant debate and advocacy. The data which government releases and restrains about COVID infections and prevalence is also key to informing political debates and personal choices, and the situation in Scotland presents a complex picture of the tensions between health, the economy and politics both at the local level and as a snapshot of more global tensions in the pandemic response.

Contact tracing and the app

Scotland’s (belated) approach to contact tracing is one of the most prominent examples of its divergence with the UK central government on COVID data policy. From May Scotland set up its own contact tracing system, building capacity in its public healthcare service (NHS), in contrast to the outsourcing of this service to private companies that has occurred in England. The Scottish Government also expressed its reservations with the NHSX app and the lack of consultation with devolved administrations. However it still came as a surprise in August when the Scottish Government announced that it was launching a contact tracing app and would be adopting the Republic of Ireland’s model and software, developed by Irish company Nearform. The Northern Irish administration has also adopted this model which makes sense given political and geographical reasons, principally the land border with the Republic. The Scottish Government’s decision to adopt the app is more overtly political, inasmuch as its land border is with England rather than Ireland. However the RoI app is reasonably privacy-protecting through its adoption of the Google-Apple app protocol and decentralised design, purpose limited and already has a track record of functioning reasonably well, the same which cannot be said of the original NHSX app. Even the NHSX app’s current incarnation, released after the Scottish app, still seems to be suffering from malfunctions.

The Scottish Government may have adopted the RoI app for politically pragmatic reasons, but it leaves the nation in a position where it has followed the lead of another nation-state (Republic of Ireland) rather than its own central government in London, leading to a ‘Gaelic Fringe’ approach to apps and contact tracing across the (contested) borders of nation-states in the islands of Britain and Ireland. The outcome of this approach may be de facto the establishment of Scotland’s digital sovereignty in a similar way to the movement in Catalonia, another separatist region in Spain. This is all the more significant given Scottish Parliament elections in 2021, which the SNP are tipped to win by a landslide, and calls for another independence referendum, with polls consistently showing a pro-independence vote in the lead.

Yet the need to adhere to the Google-Apple protocol in order to create functioning apps does limit political entities’ digital sovereignty, both of Scotland and full nation-states which have had to use this protocol for their own apps. The Google-Apple protocol has promoted a measure of privacy protection lacking, for instance, from the UK Government’s initial NHSX app, but the need to adopt this protocol for a successful app demonstrates and reinforces the power of big tech firms.

Photo credits: Stephen McLeod Blythe (@stephenemm)

Government transparency

The Scottish Government has undoubtedly been more transparent about its COVID app than its counterparts in London have been about the NHSX app and the involvement of big tech firms in providing digital infrastructure for the pandemic, as a series of openDemocracy investigations have demonstrated.

However the Scottish Government does not have a flawless record on its own transparency during this period. In Scotland freedom of information (FoI) laws were ‘relaxed’ at the outbreak of the pandemic in April, allowing government agencies a threefold extension to their deadlines for responding to freedom of information requests. These measures were strongly criticised at the time. Even the UK government did not relax FoI laws to the same extent. The Index on Free Expression criticised the Scottish Government, comparing it to Bolsonaro’s Brazil for its restrictions of freedom of information rights during the pandemic.

Access to public data and information extends beyond FoI. Who is infected with COVID and who has died from COVID and where have been key questions in order to understand whether certain groups have been more impacted than others. In England people from Black and Minority Ethnic (BAME) backgrounds have been more susceptible to infection and death form COVID for a number of reasons including socio-economic circumstances, structural racism and pre-existing health inequalities. Scotland has a significant minority population of South Asian origin, and there was anecdotal evidence in spring 2020 that this community was experiencing a disproportionate amount of COVID deaths. Scottish NGO the Coalition for Racial Equality and Rights (CRER) raised concerns about the lack of data on this issue and the poor quality of the data that did exist. Finally, in July the National Record of Scotland published a study on ethnicity and COVID in Scotland which found that South Asian people were 1.9 times more likely to die of COVID. This is in line with outcomes in other parts of the UK, but the Scottish data was made available later than elsewhere. CRER is still calling for more and better data to be generated and released on COVID and ethnicity in Scotland.

Data and marketization

For contact tracing the Scottish Government has followed a less neoliberal and privatised approach to England, where these functions have been outsourced to private companies (ironically including some located in Scotland, one of which itself experienced a COVID outbreak). However marketization and privatisation of other public functions have had an obfuscating impact on what data is available to the public in Scotland.

Like elsewhere in the UK, and in other western countries, care homes for the elderly and disabled have been severely impacted by COVID, with many residents dying of the disease. One notorious example was the private Home Farm care home on the Isle of Skye, where ten residents died of the virus, run by HC-One, one of the UK’s largest care home providers. Care home regulatory bodies in both England and Scotland have refused to make public the numbers of deaths in specific care homes, with part of the justification being that this would negatively affect providers’ commercial interests.

While so far not as deadly, marketised universities in Scotland like the rest of the UK brought students back to campus for the start of the new academic year (in some cases with all teaching still online) and have experienced COVID outbreaks in shared student accommodation from September 2020. There has been patchy information about COVID cases among campus communities, with some institutions releasing this data and others not, leading to the UniCOVID site set up by two University of Sussex academics to track developments. It seems that universities are becoming more forthcoming about tracking their own COVID outbreaks and releasing data publicly, however there is no systematic way this is being done and not every institution is readily providing this data. Marketisation of this public service has led to students returning prematurely to campuses and may have contributed to institutions’ reticence in compiling and publicising data about COVID cases.

There are aspects of the Scottish digital story which demonstrate a clearly different path from that of the UK central government, notably the approach to contact tracing which remains within the public health service rather than being outsourced to private providers, yet also which represents a radical alignment with Dublin on the app. Along with the Belfast administration’s embrace of the Nearform software, we see a Gaelic Fringe approach to contact tracing apps emerging, one which is also in line with European standards more generally and thus represents a further cleavage with the pro-Brexit London government. While the Scottish Government may have adopted this approach for pragmatic reasons, in outcome it may be seen as a further step towards Scotland’s digital sovereignty in some senses, but also shows the limits of this sovereignty inasmuch as the Google-Apple protocol is respected.

The worst excesses of the UK government’s privatised and digitised COVID response are not replicated in Scotland, but equally things have not been perfect either. Transparency, who is counted in data, and what data is available to the public have been influenced negatively by logics of privatisation and marketization in public functions, particularly in care homes. The needs of ethnic minorities to be counted and visible in data when COVID has disproportionately affected them, were not adequately addressed and taken account of by the Scottish Government.

Scotland shows the potential for the margins to forge different paths on data than the cores, but also the limits of doing so in a world of big tech, neoliberal logics and inequalities. Groups such as the CRER demanding more and better data on COVID and ethnicity in Scotland and the wider UK initiative UniCOVID providing data on university outbreaks including in Scotland shows the kinds of bottom-up data activism emerging in the COVID context. With COVID, data is power, data is political and this is as true in Scotland as it is elsewhere.


About the author

Dr Angela Daly is Senior Lecturer in Strathclyde Law School and Co-Director of the Strathclyde Centre for Internet Law & Policy in Glasgow, Scotland. As a ‘critical friend’ she has advised the Scottish Government on data in its COVID-19 response as a member of the COVID-19 Data Taskforce and a board member of Research Data Scotland. She co-edited the open access book Good Data in 2019.



Stefania at the kick-off of “Global Digital Cultures” (2 October)

The University of Amsterdam has a new Research Priority Area dedicated to exploring “Global Digital Cultures”. Global Digital Cultures is a interdisciplinary research community for comparing and analyzing the profound changes brought about by digitization around the globe. Read more here.

The kick-off event of the new Research Priority Area featured a keynote by Prof. Louise Amoore (Durham University) in conversation with our PI Stefania Milan, along the lines of Amoore’s latest book on “Cloud Ethics” (Duke University Press, 2020).

[BigDataSur-COVID] Contact Tracing Apps: Friend or Foe?

by Alexandra Elliott

A man decides to go grocery shopping during the COVID 19 pandemic. Despite his best efforts to be cautious he comes within a 1.5m distance with another shopper while reaching for a basket, another when selecting his milk, two more when squeezing through the crowded cereal aisle and the cashier as he pays for his groceries. He then returns home to hugs from his wife and three kids. A few days later the man tests positive for the Coronavirus and all those he came into contact with may potentially also be sick. Each of these people has their own web of contacts, of which every person has another web and so on. And this is only the contacts made within one hour.

Contact tracing is essential in detecting cases of COVID 19, early treatment and the reduction of further contamination, ultimately overcoming the pandemic. However it is clear that doing so is no mean feat. With the total cases worldwide exceeding 7 million it seems reasonable to adopt the assistance of technology in contact tracing efforts. So why is there so much contention over the implementation of contact tracing apps? Consider this a summary guide.

In an attempt to assess whether contact-tracing technology should be met with approval I will position it within the academic notions of Good Data. Explaining how contract tracing works, through a case study of Australia’s COVIDSafe app, I hope to reach an understanding of why this technique is an essential tool in minimising the curve of the Coronavirus, “a strategy that goes hand-in-hand with economic recovery and reducing the isolation recommendations that are currently in place”. We will then explore many of the concerns and controversies preventing unanimous enthusiasm over the process, presenting both the arguments and their rebuttals to deliver a comprehensive portrait of the matter.

With the rise in suspicions over Big Tech and their manipulative and invasive data practices a counteractive field of academia developed focusing on ethical uses of data. There are discrepancies over the terminology and definitions found within the discourse – responsible data, good data, data justice – and many scholars have called for a unified understanding to therefore accelerate the ideas and implications. I have chosen to umbrella these concepts under the label Good Data.

One reference of the central ideas of the field can be found within the work of Taylor and Purtova (2019). They divide data justice into data responsibility and data sustainability; the first covering the impact of data on the user (for example matters of privacy and bias) and the latter referring to utilising data for the benefit of society – data for humanitarian, not capitalist, purpose. We will continue this piece by exploring how contact tracing apps fit into this model as an example of Good Data.

Big Data as Public Good

To illustrate how contact tracing apps are a sustainable data practice they can be understood as an implementation of Big Data as public good.

The information big data provides can be utilised for the benefit of society. Within relevant fields of academia this action is referred to as ‘data as public good’. However the reality of data access for humanitarian purposes is difficult to achieve due to the clashing responsibilities and ambitions of the various actors involved. Taylor (2016) and Ritchie and Welpton (2011) have both attempted to navigate these relationships and assess the likelihood of the exposure of personal datasets to benefit humanity.

Many papers dissect the data collection and analysis ecosystem of mobile operators and other Big Tech companies. If, upon release, data held by corporations can “promote social good” such as alerting emergency responses then it should be made available but this may not be in the best interests of the data’s private owners. The responsibility of exploiting privacy lays with the data owner who becomes hesitant to release information for fear it soil their reputation. We therefore encounter a block in sharing data for humanitarian endeavours.

COVIDSafe provides an alternative, more harmonious, model of data as public good by eliminating private ownership. The data is collected and analysed by the Australian government for the benefit of the Australian people. The government accepts the responsibility of individuals’ privacy. Unlike other cases involving numerous differing parties who collect the data and who analyse and use the data, the government’s goals align with the goals of the research – protecting the Australian population. There is no longer a need for repurposing. Through COVIDSafe an entirely new dataset is being collected, designed for the purpose of contact tracing and therefore facilitating the process of data for the public good.

Contact Tracing

Contact tracing involves identifying those who have been in contact with an infectious person so that they can isolate themselves and halt the spread. The process ultimately seeks to control the spread of a disease or virus and can be automated by smartphone tracking apps.

This tracking can be conducted over either Bluetooth or GPS. Bluetooth options offer more privacy, as they do not record the location at which contact occurred. Alternatively, others argue for GPS and its ability to identify hot spots. Up until recently Apple’s iOS software blocked Bluetooth from running in the background of apps. This would have rendered contact tracing apps ineffective as the app needed to always be open to detect contacts. They have now removed that function thus supporting the development and use of such applications.

Once in operation a phone with a contact tracing app will send out a code through Bluetooth to any other phone, also with the app, which comes within a detectable distance. An example developed alongside the Australian government is COVIDSafe.


Australia’s government and health authorities have adopted the COVIDSafe app as a tool to contain and hopefully overcome Coronavirus in the country. It’s endorsement has been strong with widespread advertisements encouraging Australians to download the app and the Prime Minister Scott Morrison appealing to the public with assurances that the more people that use the app the more quickly the pubs can reopen.

COVIDSafe works by recognising other devices in its proximity with the app installed, “it notes the date, time, distance and duration of the contact and the other user’s reference code”. The reference code is anonymous and refreshed every two hours, the data collected is encrypted and the information is deleted after 21 days (a time period which covers both incubation and testing).

From both the COVIDSafe website and statements by the government it is clear that those involved are aware of users apprehension of infringement of privacy and wish to resolve such concerns. The Guardian recently conducted a survey, which found 57% of respondents to be anxious of the security protecting their personal information.

In an attempt to quell concerns both a Privacy Policy and Privacy Impact Assessment Report are available to read and users may opt out at anytime and request for the immediate deletion of their records. Furthermore, it is a criminal offence to use the data collection for any purpose other than contact tracing (including law or isolation enforcement) and by any other actors than those delegated, punishable by a five-year jail sentence.

Regardless of these protections, COVIDSafe is not an open source software prompting critics to argue that it “is not subject to audit or oversight”. The reason privacy protection is so critical is that the data collected constructs a “comprehensible social contacts map of the nation”. A dataset of Australians’ behavioural patterns could become a valuable resource for a range of purposes from marketing opportunities to more malicious regimes.

Since its appearance in the app store COVIDSafe has experienced a number of setbacks including hoax texts distributed to users with a message reading “the COVIDSafe app has detected you are now +20km from your nominated home address” and the revelation that the users’ phone make and model was communicated unencrypted. There was also backlash in the media of the choice to store the data in the American owned Amazon Web Services (AWS) over Australian providers fit for the purpose. As well as the lost opportunity to support local businesses (particularly necessary during the pandemic), concerns were raised over information being accessed by American entities due to legislation approving government access to data held by any US owned companies. However there is some ambiguity surrounding the matter as AWS are already used for a range of Australian federal operations and the transferring of COVIDSafe data to any country is prohibited through the Biosecurity Act.

Research has confirmed that certain user numbers must be attainted before contact tracing apps can be labelled as effective. The University of Oxford conducted an experiment on a simulated city to reveal that 80 per cent of smartphone users in the U.K., or 56 per cent of the population must be using the app it is to be successful in curbing the spread of the Coronavirus. Unfortunately this cannot be enforced, it is important to permit downloading the app as voluntary to maintain civil liberties.

User numbers may be inhibited by scepticism throughout society towards both the government and Big Tech’s use of surveillance to monitor our daily routines and consequent reluctance to participate. Furthermore, there is a high correlation between those without possession of a smartphone and those at high risk of contracting COVID 19 – particularly the older generation and those from a low-income bracket. Contact tracing apps therefore fail to detect and protect many, potentially severe, cases.

Another issue is the limitations of the technology infrastructure. The Bluetooth range extends beyond 1.5 metres and also permeates through walls creating false positives. Numbers may also be inaccurately inflated through “self-diagnosing incorrectly or worse, trolls spamming the system”. False positives need to be avoided not only for the efficiency of the operation but also as to not loose the faith of its users. 

In conclusion

If assessing contact tracing apps based off their ethical purpose COVIDSafe, and its intentions of eliminating a pandemic, would be considered a golden example of Good Data. However following Taylor and Purtova (2019) it is not only sustainability but also responsibility that must be met to attain a holistic Good Data practice. Concerns over confidentiality and inaccuracies prevent contact tracing apps from easily being categorised as Good.

However, what if the equal weighting of responsibility and sustainability is not fixed? Extenuating circumstances often mean we must prioritise and compromise. Contact tracing apps are an example of foregoing responsibility towards the individual for sustainability of the whole.

Additionally, incorporation of decentralised storage, allowing people to choose from a pool of suppliers to align with their values, providing an exit strategy so data is not stored post virus and inviting collaboration to incite innovation could construct a more trustworthy model of contact tracing. Trust has become particularly potent, do we trust our government and health services to utilise this data for the benefit of the public? It is important to maintain perspective and remember what is at stake, “it sounds like a dystopian surveillance nightmare that could also save millions of lives and rescue the global economy”. In times of crisis we may comply with conditions otherwise worth challenging. Sacrifices and personal discomfort may be necessary and worthwhile if they lead to overcoming and healing from COVID 19.


About the Author

Alexandra grew up in Sydney, Australia before moving to England to complete her Bachelors degree at Warwick University. She is currently undertaking a Research Masters in Media Studies at the University of Amsterdam. It is through this course that she became involved with the Good Data tutorial and DATACTIVE project.


NEW chapter on mailing-list analysis by Niels, Stefania & Davide (LIVE presentation on September 22!)

Niels ten Oever, Stefania Milan and Davide Beraldo co-authored the chapter “Studying Discourse in Internet Governance through Mailing-List Analysis” for the book Researching Internet Governance: Methods, Frameworks, Futures, edited by Laura DeNardisDerrick CogburnNanette S. Levinson and Francesca Musiani (MIT Press, 2020). The volume is open access, and you can read it all by following this link.

To celebrate the release of our new book on September 22, 2020 at 12pm EST on Zoom, some of the authors will present their chapters, including Niels. Register here to attend. Anriette Esterhuysen (Chair of the IGF’s Multistakeholder Advisory Group and Senior Advisor for Global and Regional Internet Governance at the Association for Progressive Communications) will be moderating the discussion. Presenters include Sandra Braman, Milton Mueller, Ron Deibert and Jeannette Hoffman.