Author: Stefania

Stefania at AlgoSov Summit in Copenhagen, 8 September

On the 8th of September Stefania will give a talk at the Algorithmic Sovereignty Summit in Copenhagen, in the framework of the TechFestival, a festival “to find human answers to the big questions of technology”.

The Summit in an initiative of Jaromil Rojo from Dyne.org, who also sits in the DATACTIVE’s Ethics Advisory Board. The summit kickstarts the European Observatory on Algorithmic Sovereignty.

[BigDataSur] My experience in training women on digital safety

by Cecilia Maundu

I remember it was September 2015 when I was invited for a two-day workshop on digital safety by the Association of Media Women in Kenya. At first I was very curious because I had not heard much about digital security. The two-day workshop was an eye opener. After the workshop I found myself hungry for more information on this issue.

Naturally, I went online to find more information. I must say I was shocked at the statistics I came across on the number of women who have been abused online, and continue to suffer. Women were being subjected to sexist attacks. They were attacked because of their identity as women, not because of their opinions. I asked myself what can I do? I am well aware that I am just a drop in the ocean, but any little change I can bring will help in some way. That was a light bulb moment for me.

It was in that moment that I knew I wanted to be a digital safety trainer. I wanted to learn how to train people, especially women, on how to stay safe online. The internet is the vehicle of the future. This future is now, and we cannot afford for women to be left behind.

Online violence eventually pushes victims to stay offline. It is censorship hidden behind the veil of freedom of expression.

After this realization, I embarked on the quest to become a digital safety trainer. As fate would have it, my mentor Grace Githaiga came across the SafeSister fellowship opportunity and sent it to me. I applied and got into the program. The training was taking place in Ngaruga lodge, Uganda. The venue of the training was beyond serene. The calm lake waters next to the hotel signified how we want the internet to be for women: a calm place and a safe space where women can express themselves freely without fear of being victimized, or their voices being invalidated.

On arrival we were met by one of the facilitators, Helen, who gave us a warm welcome. The training was conducted by five facilitators, all of whom were women.

The training was student friendly. The topics were broken down in a way that allows everyone to understand what was being discussed. Each facilitator had her own way and style of delivering the different topics, from using charts to power point presentations. I must say they did an exceptional job. I got to learn more about online gender violence and how deeply rooted it is in our society, and hence the importance of digital security trainings.

Being a trainer is not only about having digital safety skills, it also requires you to be an all rounded person. While giving training you are bound to meet different types of people with different personalities, and it is your duty to make them feel comfortable and make sure the environment around them is safe. It is in this safe space that they will be able to talk and express their fears and desires, and, most importantly, they will be willing to learn. As a digital security trainer, you should first know more about your participants and how much they know about digital security. This will enable you to package your material according to their learning needs.

Being a trainer requires you to read a lot on digital security, because this keeps you updated and allows you, therefore, to relay accurate information to your trainees. As a trainer, it is also necessary to understand the concept of hands on training because it gives the participants the opportunity to put into practice what they have learnt. For example, when you are teaching about privacy status on Facebook, you don’t just talk about it, your should rather ask the participants to open their Facebook accounts – that is if they have any – and go through the instructions step by step with them till they are able to achieve the task. As a trainer there is also the possibility of meeting a participant who does not give the opportunity to the rest of the group to express their views, as they want to be the one to talk throughout. However, the trainer needs to take charge and make sure that each participant is given an equal opportunity to talk.

Before the training we had each been given a topic to make a presentation on, and mine was to do with encryption; VeraCrypt to be more specific. At first it sounded Greek to me, but then I resorted to my friend Google to get more details (this begs the question of: how was life before the internet?). By the time I was leaving Kenya for Uganda I had mastered VeraCrypt. We kept discussing our topics with the rest of the group to a point where they started calling me Vera. My presentation went so well to my surprise. The week went by so fast. By the time we realized it, it was over and it was time to go back home and start implementing what we had learnt.

We continued receiving very informative material online from the trainers. In September 2017 they opened up a pool of funding where we could apply to fund a training in our different home countries. I got the funding, and chose to hold the training at Multimedia University where I lecture part time. The reason behind my choice was that this was home for upcoming young media women, and we needed to train them on how to stay safe online, especially since media women in Kenya form the majority of victims of gender based violence. They needed to know what awaits them out there and the mechanisms they needed to have to protect themselves from the attacks. The training was a success, and the young ladies walked away happy and strong.

The second, and last, part of SafeSister (I am barely holding my tears here, because the end did come) took place in Uganda at the end of March 2018. It was such a nice reunion, meeting the other participants and our trainers after a year. This time the training was more relaxed. We were each given a chance to talk about the trainings we conducted, the challenges we encountered, the lessons learnt and what we would have done differently. For me the challenge I encountered was time management. The trainers had prepared quite informative materials, hence the time ran over, add to it a 3o minutes delayed start.

This was my first training, and one take home for me as a digital safety trainer was that not all participants will be enthusiastic about the training, but one shouldn’t be discouraged or feel like they are not doing enough. The trainer just needs to make sure that no participant is left out. The trainer should not just throw questions at the participants, or just ask for their opinion on different issues regarding digital safety. As time progresses, they gradually get enthusiastic and start feeling more at ease.

One thing I have learnt since I became a digital security trainer is that people are quite ignorant on digital security matters. People go to cybercafés and forget to sign out of their email accounts, or use the same password for more than a single account, and  then they ask you ‘’why would someone want to hack into my account or abuse me and I am not famous?” However, such questions should not discourage you, on the contrary, they should motivate you to give more trainings, because people don’t know how vulnerable they are by being online while their accounts and data are not protected. Also as a trainer, when you can, and when the need arises, give as much free trainings as you can, since not everyone can afford to pay you. It is through these trainings that you continue to sharpen your skills and become an excellent trainer.

After the training we were each awarded a certificate. It felt so good to know that I am now a certified digital security trainer; nothing can beat that feeling.  As they say, all good things must come to an end. Long live Internews, long live DefendDefenders Asante Sana. I will forever be grateful.

 

Cecilia Mwende Maundu is a broadcast journalist in Kenya, a digital security trainer and consultant with a focus on teaching women how to stay safe online. She is also a user experience (UX) trainer, collecting user information feedback and sharing it with developers.

Stefania keynotes at ‘The Digital Self’ workshop, Kings’ College London, July 6

On July 6, Stefania will deliver a keynote at the workshop ‘The Digital Self’, organized by the Departments of Digital Humanities and of Culture, Media & Creative Industries of Kings’ College, London. This workshop focuses on how digital technology influences our daily lives, its impacts on the ways culture is re-shaped, and as a result how our identities as workers, consumers and media and cultural producers are changing. Stefania’s keynote is entitled ‘Identity and data infrastructure’. Read more.

 

 

 

Advisory Board Workshop, July 4-5

In July 4-5, DATACTIVE has gathered the Advisory Board members for a sharing & feedback workshop.

Participants include Anita Say Chan (University of Illinois, Urbana-Champaign), Chris Csikszentmihályi (Madeira Interactive Technologies Institute), Ronald Deibert (University of Toronto), Seda Gürses (KU Leuven), Evelyn Ruppert (Goldsmiths, University of London), Hishant Shah (ArtEZ) … and the DATACTIVE team. Day 1 only: Hisham Al-Miraat (Justice & Peace Netherlands), Julia Hoffmann (Hivos).

*Day 1*
Fireside chat from 4pm @ Terre Lente, followed by light dinner [CLOSED]
Public event at 8pm @ SPUI25: ‘Democracy Under Siege: Digital Espionage and Civil Society Resistance’, with Ronald Deibert (Citizen Lab – University of Toronto), Seda Guerses (COSIC/ESAT – KU Leuven), and Nishant Shah (ArtEZ / Leuphana University)

*Day 2* @ the University Library, Singel 425 [BY INVITATION ONLY]
9.30 Welcome & coffee
9.45 Intro (Stefania)
Session 1 (10-11am): The framework: Concepts and Infrastructure (Presenters: Stefania & Davide)
Session 2 (11.10-12.15): Data as stakes (Presenters: Becky & Niels)
Lunch break (12.15-1.30)
Session 3 (1.30-2.35): Data as tools (Presenters: Guillen & Kersti)
Session 4 (2.40-3.45): Next in line: Emerging work (Presenters: Fabien, Jeroen, Quynn)
Session 5 (4-4.30) Wrap-up (Stefania, all)

 

Stefania at the Future Forum of IG Metall, Berlin

On June 19, Stefania will deliver a talk at the Zukunftsforum (‘Future Forum’) of IG Metall, the dominant metalworkers’ union in Germany and Europe’s largest industrial union. Stefania has been asked to reflect on how digitalisation and datafication change the dynamics of solidarity today. Check out the program of the day.

AGENDA Future Forum IG Metall 19.6.2018_final_eng
.

[BigDataSur] blog 3/3. Good Data: Challenging the Colonial Politics of Knowledge

In the previous instalment of this three-part series on the possibilities of “good data” (here and here), Anna Carlson concluded that notions of decentralisation and autonomy could not do without an understanding of global inequalities and their politics. Moving away from Northern, individualist visions of digital utopia, she considers what can be learned from Indigenous data sovereignty initiatives as they address, from the South, the colonial legacies of global knowledge production. 

The gathering of data has long been a strategy of colonialism. It is part of a set of practices designed to “standardise and simplify the indigenous ‘social hieroglyph into a legible and administratively more convenient format’ (Scott 1999, 3)” (Smith 2016, 120). Making something legible is always already political, and it always begs the question: legible for what, and to whom?

In Australia, making legible meant enumerating Indigenous “‘peoples’ into ‘populations’ (Taylor 2009); their domestic arrangements and wellbeing […] constrained within quantitative datasets and indicators that reflected colonial preoccupations and values” (Smith 2016, 120). No matter how ‘big’ the data set, data is never neutral. As Smith points out, Indigenous self-determination is increasingly linked to “the need to also reassert Indigenous peoples’ control and interpretation into the colonial data archives, and to produce alternative sources of data that are fit for their contemporary purposes” (Smith 2016, 120). Data is knowledge, and knowledge is power.

No matter how it is collected, data has long been used in ways that reinforce and sustain this colonial status quo. Data – often quantitative, often raw numbers – is used strategically. As Maggie Walter argues in a chapter in Indigenous Data Sovereignty: Towards an Agenda (2016): “social and population statistics are better understood as human artefacts, imbued with meaning. And in their current configurations, the meanings reflected in statistics are primarily drawn from the dominant social norms, values and racial hierarchy of the society in which they are created” (Walter 2016, 79).

Dianne E Smith expands in the same volume:

“Data constitute a point-in-time intervention into a flow of information of behaviour – an attempt to inject certainty and meaning into uncertainty. As such, data can be useful for generalising from a particular sample to a wider population […] testing hypotheses […] choosing between options (etc). […]  However, when derived from ethnocentric criteria and definitions, data can also impose erroneous causal connections and simplify social complexity, thereby freezing what may be fluid formations in the real world. In their unadorned quantitative form, data are hard-pressed to cope with social and cultural intangibles.” (2016, 119-120)

As such, argues Smith, questions about data governance – “who has the power and authority to make rules and decisions about the design, interpretation, validation, ownership, access to and use of data” – are increasingly emerging as key “sites of contestation” between Indigenous communities and the state (Smith 2016, 119).

One of the more interesting responses to these data challenges comes from the movements for Indigenous Data Sovereignty. C. Matthew Snipp (2016) outlines three basic preconditions for data decolonisation: “that Indigenous peoples have the power to determine who should be counted among them; that data must reflect the interests and priorities of Indigenous peoples; and that tribal communities must not only dictate the content of data collected about them, but also have the power to determine who has access to these data.” In practice, then, it means Indigenous communities having meaningful say over how information about them is collected, stored, used and managed. In Indigenous Data Sovereignty: Towards an Agenda, the editors have brought together a set of interesting case studies reflecting on different examples of Indigenous data sovereignty. These case studies are by no means exhaustive, but they do point to a set of emerging protocols around relationality and control that may prove instructive in ongoing attempts to imagine good data practices in the future.

The First Nations Information Governance Centre (FNIGC)

FNIGC in Canada is an explicitly political response to the role of knowledge production in maintaining colonial power relationship.  The FNIGC articulate a set of principles that should be respected when gathering data about First Nations communities.  These principles are “ownership, control, access and permission” (OCAP), and they are used by the FNIGC as part of a certification process for research projects, surveys and other data gathering mechanisms.  In some ways, they kind of operate as a “right of reply” – pointing to the inadequacies or successes of data-gathering practices.  They aim, at least in part, to address the heavily skewed power relationship that continues to exist between Indigenous communities and the researchers who study them.

“By Maori for Maori” healthcare

This initiative attempts to incorporate Maori knowledge protocols into primary health care provision. In this case, Maori knowledge protocols are incorporated (to some extent) into data collection, analysis and reporting tools which are used by primary health care services (Maori run) in developing knowledge about and responses to health issues in Maori communities. These protocols are also used to develop and enable processes for data sharing across related networks – for example, with housing providers.

Yawuru data sovereignty

The Yawuru, traditional owners of the area around Broome in Western Australia, recognised that gaining control of the information that existed about them (by virtue of the extensive native title data gathering process) was crucial to maintaining their sovereignty. They also recognised the value of a data archive that was produced by and for Yawuru peoples. They undertook their own data collection processes, including a “survey of all Indigenous people and dwellings in the town to create a unit-record baseline.” They sought to create an instrument to “measure local understandings of Yawuru wellbeing (mabu liyan).”They digitally mapped their lands in order to “map places of cultural, social and environmental signficance to inform a cultural and environmental management plan.” Finally, they sought to incorporate Yawuru knowledge protocols into the development of a “documentation project […] to collate and store all relevant legal records, historical information, genealogies and cultural information” in line with Yawuru law.

Beyond these discreet examples, data sovereignty is exercised in a variety of other forms. Much Indigenous knowledge exists entirely outside the colonial state, and is held and protected in line with law. Many of the knowledge keepers in Australia retain this data sovereignty, despite the increasing pressure to make such information public in order to access legal recognition. This speaks to precisely the dilemma of contemporary data politics, as “opting out” of the systems that collect our data is increasingly difficult in a world where giving up our data rights is too often a precondition to accessing the things that make life liveable.

“error 404. social life not found” at TEDxYouth AICS, June 4

On June 4, Stefania will give a TEDx talk at the TEDx Youth event of the Amsterdam International Community School. Entitled ‘error 404. social life not found. (but you can take it back)’, Stefania’s talk will contribute to this year’s theme ‘Next Nature’: what is the next nature of the human experience as we enter the technological age of big data, consumerism and automation? Read more on the TED website. You can also download the presentation.

[BigDataSur] blog 2/3: Digital Utopianism and Decentralisation: Is Decentralised Data Good Data?

In this second of three blog posts on the challenges of imagining ‘good data’ globally (read the first first episode here), Anna Carlson considers a particular strand of technology-driven utopianism in the Global North: the idea of radically decentralised data. She writes that its latest instantiations – exemplified by blockchain – tend to forget the dire and unequally distributed ecological impact of digital technologies.    

I was born in 1989, just a handful of years before mobile phones and laptop computers became the ubiquitous symbols of urban late modernity.  It is also the year that the terms “cybersecurity” and “hypertext markup language” (HTML) first appeared in print. In the heady, early years of the World Wide Web, the internet and the data it produced (and contained and distributed) offered a utopian vision of equitable globalisation, the promise to equalise access to knowledge, the possibility of a new kind of commons. Early adopters across the political spectrum celebrated the possibilities of decentralisation and autonomy, the opportunities for community-building and collectivity, for sharing economies outside the control of the state.

These notions of the common good mostly originate in the Global North, and their promises have never quite been fulfilled. They continue to re-emerge, however, and provide interesting food for thought in the process of imagining good (or at least, better) data practices for the future.

Blockchain is probably the most prominent technology to revive utopias of radical decentralisation. At its most basic, a blockchain is a massive peer-to-peer, decentralised technology that allows digital information to be seen and shared but not copied. Described by some as the “backbone of a new type of internet,” blockchain is basically a very complex, constantly updating, decentralised titling system. A blockchain is like a database, tied to the object or site of interest, constantly cross-checking and updating.

The technology originally emerged as a way of keeping track of cryptocurrencies like Bitcoin, and proponents Don and Alex Tapscott describe it as “an incorruptible digital ledger of economic transactions that can be programmed to record not just financial transactions but virtually everything of value.”  The blockchain is inherently decentralised in it cannot be controlled by any single entity, but requires constant validation and authentication from the network.

Blockchain is often represented optimistically, if vaguely, as capable of supporting new economies, free from the shackles of corporate control. In Melanie Swan’s Blockchain: Blueprint for a New Economy, she goes even further: “the potential benefits of the blockchain are more than just economic – they extend into political, humanitarian, social, and scientific domains […] For example, to counter repressive political regimes, blockchain technology can be used to enact in a decentralised cloud functions that previously needed administration… (e.g.) for organisations like WikiLeaks (where national governments prevented credit card processors from accepting donations in the sensitive Edward Snowden situation).” She goes on to describe the possibility of blockchain technology as a basis for new economies dominated by (ahem) platforms like Uber and AirBnB.

So far, so good, right? The problem is that blockchains are incredibly unwieldy and immensely energy inefficient. To use the currency bitcoin as an example, the energy required for a single transaction far exceeds the amount needed for a more traditional transfer.  This is at least in part because the system is premised on hyper-individualistic, libertarian ideals.  In a recent article on Motherboard, Christopher Malmo writes that, with prices at their current level, “it would be profitable for Bitcoin miners to burn through over 24 terawatt-hours of electricity annually as they compete to solve increasingly difficult cryptographic puzzles to “mine” more Bitcoins. That’s about as much as Nigeria, a country of 186 million people, uses in a year.” At least in part, this energy is required because, as Alex de Vries suggests: “Blockchain is inefficient tech by design, as we create trust by building a system based on distrust. If you only trust yourself and a set of rules (the software), then you have to validate everything that happens against these rules yourself. That is the life of a blockchain node.”

An interesting example of the tensions of decentralised cryptocurrencies emerged recently when The New Inquiry released their newest project, Bail Bloc. Bail Bloc is a “cryptocurrency scheme against bail,” that delivers much-needed funds to the excellent Bronx Freedom Fund in order to help provide bail fees for defendants. They outline its function as follows: “When you download the app, a small part of your computer’s unused processing power is redirected toward mining a popular cryptocurrency called Monero, which is secure, private, and untraceable. At the end of every month, we exchange the Monero for US dollars and donate the earnings to the Bronx Freedom Fund.”

On the surface, taking “unused” resources and redistributing them equitably sounds like a good idea. Unfortunately, the missing information here is that the processing power is not really “unused” – what they mean that it’s unused by you, and that using it for this cause won’t inconvenience your personal use.

But there is a slippage here that seems important to clarify.  Mining Monero requires a huge amount of electricity – much the same as the Bitcoin example above. As a result, this charity structure requires that we burn huge amounts of energy to produce a fairly minimal value, which is then redistributed. Like many such examples, it would be much easier and simpler to simply donate the money directly to the Bronx Freedom Fund.  The flip side, of course, is that electricity doesn’t feel like a scarce resource in the way that money does, so generating money out of electricity feels like generating money out of nowhere.

It is easy to see what the New Inquiry designers are appealing to: the desire to affect positive change without having to really do anything at all. And it seemed to work: in the first 24 hours of the initiative’s launch, I saw numerous friends and colleagues sharing the page with suggestions that we establish similar schemes for causes closer to home.

The sense of practically and elegantly re-distributing otherwise unused resources through the magic of technology might be appealing, but it should not come at the expense of a broader understanding of global inequalities and planetary sustainability. Decentralised technology cannot sustain “good data” if the values that are encoded in it do not account for our shared stake in the world.

The question, then, is not whether decentralised data is good data, but under what terms and conditions– and what politics of the collective (or commons) goes into shaping decentralised technology.

((to be continued. Next episode will be online on June 8th, 2018))