From show in Big Data from the South

Ration Shop, Trivandrum, India. Photo by the author

[BigDataSur] India’s Aadhaar: The datafication of anti-poverty programmes and its implications

By Silvia Masiero, Loughborough University

The notion of datafication implies rendering existing objects, actions and processes into data. Widely studied in the field of business intelligence, datafication is known to restructure consumer behaviour and the functioning of markets in multiple ways. But a less-widely researched aspect pertains to the datafication of public welfare and social protection programmes, on which the livelihoods of many poor and vulnerable people worldwide are based. The field of information and communication technology for development (ICT4D), which for more than thirty years has focused on the roles of informatics in development processes, is coming to realize the growing importance of datafication in the enactment of social policies.

Datafication acquires a particular meaning when referring to anti-poverty programmes, which are social protection schemes designed specifically for the poor. In such schemes, what is converted into machine-readable data is in the first place the population of entitled users. This leads to restructuring two core functions of anti-poverty schemes: first is the recognition of beneficiaries, automatizing the process that discriminates entitled individuals and households from non-entitled ones. Second is the correct assignation of entitlements, based on the availability of machine-readable data for their determination. While both functions were previously paper-based or only partially digitized, datafication affords the power to automatize them, with a view of infusing greater effectiveness and accountability in programme design.

Against this backdrop, my research focuses on the two concomitant aspects of the effects of datafication on the architecture of anti-poverty programmes, and its consequences on the entitlements that beneficiaries receive through them. My PhD thesis focused on the digitalization of the Public Distribution System (PDS), which is India’s largest food security programme and centres on distributing primary necessity items (mainly rice, wheat, sugar and kerosene) at subsidized prices to the nation’s poor. The back-end digitalization of the scheme, started at the state level in the early 2000s, is now culminating in datafication of the programme through the Unique Identity Project (Aadhaar), an identity scheme that constitutes the biggest biometric identification database in the world. Built with the declared purpose of facilitating the socioeconomic inclusion of India’s poor, Aadhaar provides all enrolees with a 12-digit number and the capture of biometric details, to make sure, among other aspects, that each enrolee obtains their social benefits through a simple operation of biometric recognition.

Datafication contributes to deep transformation of anti-poverty programmes, with mixed effects on programme architecture and entitlements of beneficiaries

My data collection on the datafied PDS has occurred in the two southern Indian states of Kerala and Karnataka, and also comprehends a review of the state-level cases of Aadhaar-based PDS currently operating in India. Through the years, my research has developed three lines of reflection which I synoptically illustrate below.

First, datafication is constructed by the Indian central government as a tool for simplification of access, and of improvement of users’ capability to obtain their entitlements under existing schemes. The Aadhaar-based PDS is indeed constructed to reduce the inclusion error, meaning access to the programme by non-entitled people, and the exclusion error (Swaminathan 2002), meaning the negation of subsidy to the entitled. In doing so, the biometric system traces sales from PDS ration shops to reduce diversion (rice mafia), an illegal network through which foodgrains aimed at the poor are diverted on the market for higher margins. What emerges from my research is a strong governmental narrative portraying Aadhaar as a problem-solver of PDS: technology is depicted by government officials as a simplifier of the existing system, facilitating a better and more accountable functioning of a leakage-prone anti-poverty scheme that has been in operation for a long time.

Second, recipients’ view of the datafied PDS is mixed: it reveals some positive changes, but also a set of issues that were not in place before the advent of the biometric system. One, making access conditional to enrolment in the Aadhaar database, the new system subordinates the universal right to food to enrolment in a biometric database, leading the poor to ‘trade’ their data for the food rations needed for their livelihoods. Two, while the programme is designed to combat the inclusion error, new forms of exclusion are caused by systems’ malfunctionings leading to failure in user recognition, which in turn results in families having their food rations denied even for several months in a row. Three, the system is not built to act on the back-end diversion (PDS commodities being diverted before they reach the ration shops where users buy them), where, according to existing studies of the PDS supply chain, the greatest part of goods is diverted (Khera 2011, Drèze & Khera 2015).

Third, there is a specific restructuring intent behind the creation of an Aadhaar-based PDS. From documents and narratives released by the central government, a clear teleology emerges: Aadhaar is not conceived to simply streamline the PDS, but to substitute it, in the longer run, with a system of cash transfers to the bank accounts of beneficiaries. As government officials declare, this serves the purpose of reducing the distortion caused by subsidies, and create a more effective system where existing leakages cannot take place. A large majority of beneficiaries, however, is suspicious towards cash transfers (Drèze et al. 2017): a prominent argument is that these are more complex to collect and handle, with respect to the secure materiality of PDS food rations. What is sure, beyond points of view on the appropriateness of cash transfers, is that the teleology behind the Aadhaar-based PDS is not that of streamlining the system, but that of creating a new one where the logic of buying goods on the market replaces the existing logic of subsidies.

Aadhaar concurs to enable a shift from in-kind subsidies to cash transfers, with uncertain consequences on poor people’s entitlements

Rooted into field research on datafied anti-poverty systems, these reflections offer two main contributions to extant theorizations of datafication in the Global South. First, they highlight the role of state governments in using datafied systems towards construction of a positive image of themselves, portraying datafication as a problem-solving tool adopted to tackle the most pressing issues affecting existing programmes. The power of datafication, embodied by large biometric infrastructures such as Aadhaar, is used to project an image of accountability and effectiveness, relied upon in electoral times and in the construction of consensus from the public opinion. At the same time, citizens’ perspectives reveal forms of data injustice (Heeks & Renken 2018) which did not exist before datafication, such as the denial of subsidies based on failure of user recognition by point-of-sale machines or the subordination of the right to food to enrolment in a national biometric database.

Second, datafication is often portrayed by governments and public entities as a means to streamline anti-poverty programmes, improving the mechanisms at the basis of their functioning. By contrast, my research suggests a more pervasive role of datafication, capable of transforming the very basis on which existing social protection systems are grounded (Masiero 2015). The Aadhaar case is a revealing one in this respect: as it is incorporated in extant subsidy systems, Aadhaar does not aim to simply improve their functioning, but to substitute the logic of in-kind subsidies with a market-based architecture of cash transfers. Moving the drivers of governance of anti-poverty systems from the state to the market, datafication is hence implicated in a deep reformative effort, which may have massive consequences on programme architecture and the entitlements of the poor.

Entrenched in the Indian system of social protection, Aadhaar is today the greatest datafier of anti-poverty programmes in the world. Here we have outlined its primary effects, and especially its ability to reshape existing anti-poverty policies at their very basis. Ongoing research across ICT4D, data ethics and development studies pertains to the ways datafication will affect anti-poverty programme entitlements, for the many people whose livelihoods are predicated on them.

 

Silvia Masiero is a lecturer in International Development at the School of Business and Economics, Loughborough University. Her research concerns the role of information and communication technologies (ICTs) in socio-economic development, with a focus on the participation of ICT artefacts in the politics of anti-poverty programmes and emergency management.

 

References:

Drèze, J., and Khera, R. (2015) Understanding leakages in the Public Distribution System. Economic and Political Weekly, 50(7), 39-42.

Drèze, J., Khalid, N., Khera, R., & Somanchi, A. (2017). Aadhaar and Food Security in Jharkhand. Economic & Political Weekly, 52(50), 50-60.

Heeks, R., & Renken, J. (2018). Data justice for development: What would it mean? Information Development, 34(1), 90-102.

Khera, R. (2011). India’s Public Distribution System: utilisation and impact. Journal of Development Studies, 47(7), 1038-1060.

Masiero, S. (2015). Redesigning the Indian food security system through e-governance: The case of Kerala. World Development, 67, 126-137.

Swaminathan, M. (2002). Excluding the needy: The public provisioning of food in India. Social Scientist, 30(3-4), 34-58.

blog

[BigDataSur] My experience in training women on digital safety

by Cecilia Maundu

I remember it was September 2015 when I was invited for a two-day workshop on digital safety by the Association of Media Women in Kenya. At first I was very curious because I had not heard much about digital security. The two-day workshop was an eye opener. After the workshop I found myself hungry for more information on this issue.

Naturally, I went online to find more information. I must say I was shocked at the statistics I came across on the number of women who have been abused online, and continue to suffer. Women were being subjected to sexist attacks. They were attacked because of their identity as women, not because of their opinions. I asked myself what can I do? I am well aware that I am just a drop in the ocean, but any little change I can bring will help in some way. That was a light bulb moment for me.

It was in that moment that I knew I wanted to be a digital safety trainer. I wanted to learn how to train people, especially women, on how to stay safe online. The internet is the vehicle of the future. This future is now, and we cannot afford for women to be left behind.

Online violence eventually pushes victims to stay offline. It is censorship hidden behind the veil of freedom of expression.

After this realization, I embarked on the quest to become a digital safety trainer. As fate would have it, my mentor Grace Githaiga came across the SafeSister fellowship opportunity and sent it to me. I applied and got into the program. The training was taking place in Ngaruga lodge, Uganda. The venue of the training was beyond serene. The calm lake waters next to the hotel signified how we want the internet to be for women: a calm place and a safe space where women can express themselves freely without fear of being victimized, or their voices being invalidated.

On arrival we were met by one of the facilitators, Helen, who gave us a warm welcome. The training was conducted by five facilitators, all of whom were women.

The training was student friendly. The topics were broken down in a way that allows everyone to understand what was being discussed. Each facilitator had her own way and style of delivering the different topics, from using charts to power point presentations. I must say they did an exceptional job. I got to learn more about online gender violence and how deeply rooted it is in our society, and hence the importance of digital security trainings.

Being a trainer is not only about having digital safety skills, it also requires you to be an all rounded person. While giving training you are bound to meet different types of people with different personalities, and it is your duty to make them feel comfortable and make sure the environment around them is safe. It is in this safe space that they will be able to talk and express their fears and desires, and, most importantly, they will be willing to learn. As a digital security trainer, you should first know more about your participants and how much they know about digital security. This will enable you to package your material according to their learning needs.

Being a trainer requires you to read a lot on digital security, because this keeps you updated and allows you, therefore, to relay accurate information to your trainees. As a trainer, it is also necessary to understand the concept of hands on training because it gives the participants the opportunity to put into practice what they have learnt. For example, when you are teaching about privacy status on Facebook, you don’t just talk about it, your should rather ask the participants to open their Facebook accounts – that is if they have any – and go through the instructions step by step with them till they are able to achieve the task. As a trainer there is also the possibility of meeting a participant who does not give the opportunity to the rest of the group to express their views, as they want to be the one to talk throughout. However, the trainer needs to take charge and make sure that each participant is given an equal opportunity to talk.

Before the training we had each been given a topic to make a presentation on, and mine was to do with encryption; VeraCrypt to be more specific. At first it sounded Greek to me, but then I resorted to my friend Google to get more details (this begs the question of: how was life before the internet?). By the time I was leaving Kenya for Uganda I had mastered VeraCrypt. We kept discussing our topics with the rest of the group to a point where they started calling me Vera. My presentation went so well to my surprise. The week went by so fast. By the time we realized it, it was over and it was time to go back home and start implementing what we had learnt.

We continued receiving very informative material online from the trainers. In September 2017 they opened up a pool of funding where we could apply to fund a training in our different home countries. I got the funding, and chose to hold the training at Multimedia University where I lecture part time. The reason behind my choice was that this was home for upcoming young media women, and we needed to train them on how to stay safe online, especially since media women in Kenya form the majority of victims of gender based violence. They needed to know what awaits them out there and the mechanisms they needed to have to protect themselves from the attacks. The training was a success, and the young ladies walked away happy and strong.

The second, and last, part of SafeSister (I am barely holding my tears here, because the end did come) took place in Uganda at the end of March 2018. It was such a nice reunion, meeting the other participants and our trainers after a year. This time the training was more relaxed. We were each given a chance to talk about the trainings we conducted, the challenges we encountered, the lessons learnt and what we would have done differently. For me the challenge I encountered was time management. The trainers had prepared quite informative materials, hence the time ran over, add to it a 3o minutes delayed start.

This was my first training, and one take home for me as a digital safety trainer was that not all participants will be enthusiastic about the training, but one shouldn’t be discouraged or feel like they are not doing enough. The trainer just needs to make sure that no participant is left out. The trainer should not just throw questions at the participants, or just ask for their opinion on different issues regarding digital safety. As time progresses, they gradually get enthusiastic and start feeling more at ease.

One thing I have learnt since I became a digital security trainer is that people are quite ignorant on digital security matters. People go to cybercafés and forget to sign out of their email accounts, or use the same password for more than a single account, and  then they ask you ‘’why would someone want to hack into my account or abuse me and I am not famous?” However, such questions should not discourage you, on the contrary, they should motivate you to give more trainings, because people don’t know how vulnerable they are by being online while their accounts and data are not protected. Also as a trainer, when you can, and when the need arises, give as much free trainings as you can, since not everyone can afford to pay you. It is through these trainings that you continue to sharpen your skills and become an excellent trainer.

After the training we were each awarded a certificate. It felt so good to know that I am now a certified digital security trainer; nothing can beat that feeling.  As they say, all good things must come to an end. Long live Internews, long live DefendDefenders Asante Sana. I will forever be grateful.

 

Cecilia Mwende Maundu is a broadcast journalist in Kenya, a digital security trainer and consultant with a focus on teaching women how to stay safe online. She is also a user experience (UX) trainer, collecting user information feedback and sharing it with developers.

koru-maori-vector

[BigDataSur] blog 3/3. Good Data: Challenging the Colonial Politics of Knowledge

In the previous instalment of this three-part series on the possibilities of “good data” (here and here), Anna Carlson concluded that notions of decentralisation and autonomy could not do without an understanding of global inequalities and their politics. Moving away from Northern, individualist visions of digital utopia, she considers what can be learned from Indigenous data sovereignty initiatives as they address, from the South, the colonial legacies of global knowledge production. 

The gathering of data has long been a strategy of colonialism. It is part of a set of practices designed to “standardise and simplify the indigenous ‘social hieroglyph into a legible and administratively more convenient format’ (Scott 1999, 3)” (Smith 2016, 120). Making something legible is always already political, and it always begs the question: legible for what, and to whom?

In Australia, making legible meant enumerating Indigenous “‘peoples’ into ‘populations’ (Taylor 2009); their domestic arrangements and wellbeing […] constrained within quantitative datasets and indicators that reflected colonial preoccupations and values” (Smith 2016, 120). No matter how ‘big’ the data set, data is never neutral. As Smith points out, Indigenous self-determination is increasingly linked to “the need to also reassert Indigenous peoples’ control and interpretation into the colonial data archives, and to produce alternative sources of data that are fit for their contemporary purposes” (Smith 2016, 120). Data is knowledge, and knowledge is power.

No matter how it is collected, data has long been used in ways that reinforce and sustain this colonial status quo. Data – often quantitative, often raw numbers – is used strategically. As Maggie Walter argues in a chapter in Indigenous Data Sovereignty: Towards an Agenda (2016): “social and population statistics are better understood as human artefacts, imbued with meaning. And in their current configurations, the meanings reflected in statistics are primarily drawn from the dominant social norms, values and racial hierarchy of the society in which they are created” (Walter 2016, 79).

Dianne E Smith expands in the same volume:

“Data constitute a point-in-time intervention into a flow of information of behaviour – an attempt to inject certainty and meaning into uncertainty. As such, data can be useful for generalising from a particular sample to a wider population […] testing hypotheses […] choosing between options (etc). […]  However, when derived from ethnocentric criteria and definitions, data can also impose erroneous causal connections and simplify social complexity, thereby freezing what may be fluid formations in the real world. In their unadorned quantitative form, data are hard-pressed to cope with social and cultural intangibles.” (2016, 119-120)

As such, argues Smith, questions about data governance – “who has the power and authority to make rules and decisions about the design, interpretation, validation, ownership, access to and use of data” – are increasingly emerging as key “sites of contestation” between Indigenous communities and the state (Smith 2016, 119).

One of the more interesting responses to these data challenges comes from the movements for Indigenous Data Sovereignty. C. Matthew Snipp (2016) outlines three basic preconditions for data decolonisation: “that Indigenous peoples have the power to determine who should be counted among them; that data must reflect the interests and priorities of Indigenous peoples; and that tribal communities must not only dictate the content of data collected about them, but also have the power to determine who has access to these data.” In practice, then, it means Indigenous communities having meaningful say over how information about them is collected, stored, used and managed. In Indigenous Data Sovereignty: Towards an Agenda, the editors have brought together a set of interesting case studies reflecting on different examples of Indigenous data sovereignty. These case studies are by no means exhaustive, but they do point to a set of emerging protocols around relationality and control that may prove instructive in ongoing attempts to imagine good data practices in the future.

The First Nations Information Governance Centre (FNIGC)

FNIGC in Canada is an explicitly political response to the role of knowledge production in maintaining colonial power relationship.  The FNIGC articulate a set of principles that should be respected when gathering data about First Nations communities.  These principles are “ownership, control, access and permission” (OCAP), and they are used by the FNIGC as part of a certification process for research projects, surveys and other data gathering mechanisms.  In some ways, they kind of operate as a “right of reply” – pointing to the inadequacies or successes of data-gathering practices.  They aim, at least in part, to address the heavily skewed power relationship that continues to exist between Indigenous communities and the researchers who study them.

“By Maori for Maori” healthcare

This initiative attempts to incorporate Maori knowledge protocols into primary health care provision. In this case, Maori knowledge protocols are incorporated (to some extent) into data collection, analysis and reporting tools which are used by primary health care services (Maori run) in developing knowledge about and responses to health issues in Maori communities. These protocols are also used to develop and enable processes for data sharing across related networks – for example, with housing providers.

Yawuru data sovereignty

The Yawuru, traditional owners of the area around Broome in Western Australia, recognised that gaining control of the information that existed about them (by virtue of the extensive native title data gathering process) was crucial to maintaining their sovereignty. They also recognised the value of a data archive that was produced by and for Yawuru peoples. They undertook their own data collection processes, including a “survey of all Indigenous people and dwellings in the town to create a unit-record baseline.” They sought to create an instrument to “measure local understandings of Yawuru wellbeing (mabu liyan).”They digitally mapped their lands in order to “map places of cultural, social and environmental signficance to inform a cultural and environmental management plan.” Finally, they sought to incorporate Yawuru knowledge protocols into the development of a “documentation project […] to collate and store all relevant legal records, historical information, genealogies and cultural information” in line with Yawuru law.

Beyond these discreet examples, data sovereignty is exercised in a variety of other forms. Much Indigenous knowledge exists entirely outside the colonial state, and is held and protected in line with law. Many of the knowledge keepers in Australia retain this data sovereignty, despite the increasing pressure to make such information public in order to access legal recognition. This speaks to precisely the dilemma of contemporary data politics, as “opting out” of the systems that collect our data is increasingly difficult in a world where giving up our data rights is too often a precondition to accessing the things that make life liveable.

Unknown

[BigDataSur] blog 2/3: Digital Utopianism and Decentralisation: Is Decentralised Data Good Data?

In this second of three blog posts on the challenges of imagining ‘good data’ globally (read the first first episode here), Anna Carlson considers a particular strand of technology-driven utopianism in the Global North: the idea of radically decentralised data. She writes that its latest instantiations – exemplified by blockchain – tend to forget the dire and unequally distributed ecological impact of digital technologies.    

I was born in 1989, just a handful of years before mobile phones and laptop computers became the ubiquitous symbols of urban late modernity.  It is also the year that the terms “cybersecurity” and “hypertext markup language” (HTML) first appeared in print. In the heady, early years of the World Wide Web, the internet and the data it produced (and contained and distributed) offered a utopian vision of equitable globalisation, the promise to equalise access to knowledge, the possibility of a new kind of commons. Early adopters across the political spectrum celebrated the possibilities of decentralisation and autonomy, the opportunities for community-building and collectivity, for sharing economies outside the control of the state.

These notions of the common good mostly originate in the Global North, and their promises have never quite been fulfilled. They continue to re-emerge, however, and provide interesting food for thought in the process of imagining good (or at least, better) data practices for the future.

Blockchain is probably the most prominent technology to revive utopias of radical decentralisation. At its most basic, a blockchain is a massive peer-to-peer, decentralised technology that allows digital information to be seen and shared but not copied. Described by some as the “backbone of a new type of internet,” blockchain is basically a very complex, constantly updating, decentralised titling system. A blockchain is like a database, tied to the object or site of interest, constantly cross-checking and updating.

The technology originally emerged as a way of keeping track of cryptocurrencies like Bitcoin, and proponents Don and Alex Tapscott describe it as “an incorruptible digital ledger of economic transactions that can be programmed to record not just financial transactions but virtually everything of value.”  The blockchain is inherently decentralised in it cannot be controlled by any single entity, but requires constant validation and authentication from the network.

Blockchain is often represented optimistically, if vaguely, as capable of supporting new economies, free from the shackles of corporate control. In Melanie Swan’s Blockchain: Blueprint for a New Economy, she goes even further: “the potential benefits of the blockchain are more than just economic – they extend into political, humanitarian, social, and scientific domains […] For example, to counter repressive political regimes, blockchain technology can be used to enact in a decentralised cloud functions that previously needed administration… (e.g.) for organisations like WikiLeaks (where national governments prevented credit card processors from accepting donations in the sensitive Edward Snowden situation).” She goes on to describe the possibility of blockchain technology as a basis for new economies dominated by (ahem) platforms like Uber and AirBnB.

So far, so good, right? The problem is that blockchains are incredibly unwieldy and immensely energy inefficient. To use the currency bitcoin as an example, the energy required for a single transaction far exceeds the amount needed for a more traditional transfer.  This is at least in part because the system is premised on hyper-individualistic, libertarian ideals.  In a recent article on Motherboard, Christopher Malmo writes that, with prices at their current level, “it would be profitable for Bitcoin miners to burn through over 24 terawatt-hours of electricity annually as they compete to solve increasingly difficult cryptographic puzzles to “mine” more Bitcoins. That’s about as much as Nigeria, a country of 186 million people, uses in a year.” At least in part, this energy is required because, as Alex de Vries suggests: “Blockchain is inefficient tech by design, as we create trust by building a system based on distrust. If you only trust yourself and a set of rules (the software), then you have to validate everything that happens against these rules yourself. That is the life of a blockchain node.”

An interesting example of the tensions of decentralised cryptocurrencies emerged recently when The New Inquiry released their newest project, Bail Bloc. Bail Bloc is a “cryptocurrency scheme against bail,” that delivers much-needed funds to the excellent Bronx Freedom Fund in order to help provide bail fees for defendants. They outline its function as follows: “When you download the app, a small part of your computer’s unused processing power is redirected toward mining a popular cryptocurrency called Monero, which is secure, private, and untraceable. At the end of every month, we exchange the Monero for US dollars and donate the earnings to the Bronx Freedom Fund.”

On the surface, taking “unused” resources and redistributing them equitably sounds like a good idea. Unfortunately, the missing information here is that the processing power is not really “unused” – what they mean that it’s unused by you, and that using it for this cause won’t inconvenience your personal use.

But there is a slippage here that seems important to clarify.  Mining Monero requires a huge amount of electricity – much the same as the Bitcoin example above. As a result, this charity structure requires that we burn huge amounts of energy to produce a fairly minimal value, which is then redistributed. Like many such examples, it would be much easier and simpler to simply donate the money directly to the Bronx Freedom Fund.  The flip side, of course, is that electricity doesn’t feel like a scarce resource in the way that money does, so generating money out of electricity feels like generating money out of nowhere.

It is easy to see what the New Inquiry designers are appealing to: the desire to affect positive change without having to really do anything at all. And it seemed to work: in the first 24 hours of the initiative’s launch, I saw numerous friends and colleagues sharing the page with suggestions that we establish similar schemes for causes closer to home.

The sense of practically and elegantly re-distributing otherwise unused resources through the magic of technology might be appealing, but it should not come at the expense of a broader understanding of global inequalities and planetary sustainability. Decentralised technology cannot sustain “good data” if the values that are encoded in it do not account for our shared stake in the world.

The question, then, is not whether decentralised data is good data, but under what terms and conditions– and what politics of the collective (or commons) goes into shaping decentralised technology.

((to be continued. Next episode will be online on June 8th, 2018))

 

 

good-data-project

[BigDataSur] blog 1/3: Imagining ‘Good’ Data: Northern Utopias, Southern Lessons

by Anna Carlson

What might ‘good data’ look like? Where to look for models, past and emerging? In this series of three blog posts, Anna Carlson highlights that we need to understand data issues as part of a politics of planetary sustainability and a live history of colonial knowledge production. In this first instalment, she introduces her own search for guiding principles in an age of ubiquitous data extraction and often dubious utopianism.           

I’m sitting by Cataract Gorge in Launceston, northern Tasmania. I’ve just climbed a couple of hundred metres through bushland to a relatively secluded look-out. It feels a long way from the city below, despite the fact that I can still hear distant traffic noise and human chatter. I pull out my laptop, perhaps by instinct. At around the same moment, a tiny native lizard dashes from the undergrowth and hovers uncertainly by my bare foot. I think briefly about pulling out my cracked and battered (second-hand) iPhone 4 to archive this moment, perhaps uploading it to Instagram with the hashtag #humansoflatecapitalism and a witty comment. Instead, I start writing.

I have been thinking a lot lately about the politics and ethics of data and digital technologies. My brief scramble up the hill was spent ruminating on the particular question of what “good data” might look like. I know what not-so-good data looks like. Already today I’ve generated a wealth of it. I paid online for a hostel bunk, receiving almost immediate cross-marketing from AirBnB and Hostelworld through social media sites as well as through Google. I logged into my Couchsurfing account, and immediately received a barrage of new “couch requests” (based, I presume, on an algorithm that lets potential couch surfers know when their prospective hosts login).  I’ve turned location services on my phone on, and used Google Maps to navigate a new city. I’ve searched for information about art galleries and hiking trails. I used a plant identification site to find out what tree I was looking at. Data, it seems, is the “digital air that I breathe.”

Writing in the Guardian, journalist Paul Lewis interviews tech consultant and author Nir Eyal, who claims that “the technologies we use have turned into compulsions, if not full-fledged addictions. […] It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” And this addictive quality is powerful: digital technologies are credited with altering everything from election results to consumer behaviour and our ability to empathise. Indeed, there’s money to be made from a digitally-addicted populace. Encompassing everything from social media platforms to wearable devices, smart cities and the Internet of Things, almost every action we take in the world produces data of some form, and this data represents value for the corporations, governments and marketers who buy it.

This is the phenomenon commonly referred to as Big Data, which describes sets of data so big that they must be analysed computationally. Usually stored in digital form, this data encompasses everything from consumer trends to live emotional insights, and it is routinely gathered by most companies with an online presence.

The not-goodness of this data isn’t intrinsic, however. There is nothing inherently wrong with creating knowledge about the activities we undertake online. Rather, the not-goodness is a characteristic of the murky processes through which data is gathered, consolidated, analysed, sold-on and redistributed. It’s to do with the fact that few of us really know what data is being gathered about us, and even fewer know what that data will be used for. And it’s to do with the lineages that have structured processes of mass data collection, as well as their unequally distributed impacts.

Many of us know that the technologies on which we rely have dark underbellies; that the convenience and comfort of our digital lives is contingent on these intersecting forms of violence. But we live in a world where access to these technologies increasingly operates as a precondition to entering the workforce, to social life, to connection and leisure and knowledge. More and more workers (even in the global north) are experiencing precarity, worklessness and insecurity; experiences that are often enabled by digital technologies and which, doubly cruelly, often render us further reliant on them.

The ubiquity of the digital realm provokes new ethical conundrums. The technologies on which we are increasingly reliant are themselves reliant on exploitative and often oppressive labour regimes. They are responsible for vast ecological footprints. Data is often represented as immaterial, ‘virtual,’ and yet its impact on environments across the world is pushing us ever closer to global ecological disaster. Further, these violent environmental and labour relations are unequally distributed: the negative impacts of the digital age are disproportionately focused on communities in the Global South, while the wealth generated is largely trapped in a few Northern hands.

Gathering data means producing knowledge within a particular set of parameters. In the new and emerging conversations around Big Data and its impact on our social worlds, much focus is placed on the scale of it, its bigness, the sheer possibility of having so much information at our fingertips. It is tempting to think of this as a new phenomenon – as an unprecedented moment brought about by new technologies. But as technologist Genevieve Bell reminds us, the “logics of our digital world – fast, smart and connected – have histories that proceed their digital turns.” Every new technological advance carries its legacies, and the colonial legacy is one that does not receive enough attention.

So, when we imagine what “good data” and good tech might look like now, we need to contend with the ethical quagmire of tech in its global and historical dimensions. To illustrate this point, I examine the limits of contemporary digital utopianism (exemplified by blockchain) as envisioned in the Global North (Episode 2), before delving into the principles guiding “good data” from the point of view of Indigenous communities (Episode 3).

Acknowledgments:  These blogposts have been produced as part of the Good Data project (@Good__Data), an interdisciplinary research initiative funded by Queensland University of Technology Faculty of Law, which is located in unceded Meanjin, Turrbal and Jagera Land (also known as Brisbane, Australia). The project examines ‘good’ and ‘ethical’ data practices with a view to developing policy recommendations and software design standards for programs and services that embody good data practices, in order to start conceptualising and implementing a more positive and ethical vision of the digital society and economy. In late 2018, an open access edited book entitled Good Data, comprising contributions from authors from different disciplines located in different parts of the world, will be published by the Amsterdam University of Applied Sciences Institute of Network Cultures.’

iur

Big Data from the South at the LASA conference in Barcelona

Stefania Milan is the co-organizer with Emiliano Trere (Cardiff University) and Anita Say Chan (University of Illinois, Urbana-Champaign) of two panels at the forthcoming conference of the Latin American Studies Association (LASA), in Barcelona on May 23-26.

The (slightly revised) lineup:

Big Data from the Global South Part I (1033 // COL – Panel – Friday, 2:15pm – 3:45pm, Sala CCIB M211 – M2)
PROJECT FRAMING + GROUP BRAINSTORMING
·      Stefania Milan, Emiliano Trere, Anita Chan: From Media to Mediations, from Datafication to Data Activism
 
Big Data from the Global South Part II: Archive Power (1101 // COL – Panel – Friday, 4:00pm – 5:30pm, Sala CCIB M211 – M2)
·      Inteligencia Artificial y Campañas Electorales en la Era PostPolítica: Seguidores, Bots, Apps: Paola Ricaurte Quijano, Eloy Caloca Lafont 
·      Open Government, APPs and Citizen Participation in Argentina, Chile, Colombia, Costa Rica and Mexico: Luisa Ochoa; Fernando J Martinez de Lemos 
·      Cryptography, Subjectivity and Spyware: From PGP Source Code and Internals to Pegasus: Zac Zimmer
·      Engineering Data Conduct: Prediction, Precarity, and Data-fied Talent in Latin America’s Start Up Ecology: Anita J Chan
 
Big Data from the Global South Part III: Data Incorporations (1167 // COL – Panel – Friday, 5:45pm – 7:15pm, Sala CCIB M211 – M2)
·      Evidence of Structural Ageism in Intelligent Systems: Mireia Fernandez 
·      Doing Things with Code: Opening Access through Hacktivism: Bernardo Caycedo
·      Decolonizing Data: Monika Halkort
·      Maputopias: Miren Gutierrez

 

About LASA 2018

Latin American studies today is experiencing a surprising dynamism. The expansion of this field defies the pessimistic projections of the 1990s about the fate of area studies in general and offers new opportunities for collaboration among scholars, practitioners, artists, and activists around the world. This can be seen in the expansion of LASA itself, which since the beginning of this century has grown from 5,000 members living primarily in the United States to nearly 12,000 members in 2016, 45 percent of whom reside outside of the United States (36 percent in Latin America and the Caribbean). And while the majority of us reside in the Americas, there are also an increasing number of Latin American studies associations and programs in Europe and Asia, most of which have their own publications and annual seminars and congresses.

Several factors explain this dynamism. Perhaps the most important is the very maturity of our field. Various generations of Latin Americanists have produced an enormous, diverse, and sophisticated body of research, with a strong commitment to interdisciplinarity and to teaching about this important part of the world. Latin American studies has produced concepts and comparative knowledge that have helped people around the world to understand processes and problematics that go well beyond this region. For example, Latin Americanists have been at the forefront of debates about the difficult relationship between democracy, development, and dependence on natural resource exports—challenges faced around the globe. Migration, immigration, and the displacement of people due to political violence, war, and economic need are also deeply rooted phenomena in our region, and pioneering work from Latin America can shed light on comparable experiences in other regions today. Needless to say, Latin American studies also has much to contribute to discussions about populism and authoritarianism in their various forms in Europe and even the United States today.

With these contributions in mind, we propose that the overarching theme of the Barcelona LASA Congress be “Latin American Studies in a Globalized World”, and that we examine both how people in other regions study and perceive Latin America and how Latin American studies contributes to the understanding of comparable processes and issues around the globe.

smart

[blog 3/3] Designing the city by numbers? KAPPO: More than a game 1

 

 

This is already the last blog posts of the series ‘Designing the city by numbers? Bottom-up initiatives for data-driven urbanism in Santiago de Chile’, by Martín Tironi and Matías Valderrama Barragán. Please find here the first blogpost ‘Hope for the data-driven city‘ and the second ‘Digital quantification regimes of cycling mobility‘.

 

For our final post in this series, we explore the case of the social game for smartphones, KAPPO. It was developed in early 2014 by four Chilean entrepreneurs with the goal of engaging citizens with cycling. It is structured around levels in which each trip on the bike gives experience points, virtual coins and rewards to the player, and the highest level of the game is the “Capo”2. It also offers a series of challenges and rankings for competing with friends or other KAPPO players. Though this gamified design, KAPPO puts together a narrative focused on its ability to “provoke,” “motivate”, “create the habit” of regularly using the bicycle and improving the user’s health. As suggested by Kappo in one of its promotional ads, the app promises to “show the cyclist inside of you”.

Although KAPPO did not have great success in Chile initially, it started to grow in adopters abroad in other countries, receiving funds and making public-private partnerships with municipalities in Denmark. Since then, its founders sought to position the app as “more than a game” for smartphones, seeking out different ways of capitalizing on the data generated through its use. For example, KAPPO would develop a competition event called “Cool Places to Bike” where organizations compete on the grounds of which best encourages the use of the bicycle measured by KAPPO. It also developed “Health and Well-being Programs” for companies, promising to improve productivity and well-being of workers by increasing bike use through the use of KAPPO. Local governments have also become KAPPO clients through the development of “KAPPO Insights”, a web platform which allows public planners and officials to process and visualize anonymized routes tracked by the app to decision-making.

The data obtained by KAPPO, however, present biases and is not representative of cyclists of Santiago. Instead of emphasizing the scientific narrative of RUBI regime, the narrative deployed by KAPPO is one that aims to convince city officials based on three aspects: an inexpensive method, data capture in real time, and allowing a “strong” participatory citizen involvement that encourages bicycle use. Via such aspects, KAPPO’s analytics and flow maps acquire value and foster decision making that modifies the city in a rapid, experimental cycle, guided by the “real” movements of cyclists gathered in non-declarative ways. KAPPO thus does not seek to measure and quantify cyclists’ mobility representatively like RUBI, but seeks to intervene directly by encouraging greater bike use, and presenting increases in cycling statistics biannually in order to legitimate this digital quantification regime.

The politics of digital quantification: Some points to an open debate

With these very brief vignettes of digital quantification regimes developed in Latin America, it is interesting to note how initiatives like KAPPO and RUBI that are born in the South and adopt a grammar of citizen participation, also try to differentiate themselves from competing foreign technologies. But they nonetheless end up replicating the rationalities and logics of nudge and automation when they try to escalate to the global market to survive as technological entrepreneurship, diminishing at once the possible capacity of activism or citizen engagement in the planning processes. This opens up a debate around the actual political capacities of sensors and tracking technologies to enhance citizen participation and the agendas behind its developers.

Second, it is relevant to consider the different specificities of each regime of digital quantification. Each regime design and mobilises materialities, narratives and economic interests in order to justify their databases, algorithms and analytics as the most convenient, objective or useful for knowing, planning and governing the city in a data-driven way. As a result, the ecology of quantification regimes is always heterogeneous, diverging and relating to their contexts and interests, and combining various technologies of justification (beyond the device or app). From this perspective, we found interesting elements on the goals of each regime and their capacities. For example, KAPPO exacerbated the participatory or citizen nature of the app under a commercial logic from its inception. By contrast, the RUBI regime initially emphasized participatory and bottom-up elements but the agency of cyclists was gradually displaced by more automated designs to obtain “scientifically correct” data. They also try to differentiate from other methods like surveys and devices, both digital and analogue, invoking limitations and biases. In short, capitalizing on digital data requires various strategies of justification (Bolstanski and Thévenot, 2006) that should not be taken for granted and that goes beyond the generation of data alone. Before going into a priori definition of digital data, users or urban space, it is crucial to delve into these strategies and interests, as well as the reasons on why some regimes of digital quantification end up prevailing, whereas others are ignored.

Third, despite the discrepancies between the cases analysed, we note that both cases start from a shared imaginary of data-driven city governance inspired by the North. In this imaginary, opening and sharing data on the mundane practice of riding a bicycle is invoked as a means of empowering citizen involvement with the capacity to make the city smarter and more bike-friendly. However, this imaginary can lead, first, to a reconfiguration of government as “clients of data” and citizen participation towards more passive, invisible versions that are free of true effort, and in which the exchange of data and is used for the benefit of certain stakeholders with interests other than democratic ends. Before turning cyclists into “co-designers” or “participants” in city planning, they are situated in practice as data producers without ever being informed of the real use of the data generated in a government decision or other use by third parties. And the process of automation of the devices or gamify the design of devices is in direct connection with these forms of participation. This point leave us with the question on which other responses for everyday breakdowns and idiotic data could be enacted to promote an active digital activism. And also, which modalities of experimentation allow for the consideration of those imperceptible murmurs that tend to be marginalized from the prevailing cannons of smart culture3?

Fourth, a data-driven planning and governance initiative opens up the discussion of how notions of “expertise” and “politics” are reconfigured. These regimes of digital quantification promote the belief that the decision-maker, without necessarily being an expert on the topic, can make decisions in a primarily technical manner driven by the “real” behaviour of the citizens and not by opinions, ideological differences or party pressures. Political factors are framed as something that must be eradicated through the gathering and processing of data on people’s behaviour. This politics of technify-ing decision-making is nothing new. As Morozov (2014) has written, the idea of an algorithmic regulation evokes the old technocratic utopia of politics without politics: “Disagreement and conflict, under this model, are seen as unfortunate byproducts of the analog era – to be solved through data collection – and not as inevitable results of economic or ideological conflicts.”

In this sense, a data-driven urbanism would carry the risk of believing not only in a neutrality or immediacy of data, but with it a depoliticization of urban planning and government in favour of technocratic and automated decision-making systems. Behind the apparent technical reduction of discretion in decision-making by these regimes of digital quantification, in practice, we can see how many political or discretionary decisions are made in how these regimes are enacted and made public.

 

References

Boltanski, L., & Thévenot, L. (2006). On justification: Economies of worth. Princeton University Press.

van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and secular belief. Surveillance & Society, 12(2), 197-208.

Espeland, W. N., & Stevens, M. L. (2008). A sociology of quantification. European Journal of Sociology/Archives Européennes de Sociologie, 49(3), 401-436.

Esty, D. C. & Rushing, R. (2007). Governing by the Numbers: The Promise of Data-Driven Policymaking in the Information Age. Center for American Progress, 5, 21.

Gabrys, J. “Citizen Sensing: Recasting Digital Ontologies through Proliferating Practices.” Theorizing the Contemporary, Cultural Anthropology website, March 24, 2016.
Goldsmith, S. & Crawford, S. (2014). The responsive city: engaging communities through data-smart governance. San Francisco, CA: Jossey-Bass, a Wiley Brand.

Goodchild, M. F. (2007). Citizens as sensors: The world of volunteered geography. GeoJournal, 69(4), 211-221.

Kitchin, R. (2014b). The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences. London: Sage.

Mares, N. (2018). What if nothing happens? Street trials of intelligent cars as experiments in participation. In S. Maassen, Dickel, S. and Schneider, C. H. (Eds), TechnoScience in Society, Sociology of Knowledge Yearbook. Nijmegen: Springer/Kluwer.
Mayer-Schönberger, V. and Cuckier, K. (2013). Big Data: A revolution that will transform how we live, work, and think. New York: Houghton Mifflin Harcourt.

Morozov, E. (2014). The rise of data and the death of politics. The Guardian. https://www.theguardian.com/technology/2014/jul/20/rise-of-data-death-of-politics-evgeny-morozov-algorithmic-regulation

 

1. This text is based on a presentation at the Workshop “Designing people by numbers” held in Pontificia Universidad Católica in November 2017.
2. Colloquial term in Spanish for people with a great deal of expertise or knowledge about a topic or activity.
3. On this point, see the controversies and public issues generated by the street testing with driverless cars (Marres, 2018)

 

About the authors: Martín Tironi is Associate Professor, School of Design at the Pontifical Catholic University of Chile. He holds a PhD from Centre de Sociologie de l’Innovation (CSI), École des Mines de Paris, where he also did post-doctorate studies. He received his Master degree in Sociology at the Université Paris Sorbonne V and his BA in Sociology at the Pontifical Catholic University of Chile. Now he’s doing a visiting Fellow (2018) in Centre of Invention and Social Proces, Goldsmiths, University of London [email: martin.tironi@uc.cl]

Matías Valderrama Barragán is a sociologist with a Master in Sociology from the Pontifical Catholic University of Chile. He is currently working in research projects about digital transformation of organizations and datafication of individuals and environments in Chile. [email:mbvalder@uc.cl]

smart

[blog 2/3] Designing the city by numbers? Digital quantification regimes of cycling mobility 1

 

 

This is the second of three blog posts of the series ‘Designing the city by numbers? Bottom-up initiatives for data-driven urbanism in Santiago de Chile’, by Martín Tironi and Matías Valderrama Barragán. Find the first post hereStay tuned: the next episode will appear next Friday, May 4th!

 

Over the past two years, we have been studying cases that specifically involve digital quantification and urban cycling in the city of Santiago. Because of the multiple benefits to the environment, urban congestion, and citizens’ health, urban cycling has been characterized as a “green” and “sustainable” form of mobility, highly attractive for smart cities initiatives. Under this trend, various digital devices and self-tracking apps have been developed for quantifying and expand urban cycling. The numbers and data generated by such an array of technologies have more recently been reframed as valuable crowdsourced information to inform and guide decisions on urban planning and promoting citizen demands. In this sense, data-driven initiatives seem to promote a spirit where citizens themselves appear as the central actors of urban planning thanks to the development of these civic technologies. In contrast, we explore why we should remain sceptical of how such data-driven initiatives adopt what can appear to be bottom-up approaches. We should remain critically vigilant of how such moves can be used to promote market-driven technological adoption and low-efforts forms of citizenship instead.

RUBI: Let the bikes speak for themselves

Our first example is the case of RUBI, Urban Bike Tracker device in Spanish, which we examined more in detail in a recently published paper in the journal Environment and Planning D. This device records the routes taken by cyclists anonymously in a georeferenced database that is later processed on a web platform (RubiApp) to obtain numbers, metrics and visualizations of the users. It was developed in 2014 by a young engineering student as his undergraduate thesis. At that time, he started a bottom-up project called Stgo2020, in order to invite cyclists of Santiago to voluntarily participate in the collection of data about their everyday trips, and with that, challenging the status quo in urban planning and allowing cyclists to act as “co-designers” of their own city. The project achieved the collection of data from a hundred volunteer cyclists, generating graphics, tables and heat maps about urban cycling. This information was shared later with the Transportation Office hoping that it would help to make data-driven decisions about future cycling lanes -but he never knew if the data was used in some way.

Because of the academic origin of RUBI, the entire development of the device was based on a strongly scientific narrative on how to achieve a “representative” and “clean” sample of cyclists’ mobility. So, the developer decided to design a hardware that could be differentiated from apps like STRAVA and wearables technologies that would depend on expensive technologies and data plan, presenting strong biases in his opinion. This scientific narrative marked the whole design and materiality of the RUBI. The first prototypes were large, fragile and very much dependent on the human user in several respects. In fact, the engineer behind the device playfully drew a human face on the first prototype. But several problems emerged with these first versions. For example, users continually forgot to turn it on or off when necessary, some users subvert and appropriate the functioning of the technology in unexpected ways, and particular problems emerged from the GPS of the device itself. These emerging breakdowns from the everyday entanglement of cyclists, devices, bicycles and urban spaces, produced “erroneous”, “stupid” or “absurd” data for the engineer, that we call it as “idiotic data” in our paper based on Isabelle Stengers conceptual character of the idiot, which slow down and put in question the “clean” collection of data intended for the project. To confront the emergence of idiotic data, new sensors, algorithms and automated functions were aggregated to give the device a greater “smartness” to operate as an autonomous and independent entity, outside of human control. In the process, the device shift to a literal “black box” ensuring little interaction with the cyclists and the environment as possible, and as a result, the practice of quantifying the urban cycling become more unnoticed and effortless for cyclists.

During 2016, the RUBI device scaled up to other cities using new business models, losing its bottom-up nature. The company RubiCo was created and reached agreements with local governments and international consulting agencies like the Inter-American Development Bank, to map the use of public bicycle rental systems -even without the notice of users in some cases. Giving the device “true intelligence” was not only a precautionary “solution” to idiotic data, but it was mobilized to add value and solidity to the regime compared to the competition. In contrast to other self-tracking technologies (apps, wearables, etc.), RubiCo focuses on control the biases and noise of the sample on cyclists’ mobility, constituting RUBI interaction with the bike as an authentic “moving laboratory” that captures georeferenced data precisely and objectively, using the words of RUBI’s developer.

Stay tuned for the final posts in this series for more on the development of the KAPPO pro-cycling smartphone game and its outcomes in Santiago.

 

 

1. This text is based on a presentation at the Workshop “Designing people by numbers” held in Pontificia Universidad Católica in November 2017, with the participation of Celia Lury

 

About the authors

Martín Tironi is Associate Professor, School of Design at the Pontifical Catholic University of Chile. He holds a PhD from Centre de Sociologie de l’Innovation (CSI), École des Mines de Paris, where he also did post-doctorate studies. He received his Master degree in Sociology at the Université Paris Sorbonne V and his BA in Sociology at the Pontifical Catholic University of Chile. Now he’s doing a visiting Fellow (2018) in Centre of Invention and Social Proces, Goldsmiths, University of London [email: martin.tironi@uc.cl]

Matías Valderrama Barragán is a sociologist with a Master in Sociology from the Pontifical Catholic University of Chile. He is currently working in research projects about digital transformation of organizations and datafication of individuals and environments in Chile. [email:mbvalder@uc.cl]

smart

[blog 1/3] Designing the city by numbers? Introduction: Hope for the data-driven city

This is the first of three blog posts of the series ‘Designing the city by numbers? Bottom-up initiatives for data-driven urbanism in Santiago de Chile’, by Martín Tironi and Matías Valderrama Barragán. Stay tuned: the next episode will appear next Friday, April 27!

The digital has invaded contemporary cities in Latin America, transforming ways of knowing, planning and governing urban life. Parallel to the spread of sensors, networks and digital devices of all kinds in cities in the Global North, they are increasingly becoming part of urban landscapes in cities in the Global South under Smart City initiatives. Because of this, vast quantities of digital data are produced in increasingly ubiquitous and invisible ways. The “datafication” or the growing translation of multiple phenomena into the format of computable data have been pronounced by various scholars in the North as propelling a revolution or large-scale epochal change in contemporary life (Mayer-Schönberger and Cukier, 2013; Kitchin, 2014) in which digital devices and the data collection would allow better self-knowledge and “smarter” decision-making across varied domains.

To examine the impacts of such hyped expectations and promises in Chile, from the Smart Citizen Project we have been studying different cases of Smart City and data-driven initiatives, focusing on how the idea of designing the city by digital numbers has permeated local governments in Santiago de Chile. Public officials and urban planners are being increasingly convinced that planning and governance will be better by quantifying urban variables and promoting decision making not only guided or informed but driven by digital data, algorithms and automated analytics -instead of prejudices, emotions or ideologies. In this “dataism” (van Dijck, 2014), it is believed that the data simply “speak for themselves” in a fantasy of immediacy and neutrality.

But perhaps the most innovative part of data-driven Smart City initiatives we’ve observed are the means by which they also promise to open a new era of experimentationand testing for citizen participation, amplifying notions like of ‘urban laboratory,’ ‘living lab,’ ‘pilot projects,’ ‘open innovation,’ and so on. Thanks to digital technologies, the assumption goes, a “democratization of policymaking” that might reduce the state’s monopoly on government decision-making (Esty, 2004; Esty & Rushing, 2007) might at last be realized, producing a greater “symmetry” or “horizontalization” between governors and the governed (Crawford & Goldsmith, 2014). This, however, depends on citizens’ willingness to function as sensors of their own cities, generating and “sharing” relevant and real-time geographic information about their behaviours and needs, which would be used by urban planners and public officials for their decisions (Goldsmith & Crawford, 2014; Goodchild, 2007).

Our work from the Smart Citizen Project at the Pontifical Catholic University of Chileunderscores the importance of nottaking as given any sort of homogeneous or universal “datafication” process and problematize how data-driven and smart governance are enacted –not without problems and breakdowns- in each location. Thus this series of three blog posts stresses how we must start instead by considering how multiple quantification practices are running at the same time, and how each one can present multiple purposes and meanings which can only be addressed on the basis of their heterogeneous contexts of materialization. Moreover, we explore how we are witnessing an increased diversity of what we call as “digital quantification regimes” produced from the South that aim to position themselves above existing technologies of the North in the market, and achieve an agreement that their data records are the most “participatory”, “representative”, or “accurate” bases for decision-making. Therefore, we must begin to explore the various suppositions, designs, political rationalities and scripts that these regimes establish in their diverse spheres of action under such growing “citizen” driven data initiatives in the South. What kind of practice-ontologies (Gabrys, 2016) might be produced through “citizen” driven data initiatives? At the same time, we believe that the “experimental” and “citizen” grammar that is increasingly infused into Smart City and data-driven initiatives in the South must be critically examined both in their actual development and forms of involvement. How the experimental grammar of smart projects is reconfiguring the idea of participation and government in the urban space?

So stay tuned for the next posts in this series for more on RUBI Urban bike tracker project and the KAPPO pro-cycling smart phone game in Santiago.

 

Cited works

Esty, D. C. & Rushing, R. (2007). Governing by the Numbers: The Promise of Data-Driven Policymaking in the Information Age. Center for American Progress,5, 21.

Gabrys, J. “Citizen Sensing: Recasting Digital Ontologies through Proliferating Practices.”Theorizing the Contemporary, Cultural Anthropology website, March 24, 2016.

Espeland, W. N., & Stevens, M. L. (2008). A sociology of quantification. European Journal of Sociology/Archives Européennes de Sociologie, 49(3), 401-436.

Esty, D. C. & Rushing, R. (2007). Governing by the Numbers: The Promise of Data-Driven Policymaking in the Information Age. Center for American Progress, 5, 21.

Goldsmith, S. & Crawford, S. (2014). The responsive city: engaging communities through data-smart governance. San Francisco, CA: Jossey-Bass, a Wiley Brand.

Goodchild, M. F. (2007). Citizens as sensors: The world of volunteered geography.GeoJournal, 69(4), 211-221.

Kitchin, R. (2014). The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences.London: Sage.

Mayer-Schönberger, V. and Cuckier, K. (2013).  Big Data: A revolution that will transform how we live, work, and think. New York: Houghton Mifflin Harcourt.

van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and secular belief. Surveillance & Society, 12(2), 197-208.

 

About the authors

Martín Tironi is Associate Professor, School of Design at the Pontifical Catholic University of Chile. He holds a PhD from Centre de Sociologie de l’Innovation (CSI), École des Mines de Paris, where he also did post-doctorate studies. He received his Master degree in Sociology at the Université Paris Sorbonne V and his BA in Sociology at the Pontifical Catholic University of Chile. Now he’s doing a visiting Fellow (2018) in Centre of Invention and Social Proces, Goldsmiths, University of London [email: martin.tironi@uc.cl]

Matías Valderrama Barragán is a sociologist with a Master in Sociology from the Pontifical Catholic University of Chile. He is currently working in research projects about digital transformation of organizations and datafication of individuals and environments in Chile. [email:mbvalder@uc.cl]

 

Schermafdruk van 2018-02-14 12.39.05

Agenda data-situada desde el sur: Enlazando movimientos y competencias

by Virginia Brussa

English abstract. In Latin America, within the government, science, education, and software development sectors, various epistemic spaces are emerging that relate to data, and deal with issues such as openness, use, reuse, impact, and privacy. But how do the agendas of each movement look like? Can we think about a located agenda and practices common to our territory? If we go through the “content” of the most recent data-intensive meetings, conferences, laboratories, we will be able to visualize a vast thematic agenda. But what about the potential impact of sharing a common fabric of skills/competencies? This blog post argues that we have to combine the distinct practices to be able to think about the benefits of building bridges between movements and actors. But what about the humanitarian data agenda to test a shared common framework based in a “circular fabric” of competencies? It would be a great step forward to imagine a future situated-agenda free of the disciplinary silos within existing practices and actions.

 

América Latina parece ser disparadora de una agenda particular de datos desafiante, al calor de la maduración de encuentros pertenecientes al movimiento de datos y gobierno abierto. Evidencia de ellos es, por un lado, la comunidad de actores que mantienen con interés el debate, la crítica y los desafíos para cada anclaje territorial, y por otro lado, que el acceso, la ciencia o educación abiertas también están tímidamente resignificando sus espacios al calor de normativas, portales de datos, redes, o labs afines a los espacios de apertura originados en la Academia1. Sumado a este escenario, y siendo funcional a dicha agenda (a veces no oportunamente visibilizada), la innovación humanitaria y ciudadana también recala con distintas velocidades en la región.

Sólo como meros ejemplos de la diversidad de actividades que comulgaron con datos podemos mencionar el desarrollo del Primer MSF Scientific Day de AL, las iniciativas de Codeando México tras los eventos sísmicos del mes de septiembre, donde la tecnología cívica enfatizó la importancia del uso de datos abiertos (pero también reclamando oportunamente su ausencia en áreas vitales para una eficaz respuesta ante emergencias), o la relevancia de la identificación de fake news durante desastres. También se han llevado a cabo festivales de innovación ciudadana y abierta, reformulando paradigmas de organismos internacionales a partir de la integración de propuestas de laboratorios de innovación ciudadana, por ejemplo, los laboratorios por la Paz, los encuentros de hardware abierto y ciencia ciudadana, la Semana de la Evidencia, y encuentros de intervenciones urbanas. Todos denotan la mixtura de actores, competencias e intereses potencialmente confluyentes en el armado de una agenda de datos en el espacio regional2.

A pesar del elocuente escenario que se desprende de tantos esfuerzos, las distintas velocidades, fondos y políticas que delinean el rumbo en dicha “agenda de datos”, en ocasiones dan origen a “silos” o subagendas que se resisten al establecimiento de puentes entre los movimientos subyacentes. Esta (infra)conexión conlleva no sólo a gaps de colaboración, sino a demorar la posibilidad de profundizar lineamientos compartidos (facilitados por la definición de estándares y políticas regionales) y en potenciar la acción por-con los datos (a partir de competencias relevantes). Dicho esto, podemos convenir que nos encontramos bajo la presencia de una agenda data-situada bastante bien desarrollada a nivel temático, pero que en su componente de competencias, espacios y actores aún no logra aunar demasiadas energías.

Siguiendo a Milan & Van der Velden (2016) podríamos decir que los lazos entre movimientos regionales como los de innovación y apertura podrían ser uno de los elementos necesarios para fortalecer a la ciudadanía en competencias (digitales, informacionales, de datos), así como para establecer procesos de incidencia hacia el interior de gobiernos, instituciones educativas u organizaciones de la sociedad civil. Pero ¿está siendo la práctica de ésta incidencia y conformación de agenda “situada”? Es decir, ¿estamos considerando realmente nuestros propios procesos, soberanía tecnológica y narrativas respecto a los Datos? Respecto a ello, podríamos preguntarnos brevemente si para asistir a esta suerte de agenda data-situada nos hemos planteado caducar ciertos ismos. Posteriormente podremos adentrarnos en el debate, pero en principio podemos argumentar que al ser revisado el data-centrismo3 y la dataficación o el tecnocentrismo se reformula también la agenda. Sin embargo, al pensar en su profundización, la ausencia de entramados conjuntos de actuación y trasvasamiento de competencias desaceleran su impulso crítico.

Trama circular de competencias

Las competencias o habilidades para trabajar con datos son uno de los elementos que propiciarían una trama circular que facilite la conversación entre movimientos, para ahondar en la localización de agenda y afianzar finalmente las prácticas para un activismo proactivo y reactivo de datos. Una condición a tener en cuenta para guiar e iluminar el camino podría ser el enlace entre aquel que ha la estandarización de datos –impulsada en la región por ILDA –, y la identificación de competencias, ya que “[l]os procesos de estandarización fuerzan a todas las organizaciones a pensar qué tipo de datos precisan, cómo lo recolectan, de qué manera se almacena y eventualmente en los procesos de uso de los mismos” (ILDA, 2016) . Por tanto, de la estandarización se pueden ir desprendiendo de forma paralela aquellas competencias –y en especial las referidas a las “data literacies”- relacionadas a promover formas de acceso, ciudadanía crítica y equidad.

Tal como evalúan desde ILDA, la estandarización de datos está emparentada con la acción y por tanto, para ir más allá del mero discurso, es necesario democratizar las capacidades en el marco dado por los estándares. Prácticas de co-creación de recursos educativos abiertos, intercambio de habilidades y conocimiento en la sensibilización y “formación”, habilitarían esa característica circular provista por modalidades de colaboración (entre movimientos), en ocasión de habilitar la trama propuesta.

Schermafdruk van 2018-02-14 12.39.05

Fuente: Brussa, V (2017) Primer MSF Scientific Day América Latina. UNR

Particular atención podrá requerir en esta agenda data situada, como un emergente, el tópico humanitario. Si bien es un ítem que merece ser analizado con mayor detenimiento, es uno de los temas que por su naturaleza aún no logra tomar una forma regional, como se propone en este breve artículo. Sin embargo, no se le puede negar su potencial de acción para introducir en los lineamientos de lo abierto y colaborativo, herramientas de lo humanitario tal como se presenta en la imagen. Por tanto, los movimientos de innovación política, humanitaria, ciudadana, abierta, académica (parte de la agenda de datos) ofrecen instrumentos que teórica y empíricamente forjan la regionalización-localización comentada, en tanto y en cuanto se reconozcan lazos comunes existentes.

Algunos ejemplos que visualizan directa e indirectamente una futura trama “circular” de competencias , aprendizajes, experiencias en Latinoamérica son el trabajo de Médicos Sin Fronteras en relación a los marcos éticos de uso de datos sensibles, el monitoreo de los Objectivos de Desarrollo Sostenible en el marco de gobierno abierto; los indicadores ambientales sumados al Barómetro de Datos Abiertos; los repositorios de acceso abierto o notas de laboratorios en salud; los Open Trials; las iniciativas de apertura de Contrataciones Públicas o las relativas a presupuestos abiertos y/o Follow the Money; Data Sprints sobre Fake News; estándares de IATI o HDX sobre datos; el proyecto Missing Map; la creación de recursos educativos abiertos para poblaciones migrantes o refugiadas; la gestión de redes sociales y la estandarización de uso de hashtags en situación de emergencias. La convergencia de estos procesos podría fortalecer a la ciudadanía en capacidades afines al uso y reuso de datos. Y así por qué no, ser muchos más codeando durante una emergencia, mapeando en OSM, reutilizando procesos colaborativos de otras esferas, usando metodologías y prototipado de los “labs”, transparentando y co-produciendo insumos varios para el hacking cívico y/o nuevas modalidades de participación ciudadana en clave latinoamericana.

* Los ejemplos citados representan un núcleo pequeño de todo aquello existente para enumerar. En consonancia con eso, estoy realizando una plataforma que intenta sistematizar iniciativas relacionadas a temáticas aquí señaladas, con el fin de aglutinar experiencias y posibilitar otras modalidades de análisis a través de variables seleccionadas.

Virginia Brussa es coordinadora de la Red Argentina de Educación Abierta (AREA) y +Datalab. Co-organizó junto a MSFArgentina el Primer MSF Scientific Day en AL. Es SBTF alumni, participa en CIM, AAHD, Santalab entre otros.

About the author. Internationalist. Interested in collaboratives practices , digital research methologies and spaces of innovation. Co-organized the First Latin American MSF Scientific Day with MSF Argentina. Coordinator of the Open Education Network of Argentina.  Working close to citizen laboratories , open data and gender issues. Member of CIM and AAHD.

1. Recordemos las Conferencias OpenCon Latam en MX, OpenCon Santiago, las actividades en el campo de las Humanidades Digitales en MX, Arg, Co, Uy, las guías de la CEPAL sobre Gestión de Datos de Investigación etc.
2. Interesante sería crear una agenda integral y colaborativa de encuentros 2018!
3. Podemos citar las referencias sobre el fin de la teoría, la revolución de los datos como aporte naïf en la toma de decisiones o las corrientes fundamentalistas en apoyo al bigdata para la predicción de todo tipo de situaciones.