Author: Stefania

Advisory Board Workshop, July 4-5

In July 4-5, DATACTIVE has gathered the Advisory Board members for a sharing & feedback workshop.

Participants include Anita Say Chan (University of Illinois, Urbana-Champaign), Chris Csikszentmihályi (Madeira Interactive Technologies Institute), Ronald Deibert (University of Toronto), Seda Gürses (KU Leuven), Evelyn Ruppert (Goldsmiths, University of London), Hishant Shah (ArtEZ) … and the DATACTIVE team. Day 1 only: Hisham Al-Miraat (Justice & Peace Netherlands), Julia Hoffmann (Hivos).

*Day 1*
Fireside chat from 4pm @ Terre Lente, followed by light dinner [CLOSED]
Public event at 8pm @ SPUI25: ‘Democracy Under Siege: Digital Espionage and Civil Society Resistance’, with Ronald Deibert (Citizen Lab – University of Toronto), Seda Guerses (COSIC/ESAT – KU Leuven), and Nishant Shah (ArtEZ / Leuphana University)

*Day 2* @ the University Library, Singel 425 [BY INVITATION ONLY]
9.30 Welcome & coffee
9.45 Intro (Stefania)
Session 1 (10-11am): The framework: Concepts and Infrastructure (Presenters: Stefania & Davide)
Session 2 (11.10-12.15): Data as stakes (Presenters: Becky & Niels)
Lunch break (12.15-1.30)
Session 3 (1.30-2.35): Data as tools (Presenters: Guillen & Kersti)
Session 4 (2.40-3.45): Next in line: Emerging work (Presenters: Fabien, Jeroen, Quynn)
Session 5 (4-4.30) Wrap-up (Stefania, all)

 

Stefania at the Future Forum of IG Metall, Berlin

On June 19, Stefania will deliver a talk at the Zukunftsforum (‘Future Forum’) of IG Metall, the dominant metalworkers’ union in Germany and Europe’s largest industrial union. Stefania has been asked to reflect on how digitalisation and datafication change the dynamics of solidarity today. Check out the program of the day.

AGENDA Future Forum IG Metall 19.6.2018_final_eng
.

[BigDataSur] blog 3/3. Good Data: Challenging the Colonial Politics of Knowledge

In the previous instalment of this three-part series on the possibilities of “good data” (here and here), Anna Carlson concluded that notions of decentralisation and autonomy could not do without an understanding of global inequalities and their politics. Moving away from Northern, individualist visions of digital utopia, she considers what can be learned from Indigenous data sovereignty initiatives as they address, from the South, the colonial legacies of global knowledge production. 

The gathering of data has long been a strategy of colonialism. It is part of a set of practices designed to “standardise and simplify the indigenous ‘social hieroglyph into a legible and administratively more convenient format’ (Scott 1999, 3)” (Smith 2016, 120). Making something legible is always already political, and it always begs the question: legible for what, and to whom?

In Australia, making legible meant enumerating Indigenous “‘peoples’ into ‘populations’ (Taylor 2009); their domestic arrangements and wellbeing […] constrained within quantitative datasets and indicators that reflected colonial preoccupations and values” (Smith 2016, 120). No matter how ‘big’ the data set, data is never neutral. As Smith points out, Indigenous self-determination is increasingly linked to “the need to also reassert Indigenous peoples’ control and interpretation into the colonial data archives, and to produce alternative sources of data that are fit for their contemporary purposes” (Smith 2016, 120). Data is knowledge, and knowledge is power.

No matter how it is collected, data has long been used in ways that reinforce and sustain this colonial status quo. Data – often quantitative, often raw numbers – is used strategically. As Maggie Walter argues in a chapter in Indigenous Data Sovereignty: Towards an Agenda (2016): “social and population statistics are better understood as human artefacts, imbued with meaning. And in their current configurations, the meanings reflected in statistics are primarily drawn from the dominant social norms, values and racial hierarchy of the society in which they are created” (Walter 2016, 79).

Dianne E Smith expands in the same volume:

“Data constitute a point-in-time intervention into a flow of information of behaviour – an attempt to inject certainty and meaning into uncertainty. As such, data can be useful for generalising from a particular sample to a wider population […] testing hypotheses […] choosing between options (etc). […]  However, when derived from ethnocentric criteria and definitions, data can also impose erroneous causal connections and simplify social complexity, thereby freezing what may be fluid formations in the real world. In their unadorned quantitative form, data are hard-pressed to cope with social and cultural intangibles.” (2016, 119-120)

As such, argues Smith, questions about data governance – “who has the power and authority to make rules and decisions about the design, interpretation, validation, ownership, access to and use of data” – are increasingly emerging as key “sites of contestation” between Indigenous communities and the state (Smith 2016, 119).

One of the more interesting responses to these data challenges comes from the movements for Indigenous Data Sovereignty. C. Matthew Snipp (2016) outlines three basic preconditions for data decolonisation: “that Indigenous peoples have the power to determine who should be counted among them; that data must reflect the interests and priorities of Indigenous peoples; and that tribal communities must not only dictate the content of data collected about them, but also have the power to determine who has access to these data.” In practice, then, it means Indigenous communities having meaningful say over how information about them is collected, stored, used and managed. In Indigenous Data Sovereignty: Towards an Agenda, the editors have brought together a set of interesting case studies reflecting on different examples of Indigenous data sovereignty. These case studies are by no means exhaustive, but they do point to a set of emerging protocols around relationality and control that may prove instructive in ongoing attempts to imagine good data practices in the future.

The First Nations Information Governance Centre (FNIGC)

FNIGC in Canada is an explicitly political response to the role of knowledge production in maintaining colonial power relationship.  The FNIGC articulate a set of principles that should be respected when gathering data about First Nations communities.  These principles are “ownership, control, access and permission” (OCAP), and they are used by the FNIGC as part of a certification process for research projects, surveys and other data gathering mechanisms.  In some ways, they kind of operate as a “right of reply” – pointing to the inadequacies or successes of data-gathering practices.  They aim, at least in part, to address the heavily skewed power relationship that continues to exist between Indigenous communities and the researchers who study them.

“By Maori for Maori” healthcare

This initiative attempts to incorporate Maori knowledge protocols into primary health care provision. In this case, Maori knowledge protocols are incorporated (to some extent) into data collection, analysis and reporting tools which are used by primary health care services (Maori run) in developing knowledge about and responses to health issues in Maori communities. These protocols are also used to develop and enable processes for data sharing across related networks – for example, with housing providers.

Yawuru data sovereignty

The Yawuru, traditional owners of the area around Broome in Western Australia, recognised that gaining control of the information that existed about them (by virtue of the extensive native title data gathering process) was crucial to maintaining their sovereignty. They also recognised the value of a data archive that was produced by and for Yawuru peoples. They undertook their own data collection processes, including a “survey of all Indigenous people and dwellings in the town to create a unit-record baseline.” They sought to create an instrument to “measure local understandings of Yawuru wellbeing (mabu liyan).”They digitally mapped their lands in order to “map places of cultural, social and environmental signficance to inform a cultural and environmental management plan.” Finally, they sought to incorporate Yawuru knowledge protocols into the development of a “documentation project […] to collate and store all relevant legal records, historical information, genealogies and cultural information” in line with Yawuru law.

Beyond these discreet examples, data sovereignty is exercised in a variety of other forms. Much Indigenous knowledge exists entirely outside the colonial state, and is held and protected in line with law. Many of the knowledge keepers in Australia retain this data sovereignty, despite the increasing pressure to make such information public in order to access legal recognition. This speaks to precisely the dilemma of contemporary data politics, as “opting out” of the systems that collect our data is increasingly difficult in a world where giving up our data rights is too often a precondition to accessing the things that make life liveable.

“error 404. social life not found” at TEDxYouth AICS, June 4

On June 4, Stefania will give a TEDx talk at the TEDx Youth event of the Amsterdam International Community School. Entitled ‘error 404. social life not found. (but you can take it back)’, Stefania’s talk will contribute to this year’s theme ‘Next Nature’: what is the next nature of the human experience as we enter the technological age of big data, consumerism and automation? Read more on the TED website. You can also download the presentation.

[BigDataSur] blog 2/3: Digital Utopianism and Decentralisation: Is Decentralised Data Good Data?

In this second of three blog posts on the challenges of imagining ‘good data’ globally (read the first first episode here), Anna Carlson considers a particular strand of technology-driven utopianism in the Global North: the idea of radically decentralised data. She writes that its latest instantiations – exemplified by blockchain – tend to forget the dire and unequally distributed ecological impact of digital technologies.    

I was born in 1989, just a handful of years before mobile phones and laptop computers became the ubiquitous symbols of urban late modernity.  It is also the year that the terms “cybersecurity” and “hypertext markup language” (HTML) first appeared in print. In the heady, early years of the World Wide Web, the internet and the data it produced (and contained and distributed) offered a utopian vision of equitable globalisation, the promise to equalise access to knowledge, the possibility of a new kind of commons. Early adopters across the political spectrum celebrated the possibilities of decentralisation and autonomy, the opportunities for community-building and collectivity, for sharing economies outside the control of the state.

These notions of the common good mostly originate in the Global North, and their promises have never quite been fulfilled. They continue to re-emerge, however, and provide interesting food for thought in the process of imagining good (or at least, better) data practices for the future.

Blockchain is probably the most prominent technology to revive utopias of radical decentralisation. At its most basic, a blockchain is a massive peer-to-peer, decentralised technology that allows digital information to be seen and shared but not copied. Described by some as the “backbone of a new type of internet,” blockchain is basically a very complex, constantly updating, decentralised titling system. A blockchain is like a database, tied to the object or site of interest, constantly cross-checking and updating.

The technology originally emerged as a way of keeping track of cryptocurrencies like Bitcoin, and proponents Don and Alex Tapscott describe it as “an incorruptible digital ledger of economic transactions that can be programmed to record not just financial transactions but virtually everything of value.”  The blockchain is inherently decentralised in it cannot be controlled by any single entity, but requires constant validation and authentication from the network.

Blockchain is often represented optimistically, if vaguely, as capable of supporting new economies, free from the shackles of corporate control. In Melanie Swan’s Blockchain: Blueprint for a New Economy, she goes even further: “the potential benefits of the blockchain are more than just economic – they extend into political, humanitarian, social, and scientific domains […] For example, to counter repressive political regimes, blockchain technology can be used to enact in a decentralised cloud functions that previously needed administration… (e.g.) for organisations like WikiLeaks (where national governments prevented credit card processors from accepting donations in the sensitive Edward Snowden situation).” She goes on to describe the possibility of blockchain technology as a basis for new economies dominated by (ahem) platforms like Uber and AirBnB.

So far, so good, right? The problem is that blockchains are incredibly unwieldy and immensely energy inefficient. To use the currency bitcoin as an example, the energy required for a single transaction far exceeds the amount needed for a more traditional transfer.  This is at least in part because the system is premised on hyper-individualistic, libertarian ideals.  In a recent article on Motherboard, Christopher Malmo writes that, with prices at their current level, “it would be profitable for Bitcoin miners to burn through over 24 terawatt-hours of electricity annually as they compete to solve increasingly difficult cryptographic puzzles to “mine” more Bitcoins. That’s about as much as Nigeria, a country of 186 million people, uses in a year.” At least in part, this energy is required because, as Alex de Vries suggests: “Blockchain is inefficient tech by design, as we create trust by building a system based on distrust. If you only trust yourself and a set of rules (the software), then you have to validate everything that happens against these rules yourself. That is the life of a blockchain node.”

An interesting example of the tensions of decentralised cryptocurrencies emerged recently when The New Inquiry released their newest project, Bail Bloc. Bail Bloc is a “cryptocurrency scheme against bail,” that delivers much-needed funds to the excellent Bronx Freedom Fund in order to help provide bail fees for defendants. They outline its function as follows: “When you download the app, a small part of your computer’s unused processing power is redirected toward mining a popular cryptocurrency called Monero, which is secure, private, and untraceable. At the end of every month, we exchange the Monero for US dollars and donate the earnings to the Bronx Freedom Fund.”

On the surface, taking “unused” resources and redistributing them equitably sounds like a good idea. Unfortunately, the missing information here is that the processing power is not really “unused” – what they mean that it’s unused by you, and that using it for this cause won’t inconvenience your personal use.

But there is a slippage here that seems important to clarify.  Mining Monero requires a huge amount of electricity – much the same as the Bitcoin example above. As a result, this charity structure requires that we burn huge amounts of energy to produce a fairly minimal value, which is then redistributed. Like many such examples, it would be much easier and simpler to simply donate the money directly to the Bronx Freedom Fund.  The flip side, of course, is that electricity doesn’t feel like a scarce resource in the way that money does, so generating money out of electricity feels like generating money out of nowhere.

It is easy to see what the New Inquiry designers are appealing to: the desire to affect positive change without having to really do anything at all. And it seemed to work: in the first 24 hours of the initiative’s launch, I saw numerous friends and colleagues sharing the page with suggestions that we establish similar schemes for causes closer to home.

The sense of practically and elegantly re-distributing otherwise unused resources through the magic of technology might be appealing, but it should not come at the expense of a broader understanding of global inequalities and planetary sustainability. Decentralised technology cannot sustain “good data” if the values that are encoded in it do not account for our shared stake in the world.

The question, then, is not whether decentralised data is good data, but under what terms and conditions– and what politics of the collective (or commons) goes into shaping decentralised technology.

((to be continued. Next episode will be online on June 8th, 2018))

 

 

[BigDataSur] blog 1/3: Imagining ‘Good’ Data: Northern Utopias, Southern Lessons

by Anna Carlson

What might ‘good data’ look like? Where to look for models, past and emerging? In this series of three blog posts, Anna Carlson highlights that we need to understand data issues as part of a politics of planetary sustainability and a live history of colonial knowledge production. In this first instalment, she introduces her own search for guiding principles in an age of ubiquitous data extraction and often dubious utopianism.           

I’m sitting by Cataract Gorge in Launceston, northern Tasmania. I’ve just climbed a couple of hundred metres through bushland to a relatively secluded look-out. It feels a long way from the city below, despite the fact that I can still hear distant traffic noise and human chatter. I pull out my laptop, perhaps by instinct. At around the same moment, a tiny native lizard dashes from the undergrowth and hovers uncertainly by my bare foot. I think briefly about pulling out my cracked and battered (second-hand) iPhone 4 to archive this moment, perhaps uploading it to Instagram with the hashtag #humansoflatecapitalism and a witty comment. Instead, I start writing.

I have been thinking a lot lately about the politics and ethics of data and digital technologies. My brief scramble up the hill was spent ruminating on the particular question of what “good data” might look like. I know what not-so-good data looks like. Already today I’ve generated a wealth of it. I paid online for a hostel bunk, receiving almost immediate cross-marketing from AirBnB and Hostelworld through social media sites as well as through Google. I logged into my Couchsurfing account, and immediately received a barrage of new “couch requests” (based, I presume, on an algorithm that lets potential couch surfers know when their prospective hosts login).  I’ve turned location services on my phone on, and used Google Maps to navigate a new city. I’ve searched for information about art galleries and hiking trails. I used a plant identification site to find out what tree I was looking at. Data, it seems, is the “digital air that I breathe.”

Writing in the Guardian, journalist Paul Lewis interviews tech consultant and author Nir Eyal, who claims that “the technologies we use have turned into compulsions, if not full-fledged addictions. […] It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” And this addictive quality is powerful: digital technologies are credited with altering everything from election results to consumer behaviour and our ability to empathise. Indeed, there’s money to be made from a digitally-addicted populace. Encompassing everything from social media platforms to wearable devices, smart cities and the Internet of Things, almost every action we take in the world produces data of some form, and this data represents value for the corporations, governments and marketers who buy it.

This is the phenomenon commonly referred to as Big Data, which describes sets of data so big that they must be analysed computationally. Usually stored in digital form, this data encompasses everything from consumer trends to live emotional insights, and it is routinely gathered by most companies with an online presence.

The not-goodness of this data isn’t intrinsic, however. There is nothing inherently wrong with creating knowledge about the activities we undertake online. Rather, the not-goodness is a characteristic of the murky processes through which data is gathered, consolidated, analysed, sold-on and redistributed. It’s to do with the fact that few of us really know what data is being gathered about us, and even fewer know what that data will be used for. And it’s to do with the lineages that have structured processes of mass data collection, as well as their unequally distributed impacts.

Many of us know that the technologies on which we rely have dark underbellies; that the convenience and comfort of our digital lives is contingent on these intersecting forms of violence. But we live in a world where access to these technologies increasingly operates as a precondition to entering the workforce, to social life, to connection and leisure and knowledge. More and more workers (even in the global north) are experiencing precarity, worklessness and insecurity; experiences that are often enabled by digital technologies and which, doubly cruelly, often render us further reliant on them.

The ubiquity of the digital realm provokes new ethical conundrums. The technologies on which we are increasingly reliant are themselves reliant on exploitative and often oppressive labour regimes. They are responsible for vast ecological footprints. Data is often represented as immaterial, ‘virtual,’ and yet its impact on environments across the world is pushing us ever closer to global ecological disaster. Further, these violent environmental and labour relations are unequally distributed: the negative impacts of the digital age are disproportionately focused on communities in the Global South, while the wealth generated is largely trapped in a few Northern hands.

Gathering data means producing knowledge within a particular set of parameters. In the new and emerging conversations around Big Data and its impact on our social worlds, much focus is placed on the scale of it, its bigness, the sheer possibility of having so much information at our fingertips. It is tempting to think of this as a new phenomenon – as an unprecedented moment brought about by new technologies. But as technologist Genevieve Bell reminds us, the “logics of our digital world – fast, smart and connected – have histories that proceed their digital turns.” Every new technological advance carries its legacies, and the colonial legacy is one that does not receive enough attention.

So, when we imagine what “good data” and good tech might look like now, we need to contend with the ethical quagmire of tech in its global and historical dimensions. To illustrate this point, I examine the limits of contemporary digital utopianism (exemplified by blockchain) as envisioned in the Global North (Episode 2), before delving into the principles guiding “good data” from the point of view of Indigenous communities (Episode 3).

Acknowledgments:  These blogposts have been produced as part of the Good Data project (@Good__Data), an interdisciplinary research initiative funded by Queensland University of Technology Faculty of Law, which is located in unceded Meanjin, Turrbal and Jagera Land (also known as Brisbane, Australia). The project examines ‘good’ and ‘ethical’ data practices with a view to developing policy recommendations and software design standards for programs and services that embody good data practices, in order to start conceptualising and implementing a more positive and ethical vision of the digital society and economy. In late 2018, an open access edited book entitled Good Data, comprising contributions from authors from different disciplines located in different parts of the world, will be published by the Amsterdam University of Applied Sciences Institute of Network Cultures.’

NOW OUT! Special issue on ‘data activism’ of Krisis: Journal for Contemporary Philosophy

DATACTIVE is proud to announce the publication of the special issue on ‘data activism’ of Krisis: Journal for Contemporary Philosophy. Edited by Stefania Milan and Lonneke van der Velden, the special issue features six articles by Jonathan Gray, Helen Kennedy, Lina Dencik, Stefan Baack, Miren Gutierrez, Leah Horgan and Paul Dourish; an essay by, and three book reviews. The journal is open access; you can read and download the article from http://krisis.eu.

Issue 1, 2018: Data Activism
Digital data increasingly plays a central role in contemporary politics and public life. Citizen voices are increasingly mediated by proprietary social media platforms and are shaped by algorithmic ranking and re-ordering, but data informs how states act, too. This special issue wants to shift the focus of the conversation. Non-governmental organizations, hackers, and activists of all kinds provide a myriad of ‘alternative’ interventions, interpretations, and imaginaries of what data stands for and what can be done with it.

Jonathan Gray starts off this special issue by suggesting how data can be involved in providing horizons of intelligibility and organising social and political life. Helen Kennedy’s contribution advocates for a focus on emotions and everyday lived experiences with data. Lina Dencik puts forward the notion of ‘surveillance realism’ to explore the pervasiveness of contemporary surveillance and the emergence of alternative imaginaries. Stefan Baack investigates how data are used to facilitate civic engagement. Miren Gutiérrez explores how activists can make use of data infrastructures such as databases, servers, and algorithms. Finally, Leah Horgan and Paul Dourish critically engage with the notion of data activism by looking at everyday data work in a local administration. Further, this issue features an interview with Boris Groys by Thijs Lijster, whose work Über das Neue enjoys its 25th anniversary last year. Lastly, three book reviews illuminate key aspects of datafication. Patricia de Vries reviews Metahavens’ Black Transparency; Niels van Doorn writes on Platform Capitalism by Nick Srnicek and Jan Overwijk comments on The Entrepeneurial Self by Ulrich Bröckling.

[blog 1/3] Designing the city by numbers? Introduction: Hope for the data-driven city

This is the first of three blog posts of the series ‘Designing the city by numbers? Bottom-up initiatives for data-driven urbanism in Santiago de Chile’, by Martín Tironi and Matías Valderrama Barragán. Stay tuned: the next episode will appear next Friday, April 27!

The digital has invaded contemporary cities in Latin America, transforming ways of knowing, planning and governing urban life. Parallel to the spread of sensors, networks and digital devices of all kinds in cities in the Global North, they are increasingly becoming part of urban landscapes in cities in the Global South under Smart City initiatives. Because of this, vast quantities of digital data are produced in increasingly ubiquitous and invisible ways. The “datafication” or the growing translation of multiple phenomena into the format of computable data have been pronounced by various scholars in the North as propelling a revolution or large-scale epochal change in contemporary life (Mayer-Schönberger and Cukier, 2013; Kitchin, 2014) in which digital devices and the data collection would allow better self-knowledge and “smarter” decision-making across varied domains.

To examine the impacts of such hyped expectations and promises in Chile, from the Smart Citizen Project we have been studying different cases of Smart City and data-driven initiatives, focusing on how the idea of designing the city by digital numbers has permeated local governments in Santiago de Chile. Public officials and urban planners are being increasingly convinced that planning and governance will be better by quantifying urban variables and promoting decision making not only guided or informed but driven by digital data, algorithms and automated analytics -instead of prejudices, emotions or ideologies. In this “dataism” (van Dijck, 2014), it is believed that the data simply “speak for themselves” in a fantasy of immediacy and neutrality.

But perhaps the most innovative part of data-driven Smart City initiatives we’ve observed are the means by which they also promise to open a new era of experimentationand testing for citizen participation, amplifying notions like of ‘urban laboratory,’ ‘living lab,’ ‘pilot projects,’ ‘open innovation,’ and so on. Thanks to digital technologies, the assumption goes, a “democratization of policymaking” that might reduce the state’s monopoly on government decision-making (Esty, 2004; Esty & Rushing, 2007) might at last be realized, producing a greater “symmetry” or “horizontalization” between governors and the governed (Crawford & Goldsmith, 2014). This, however, depends on citizens’ willingness to function as sensors of their own cities, generating and “sharing” relevant and real-time geographic information about their behaviours and needs, which would be used by urban planners and public officials for their decisions (Goldsmith & Crawford, 2014; Goodchild, 2007).

Our work from the Smart Citizen Project at the Pontifical Catholic University of Chileunderscores the importance of nottaking as given any sort of homogeneous or universal “datafication” process and problematize how data-driven and smart governance are enacted –not without problems and breakdowns- in each location. Thus this series of three blog posts stresses how we must start instead by considering how multiple quantification practices are running at the same time, and how each one can present multiple purposes and meanings which can only be addressed on the basis of their heterogeneous contexts of materialization. Moreover, we explore how we are witnessing an increased diversity of what we call as “digital quantification regimes” produced from the South that aim to position themselves above existing technologies of the North in the market, and achieve an agreement that their data records are the most “participatory”, “representative”, or “accurate” bases for decision-making. Therefore, we must begin to explore the various suppositions, designs, political rationalities and scripts that these regimes establish in their diverse spheres of action under such growing “citizen” driven data initiatives in the South. What kind of practice-ontologies (Gabrys, 2016) might be produced through “citizen” driven data initiatives? At the same time, we believe that the “experimental” and “citizen” grammar that is increasingly infused into Smart City and data-driven initiatives in the South must be critically examined both in their actual development and forms of involvement. How the experimental grammar of smart projects is reconfiguring the idea of participation and government in the urban space?

So stay tuned for the next posts in this series for more on RUBI Urban bike tracker project and the KAPPO pro-cycling smart phone game in Santiago.

 

Cited works

Esty, D. C. & Rushing, R. (2007). Governing by the Numbers: The Promise of Data-Driven Policymaking in the Information Age. Center for American Progress,5, 21.

Gabrys, J. “Citizen Sensing: Recasting Digital Ontologies through Proliferating Practices.”Theorizing the Contemporary, Cultural Anthropology website, March 24, 2016.

Espeland, W. N., & Stevens, M. L. (2008). A sociology of quantification. European Journal of Sociology/Archives Européennes de Sociologie, 49(3), 401-436.

Esty, D. C. & Rushing, R. (2007). Governing by the Numbers: The Promise of Data-Driven Policymaking in the Information Age. Center for American Progress, 5, 21.

Goldsmith, S. & Crawford, S. (2014). The responsive city: engaging communities through data-smart governance. San Francisco, CA: Jossey-Bass, a Wiley Brand.

Goodchild, M. F. (2007). Citizens as sensors: The world of volunteered geography.GeoJournal, 69(4), 211-221.

Kitchin, R. (2014). The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences.London: Sage.

Mayer-Schönberger, V. and Cuckier, K. (2013).  Big Data: A revolution that will transform how we live, work, and think. New York: Houghton Mifflin Harcourt.

van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and secular belief. Surveillance & Society, 12(2), 197-208.

 

About the authors

Martín Tironi is Associate Professor, School of Design at the Pontifical Catholic University of Chile. He holds a PhD from Centre de Sociologie de l’Innovation (CSI), École des Mines de Paris, where he also did post-doctorate studies. He received his Master degree in Sociology at the Université Paris Sorbonne V and his BA in Sociology at the Pontifical Catholic University of Chile. Now he’s doing a visiting Fellow (2018) in Centre of Invention and Social Proces, Goldsmiths, University of London [email: martin.tironi@uc.cl]

Matías Valderrama Barragán is a sociologist with a Master in Sociology from the Pontifical Catholic University of Chile. He is currently working in research projects about digital transformation of organizations and datafication of individuals and environments in Chile. [email:mbvalder@uc.cl]