Niels ten Oever held a talk on the subversion of equality and freedom of users in the Internet architecture at the Privacy and Sustainable Computing Lab of the Vienna University of Economics and Business. The talk built on a mixed methods analysis of the Internet architecture and its technical governance, and showed how the Internet protocol community structurally did not uphold the values it professes, such as end-to-end, permissionless innovation, and openness. This opened up a discussion how such governance institutions could then be expected to safeguard external principles, such as the public values, which might run counter to the interests of many of the represented stakeholders. This sparked a lively discussion among the attending academics and policy makers about value frameworks and analytical tools and approaches that transverse the fields of technologists, policy makers, the law, and academia.
The municipality of Amsterdam published its Digital Agenda in which it presents its ambitions to become a free, inclusive, and creative digital city. Amsterdam is grounding its ambitions on the early experiences with networking technology in the Netherlands, after which the agenda was called: The Digital City (De Digitale Stad). At DATACTIVE Niels ten Oever thoroughly analyzed the agenda and commend the city for its ambition to base the digital city of the future on digital rights. The approach of the city is anchored in concepts such as privacy, inclusivity, transparency, sovereignty, autonomy, universal access, transparency, openness, participation, and security, which are to be developed in a ‘Cities Coalition for Digital Rights’, and through interviews and cocreation sessions.
While the ambition is very laudable, the agenda has little to no references to existing legal, ethical, or technical standards on which this work can build. This could lead to a duplication of efforts, and a repetition of (expensive) mistakes that have already been made. This is why Jeroen de Vos represented DATACTIVE at the meeting of the Amsterdam Municipal Commission on Art, Diversity, and Democratization and offered suggestions that could assist the successful design, implementation, and evaluation of the Digital Agenda. The full text of the contribution can be read underneath (in Dutch). In the contribution DATACTIVE reiterates the importance of human rights in general and the United Nations Guiding Principles for Business and Human Rights, the international standards for state and corporate accountability in specific, in the implementation of technical infrastructures.
We hope and trust that our analysis of the report, which we shared with the municipality, will be our first contribution to the construction of a public information infrastructure in Amsterdam.
The original text of the oral contribution as read by Jeroen de Vos:
Geachte leden van de Raadscommissie Kunst Diversiteit en Democratisering,
Ik spreek hier namens DATACTIVE, een onderzoeksgroep aan de universiteit van Amsterdam die de sociale en democratische consequenties van datastromen onderzoekt. Wij namen dan ook met grote waardering kennis van de ambitie van de gemeente om een digitale strategie te ontwikkelen waarin mensenrechten een belangrijke plek krijgen. De gemeente plaatst zich hiermee midden een debat over de technologische infrastructuur van onze samenleving.
Wij hopen dat de gemeente niet zal proberen het wiel opnieuw uit te vinden: er zijn reeds veel digitale vrijheidsverklaringen, mensenrechtenverdragen, en technische standaarden ontwikkeld op dit gebied die de gemeente zo kan overnemen. Hierover wordt niets gezegd in de digitale agenda. Dat zou een gemiste kans zijn.
Wij willen van harte aanbevelen dat de gemeente in kaart brengt op welke verdragen, implementatieraamwerken, en technische standaarden ze zich gaat baseren. Dat kan worden gedaan als aanvulling op de agenda en meegenomen worden bij de uitvoering. Dit zal ook de implementatie van de agenda ten goede komen, omdat er dan heldere indicatoren en evaluatie criteria zijn en daarmee goed duidelijk wordt wat de digitale agenda gaat betekenen voor Amsterdammers.
Een van de belangrijke raamwerken die al bestaan, zijn de zogenaamde Guiding Principles for Business and Human Rights van de Verenigde Naties. Dat is wereldwijd zowel in bij overheden als bij bedrijven de gouden standaard voor de implementatie van digitale rechten. Het zou goed zijn als deze wordt betrokken bij de uitvoering van de Agenda Digitale Stad.
Als de gemeente dat niet doet, zal zij geen kennis hebben van best practices, vanaf het begin achterlopen met de implementatie, en reeds gemaakte fouten en werk dupliceren.
Wij zijn natuurlijk bereid mee te denken, Amsterdam te situeren in de voorhoede van de infrastructurele digitale mensenrechten discussie.
Fresh from the DATACTIVE press: Kersti Wissenbach introduces the concept of ‘acting within’ to contemporary media practice approaches on the intersection of communication and social movement studies. She argues for the need to take distance from pre-assigned indications of exclusion and builds on the de-westernisation discourse of communication scholarship in her strive to provide a framework that allows the surfacing of roots of power in diverse country and inner-country contexts. The article examines the need for an explicit conceptualisation of communication in the field of social movement research in order to grasp power dynamics within transnational civic tech activism communities. Civic tech activism is an instance of organised collective action that acts on institutionally regulated governance processes through the crafting of technologies and tactics supporting citizens’ direct political participation.
This theoretical discussion builds the foundation of Kersti’s research into transnational data activist collectives, nurtured by her background in critical development, post-colonialism, and communication for social change.
How to cite it:
Wissenbach, K. R. (2019). Accounting for power in transnational civic tech activism: A communication-based analytical framework for media practice. International Communication Gazette. https://doi.org/10.1177/1748048519832779
DATACTIVE PI Stefania Milan will participate in the event “We, creators of AI” at Science Park on March 14. The event was is organized by the University of Amsterdam, in the frame of ERC=Science², an EU-funded campaign aiming to promote the research funded by the European research Council. Read more.
Happy to announce the publication of a stellar special issue of the journal ‘Policy & Internet’, dedicated to ‘Internet Architecture & Human Rights’. The special issue, edited by DATACTIVE PI Stefania Milan with Monika Zalnieriute (Faculty of Law, UNSW Sydney, Australia), features articles by a number of key authors in the Internet governance field, such as Laura DeNardis and Samantha Brandshaw, Milton L. Mueller and Farzahen Badiei, Nicolas Suzor and colleagues, Ben Wagner, and our very own Niels ten Oever. Read it online!
Zalnieriute and Milan (2019). “Internet Architecture and Human Rights: Beyond the Human Rights Gap”, Policy & Internet, 11(1): 6-15, https://doi.org/10.1002/poi3.200
Internet architecture and infrastructure are generally not at the top of the concerns of end users, and the overlying logical arrangements of root services, domain names, and protocols remain largely invisible to its users. Recent developments, however—including massive user data leakages, hacks targeting social networking service providers, and behavioral micro‐targeting—have turned a spotlight on Internet governance defined broadly, and its relationship with civil liberties and human rights. The articles in this special issue examine the policymaking role of influential private intermediaries and private actors such as ICANN in enacting global governance via Internet architecture, exploring the implications of such a mode of governance for human rights. They consider: to what extent are human rights standards mediated and set via technical infrastructure, such as the DNS and platform policies, rather than by governmental structures? What are the implications of governance via Internet architecture for individual human rights? And what frameworks—be they legal, technological or policy‐related—are needed to address the contemporary privatization of human rights online, in order to ensure the effective protection of human rights in the digital age?
ten Oever (2019). “Productive Contestation, Civil Society, and Global Governance: Human Rights as a Boundary Object in ICANN”, Policy & Internet, 11(1): 37-60, https://doi.org/10.1002/poi3.172
Human rights have long been discussed in relation to global governance processes, but there has been disagreement about whether (and how) a consideration for human rights should be incorporated into the workings of the Internet Corporation for Assigned Names and Numbers (ICANN), one of the main bodies of Internet governance. Internet governance is generally regarded as a site of innovation in global governance; one in which civil society can, in theory, contribute equally with government and industry. This article uses the lens of boundary object theory to examine how civil society actors succeeded in inscribing human rights as a Core Value in ICANN’s bylaws. As a “boundary object” in the negotiations, the concept of human rights provided enough interpretive flexibility to translate to the social realities of the various stakeholder groups, including government and industry. This consensus‐building process was bound by the organizing structure of the boundary object (human rights), and its ability to accommodate the interests of the different parties. The presence of civil society at the negotiating table demanded a shift in strategy from the usual “outsider” tactics of issue framing and agenda setting, to a more complex and iterative process of “productive contestation,” a consensus‐building process fueled by the differences of experience and interests of parties, bound together by the organizing structure of the boundary object. This article describes how this process ultimately resulted in the successful adoption of human rights in ICANN’s bylaws.
Cite as Zalnieriute and Milan (eds.) (2019). Special issue ‘Internet Architecture & Human Rights’, Policy & Internet, 11(1)
On February 20-22, DATACTIVE PI Stefania Milan is in Milan, Italy, for the MilanoDesign PhD Festival at the Politecnico di Milano. In particular, she will sit in the PhD defences of Maria Briones de los Angeles and Camilo Ayala Garcia. On Friday afternoon, she will also deliver a talk as part of the “design pills” program. Check out the program of the event.
We are very excited that “Disclose to tell. A data design framework for alternative narratives” by Maria Briones de los Angeles features DATACTIVE as well. What’s more, this amazing dissertation contributes to understand data activism, in particular when it comes to data visualisation and its role in the creation of empowering narratives for social change. Visit alternative-narratives-vis-archive.com to know more. And congrats to Maria Briones de los Angeles (@angelesbriones)!
On February 21, Becky Kazansky will be responding to Jamie Susskind during an event for the presentation of his book Future Politics. Living Together in a World Transformed by Tech. During the evening, the author will present his insights on how digital technology is and will further transform our society and political system. Becky will comment on Susskind’s book based on her experience as researcher on topics related to technology and social justice.
21 February 2019, 8pm @SPUI25
Lonneke van der Velden, Guillén Torres, Becky Kazansky, Kersti Wissenbach, and Stefania Milan have together co-authored a new chapter appearing in the newly published Good Data book, edited by Angela Daly, S. Kate Devitt and Monique Mann.
‘Big data’ is a hyped buzzword – or rather, it has been for a while, before being supplanted by ‘newer’ acclaimed concepts such as artificial intelligence. The popularity of the term says something about the widespread fascination with the seemingly infinite possibilities of automatized data collection and analysis. This enchantment affects not only the corporate sector, where many technology companies have centered their business model on data mining, and governments, whose intelligence agencies have adopted sophisticated machin- ery to monitor citizens. Many civic society organizations, too, are increasingly trying to take advantage of the opportunities brought about by datafication, using data to improve society. From crowdsourced maps about gender-based violence (‘feminicide’) in Latin America, to the analysis of audio-visual footage to map drone attacks in conflict zones, individuals and groups regularly produce, collect, process and repurpose data to fuel research for the social good. Problematizing the mainstream connotations of big data, these examples of ‘data activ- ism’ take a critical stance towards massive data collection and represent the new frontier of citizens’ engagement with information and technological innovation.
In this chapter we survey diverse experiences and methodologies of what we call ‘data-activist research’ – an approach to research that combines embeddedness in the social world with the research methods typical of academia and the innovative repertoires of data activists. We argue that such approach to knowledge production fosters community building and knowledge sharing, while providing a way to fruitfully interrogate datafication and democratic participation. By exploring what we can learn from data-activist projects and investigating the conditions for collaboration between activist communities and academia, we aim at laying the groundwork for a data-activist research agenda whose dynamics are socially responsible and empowering for all the parties involved.
On February 7, 2019 the Internet Policy Review published an op-ed by Stefania Milan and Claudio Agosti. We reflect on personalization algorithms and elections, and share some ideas about algorithmic sovereignty and literacy. Thanks to Frédéric Dubois for the invitation.
“Personalisation algorithms allow platforms to carefully target web content to the tastes and interests of their users. They are at the core of social media platforms, dating apps, shopping and news sites. They make us see the world as we want to see it. By forging a specific reality for each user, they silently and subtly shape customised “information diets”, including around our voting preferences. We still remember Facebook’s CEO Mark Zuckerberg testifying before the US Congress (in April 2018) about the many vulnerabilities of his platform during election campaigns. With the elections for the European Parliament scheduled for May 2019, it is about time to look at our information diets and take seriously the role of platforms in shaping our worldviews. But how? Personalisation algorithms are kept a closely guarded secret by social media platform companies. The few experiments auditing these algorithms rely on data provided by platform companies themselves. Researchers are sometimes subject to legal challenges by social media companies who accuse them of violating the Terms of Services of their utility. As we speak, technological fencing-offs are emerging as the newest challenge to third-party accountability. Generally, auditing algorithms fail to involve ordinary users, missing out on a crucial opportunity for awareness raising and behavioural change.
The Algorithms Exposed (ALEX) project1, funded by a Proof of Concept grant of the European Research Council, intervenes in this space by promoting an approach to algorithms auditing that empowers and educates users. ALEX stabilises and expands the functionalities of a browser extension – fbtrex – an original idea of lead developer Claudio Agosti. Analysing the outcomes of Facebook’s news feed algorithm, our software enables users to monitor their own social media consumption, and to volunteer their data for scientific or advocacy projects of their choosing. It also empowers advanced users, including researchers and journalists, to produce sophisticated investigations of algorithmic biases. Taking Facebook and the forthcoming EU elections as a test case, ALEX unmasks the functioning of personalisation algorithms on social media platforms.”
Continue reading in the website of the Internet Policy Review.