2,596
Views
43
CrossRef citations to date
0
Altmetric
Articles

Bot-based collective blocklists in Twitter: the counterpublic moderation of harassment in a networked public space

Pages 787-803 | Received 08 Sep 2015, Accepted 08 Feb 2016, Published online: 21 Mar 2016
 

ABSTRACT

This article introduces and discusses bot-based collective blocklists (or blockbots) in Twitter, which have been developed by volunteers to combat harassment in the social networking site. Blockbots support the curation of a shared blocklist of accounts, where subscribers to a blockbot will not receive any notifications or messages from accounts on the blocklist. Blockbots support counterpublic communities, helping people moderate their own experiences of a site. This article provides an introduction and overview of blockbots and the issues that they raise about networked publics and platform governance, extending an intersecting literature on online harassment, platform governance, and the politics of algorithms. Such projects involve a more reflective, intentional, transparent, collaborative, and decentralized way of using algorithmic systems to respond to issues of platform governance like harassment. I argue that blockbots are not just technical solutions but social ones as well, a notable exception to common technologically determinist solutions that often push responsibility for issues like harassment to the individual user. Beyond the case of Twitter, blockbots call our attention to collective, bottom-up modes of computationally assisted moderation that can be deployed by counterpublic groups who want to participate in networked publics where hegemonic and exclusionary practices are increasingly prevalent.

Acknowledgements

I would like to thank Jenna Burrell and Paul Duguid, who have helped supervise this work as part of my Ph.D, providing invaluable support, feedback, and mentorship. I am grateful to Nathan Matias for his generous work in helping me investigate and conceptualize this topic. I would also like to thank many other people for helping me in research and revise this work, including: Aaron Halfaker, Amanda Menking, Amy Johnson, Ben Light, Gina Neff, Megan Finn, Nick Doty, Norah Abokhodair, Philip Howard, Randi Harper, Richmond Wong, Samuel Woolley, Whitney Phillips, the members of the UC-Berkeley School of Information seminar on Technology and Delegation, and the anonymous reviewers.

Disclosure statement

No potential conflict of interest was reported by the author.

Notes on contributor

R. S. Geiger is an ethnographer and post-doctoral scholar at the Berkeley Institute for Data Science at UC-Berkeley, where he studies the infrastructures and institutions that support the production of knowledge. His Ph.D. research at the UC-Berkeley School of Information focused on the governance and operation of user-generated content platforms. He has studied topics including newcomer socialization, moderation and quality control, specialization and professionalization, cooperation and conflict, the roles of support staff and technicians, and diversity and inclusion.

Additional information

Funding

This work was supported by a doctoral completion fellowship at UC-Berkeley and a pre-doctoral fellowship at the Center for Media, Data, and Society at the Central European University in Budapest, Hungary.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 304.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.