2,590
Views
43
CrossRef citations to date
0
Altmetric
Articles

Bot-based collective blocklists in Twitter: the counterpublic moderation of harassment in a networked public space

Pages 787-803 | Received 08 Sep 2015, Accepted 08 Feb 2016, Published online: 21 Mar 2016
 

ABSTRACT

This article introduces and discusses bot-based collective blocklists (or blockbots) in Twitter, which have been developed by volunteers to combat harassment in the social networking site. Blockbots support the curation of a shared blocklist of accounts, where subscribers to a blockbot will not receive any notifications or messages from accounts on the blocklist. Blockbots support counterpublic communities, helping people moderate their own experiences of a site. This article provides an introduction and overview of blockbots and the issues that they raise about networked publics and platform governance, extending an intersecting literature on online harassment, platform governance, and the politics of algorithms. Such projects involve a more reflective, intentional, transparent, collaborative, and decentralized way of using algorithmic systems to respond to issues of platform governance like harassment. I argue that blockbots are not just technical solutions but social ones as well, a notable exception to common technologically determinist solutions that often push responsibility for issues like harassment to the individual user. Beyond the case of Twitter, blockbots call our attention to collective, bottom-up modes of computationally assisted moderation that can be deployed by counterpublic groups who want to participate in networked publics where hegemonic and exclusionary practices are increasingly prevalent.

Acknowledgements

I would like to thank Jenna Burrell and Paul Duguid, who have helped supervise this work as part of my Ph.D, providing invaluable support, feedback, and mentorship. I am grateful to Nathan Matias for his generous work in helping me investigate and conceptualize this topic. I would also like to thank many other people for helping me in research and revise this work, including: Aaron Halfaker, Amanda Menking, Amy Johnson, Ben Light, Gina Neff, Megan Finn, Nick Doty, Norah Abokhodair, Philip Howard, Randi Harper, Richmond Wong, Samuel Woolley, Whitney Phillips, the members of the UC-Berkeley School of Information seminar on Technology and Delegation, and the anonymous reviewers.

Disclosure statement

No potential conflict of interest was reported by the author.

Notes on contributor

R. S. Geiger is an ethnographer and post-doctoral scholar at the Berkeley Institute for Data Science at UC-Berkeley, where he studies the infrastructures and institutions that support the production of knowledge. His Ph.D. research at the UC-Berkeley School of Information focused on the governance and operation of user-generated content platforms. He has studied topics including newcomer socialization, moderation and quality control, specialization and professionalization, cooperation and conflict, the roles of support staff and technicians, and diversity and inclusion.

Additional information

Funding

This work was supported by a doctoral completion fellowship at UC-Berkeley and a pre-doctoral fellowship at the Center for Media, Data, and Society at the Central European University in Budapest, Hungary.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.