Abstract
Approaches to content moderation online draw from models used to manage behavior in the offline world—undesirable content and behaviors are identified and sanctioned are issued as a means of deterrence. Recent discussions in both offline and online contexts have emphasized the limits of a sanction-based approach and highlighted the gains that would flow from building self-regulatory models within which users are encouraged to take personal responsibility for following rules. Our concern is whether a procedural justice model—a model which has increasingly been adopted in offline legal settings—can be used for content moderation online, and furthermore, if the benefits of this self-regulatory model persist in an online setting where algorithms play a more central role. We review recent studies which demonstrate that it is possible to promote self-governance by having platforms employ enforcement procedures that users experience as being procedurally just. The challenge of such procedures is that at least some of their features—having voice, receiving an explanation, treatment with respect—appear to be in conflict with a reliance on algorithms used by many online platforms. This review of the literature suggests that there is not necessarily an inherent conflict between the use of algorithms and the user experience of procedural justice. Drawing upon findings from recent empirical work in this space, we argue that the necessary antecedents for procedural justice can be built into algorithmic decision making used in platforms’ content moderation efforts. Doing so, however, requires a nuanced understanding of how algorithms are viewed—both positively and negatively—in building trust during these decision making processes.
Disclosure statement
No potential conflict of interest was reported by the author(s).
Notes
4 Ibid.
Additional information
Notes on contributors
Matthew Katsaros
Matthew Katsaros is the Director of the Social Media Governance Initiative at the Justice Collaboratory at Yale Law School. His research explores ways in which the design and architecture of platforms can support prosocial outcomes online.
Jisu Kim
Jisu Kim is an Assistant Professor at the Singapore Institute of Technology, where she teaches in the Digital Communications and Interactive Media program. Her research focuses on how digital technology influences audience engagement, journalism practices, and the business model of news organizations, with an emphasis on computational methods.
Tom Tyler
Tom Tyler is the Macklin-Flemming Professor of Law and Professor of Psychology at Yale University. His research concerns authority dynamics in groups, organizations, communities, and societies. He has studied these issues in legal, political, and managerial settings. His books include Why People Obey the Law.