635
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

Online Content Moderation: Does Justice Need a Human Face?

ORCID Icon, ORCID Icon & ORCID Icon
Pages 66-77 | Received 01 Jul 2022, Accepted 02 May 2023, Published online: 22 May 2023
 

Abstract

Approaches to content moderation online draw from models used to manage behavior in the offline world—undesirable content and behaviors are identified and sanctioned are issued as a means of deterrence. Recent discussions in both offline and online contexts have emphasized the limits of a sanction-based approach and highlighted the gains that would flow from building self-regulatory models within which users are encouraged to take personal responsibility for following rules. Our concern is whether a procedural justice model—a model which has increasingly been adopted in offline legal settings—can be used for content moderation online, and furthermore, if the benefits of this self-regulatory model persist in an online setting where algorithms play a more central role. We review recent studies which demonstrate that it is possible to promote self-governance by having platforms employ enforcement procedures that users experience as being procedurally just. The challenge of such procedures is that at least some of their features—having voice, receiving an explanation, treatment with respect—appear to be in conflict with a reliance on algorithms used by many online platforms. This review of the literature suggests that there is not necessarily an inherent conflict between the use of algorithms and the user experience of procedural justice. Drawing upon findings from recent empirical work in this space, we argue that the necessary antecedents for procedural justice can be built into algorithmic decision making used in platforms’ content moderation efforts. Doing so, however, requires a nuanced understanding of how algorithms are viewed—both positively and negatively—in building trust during these decision making processes.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

Additional information

Notes on contributors

Matthew Katsaros

Matthew Katsaros is the Director of the Social Media Governance Initiative at the Justice Collaboratory at Yale Law School. His research explores ways in which the design and architecture of platforms can support prosocial outcomes online.

Jisu Kim

Jisu Kim is an Assistant Professor at the Singapore Institute of Technology, where she teaches in the Digital Communications and Interactive Media program. Her research focuses on how digital technology influences audience engagement, journalism practices, and the business model of news organizations, with an emphasis on computational methods.

Tom Tyler

Tom Tyler is the Macklin-Flemming Professor of Law and Professor of Psychology at Yale University. His research concerns authority dynamics in groups, organizations, communities, and societies. He has studied these issues in legal, political, and managerial settings. His books include Why People Obey the Law.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 61.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 306.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.