1,557
Views
2
CrossRef citations to date
0
Altmetric
Articles

Splintering and centralizing platform governance: how Facebook adapted its content moderation practices to the political and legal contexts in the United States, Germany, and South Korea

ORCID Icon, ORCID Icon &
Pages 2843-2862 | Received 29 Aug 2021, Accepted 08 Aug 2022, Published online: 06 Sep 2022
 

ABSTRACT

The proliferation of hate speech and disinformation on social media has prompted democratic countries around the world to discuss adequate regulations to limit the power exerted by platforms over national politics. As a result, the once ostensibly uniform content moderation practices of social media companies are becoming increasingly territorialized, and the governance of online political speech is constantly negotiated between global social media platforms and national governments. To comprehend the evolving landscape of online political speech governance, this paper scrutinizes how Facebook has adapted its content moderation practices to the political and legal contexts of three democratic nations: the United States, Germany, and South Korea. We assessed national laws and governmental documents to explain the regulatory landscapes of the three countries, and used VPNs and corporate PR materials to see how Facebook’s platform design and public communication diverge by location. The findings suggest that the seemingly ‘splintering’ regulatory frameworks still have a ‘centralizing’ effect: Facebook formally complies with national laws, but its platform interface and communication activities steer users away from the local systems and towards its centralized operations. We discuss future implications for the regulation of online political speech in democratic nations.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1 The others include data and privacy, technology and innovation, safety and expression, combating misinformation, economic opportunity, and strengthening communities.

2 Such instances can be actually covered by Federal Election Campaign Act (FECA), Foreign Agents Registration Act (FARA), and the Homeland Security Act, despite the lack of overarching federal statute (Garrett et al., Citation2020).

3 The widely quoted deadline of 24 h applies only to ‘manifestly unlawful content’; however, the Act does not define what this means nor indicate how to recognize such content.

4 Bundesverfassungsgericht (2021), 1 BvR 1073/20 -, Rn. 1-53, http://www.bverfg.de/e/rk20211219_1bvr107320.html

5 Bundesgerichtshof (2021), III ZR 179/20 and III ZR 192/20, https://www.bundesgerichtshof.de/SharedDocs/Pressemitteilungen/DE/2021/2021149.html

6 For example, a partial evaluation of the application of NetzDG by Liesching et al. (Citation2021) warns about overblocking, while an analysis of two million tweets by Andres and Slivko (Citation2021) does not find any evidence for this practice.

7 In 2021, the German government passed the Gesetz zur Änderung des Netzwerkdurchsetzungsgesetzes (Law to amend NetzDG), which specifies that the NetzDG reporting mechanism must be ‘intuitively usable and located directly next to the content in question.’ Moreover, social media companies shall not ‘discourage’ people from using the mechanism with ‘unnecessary’ warnings or legal jargon. Facebook has since included the option to report content under NetzDG in its flagging tool. When users click on this option, they are redirected to a simplified NetzDG reporting form, in which making a legal classification of the content in question is optional but no longer required. Nontheless, reporting content under the Community Standards remains the most visible and easily accessible option. Reporting content under NetzDG is listed as a third and final option under ‘what else you can do,’ after the options to block or unfollow the author of the post.

8 In April 2022, EU Member States reached a political agreement with the European Parliament on the European Commission's proposal on the Digital Services Act, which includes EU-wide due diligence obligations for "very large online platforms" that resemble obligations in Germany's NetzDG, e.g., to provide a mechanism for users to flag what they consider to be illegal content under Union/national law and to report on their content moderation decisions (European Commission, Citation2022b). Once the Act enters into force, platforms will have to adapt their content moderation practices in Germany and the other European Member States to these new, EU-wide rules.

9 The amendments introduced in NetzDGÄndG require platform companies to include data relating to the enforcement of both NetzDG and internal platform policies in their legally mandated transparency reports. Similar provisions are included in the European Commission’s proposal for a Digital Services Act.

Additional information

Notes on contributors

Soyun Ahn

Soyun Ahn is a doctoral candidate at the Annenberg School for Communication and Journalism at the University of Southern California, where she is conducting her dissertation research examining platforms and their societal implications from a comparative and critical perspective. She earned her MA in Journalism and Mass Communication from the University of Wisconsin-Madison and holds a law degree from Yonsei University in South Korea.

Jeeyun (Sophia) Baik

Jeeyun (Sophia) Baik, Ph.D. is an assistant professor in the Department of Communication at the University of San Diego (USD). She studies the politics of communication technology policy across stakeholders and borders. Her research has covered issues including privacy, surveillance, and data-oriented business models, looking at the impacts on marginalized communities in particular.

Clara Sol Krause

Clara Sol Krause is a digital policy assistant at the European Commission's Directorate General for Communications Networks, Content and Technology. She holds a Double Master in Global Media and Communications from the London School of Economics and Political Science and the Annenberg School for Communication and Journalism at the University of Southern California. Her research explores the sociotechnical dynamics of online speech regulation and alternative approaches to platform governance in the US and the EU. The information and views presented in this article are those of the author and cannot under any circumstances be regarded as the views of the European Commission or its services.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 304.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.