6,609
Views
50
CrossRef citations to date
0
Altmetric
Original Article

The “Arbiters of What Our Voters See”: Facebook and Google’s Struggle with Policy, Process, and Enforcement around Political Advertising

 

Abstract

The question of how Facebook and Google make and justify decisions regarding permissible political advertising on their platforms is increasingly important. In this paper, we focus on the U.S. case and present findings from interviews with 17 former social media firm employees (n = 7) and political practitioners (n = 11). We also analyze emails (n = 45) exchanged between Facebook government and elections staffers and two campaigns, a U.S. gubernatorial (2017) and presidential campaign (2016), regarding the platform’s policies in the context of paid speech. In addressing questions about Facebook’s and Google’s processes and policies regarding paid political content, the rationales for them, and the ability of campaigns to contest decisions, this study shows how while Facebook and Google resist being arbiters of political discourse, they actively vet paid content on their platforms. These platforms differ with respect to how and what decisions they make in the context of paid speech and within each company there are active and ongoing debates among staffers about speech. These debates at times take place in consultation with political practitioners and often occur in the context of external events. Across these firms, policies regarding speech evolve through these internal debates, appeals by practitioners, and outside pressure. At the same time, both Facebook and Google make decisions in often opaque ways, according to policies that are not transparent, and without clear justifications to campaigns or the public as to how they are applied or enforced. This limits options for political practitioners to contest regulation decisions. Finally, we conclude by arguing for the need for expanded capacities for political practitioners and the public to exercise voice around the content decisions that these firms make, and for firms to create more robust institutional mechanisms for incorporating it.

Acknowledgement

The authors wish to thank Bridget Barrett for her comments on an earlier draft of the study.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes

1. For Google, we focus here on Google’s ad network and advertising policies, not YouTube or its other properties. The Google Ads Policies can be found here: https://support.google.com/adspolicy/answer/6008942?visit_id=636856675945721223-1772300979&rd=1 (accessed 2/13/19).

2. Since the 2016 U.S. presidential election, both Facebook and Google implemented verification processes to ensure the authenticity of those buying political advertisements. For Google, to purchase Google Ads around politics accounts need to have verification through the FEC or a tax ID (see https://support.google.com/adspolicy/answer/9002729?hl=en&ref_topic=1316596 (accessed 2/13/19). Facebook requires verification through a postal address to run ads related to a list of political topics that Facebook provides (see: https://www.facebook.com/business/help/208949576550051 (accessed 2/13/19)).

3. This includes Facebook’s political ads archive, launched in June 2018 and a new political ads verification process, launched in April 2018.

4. For Facebook, these advertising policies apply to Instagram, Facebook Messenger, and the company’s broader audience network.

5. Google, Targeting Your Ads. Available online at: https://support.google.com/google-ads/answer/1704368?hl=en (accessed 2/11/19).

6. Although investigations from ProPublica have found that these tools can be used to target users whose “interests” include white nationalism or exclude users from minority racial identities. https://www.propublica.org/article/facebook-advertising-discrimination-housing-race-sex-national-origin.

7. Google Ads Policies. Available online at: https://support.google.com/adspolicy/answer/6008942?visit_id=1-636682172203706782-1677536111&rd=1 (accessed 2/13/19).

8. Google, Inappropriate Content. Available online at: https://support.google.com/adspolicy/answer/6015406?hl=en (accessed 10/12/18).

9. https://www.facebook.com/communitystandards (accessed 2/12/19). For a review of the update of its community standards and appeals process in April 2018 see: https://www.npr.org/2018/04/24/605107093/facebook-updates-community-standards-expands-appeals-process.

10. In the wake of revelations that Russian actors created fake pages and accounts to post political content about the 2016 U.S. presidential election, Facebook created an authorization process that applies to anyone wanting to post ads related to candidates running for office, political parties, PACs, or advocates for specific electoral outcomes; related to election, referendum, or ballot initiatives (including get out the vote ads); is regulated as political advertising; or relates to any “national legislative issue of public importance.” In the U.S., these issues are broad and far-ranging, including: abortion, budget, crime, economy, guns, energy, infrastructure, health, taxes, and values. We have heard from our informants – and there are accounts in the press of this as well – of Facebook’s issue-related ad policy being applied to all sorts of non-political ads. Ads relating to these issues must, per Facebook’s terms, include a paid-for-by disclaimer that appears to users targeted with the post. However, Facebook does not verify that these paid-for-by disclaimers are accurate. After the requirement, news organizations have shown that nearly anything can be put into the self-regulated “paid-for-by” input, and these ads, with false disclaimers, were approved to run on the platform. Facebook recently announced that they are rolling out these stricter rules in countries with upcoming elections, including India, Nigeria, Ukraine, and the EU. At the time of this writing, the loophole in disclaimer requirements in the U.S. was not closed, and it is unclear as to what the impact of the varying rules across countries has had on attempts to undermine disclosure requirements.

11. The Google elections team, which is made up of account managers and executives organized under “sales,” do not make decisions about what ads should be approved and what should not. Former staffers were at pains to draw distinctions between the elections team, as a sales team, and the ad policy team.

14. Facebook Terms of Service. Available online at: https://www.facebook.com/terms.php.

15. Facebook puts this in the category of regulating content not based on what an actor is saying, but in essence how they are saying it. This is similar to the category of how Facebook regulates pages based on behavior (such as during the midterms, taking down pages that evidenced spamming behavior or that used fake accounts).

16. It appears this policy change received public attention only months after this email. It cannot be independently verified when the policy change actually occurred.

17. Facebook, Addressing Hoaxes and Fake News. Available online at: https://newsroom.fb.com/news/2016/12/news-feed-fyi-addressing-hoaxes-and-fake-news/.

18. Facebook, Addressing Hoaxes and Fake News. Available online at: https://newsroom.fb.com/news/2016/12/news-feed-fyi-addressing-hoaxes-and-fake-news/ Going further, in 2018 Facebook did start taking action on content instead of just behavior (such as behaving in similar ways to spam profiles), to no longer permitting accounts that produced something fact checkers said was false to run advertising.

19. Facebook updated its community standards and expanded its appeals process in April 2018, see: https://www.npr.org/2018/04/24/605107093/facebook-updates-community-standards-expands-appeals-process.

20. See “The Impossible Job: Inside Facebook’s Struggle to Moderate Two Billion People.” Motherboard. 2018, August 23. Available online at:https://motherboard.vice.com/en_us/article/xwk9zd/how-facebook-content-moderation-works.

21. See “Charting a Course for an Oversight Board for Content Decisions,” available online at: https://newsroom.fb.com/news/2019/01/oversight-board/ (accessed 2/12/29).

Additional information

Notes on contributors

Daniel Kreiss

Shannon C. McGregor is an Assistant Professor in the Department of Journalism at the University of Utah.

Shannon C. Mcgregor

Daniel Kreiss is Associate Professor in the School of Media and Journalism at the University of North Carolina at Chapel Hill.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.