12,219
Views
7
CrossRef citations to date
0
Altmetric
Articles

Making sense of algorithmic profiling: user perceptions on Facebook

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Pages 809-825 | Received 04 Feb 2021, Accepted 24 Sep 2021, Published online: 20 Oct 2021

References

  • Adam, A. (2005). Delegating and distributing morality: Can we inscribe privacy protection in a machine? Ethics and Information Technology, 7(4), 233–242. https://doi.org/10.1007/s10676-006-0013-3
  • Brevini, B., & Pasquale, F. (2020). Revisiting the black box society by rethinking the political economy of big data. Big Data & Society, 7(2), 1–4. https://doi.org/10.1177/2053951720935146
  • Bucher, T. (2017). The algorithmic imaginary: Exploring the ordinary affects of Facebook algorithms. Information, Communication & Society, 20(1), 30–44. https://doi.org/10.1080/1369118X.2016.1154086
  • Bucher, T. (2018). If … then: Algorithmic power and politics. Oxford University Press.
  • Büchi, M., Fosch-Villaronga, E., Lutz, C., Tamò-Larrieux, A., Velidi, S., & Viljoen, S. (2020). The chilling effects of algorithmic profiling: Mapping the issues. Computer Law & Security Review, 36, 105367. https://doi.org/10.1016/j.clsr.2019.105367
  • Carretero Gomez, S., Vuorikari, R., & Punie, Y. (2017). DigComp 2.1: The digital competence framework for citizens with eight proficiency levels and examples of use (JRC106281). Publications Office of the European Union. https://doi.org/10.2760/00963(ePub)
  • Christin, A. (2020). The ethnographer and the algorithm: Beyond the black box. Theory and Society, 1–22. https://doi.org/10.1007/s11186-020-09411-3
  • Cotter, K., & Reisdorf, B. C. (2020). Algorithmic knowledge gaps: A new dimension of (digital) inequality. International Journal of Communication, 14, 745–765. https://ijoc.org/index.php/ijoc/article/view/12450
  • De Graaf, M., Allouch, S. B., & Van Dijk, J. (2017, March 6–9). Why do they refuse to use my robot? Reasons for non-use derived from a long-term home study. In 2017 12th ACM/IEEE international conference on Human-Robot Interaction (HRI) (pp. 224–233).
  • Dencik, L., & Cable, J. (2017). The advent of surveillance realism: Public opinion and activist responses to the Snowden leaks. International Journal of Communication, 11, 763–781. https://ijoc.org/index.php/ijoc/article/view/5524
  • DeVito, M. A., Gergle, D., & Birnholtz, J. (2017, May). “Algorithms ruin everything”: #RIPTwitter, folk theories, and resistance to algorithmic change in social media. In Proceedings of the 2017 CHI conference on human factors in computing systems (pp. 3163–3174). ACM. https://doi.org/10.1145/3025453.3025659
  • Draper, N. A., & Turow, J. (2019). The corporate cultivation of digital resignation. New Media & Society, 21(8), 1824–1839. https://doi.org/10.1177/1461444819833331
  • Eslami, M., Karahalios, K., Sandvig, C., Vaccaro, K., Rickman, A., Hamilton, K., & Kirlik, A. (2016). First I “like” it, then I hide it: Folk theories of social feeds. In Proceedings of the 2016 CHI conference on human factors in computing systems (pp. 2371–2382). ACM. https://doi.org/10.1145/2702123.2702556
  • Eslami, M., Rickman, A., Vaccaro, K., Aleyasen, A., Vuong, A., Karahalios, K., Hamilton, K., & Sandvig, C. (2015, April, 18–23). I always assumed that I wasn’t really that close to [her]: Reasoning about invisible algorithms in news feeds. In CHI’15: Proceedings of the 33rd annual ACM conference on human factors in computing systems (pp. 153–162). ACM. https://doi.org/10.1145/2702123.2702556
  • European Commission’s Directorate-General for Communications Networks, Content and Technology (EC DG Communication Network, Content, and Technology) (2018). Algo:Aware. Raising awareness on algorithms. European Commission. https://platformobservatory.eu/app/uploads/2019/06/AlgoAware-State-of-the-Art-Report.pdf.
  • Facebook (2016) Facebook does not use your phone’s microphone for ads or news feed stories. https://about.fb.com/news/h/facebook-does-not-use-your-phones-microphone-for-ads-or-news-feed-stories/
  • Felzmann, H., Villaronga, E. F., Lutz, C., & Tamò-Larrieux, A. (2019). Transparency you can trust: Transparency requirements for artificial intelligence between legal norms and contextual concerns. Big Data & Society, 6(1), 1–14. https://doi.org/10.1177/2053951719860542
  • Festic, N. (2020). Same, same, but different! Qualitative evidence on how algorithmic selection applications govern different life domains. Regulation & Governance, 1–17. https://doi.org/10.1111/rego.12333
  • Fosch-Villaronga, E., Poulsen, A., Søraa, R. A., & Custers, B. H. M. (2021). A little bird told me your gender: Gender inferences in social media. Information Processing & Management, 58(3), 102541. https://doi.org/10.1016/j.ipm.2021.102541
  • Hargittai, E., & Marwick, A. (2016). “What can I really do?” Explaining the privacy paradox with online apathy. International Journal of Communication, 10, 3737–3757. https://ijoc.org/index.php/ijoc/article/view/4655
  • Hautea, S., Munasinghe, A., & Rader, E. (2020). ‘That’s not me’: Surprising algorithmic inferences. In Extended abstracts of the 2020 CHI conference on human factors in computing systems (pp. 1–7). ACM. https://doi.org/10.1145/3334480.3382816
  • Hitlin, P., & Rainie, L. (2019, January 16). Facebook algorithms and personal data. Pew Research Center: Internet & Technology Report. https://www.pewresearch.org/internet/2019/01/16/facebook-algorithms-and-personal-data/
  • Hoffmann, C. P., Lutz, C., & Ranzini, G. (2016). Privacy cynicism: A new approach to the privacy paradox. Cyberpsychology: Journal of Psychosocial Research on Cyberspace, 10(4). https://doi.org/10.5817/CP2016-4-7
  • Kennedy, H., Elgesem, D., & Miguel, C. (2017). On fairness: User perspectives on social media data mining. Convergence: The International Journal of Research into New Media Technologies, 23(3), 270–288. https://doi.org/10.1177/1354856515592507
  • Kitchin, R., & Lauriault, T. (2014). Towards critical data studies: Charting and unpacking data assemblages and their work. SSRN Electronic Journal. https://papers.ssrn.com/sol3/papers.cfm?Abstract_id=2474112
  • Kolkman, D. (2020). The (in)credibility of algorithmic models to non-experts. Information, Communication & Society, 1–17. https://doi.org/10.1080/1369118X.2020.1761860
  • Lomborg, S., & Kapsch, P. H. (2020). Decoding algorithms. Media, Culture & Society, 42(5), 745–761. https://doi.org/10.1177/0163443719855301
  • Lupton, D. (2020). Thinking with care about personal data profiling: A more-than-human approach. International Journal of Communication, 14, 3165–3183. https://ijoc.org/index.php/ijoc/article/view/13540
  • Lusoli, W., Bacigalupo, M., & Lupianez, F. (2012). Pan-European survey of practices, attitudes and policy preferences as regards personal identity data management. http://is.jrc.ec.europa.eu/pages/TFS/documents/EIDSURVEY_Web_001.pdf
  • Lutz, C., Hoffmann, C. P., & Ranzini, G. (2020). Data capitalism and the user: An exploration of privacy cynicism in Germany. New Media & Society, 22(7), 1168–1187. https://doi.org/10.1177/1461444820912544
  • Madden, M, Smith, A (2010). Reputation management and social media: How people monitor their identity and search for others online. http://pewinternet.org/~/media/Files/Reports/2010/PIP_Reputation_Management.pdf
  • Nissenbaum, H. (2010). Privacy in context: Technology, policy, and the integrity of social life. Stanford University Press.
  • Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York University Press.
  • Palan, S., & Schitter, C. (2018). Prolific.ac – A subject pool for online experiments. Journal of Behavioral and Experimental Finance, 17, 22–27. https://doi.org/10.1016/j.jbef.2017.12.004
  • Pew. (2019). Social media fact sheet. Pew Research Center: Internet & Technology. https://www.pewresearch.org/internet/fact-sheet/social-media/
  • Powers, E. (2017). My news feed is filtered? Awareness of news personalization among college students. Digital Journalism, 5(10), 1315–1335. https://doi.org/10.1080/21670811.2017.1286943
  • Rader, E., Cotter, K., & Cho, J. (2018). Explanations as mechanisms for supporting algorithmic transparency. In Proceedings of the ACM 2018 CHI conference on human factors in computing systems (pp. 1–13). ACM. https://doi.org/10.1145/3173574.3173677
  • Ruckenstein, M., & Granroth, J. (2020). Algorithms, advertising and the intimacy of surveillance. Journal of Cultural Economy, 13(1), 12–24. https://doi.org/10.1080/17530350.2019.1574866
  • Sax, M. (2016). Big data: Finders keepers, losers weepers? Ethics and Information Technology, 18(1), 25–31. https://doi.org/10.1007/s10676-016-9394-0
  • Schwartz, S. A., & Mahnke, M. S. (2021). Facebook use as a communicative relation: Exploring the relation between Facebook users and the algorithmic news feed. Information, Communication & Society, 24(7), 1041–1056. https://doi.org/10.1080/1369118X.2020.1718179
  • Siles, I., Segura-Castillo, A., Solís, R., & Sancho, M. (2020). Folk theories of algorithmic recommendations on Spotify: Enacting data assemblages in the global South. Big Data & Society, 7(1), 1–15. https://doi.org/10.1177/2053951720923377
  • Statista. (2020a). Distribution of Facebook users in the United States as of November 2020, by gender. Statista.com. https://www.statista.com/statistics/266879/facebook-users-in-the-us-by-gender/
  • Statista. (2020b). Distribution of Facebook users in the United States as of November 2020, by age group and gender. Statista.com. https://www.statista.com/statistics/187041/us-user-age-distribution-on-facebook/
  • Statista. (2021). Median age of the resident population of the United States from 1960 to 2019. Statista.com. https://www.statista.com/statistics/241494/median-age-of-the-us-population/
  • Stoycheff, E., Liu, J., Xu, K., & Wibowo, K. (2019). Privacy and the Panopticon: Online mass surveillance’s deterrence and chilling effects. New Media & Society, 21(3), 602–619. https://doi.org/10.1177/1461444818801317
  • Sundar, S. S., & Kim, J. (2019). Machine heuristic: When we trust computers more than humans with our personal information. In Proceedings of the 2019 CHI conference on human factors in computing systems (paper no. 538). ACM. https://doi.org/10.1145/3290605.3300768
  • Sundar, S. S., & Marathe, S. S. (2010). Personalization versus customization: The importance of agency, privacy, and power usage. Human Communication Research, 36(3), 298–322. https://doi.org/10.1111/j.1468-2958.2010.01377.x
  • Swart, J. (2021). Experiencing algorithms: How young people understand, feel about, and engage with algorithmic news selection on social media. Social Media + Society, 7(2). https://doi.org/10.1177/20563051211008828
  • Turow, J., Feldman, L., & Meltzer, K. (2005). Open to exploitation: American shoppers, online and offline. http://www.annenbergpublicpolicycenter.org/Downloads/Information_And_Society/Turow_APPC_Report_WEB_FINAL.pdf
  • Turow, J., Hoofnagle, C. J., & King, J. (2009). Americans reject tailored advertising and three activities that enable it. http://www.ftc.gov/bcp/workshops/privacyroundtables/Turow.pdf
  • Van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society, 12(2), 197–208. https://doi.org/10.24908/ss.v12i2.4776
  • Van Dijck, J., Poell, T., & De Waal, M. (2018). The platform society: Public values in a connective world. Oxford University Press.
  • Wachter, S. (2020). Affinity profiling and discrimination by association in online behavioural advertising. Berkeley Technology Law Journal, 35(2), 1–74. https://doi.org/10.15779/Z38JS9H82M
  • Wagner, A. R., Borenstein, J., & Howard, A. (2018). Overtrust in the robotic age. Communications of the ACM, 61(9), 22–24. https://doi.org/10.1145/3241365
  • West, S. M. (2019). Data capitalism: Redefining the logics of surveillance and privacy. Business & Society, 58(1), 20–41. https://doi.org/10.1177/0007650317718185
  • Ytre-Arne, B., & Moe, H. (2021). Folk theories of algorithms: Understanding digital irritation. Media, Culture & Society, 43(5), 807–824. https://doi.org/10.1177/0163443720972314
  • Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. Profile Books.
  • Zuckerman, E. (2021). Why study media ecosystems? Information, Communication & Society, 24(10), 1495–1513. https://doi.org/10.1080/1369118X.2021.1942513