417
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Contesting personalized recommender systems: a cross-country analysis of user preferences

ORCID Icon, , &
Received 23 Aug 2023, Accepted 14 May 2024, Published online: 03 Jul 2024

References

  • Anderson, M., & Jiang, J. (2018). Teens, social media & technology 2018. Pew Research Center. https://assets.pewresearch.org/wp-content/uploads/sites/14/2018/05/31102617/PI_2018.05.31_TeensTech_FINAL.pdf
  • Auxier, B. (2020). 64% of Americans say social media have a mostly negative effect on the way things are going in the U.S. today. Pew Research Center. https://www.pewresearch.org/fact-tank/2020/10/15/64-of-americans-say-social-media-have-a-mostly-negative-effect-on-the-way-things-are-going-in-the-u-s-today/
  • Baumann, F., Lorenz-Spreen, P., Sokolov, I. M., & Starnini, M. (2020). Modeling echo chambers and polarization dynamics in social networks. Physical Review Letters, 124(4), 048301. https://doi.org/10.1103/PhysRevLett.124.048301
  • Bayamlioglu, E. (2018). Contesting automated decisions:. European Data Protection Law Review, 4(4), 433–446. https://doi.org/10.21552/edpl/2018/4/6
  • Bendiek, A., & Stuerzer, I. (2023). The Brussels effect, European regulatory power and political capital: Evidence for mutually reinforcing internal and external dimensions of the Brussels effect from the European digital policy debate. Digital Society, 2(1), 5. https://doi.org/10.1007/s44206-022-00031-1
  • Binns, R. (2018). Algorithmic accountability and public reason. Philosophy & Technology, 31(4), 543–556. https://doi.org/10.1007/s13347-017-0263-5
  • Bradford, A. (2020). The Brussels effect: How the European union rules the world. Oxford University Press.
  • Chater, N., & Loewenstein, G. (2022). The i-frame and the s-frame: How focusing on individual-level solutions has led behavioral public policy astray. The Behavioral and Brain Sciences, 1–60.
  • Cota, W., Ferreira, S. C., Pastor-Satorras, R., & Starnini, M. (2019). Quantifying echo chamber effects in information spreading over political communication networks. EPJ Data Science, 8(1), 1–13. https://doi.org/10.1140/epjds/s13688-019-0213-9
  • Council of Europe. (2018). Recommendation CM/Rec(2018)2 of the Committee of Ministers to member States on the roles and responsibilities of internet intermediaries. https://rm.coe.int/0900001680790e14#:~:text=State%20authorities%20should%20protect%20the,Convention%20108%2C%20are%20also%20guaranteed
  • Directorate-General for Communication. (2019). Special Eurobarometer 487a: The General Data Protection Regulation [dataset]. https://europa.eu/eurobarometer/surveys/detail/2222
  • Eslami, M., Rickman, A., Vaccaro, K., Aleyasen, A., Vuong, A., Karahalios, K., Hamilton, K., & Sandvig, C. (2015). “I always assumed that I wasn’t really that close to [her]”: Reasoning about invisible algorithms in news feeds. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (pp. 153–162).
  • European Commission. (2022). The Digital Services Act: Ensuring a safe and accountable online environment. https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en
  • Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly, 80(S1), 298–320. https://doi.org/10.1093/poq/nfw006
  • Fuchs, C. (2021). Social media: A critical introduction. SAGE.
  • Gillespie, T. (2018). Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press.
  • Guess, A. M., Malhotra, N., Pan, J., Barberá, P., Allcott, H., Brown, T., Crespo-Tenorio, A., Dimmery, D., Freelon, D., Gentzkow, M., González-Bailón, S., Kennedy, E., Kim, Y. M., Lazer, D., Moehler, D., Nyhan, B., Rivera, C. V., Settle, J., Thomas, D. R., … Tucker, J. A. (2023). How do social media feed algorithms affect attitudes and behavior in an election campaign? Science, 381(6656), 398–404. https://doi.org/10.1126/science.abp9364
  • Haim, M., Graefe, A., & Brosius, H.-B. (2018). Burst of the filter bubble? Digital Journalism, 6(3), 330–343. https://doi.org/10.1080/21670811.2017.1338145
  • Harambam, J., Bountouridis, D., Makhortykh, M., & van Hoboken, J. (2019). Designing for the better by taking users into account: A qualitative evaluation of user control mechanisms in (news) recommender systems. Proceedings of the 13th ACM Conference on Recommender Systems, 69–77. https://doi.org/10.1145/3298689.3347014
  • Helberger, N., Karppinen, K., & D’Acunto, L. (2018). Exposure diversity as a design principle for recommender systems. Information, Communication & Society, 21(2), 191–207. https://doi.org/10.1080/1369118X.2016.1271900
  • High-Level Expert Group on AI. (2019). Ethics guidelines for trustworthy AI. https://digital-strategy.ec.europa.eu/en/library/ethics-guidelines-trustworthy-ai
  • Hildebrandt, M. (2022). The issue of proxies and choice architectures. Why EU Law matters for recommender systems. Frontiers in Artificial Intelligence, 5, 789076. https://doi.org/10.3389/frai.2022.789076
  • Hinds, J., & Joinson, A. N. (2018). What demographic attributes do our digital footprints reveal? A systematic review. PLoS One, 13(11), e0207112. https://doi.org/10.1371/journal.pone.0207112
  • Honkala, M., & Cui, Y. (2012). Automatic on-device filtering of social networking feeds. Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design, 721–730. https://doi.org/10.1145/2399016.2399126
  • Iyengar, S., Lelkes, Y., Levendusky, M., Malhotra, N., & Westwood, S. J. (2019). The origins and consequences of affective polarization in the United States. Annual Review of Political Science, 22(1), 129–146. https://doi.org/10.1146/annurev-polisci-051117-073034
  • Jannach, D., Naveed, S., & Jugovac, M. (2017). User Control in Recommender Systems: Overview and Interaction Challenges. E-Commerce and Web Technologies, 278, 21–33. https://doi.org/10.1007/978-3-319-53676-7_2
  • Jeckmans, A. J. P., Beye, M., Erkin, Z., Hartel, P., Lagendijk, R. L., & Tang, Q. (2013). Privacy in recommender systems. In N. Ramzan, R. van Zwol, J.-S. Lee, K. Clüver, & X.-S. Hua (Eds.), Social media retrieval (pp. 263–281). Springer London.
  • Jin, Y., Tintarev, N., & Verbert, K. (2018). Effects of personal characteristics on music recommender systems with different levels of controllability. Proceedings of the 12th ACM Conference on Recommender Systems (pp. 13–21). https://doi.org/10.1145/3240323.3240358
  • Karimi, M., Jannach, D., & Jugovac, M. (2018). News recommender systems – Survey and roads ahead. Information Processing & Management, 54(6), 1203–1227. https://doi.org/10.1016/j.ipm.2018.04.008
  • Kluttz, D. N., & Mulligan, D. K. (2019). Automated decision support technologies and the legal profession. Berkeley Technology Law Journal, 34(3), 853–890.
  • Knijnenburg, B. P., & Kobsa, A. (2013). Making Decisions about Privacy: Information Disclosure in Context-Aware Recommender Systems. ACM Transactions on Interactive Intelligent Systems, 3(3), 1–23. https://doi.org/10.1145/2499670
  • Knijnenburg, B. P., Willemsen, M. C., Gantner, Z., Soncu, H., & Newell, C. (2012). Explaining the user experience of recommender systems. User Modeling and User-Adapted Interaction, 22(4), 441–504. https://doi.org/10.1007/s11257-011-9118-4
  • Kozyreva, A., Lorenz-Spreen, P., Hertwig, R., Lewandowsky, S., & Herzog, S. M. (2021). Public attitudes towards algorithmic personalization and use of personal data online: Evidence from Germany, Great Britain, and the United States. Humanities and Social Sciences Communications, 8(1), 1–11. https://doi.org/10.1057/s41599-021-00787-w
  • Lorenz-Spreen, P., Oswald, L., Lewandowsky, S., & Hertwig, R. (2022). A systematic review of worldwide causal and correlational evidence on digital media and democracy. Nature Human Behaviour, 7, 74–101. https://doi.org/10.1038/s41562-022-01460-1
  • Lyons, H., Velloso, E., & Miller, T. (2021). Conceptualising contestability: Perspectives on contesting algorithmic decisions. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW1), 1–25. https://doi.org/10.1145/3449180
  • McDonald, A. M., & Cranor, L. F. (2008). The cost of reading privacy policies. I/S: A Journal of Law and Policy for the Information Society, 3(4), 543–568.
  • McQuillan, D. (2022). Resisting AI: An anti-fascist approach to artificial intelligence. Policy Press.
  • Monzer, C., Moeller, J., Helberger, N., & Eskens, S. (2020). User perspectives on the news personalisation process: Agency, trust and utility as building blocks. Digital Journalism, 8(9), 1142–1162. https://doi.org/10.1080/21670811.2020.1773291
  • Newport, C. (2019). Digital minimalism: Choosing a focused life in a noisy world. Penguin.
  • Nguyen, T. T., Hui, P.-M., Harper, F. M., Terveen, L., & Konstan, J. A. (2014). Exploring the filter bubble: The effect of using recommender systems on content diversity. Proceedings of the 23rd International Conference on World Wide Web, 677–686. https://doi.org/10.1145/2566486.2568012
  • Nyhan, B., Settle, J., Thorson, E., Wojcieszak, M., Barberá, P., Chen, A. Y., Allcott, H., Brown, T., Crespo-Tenorio, A., Dimmery, D., Freelon, D., Gentzkow, M., González-Bailón, S., Guess, A. M., Kennedy, E., Kim, Y. M., Lazer, D., Malhotra, N., Moehler, D., … Tucker, J. A. (2023). Like-minded sources on Facebook are prevalent but not polarizing. Nature, 620, 137–144. https://doi.org/10.1038/s41586-023-06297-w
  • OECD. (2019). Recommendation of the Council on Artificial Intelligence. https://legalinstruments.oecd.org/en/instruments/oecd-legal-0449
  • Oeldorf-Hirsch, A., & Neubaum, G. (2023). Attitudinal and behavioral correlates of algorithmic awareness among German and U.S. social media users. Journal of Computer-Mediated Communication: JCMC, 28(5), zmad035.
  • Papadamou, K., Papasavva, A., Zannettou, S., Blackburn, J., Kourtellis, N., Leontiadis, I., Stringhini, G., & Sirivianos, M. (2020). Disturbed YouTube for kids: Characterizing and detecting inappropriate videos targeting young children. Proceedings of the International AAAI Conference on Web and Social Media, 14, 522–533. https://doi.org/10.1609/icwsm.v14i1.7320
  • Powers, E. (2017). My news feed is filtered? Digital Journalism, 5(10), 1315–1335. https://doi.org/10.1080/21670811.2017.1286943
  • Schertel Mendes, L., & Kira, B. (2023). The road to regulation of artificial intelligence: The Brazilian experience. Internet Policy Review.
  • Schumacher, S. (2020, April 2). 8 Charts on internet use around the world as countries grapple with COVID-19. Pew Research Center. https://www.pewresearch.org/short-reads/2020/04/02/8-charts-on-internet-use-around-the-world-as-countries-grapple-with-covid-19/
  • Smit, E. G., Van Noort, G., & Voorveld, H. A. M. (2014). Understanding online behavioural advertising: User knowledge, privacy concerns and online coping behaviour in Europe. Computers in Human Behavior, 32, 15–22. https://doi.org/10.1016/j.chb.2013.11.008
  • Sun, Y., Drivas, M., Liao, M., & Sundar, S. S. (2023). When recommender systems snoop into social media, users trust them less for health advice. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems.
  • Swart, J. (2021). Experiencing algorithms: How young people understand, feel about, and engage With algorithmic news selection on social media. Social Media + Society, 7(2), 205630512110088. https://doi.org/10.1177/20563051211008828
  • Taylor, S. H., & Brisini, K. S. C. (2024). Parenting the TikTok algorithm: An algorithm awareness as process approach to online risks and opportunities. Computers in Human Behavior, 150, 107975. https://doi.org/10.1016/j.chb.2023.107975
  • Thurman, N., Moeller, J., Helberger, N., & Trilling, D. (2019). My friends, editors, algorithms, and I. Digital Journalism, 7(4), 447–469. https://doi.org/10.1080/21670811.2018.1493936
  • Turow, J., King, J., Hoofnagle, C. J., Bleakley, A., & Hennessy, M. (2009). Americans reject tailored advertising and three activities that enable It. SSRN Electronic Journal, https://doi.org/10.2139/ssrn.1478214
  • Van den Bogaert, L., Geerts, D., & Harambam, J. (2022). Putting a human face on the algorithm: Co-designing recommender personae to democratize news recommender systems. Digital Journalism, 1–21. https://doi.org/10.1080/21670811.2022.2097101
  • van Drunen, M., Zarouali, B., & Helberger, N. (2022). Recommenders you can rely on: A legal and empirical perspective on the transparency and control individuals require to trust news personalisation. JIPITEC, 13(3), 302–322.
  • Vogels, E., & Gelles-Watnick, R. (2023, April 24). Teens and social media: Key findings from Pew Research Center surveys. Pew Research Center. https://www.pewresearch.org/short-reads/2023/04/24/teens-and-social-media-key-findings-from-pew-research-center-surveys/
  • Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151. https://doi.org/10.1126/science.aap9559
  • Youn, S., & Kim, S. (2019). Newsfeed native advertising on Facebook: Young millennials’ knowledge, pet peeves, reactance and ad avoidance. International Journal of Advertising, 38(5), 651–683. https://doi.org/10.1080/02650487.2019.1575109
  • Youyou, W., Kosinski, M., & Stillwell, D. (2015). Computer-based personality judgments are more accurate than those made by humans. Proceedings of the National Academy of Sciences, 112(4), 1036–1040. https://doi.org/10.1073/pnas.1418680112
  • Zarouali, B., Boerman, S. C., & de Vreese, C. H. (2021). Is this recommended by an algorithm? The development and validation of the algorithmic media content awareness scale (AMCA-scale). Telematics and Informatics, 62, 101607. https://doi.org/10.1016/j.tele.2021.101607
  • Zhou, X., Xu, Y., Li, Y., Josang, A., & Cox, C. (2012). The state-of-the-art in personalized recommender systems for social networking. Artificial Intelligence Review, 37(2), 119–132. https://doi.org/10.1007/s10462-011-9222-1
  • Zuboff, S. (2019). The Age of surveillance capitalism: The fight for a human future at the New frontier of power. Profile Books.