823
Views
0
CrossRef citations to date
0
Altmetric
Essay

Can Algorithm Knowledge Stop Women from Being Targeted by Algorithm Bias? The New Digital Divide on Weibo

&

References

  • Abubakar, F. M., & Ahmad, H. B. (2013). The moderating effect of technology awareness on the relationship between UTAUT constructs and behavioural intention to use technology. Australian Journal of Business and Management Research, 3(2), 14–23. https://doi.org/10.52283/NSWRCA.AJBMR.20130302A02
  • Barnidge, M., & Xenos, M. A. (2021). Social media news deserts: Digital inequalities and incidental news exposure on social media platforms. New Media & Society, 146144482110595. https://doi.org/10.1177/14614448211059529
  • Blank, G., & Lutz, C. (2018). Benefits and harms from Internet use: A differentiated analysis of Great Britain. New Media & Society, 20(2), 618–640. https://doi.org/10.1177/1461444816667135
  • Bucher, T. (2015). Neither black nor box: Ways of knowing algorithms. In S. Kubitschko & A. Kaun, Eds. Innovative methods in media and communication research. Palgrave Macmillan (pp. 81–98). https://doi.org/10.1007/978-3-319-40700-5_5
  • Bucher, T. (2017). The algorithmic imaginary: Exploring the ordinary affects of Facebook algorithms. INFORMATION COMMUNICATION & SOCIETY, 20(1), 30–44. https://doi.org/10.1080/1369118X.2016.1154086
  • Büchi, M., Fosch-Villaronga, E., Lutz, C., Tamò-Larrieux, A., & Velidi, S. (2021). Making sense of algorithmic profiling: User perceptions on Facebook. INFORMATION COMMUNICATION & SOCIETY, 26(4), 1–17. https://doi.org/10.1080/1369118X.2021.1989011
  • Burrell, J. (2016). How the machine ‘thinks’. Big Data & Society, 3(1), 1–12. https://doi.org/10.1177/2053951715622512
  • Cotter, K. (2021). “Shadowbanning is not a thing”: Black box gaslighting and the power to independently know and credibly critique algorithms. INFORMATION COMMUNICATION & SOCIETY, 26(6), 1–18. https://doi.org/10.1080/1369118X.2021.1994624
  • Cotter, K., & Reisdorf, B. C. (2020). Algorithmic knowledge gaps: A new horizon of (digital) inequality. International Journal of Communication, 14, 21. https://ijoc.org/index.php/ijoc/article/view/12450
  • Courtois, C., & Timmermans, E. (2018). Cracking the tinder code. Journal of Computer- Mediated Communication, 23(1), 1–16. https://doi.org/10.1093/jcmc/zmx001
  • Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St Martin’s Press.
  • Favaretto, M., De Clercq, E., & Elger, B. S. (2019). Big Data and discrimination: Perils, promises and solutions. A systematic review. Journal of Big Data, 6(1), 12. https://doi.org/10.1186/s40537-019-0177-4
  • Fosch-Villaronga, E., Poulsen, A., Søraa, R. A., & Custers, B. H. M. (2021). A little bird told me your gender: Gender inferences in social media. Information Processing & Management, 58(3), 102541. https://doi.org/10.1016/j.ipm.2021.102541
  • Gal, U., Jensen, T. B., & Stein, M.-K. (2017). People analytics in the age of big data: An agenda for is research. ICIS 2017: Transforming Society with Digital Innovation. Proceedings of the 38th International Conference on Information Systems, Seoul, South Korea.
  • Gillespie, T. (2013). The relevance of algorithms. In T. Gillespie, P. Boczkowski, & K. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167–194). MIT Press.
  • Gran, A., Booth, P., & Bucher, T. (2020). To be or not to be algorithm aware: A question of a new digital divide? Information, Communication & Society. INFORMATION COMMUNICATION & SOCIETY, 24(12), 1779–1796. https://doi.org/10.1080/1369118X.2020.1736124
  • Hargittai, E., Gruber, J., Djukaric, T., Fuchs, J., & Brombach, L. (2020). Black box measures? How to study people’s algorithm skills. INFORMATION COMMUNICATION & SOCIETY, 23(5), 764–775. https://doi.org/10.1080/1369118X.2020.1713846
  • Helberger, N., Karppinen, K., & D’Acunto, L. (2018). Exposure diversity as a design principle for recommender systems. INFORMATION COMMUNICATION & SOCIETY, 21(2), 191–207. https://doi.org/10.1080/1369118X.2016.1271900
  • Hong, J.-W., Choi, S., & Williams, D. (2020). Sexist AI: An experiment integrating CASA and ELM. International Journal of Human-Computer Interaction, 36(20), 1928–1941. https://doi.org/10.1080/10447318.2020.1801226
  • Kee, K., & Shin, E. (2022). Algorithm awareness: Why user awareness is critical for personal privacy in the adoption of algorithmic platforms? International Journal of Information Management, 65, 102494. https://doi.org/10.1016/j.ijinfomgt.2022.102494
  • Kim, P. T. (2017). Data-Driven Discrimination at Work, WM Law Review. https://scholarship.law.wm.edu/wmlr/vol58/iss3/4
  • Klawitter, E., & Hargittai, E. (2018). “It’s like learning a whole other language”: The role of algorith- mic skills in the curation of creative goods. International Journal of Communication, 12, 3490–3510. https://ijoc.org/index.php/ijoc/article/view/7864
  • Knobloch-Westerwick, S., Mothes, C., & Polavin, N. (2020). Confirmation bias, ingroup bias, and negativity bias in selective exposure to political information. Communication Research, 47(1), 104–124. https://doi.org/10.1177/0093650217719596
  • Kordzadeh, N., & Ghasemaghaei, M. (2022). Algorithmic bias: Review, synthesis, and future research directions. European Journal of Information Systems, 31(3), 388–409. https://doi.org/10.1080/0960085X.2021.1927212
  • Kotliar, D. M. (2020). The return of the social: Algorithmic identity in an age of symbolic demise. New Media & Society, 22(7), 1152–1167. https://doi.org/10.1177/1461444820912535
  • Kotras, B. (2020). Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge. Big Data & Society, 7(2), 205395172095158. https://doi.org/10.1177/2053951720951581
  • Kümpel, A. S. (2020). The Matthew Effect in social media news use: Assessing inequalities in news exposure and news engagement on social network sites (SNS). Journalism, 21(8), 1083–1098. https://doi.org/10.1177/1464884920915374
  • Lambrecht, A., & Tucker, C. (2019). Algorithmic Bias? An empirical study of apparent gender-based discrimination in the display of STEM career ads. Management Science, 65(7), 2966–2981. https://doi.org/10.1287/mnsc.2018.3093
  • Latzer, M., & Festic, N. (2019). A guideline for understanding and measuring algorithmic governance in everyday life. Internet Policy Review, 8(2). https://doi.org/10.14763/2019.2.1415
  • Mohamed, S., Png, M.-T., & Isaac, W. (2020). Decolonial AI: Decolonial theory as sociotechnical foresight in artificial intelligence. Philosophy & Technology, 33(4), 659–684. https://doi.org/10.1007/s13347-020-00405-8
  • Möller, J., Trilling, D., Helberger, N., & van Es, B. (2018). Do not blame it on the algorithm: An empirical assessment of multiple recommender systems and their impact on content diversity. Information Communication & Society, 21(7), 959–977. https://doi.org/10.1080/1369118X.2018.1444076
  • Shen, E., & Li, Z. (2020). The algorithm mechanism of content recommendations and AI involution on Weibo (微博推荐算法实践与机器学习平台演进). Available at: https://www.infoq.cn/article/xutw5wbtixpbdeyqiivn (Retrieved July 1, 2021)
  • Shin, E. (2020). User perceptions of algorithmic decisions in the personalized AI system: Perceptual evaluation of fairness, accountability, transparency, and explainability. Journal of Broadcasting & Electronic Media, 64(4), 541–565. https://doi.org/10.1080/08838151.2020.1843357
  • Shin, D., Rasul, A., & Fotiadis, A. (2021). Why am I seeing this? Deconstructing algorithm literacy through the lens of users. Internet Research, 32(4), 1214–1234. https://doi.org/10.1108/INTR-02-2021-0087
  • Shrestha, Y. R., & Yang, Y. (2019). Fairness in algorithmic decision-making: Applications in multi-winner voting, machine learning, and recommender systems. Algorithms, 12(9), 199. https://doi.org/10.3390/a12090199
  • Shuo, Y., Korayem, M., AlJadda, K., Grainger, T., & Natarajan, S. (2017). Combining content-based and collaborative filtering for job recommendation system: A cost-sensitive statistical relational learning approach. Knowledge-Based Systems, 136, 37–45. https://doi.org/10.1016/j.knosys.2017.08.017
  • Tarpley, M. (2020). Invisible Women: Exposing data bias in a world designed for men: A book for all. By Caroline Criado Perez 2019. ISBN: 978-1-4197-2907-2. Abrams Press.
  • Thorson, K., Cotter, K., Medeiros, M., & Pak, C. (2021). Algorithmic inference, political interest, and exposure to news and politics on Facebook. INFORMATION COMMUNICATION & SOCIETY, 24(2), 183–200. https://doi.org/10.1080/1369118X.2019.1642934
  • Tu. (2022). The brave new world is now real: Living with media surveillance and algorithm-driven personalized content. International Communication Association 2022.
  • Van Deursen, A. J., & Helsper, E. J. (2018). Collateral benefits of Internet use: Explaining the diverse outcomes of engaging with the Internet. New Media & Society, 20(7), 2333–2351. https://doi.org/10.1177/1461444817715282
  • Wang, R., Harper, F. M., & Zhu, H. (2020). Factors influencing perceived fairness in algorithmic decision-making: Algorithm outcomes, development procedures, and individual differences. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1–14). https://doi.org/10.1145/3313831.3376813
  • Zliobaite, I., & Custers, B. (2016). Using sensitive personal data may be necessary for avoiding discrimination in data-driven decision models. Artificial Intelligence and Law, 24(2), 183–201. https://doi.org/10.1007/s10506-016-9182-5

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.