References
- AI Now Institute. (2018, November 25). AI Now 2018 Symposium [Video recording]. Retrieved from https://www.youtube.com/watch?v=NmdAtfcmTNg&feature=youtu.be&t=2219
- Ajunwa, I. (2019). Algorithms at work: Productivity monitoring platforms and wearable technology as the new data-centric research agenda for employment and labor law. St. Louis University Law Journal, 63(47), 1–47.
- Alexander, M. (2012). The new Jim Crow: Mass incarceration in the age of colorblindness. New York, NY: The New Press.
- Ananny, M. (2016). Toward an ethics of algorithms: Convening, observation, Probability, and Timeliness. Science, Technology, & Human Values, 41(1), 93–117.
- Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016, May 23). Machine Bias. ProPublica. Retrieved from https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
- Bagenstos, S. R. (2006). The structural turn and the limits of antidiscrimination law. California Law Review, 94(1), 1–47.
- Barocas, S., & Selbst, A. D. (2016). Big data’s disparate impact. California Law Review, 104, 671–732.
- Bell, C. (2010). Is disability studies actually white disability studies? In L. J. Davis (Ed.), The disability studies Reader (3rd ed., pp. 266–273). New York, NY: Routledge.
- Berk, R., Heidari, H., Jabbari, S., Kearns, M., & Roth, A. (2017). Fairness in criminal justice risk assessments: The state of the art. ArXiv:1703.09207 [Stat]. Retrieved from http://arxiv.org/abs/1703.09207
- Bivens, R., & Hoque, A. S. (2018). Programming sex, gender, and sexuality: Infrastructural failures in the “feminist” dating app Bumble. Canadian Journal of Communication, 43(3), 441–459.
- boyd, D., & Crawford, K. (2012). Critical questions for big data: Provocations for a cultural, technological, and scholarly phenomenon. Information, Communication & Society, 15(5), 662–679.
- Brock, A. (2018). Critical technocultural discourse analysis. New Media & Society, 20(3), 1012–1030.
- Buolamwini, J. (2018, June 22). When the robot doesn’t see dark skin. The New York Times. Retrieved from https://www.nytimes.com/2018/06/21/opinion/facial-analysis-technology-bias.html
- Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of Machine Learning Research, 81, 77–91.
- Calders, T., & Verwer, S. (2010). Three naive Bayes approaches for discrimination-free classification. Data Mining and Knowledge Discovery, 21(2), 277–292.
- Calo, R. (2013). Consumer subject review boards: A thought experiment. Stanford Law Review, 66, 97–102.
- Carastathis, A. (2016). Intersectionality: Origins, contestations, horizons. Lincoln, NE: University of Nebraska Press.
- Cinnamon, J. (2017). Social injustice in surveillance capitalism. Surveillance & Society, 15(5), 609–625.
- Citron, D. K. (2007). Technological due process. Washington University Law Review, 85, 1249–1314.
- Citron, D. K., & Pasquale, F. (2014). The scored society: Due process for automated predictions. Washington Law Review, 89, 1–34.
- Cohen, J. E. (2018). The biopolitical public domain: The legal construction of the surveillance economy. Philosophy & Technology, 31(2), 213–233.
- Costanza-Chock, S. (2018). Design justice, A.I., and escape from the matrix of domination. Journal of Design and Science, 3.5. Retrieved from https://jods.mitpress.mit.edu/pub/costanza-chock
- Crawford, K., & Joler, V. (2018). Anatomy of an AI System: The Amazon Echo As An Anatomical Map of Human Labor, Data and Planetary Resources [Diagram]. AI Now Institute and Share Lab. Retrieved from http://anatomyof.ai
- Crawford, K., & Schultz, J. (2014). Big data and due process: Toward a framework to redress predictive privacy harms. Boston College Law Review, 55, 93–128.
- Crenshaw, K. (1988). Race, reform, and retrenchment: Transformation and legitimation in antidiscrimination law. Harvard Law Review, 101(7), 1331–1387.
- Crenshaw, K. (1989). Demarginalizing the intersection of race and sex: A Black feminist critique of antidiscrimination doctrine, feminist theory and antiracist politics. University of Chicago Legal Forum, 1(8), 139–167.
- Crenshaw, K. (1991). Mapping the margins: Intersectionality, identity politics, and violence against Women of Color. Stanford Law Review, 43(6), 1241–1299.
- Data & civil rights. (2014, October 30). Why “big data” is a civil rights issue. Retrieved from http://www.datacivilrights.org/2014/
- Dencik, L., Jansen, F., & Metcalfe, P. (2018, August 30). A conceptual framework for approaching social justice in an age of datafication. DATAJUSTICE project. Retrieved from https://datajusticeproject.net/2018/08/30/a-conceptual-framework-for-approaching-social-justice-in-an-age-of-datafication/
- Dwork, C., Hardt, M., Pitassi, T., Reingold, O., & Zemel, R. (2012). Fairness through awareness. In Proceedings of the 3rd Innovations in Theoretical Computer Science Conference (pp. 214–226). New York, NY: ACM.
- Eubanks, V. (2018, January). The digital poorhouse. Harper’s Magazine. Retrieved from https://harpers.org/archive/2018/01/the-digital-poorhouse/
- European Commission. (2012, January 25). Commission proposes a comprehensive reform of data protection rules to increase users’ control of their data and to cut costs for businesses. Retrieved from http://europa.eu/rapid/press-release_IP-12-46_en.htm
- Feldman, M., Friedler, S. A., Moeller, J., Scheidegger, C., & Venkatasubramanian, S. (2015). Certifying and removing disparate impact. In KDD 2015. Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 259–268). Sydney, NSW, Australia: ACM Press.
- Flagg, B. J. (1993). “Was blind, but now I see”: White race consciousness and the requirement of discriminatory intent. Michigan Law Review, 91(5), 953–1017.
- Fontaine, C. (2016, August 8). The myth of accountability: How data (mis)use is reinforcing the problems of public education. Data & Society. Retrieved from https://datasociety.net/output/the-myth-of-accountability-how-data-misuse-is-reinforcing-the-problems-of-public-education/
- Freeman, A. D. (1978). Legitimizing racial discrimination through antidiscrimination law: A critical review of Supreme Court doctrine. Minnesota Law Review, 62, 1049–1120.
- Friedler, S. A., Scheidegger, C., & Venkatasubramanian, S. (2016). On the (im)possibility of fairness. ArXiv:1609.07236 [Cs, Stat]. Retrieved from http://arxiv.org/abs/1609.07236
- Friedman, B., & Nissenbaum, H. (1996). Bias in computer systems. ACM Transactions on Information Systems (TOIS), 14(3), 330–347.
- Gandy, O. H., Jr. (1993). The panoptic sort: A political economy of personal information. Critical studies in communication and in the cultural industries. Boulder, CO: Westview Press, Inc.
- Gandy, O. H., Jr. (1995). It’s discrimination, stupid. In J. Brook & I. Boal (Eds.), Resisting the virtual life: The culture and politics of information (pp. 35–47). San Francisco, CA: City Lights.
- Gandy, O. H. (2010). Engaging rational discrimination: Exploring reasons for placing regulatory constraints on decision support systems. Ethics and Information Technology, 12(1), 29–42.
- Gangadharan, S. P. (2014). Data-based discrimination. In S. P. Gangadharan (Ed.), Data and discrimination: Collected essays (pp. 2–4). Washington, DC: Open Technology Institute.
- Garland-Thomson, R. (2006). Ways of staring. Journal of Visual Culture, 5(2), 173–192.
- Gillespie, T. (2012). Can an algorithm be wrong? - escholarship. Limn, 1(2). Retrieved from https://escholarship.org/uc/item/0jk9k4hj
- Gillespie, T. (2017). Algorithmically recognizable: Santorum’s Google problem, and Google’s Santorum problem. Information, Communication & Society, 20(1), 63–80.
- Gotanda, N. (1991). A critique of “Our Constitution Is color-Blind”. Stanford Law Review, 44(1), 1–68.
- Gray, J. (2011, September 21). Delusions of peace. Prospect Magazine. Retrieved from https://www.prospectmagazine.co.uk/magazine/john-gray-steven-pinker-violence-review
- Hajian, S., & Domingo-Ferrer, J. (2013). A methodology for direct and indirect discrimination prevention in data mining. IEEE Transactions on Knowledge and Data Engineering, 25(7), 1445–1459.
- Harcourt, B. E. (2006). Against prediction: Profiling, policing, and punishing in an actuarial age (Reprint ed.). Chicago: University of Chicago Press.
- Hardt, M., Price, E., & Srebro, N. (2016, December). Equality of opportunity in supervised learning. Paper presented at the 30th Conference on Neural information processing systems (NIPS 2016), Barcelona, Spain.
- Hoffmann, A. L. (2016). Google books, libraries, and self-respect: Information justice beyond distributions. The Library Quarterly, 86(1), 76–92.
- Hoffmann, A. L. (2017). Beyond distributions and primary goods: Assessing applications of Rawls in information science and technology literature since 1990. Journal of the Association for Information Science and Technology, 68(7), 1601–1618.
- Hoofnagle, C. J. (2015, April 11). The origin of fair information practices [Essay]. Retrieved from https://www.law.berkeley.edu
- Johnson, D. G., & Mulvey, J. M. (1993). Computer decisions: Ethical issues of responsibility and bias (Report No. SOR-93-11). Statistics and Operations Research Series. Princeton, NJ: Princeton University.
- Kamiran, F., & Calders, T. (2012). Data preprocessing techniques for classification without discrimination. Knowledge and Information Systems, 33(1), 1–33.
- Kearns, M., Neel, S., Roth, A., & Wu, Z. S. (2017). Preventing fairness gerrymandering: Auditing and learning for subgroup fairness. ArXiv:1711.05144 [Cs.LG]. Retrieved from http://arxiv.org/abs/1711.05144
- Kilbertus, N., Carulla, M. R., Parascandolo, G., Hardt, M., Janzing, D., & Schölkopf, B. (2017, December). Avoiding discrimination through causal reasoning. Paper presented at the 31st Conference on Neural information processing systems (NIPS 2017), Long Beach, California.
- Kleinberg, J., Mullainathan, S., & Raghavan, M. (2016). Inherent trade-offs in the fair determination of risk scores. ArXiv:1609.05807 [cs.LG]. Retrieved from http://arxiv.org/abs/1609.05807
- Krieger, L. H. (1995). The content of our categories: A cognitive bias approach to discrimination and equal employment opportunity. Stanford Law Review, 47(6), 1161–1248.
- Levy, K. E. C. (2015). The contexts of control: Information, power, and truck-driving work. The Information Society, 31(2), 160–174.
- May, V. M. (2015). Pursuing intersectionality, unsettling dominant imaginaries. New York: Routledge.
- McKittrick, K. (2006). Demonic grounds: Black women and the cartographies of struggle. Minneapolis, MN: University of Minnesota Press.
- McNamara, R. M., Jr. (1973). The Fair Credit Reporting Act: A legislative overview. Journal of Public Law, 22(1), 67–102.
- Miller, C. C. (2015, July 9). When algorithms discriminate. The New York Times. Retrieved from https://www.nytimes.com/2015/07/10/upshot/when-algorithms-discriminate.html
- Mills, C. W. (2017). Black rights/white wrongs: The critique of racial liberalism. Oxford, UK: Oxford University Press.
- Mitchell, S., Potash, E., & Barocas, S. (2018). Prediction-based decisions and fairness: A catalogue of choices, assumptions, and definitions. ArXiv:1811.07867 [stat.AP]. Retrieved from http://arxiv.org/abs/1711.05144
- Moor, J. H. (1985). What is computer ethics? Metaphilosophy, 16(4), 266–275.
- Muñoz, C., Smith, M., & Patil, D. J. (2016). Big data: A report on algorithmic systems, opportunity, and civil rights. Washington, D.C.: Executive Osffice of the President.
- Mutua, A. D. (2006). The rise, development and future directions of critical race theory and related scholarship. Denver University Law Review, 84, 329–394.
- Nash, J. C. (2017). Intersectionality and its discontents. American Quarterly, 69(1), 117–129.
- Noble, S. U. (2016). A future for intersectional Black feminist technology studies. The Scholar & Feminist Online, 13.3–14.1. Retrieved from http://sfonline.barnard.edu/traversing-technologies/safiya-umoja-noble-a-futurefor-
- Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York, NY: New York University Press.
- O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. New York, NY: Crown Publishing Group.
- Onuoha, M. (2018, February 7). On algorithmic violence: Attempts at fleshing out the concept of algorithmic violence [Essay]. Retrieved from https://github.com/MimiOnuoha/On-Algorithmic-Violence
- Overdorf, R., Kulynych, B., Balsa, E., Troncoso, C., & Gürses, S. (2018, December). Questioning the assumptions behind fairness solutions. Paper presented at the 32nd Conference on Neural information processing systems (NIPS 2018), Montreal, Canada.
- Pasquale, F. (2016). The black box society: The secret algorithms that control money and information (Reprint ed.). Cambridge, MA: Harvard University Press.
- Podesta, J., Pritzker, P., Moniz, E. J., Holdren, J., & Zientz, J. (2014). Big data: Seizing opportunities, Preserving values. Washington, D.C.: Executive Office of the Presidents.
- Rosenblat, A., Randhava, R., boyd, D., Gangadharan, S. P., & Yu, C. (2014a). Data & civil rights: Consumer finance primer [Report]. Data & Society Research Institute. Retrieved from http://www.datacivilrights.org/pubs/2014-1030/Finance.pdf
- Rosenblat, A., Wikelius, K., boyd, D., Gangadharan, S. P., & Yu, C. (2014b). Data & civil rights: Criminal justice primer [Report]. Data & Society Research Institute. Retrieved from http://www.datacivilrights.org/pubs/2014-1030/CriminalJustice.pdf
- Seaver, N. (2017). Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society, 4(2), 1–12.
- Shew, A. (2017, November 11). Technoableism, cyborg bodies, and Mars [Blog post]. Retrieved from https://techanddisability.com/2017/11/11/technoableism-cyborg-bodies-and-mars/
- Snow, J. (2018, February 14). “We’re in a diversity crisis”: Cofounder of Black in AI on what’s poisoning algorithms in our lives. MIT Technology Review. Retrieved from https://www.technologyreview.com/s/610192/were-in-a-diversity-crisis-black-in-ais-founder-on-whats-poisoning-the-algorithms-in-our/
- Spade, D. (2015). Normal life: Administrative violence, critical trans politics, and the limits of law (Revised and expanded ed.). Durham, NC: Duke University Press.
- Sweeney, L. (2013). Discrimination in online ad delivery. Queue, 11(3), 10:10–10:29.
- Sweeney, M. E. (2016). The intersectional interface. In S. U. Noble & B. M. Tynes (Eds.), The intersectional internet: Race, sex, class, and culture online (pp. 215–228). Switzerland: Peter Lang International Academic Publishers.
- University of Bath. (2017, April 13). Biased bots: Human prejudices sneak into AI systems [Press release]. Retrieved from http://www.bath.ac.uk/research/news/2017/04/13/biased-bots-artificial-intelligence/
- Willson, M. (2017). Algorithms (and the) everyday. Information, Communication & Society, 20(1), 137–150.
- Young, I. M. (2006). Taking the basic structure seriously. Perspectives on Politics, 4(1), 91–97.
- Zafar, M. B., Valera, I., Gomez Rodriguez, M., & Gummadi, K. P. (2017). Fairness beyond disparate treatment & disparate impact: Learning classification without disparate mistreatment. In Proceedings of the 26th International Conference on World Wide Web (pp. 1171–1180). Perth, Australia: ACM Press.