24,854
Views
94
CrossRef citations to date
0
Altmetric
Articles

Transparent to whom? No algorithmic accountability without a critical audience

&
Pages 2081-2096 | Received 04 Oct 2017, Accepted 11 May 2018, Published online: 18 Jun 2018

References

  • Aalst, W. M. P., Bichler, M., & Heinzl, A. (2017). Responsible data science. Business & Infor-Mation Systems Engineering, 2–4. doi: 10.1007/s12599-017-0487-z
  • Angwin, J. (2016). Make algorithms accountable. The New York Times, 1.
  • Annany, M., & Crawford, K. (2016). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society, 20(3), 973–989.
  • Barocas, S., & Selbst, A. (2016). Big data’s disparate impact. California Law Review, 104(1), 671–729. doi: 10.15779/Z38BG31
  • Berardi, F. (2015). And: Phenomenology of the end. South Pasadena, CA: Semiotext(e).
  • Betancourt, M. (2017). Glitch Art in theory and practice: Critical failures and post-digital Aes-thetics. New York, NY: Routledge.
  • Beunza, D., & Garud, R. (2007). Calculators, lemmings or frame-makers? The intermediary role of securities analysts. In M. Callon, Y. Millo, & F. Muniesa (Eds.), Market devices (pp. 13–39). Oxford: Blackwell. Retrieved from http://onlinelibrary.wiley.com/doi/10.1111/j.1467-954X.2007.00728.x/full
  • Bodó, B., Helberger, N., Irion, K., Borgesius Zuiderveen, F. J., Moller, J., van der Velde, B., & Vreese, C. H. d. (2017). Tackling the algorithmic control crisis – the technical, legal, and ethical challenges of research into algorithmic agents. Yale Journal of Law & Technology, 19, 133.
  • Caliskan, A., Bryson, J. J., & Narayanan, A. (2017). Semantics derived automatically from language corpora contain human-like biases. Science, 356(6334), 183–186. Retrieved from http://science.sciencemag.org/content/356/6334/183.abstract
  • Caliskan-Islam, A., Bryson, J. J., & Narayanan, A. (2016). Semantics derived automatically from language corpora necessarily contain human biases. arXiv:1608.07187v2 [cs.AI] 30 Aug 2016, 6334, 1–14. doi: 10.1126/science.aal4230
  • Chen, H., Chiang, R., & Storey, V. (2012). Business intelligence and analytics: From Big data to Big impact. MIS Quarterly, 36(4), 1165–1188. Retrieved from https://noppa.aalto.fi/noppa/kurssi/37e44000/materiaali/37E44000_3_generations_of_business_analytics_overview.pdf
  • Chun, W. H. K. (2016). Updating to remain the same: Habitual New media. Cambridge, MA: MIT press.
  • Cloninger, C. (2013). Manifesto for a theory of the ‘New aesthetic’. In J. B. Slater & A. Iles (Eds.), Slave to the algorithm. (pp. 16–27). London: Mute.
  • Contreras-Koterbay, S., & Mirocha, Ł. (2016). The new aesthetic and art: Constellations of the postdigital. Amsterdam: Institute of Network Cultures.
  • Council of Europe. (2017). Guidelines on the protection of individuals with regard to the processing of personal data in a world of Big Data.
  • Cubitt, S. (2017). Glitch. Cultural Politics, 13(1), 19–33. doi: 10.1215/17432197-3755156
  • Dekker, C. M., Groenendijk, A., Sliggers, C. J., & Verboom, G. K. (1990). Quality criteria for models to calculate air pollution. Lucht (Air) 90. Leidschendam: Ministry of Housing, Physical Planning and Environment.
  • Department for Transport. (2012). Report of the Laidlaw inquiry.
  • van Doorn, N. (2017). Platform labor: On the gendered and racialized exploitation of low-income service work in the ‘on-demand’ economy. Information, Communication & Society, 20(6), 898–914. doi: 10.1080/1369118X.2017.1294194
  • Emmerson, C., Reed, H., & Shephard, A. (2004). An assessment of PenSim2 (No. 04/21). IFS Working papers. Institute for Fiscal Studies (IFS).
  • Espeland, W. N., & Stevens, M. L. (2008). A sociology of quantification. European Journal of Sociology, 49(3), 401. doi: 10.1017/S0003975609000150
  • Faulkner, P., & Jochen, R. (2013). Technological objects, social positions, and the transformational model of social activity. Mis Quarterly, 37, 803–818.
  • Floridi, L., & Taddeo, M. (2016). What is data ethics? Philosophical Transactions of the Royal So-Ciety A: Mathematical, Physical and Engineering Sciences, 374, 2083. Retrieved from http://rsta.royalsocietypublishing.org/content/374/2083/20160360.abstract
  • Frigg, R., & Hartmann, S. (2012). Models in Science. The Stanford encyclopedia of philosophy. Retrieved from http://plato.stanford.edu/archives/fall2012/entries/models-science/
  • Gandomi, A., & Haider, M. (2015). Beyond the hype: Big data concepts, methods, and analytics. International Journal of Information Management, 35(2), 137–144. doi: 10.1016/j.ijinfomgt.2014.10.007
  • Geiger, R. S. (2014). Bots, bespoke, code and the materiality of software platforms. Information, Communication & Society, 17(3), 342–356. doi: 10.1080/1369118X.2013.873069
  • Goodman, B., & Flaxman, S. (2016). EU regulations on algorithmic decision-making and a “right to explanation.” 2016 ICML Workshop on Human Interpretability in Machine Learning (WHI 2016), (Whi), 26–30. Retrieved from http://arxiv.org/abs/1606.08813
  • Gross, D., & Strand, R. (2000). Can agent-based models assist decisions on large-scale practical problems? A Philosophical Analysis: Complexity, 5(6), 26–33. doi:10.1002/1099-0526(200007/08)5:6≤26::AID-CPLX6≥3.0.CO;2-G
  • Haag, D., & Kaupenjohann, M. (2001). Parameters, prediction, post-normal science and the precau-tionary principle – a roadmap for modelling for decision-making. Ecological Modelling, 144(1), 45–60. doi: 10.1016/S0304-3800(01)00361-1
  • Halter, E. (2010, November). The matter of electronics. Retrieved from http://heavysideindustries.com/wp-content/uploads/2010/11/THE-MATTER-OF-ELECTRONICS-by-Ed-Halter.pdf
  • Hardie, I., & Mackenzie, D. (2007). Constructing the market frame: Distributed cognition and distributed framing in financial markets. New Political Economy, 12(3), 389–403. doi: 10.1080/13563460701485649
  • Heald, D. (2006). Varieties of transparency. Proceedings of the British academy, 135, 25–43 . doi: 10.5871/bacad/9780197263839.003.0002
  • Hildebrandt, M. (2006). Profiling: From data to knowledge. Datenschutz Und Datensicherheit-DuD, 30(9), 548–552.
  • Hildebrandt, M. (2012). The Dawn of a critical transparency right for the profiling era the dawn of a critical transparency right for the profiling era. Digital Enlightenment Yearbook 2012, 12(2008), 40–56.
  • Janssen, M., Charalabidis, Y., & Zuiderwijk, A. (2012). Benefits, adoption barriers and myths of open data and open government. Information Systems Management, 29(4), 258–268. doi: 10.1080/10580530.2012.716740
  • Janssen, M., & Kuk, G. (2016). The challenges and limits of big data algorithms in technocratic governance. Government Information Quarterly, 33(3), 371–377. doi: 10.1016/j.giq.2016.08.011
  • Kallinikos, J., Aaltonen, A., & Marton, A. (2013). The ambivalent ontology of digital artifacts. Mis Quarterly, 37(2), 357–370.
  • Kitchin, R. (2017). Thinking critically about and researching algorithms. Information, Communication & Society, 20(1), 14–29. doi: 10.1080/1369118X.2016.1154087
  • Knoespe, K. J., & Zhu, J. (2008). Continuous materiality: Through a hierarchy of computational codes. Théorie, Littérature, Epistémologie, 25, 235–247.
  • Kolkman, D. (in press). The (in)credibility of data science methods to non-experts. Science Technology Studies.
  • Lathrop, D., & Ruma, L. (2010). Open government: Collaboration, transparency, and participation in practice. Sebastopol, CA: O’Reilly Media.
  • Lazer, D., Kennedy, R., King, G., & Vespignani, A. (2014). The parable of Google Flu: Traps in Big Data Analysis David. Policy Forum, 343(March), 1203–1205.
  • Leonardi, P. (2010). Digital materiality? How artifacts without matter, matter. First Monday, 15(6). doi: 10.5210/fm.v15i6.3036
  • Lepri, B., Oliver, N., Letouzé, E., Pentland, A., & Vinck, P. (2017). Fair, transparent, and accountable algorithmic decision-making processes the premise, the proposed solutions, and the open challenges. Philosophy and Technology. doi: 10.1007/s13347-017-0279-x
  • Liu, A. (2004). The laws of cool: Knowledge work and the culture of information. Chicago, IL: University of Chicago Press.
  • Macpherson, N. (2013). Review of quality assurance of government models. London: HM Treasury.
  • Manon, H., & Temkin, D. (2011). Notes on Glitch. World Picture, 6, 1–15.
  • Marenko, B. (2015). When making becomes divination: Uncertainty and contingency in computational glitch-events. Design Studies, 41, 110–125. doi: 10.1016/j.destud.2015.08.004
  • Mazzotti, M. (2017). Algorithmic life. Retrieved from https://lareviewofbooks.org/article/algorithmic-life/
  • Menkman, R. (2010, June 1). The collapse of PAL. Retrieved from https://vimeo.com/12199201
  • Miller, S. (2014). Collaborative approaches needed to close the big data skills gap. Journal of Organization Design, 3. doi: 10.7146/jod.9823
  • Miller, H. G., & Mork, P. (2013). From data to decisions: A value chain for big data. IT Professional, 15(1), 57–59.
  • Millo, Y., & MacKenzie, D. (2009). The usefulness of inaccurate models: Financial risk manage-ment“ in the wild. Journal of Risk Model Validation, 3(1), 23–49. Retrieved from http://eprints.lse.ac.uk/id/eprint/28966
  • Minsky, M. L. (1965). Matter, mind, and models. Proceedings of International Federation of Information Processing Congress 1, 45–49.
  • Miyazaki, S. (2012). Algorhythmics: Understanding micro-temporality in computational cultures. Computational Culture, 2. Retrieved from http://computationalculture.net/algorhythmics-understanding-micro-temporality-in-computational-cultures/
  • National Institute of Food and Agriculture. (2017). National Institute of Food and Agriculture (pp. 1–2). Data science in agriculture summit. Retrieved from http://nifa.usda.gov/extension
  • Ojeda, T., Murphy, S. P., Bengfort, B., & Dasgupta, A. (2014). Practical data science cookbook. doi: 10.5326/50.5.toc
  • Parisi, L. (2013). Contagious architecture: Computation, aesthetics, and space. Cambridge, MA: MIT Press.
  • Pasquale, F. (2015). The black Box society. Cambridge, MA: Harvard University Press.
  • Peng, G., Ritchey, N. A., Casey, K. S., Kearns, E. J., Privette, J. L., Saunders, D., & Ansari, S. (2016). Scientific stewardship in the open data and big data era – roles and responsibilities of stewards and other major product stakeholders. D-Lib Magazine, 22(5-6), 1–1. doi: 10.1045/may2016-peng
  • Pielke, R. A. (1999). Who decides? Forecasts and Responsibilities in the 1997 Red River Flood. Applied Behavioral Science Review, 7(2), 83–101.
  • Rae, A., & Singleton, A. (2015). Putting big data in its place: A regional studies and regional science perspective. Regional Studies, Regional Science, 2. doi: 10.1080/21681376.2014.990678
  • Responsible Data Science Initiative. (2016). Responsible Data Science Initiative.
  • Rhizome. (2012, May 31). Rhizome The Collapse of PAL. Retrieved from http://classic.rhizome.org/portfolios/artwork/54452/
  • Rodríguez-Iglesias, A., Rodríguez-González, A., Irvine, A. G., Sesma, A., Urban, M., Hammond-Kosack, K. E., & Wilkinson, M. D. (2016). Publishing FAIR data: An exemplar methodology utilizing PHI-base. Frontiers in Plant Science, 7, 641. doi: 10.3389/fpls.2016.00641
  • Sagiroglu, S., & Sinanc, D. (2013, May). Big data: A review. International conference on collaboration technologies and systems (CTS), San Diego, CA (pp. 42–47). IEEE.
  • Sandvig, C., Hamilton, K., Karahalios, K., & Langbort, C. (2014). Auditing algorithms: Research methods for detecting discrimination on internet platforms. Paper presented to Data and Discrimination: Converting Critical Concerns into Productive Inquiry, Seattle, WA.
  • Seaver, N. (2013). Knowing algorithms. Media in Transition, 8, 1–12.
  • Stone, H. S. (1971). Introduction to computer organization and data structures. New York: McGraw-Hill.
  • Stoyanovich, J., Howe, B., Abiteboul, S., Miklau, G., Sahuguet, A., & Weikum, G. (2017). Fides: Towards a platform for responsible data science. International Conference on Scientific and Statistical Database Management, Chicago.
  • Suchman, L. (2007). Human-machine reconfigurations: Plans and situated actions. Cambridge: Cambridge University Press.
  • Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124–1131.
  • Wilkinson, M. D. (2016). The FAIR guiding principles for scientific data management and stewardship. Scientific Data, 3. doi: 10.1038/sdata.2016.18
  • Wilkinson, M. D., Verborgh, R., Bonino da Silva Santos, L. O., Clark, T., Swertz, M. A., Kelpin, F. D. L., … Dumontier, M. (2017). Interoperability and FAIRness through a novel combination of Web technologies. PeerJ Computer Science, 3. doi: 10.7287/peerj.preprints.2522v1