1,638
Views
2
CrossRef citations to date
0
Altmetric
Articles

Algorithms as regulatory objects

ORCID Icon
Pages 1542-1558 | Received 01 Sep 2019, Accepted 17 Dec 2020, Published online: 17 Jan 2021
 

ABSTRACT

The recent dispersion of algorithms throughout a large part of social life makes them valid analytical objects for sociology in the twenty-first century. The ubiquity of algorithms has led to increased public attention, scrutiny and, consequently, regulation. That is the focus of this paper. I will show that such regulatory processes are not just aimed at preventing certain algorithmic activities, but that they are also co-producing algorithms. They determine, in specific settings, what an algorithm is and what it ought to do. I will illustrate this by comparing two different European regulations aimed at algorithmic practices: the regulation of trading algorithms in the German High Frequency Trading Act and in the Markets in Financial Instruments Directive (MiFID II), and the regulation of personal data processing in the General Data Protection Regulation (GDPR).

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1 In jurisprudence, natural person refers to an individual human being. In turn, a legal person also refers organizations such as private firms and organizational agencies.

2 I thank one of the reviewers for pointing out that the GDPR also contains paternalistic rules that assume a non-digitally-literate subject, e.g., the ‘vulnerable data subject’ (Malgieri & Niklas, Citation2020).

3 A philosophical summary of this imaginary subject, implicit in the legal text, can be found in Anthony Giddens’ Magna Carta for the Digital Age that he formulated during his time on the House of Lords Select Committee on Artificial Intelligence in the UK (Giddens, Citation2018).

4 That does not mean traditional aspects of the natural person are unrelated to the anonymous identification process. They can be reconstructed, for instance by inference. In fact, some researchers claim ‘that Google has the ability to connect the anonymous data collected through passive means with the personal information of the user’ (Schmidt, Citation2018, p. 4). Thus, individual attributes about gender, race, political orientation etc. can be inferred from presumably ‘anonymous’ data (Kuner et al., Citation2012).

Additional information

Notes on contributors

Robert Seyfert

Robert Seyfert is Senior Lecturer / Senior Researcher (Akademischer Rat) at the Institute of Sociology at Universität Duisburg-Essen, Germany. He works at the intersection of algorithmic cultures, algorithmic sociality and affect theory. His research focuses on the societal implications of emergent digital technologies, including algorithmic trading, connected and assisted driving, and the regulation and governance of algorithms.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 304.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.