1,638
Views
2
CrossRef citations to date
0
Altmetric
Articles

Algorithms as regulatory objects

ORCID Icon
Pages 1542-1558 | Received 01 Sep 2019, Accepted 17 Dec 2020, Published online: 17 Jan 2021
 

ABSTRACT

The recent dispersion of algorithms throughout a large part of social life makes them valid analytical objects for sociology in the twenty-first century. The ubiquity of algorithms has led to increased public attention, scrutiny and, consequently, regulation. That is the focus of this paper. I will show that such regulatory processes are not just aimed at preventing certain algorithmic activities, but that they are also co-producing algorithms. They determine, in specific settings, what an algorithm is and what it ought to do. I will illustrate this by comparing two different European regulations aimed at algorithmic practices: the regulation of trading algorithms in the German High Frequency Trading Act and in the Markets in Financial Instruments Directive (MiFID II), and the regulation of personal data processing in the General Data Protection Regulation (GDPR).

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1 In jurisprudence, natural person refers to an individual human being. In turn, a legal person also refers organizations such as private firms and organizational agencies.

2 I thank one of the reviewers for pointing out that the GDPR also contains paternalistic rules that assume a non-digitally-literate subject, e.g., the ‘vulnerable data subject’ (Malgieri & Niklas, Citation2020).

3 A philosophical summary of this imaginary subject, implicit in the legal text, can be found in Anthony Giddens’ Magna Carta for the Digital Age that he formulated during his time on the House of Lords Select Committee on Artificial Intelligence in the UK (Giddens, Citation2018).

4 That does not mean traditional aspects of the natural person are unrelated to the anonymous identification process. They can be reconstructed, for instance by inference. In fact, some researchers claim ‘that Google has the ability to connect the anonymous data collected through passive means with the personal information of the user’ (Schmidt, Citation2018, p. 4). Thus, individual attributes about gender, race, political orientation etc. can be inferred from presumably ‘anonymous’ data (Kuner et al., Citation2012).

Additional information

Notes on contributors

Robert Seyfert

Robert Seyfert is Senior Lecturer / Senior Researcher (Akademischer Rat) at the Institute of Sociology at Universität Duisburg-Essen, Germany. He works at the intersection of algorithmic cultures, algorithmic sociality and affect theory. His research focuses on the societal implications of emergent digital technologies, including algorithmic trading, connected and assisted driving, and the regulation and governance of algorithms.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.