794
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Probability and Agency: Introduction

Contemporary technologies and media practices have irrevocably shifted the ways in which agency is exerted in social contexts. On the one hand, we are faced with the loss of faith in human intuitive knowledge, caused largely by dataism and its neoplatonic tendencies. On the other hand, there is the impasse of overdetermination and indeterminacy. Consider, for instance, the 2017 Grenfell disaster. For years, the inhabitants of the Grenfell tower in London – where a fire broke out in June 2017 killing 74 people – were reporting electricity oscillations and frequent explosions of household appliances. They saw this as a clear sign that larger-scale incidents were not only possible but highly probable. However, their reports were ignored as the data about the tower contradicted these ‘impressions’, and the Council chose to trust the data.Footnote1 Data science’s derivation of actionable conclusions from opaque algorithmic processes has neoplatonic tendencies in that it treats these processes as revelatory of a hidden mathematical order that is superior to human knowledge and experience.Footnote2

The so-called ‘objective truth’ promise of dataism has been severely compromised by private tech companies as the by now classic example of Cambridge Analytica, where personal data from over 87 million Facebook users, harvested without their consent, was used to target American voters with individually tailored political advertising in the 2016 presidential election, shows. But dataism is not limited to the manipulative use of misappropriated data alone. A much bigger problem is its relation to knowledge production: the fact that dataism relies primarily on correlation. As Byung-Chul Han has argued drawing on GWF Hegel’s Science of Logic, correlation represents the most primitive level of knowledge; a strong correlation between A and B means that if A changes, B will change, too, but doesn’t explain why that is the case.Footnote3 Causality, by contrast – A causing B – is a relation of necessity. However, the connection between causality and knowledge is not as strong as the connection between reciprocity and knowledge. Reciprocity, where A and B condition each other mutually, is more complex than causality.Footnote4 Yet, reciprocity is not the most complex level of knowledge either as the ‘overarching context for the connection between A and B’ is here not present; only at the level of the ‘Concept’, which ‘comprehends within itself A and B and their context, does knowledge proper become possible.Footnote5 Importantly, this does not occur in human heads alone: ‘the Concept dwells within the things themselves, it is that through which they become what they are’.Footnote6 For Hegel, as for Han, the Concept is an ‘intrinsic conclusion in which everything is comprised and comprehended’.Footnote7

Clearly, this is very different from data science, which produces what we could call rudimentary knowledge. Although Han equates this form of knowledge with absolute ignorance – because it stands in stark contrast with the dataist promise of absolute knowledge – correlation is far from useless in a world much more complicated than Hegel’s.Footnote8 A world where low-level cognitive processes have long been outsourced to automated agents, where organisational complexity transcends the human cognitive horizon and where the mediation of techno-social relations – through which the whole world is entangled in ways that are not subject to conscious presentation – has given rise to a new regime of complexity.Footnote9 As Nick Srnicek has observed in reference to Frederick Jameson, this ‘properly cognitive complexity’, consists of opaque relations of production, diffusely spread in variable processes across the surface of the earth, and multiple non-linearly dynamic processes.Footnote10

Let us for a moment go back to a very familiar, pre-1990s (pre-mass-use of digital technologies) example: 73 seconds after launch at Cape Canaveral, Florida, on 28 January 1986, the Challenger disappears in a cloud of smoke, killing all seven crew members. In her analysis of the accident, Diane Vaughan highlights the micro-causal influences of complex organisational structures, the ‘inevitability of mistake in organisations’ and the interconnectedness of systems’ complexities.Footnote11 Despite the fact that she acknowledges that management could have acted differently, Vaughan bypasses individual responsibility altogether. For her, the effects of ‘unacknowledged and invisible social forces on information, interpretation, knowledge, and – ultimately – action’ are not only ‘very difficult’ to ‘control’ but also very difficult to ‘identify’.Footnote12

However, this is by no means the only possible reaction to complexity, as we will see if we turn to a different ‘accident’, that of the 2008 financial crash, and the financial instrument invented in its wake: the derivative. The derivative is an insurance for the continuously fluctuating rates of exchange. It is heterogeneous and typically outsourced to artificial intelligence, which breaks a complex task like ‘“provide ten million cell phones to a Brazilian subsidiary of a South-African firm”’ into several, smaller tasks: outsource the interior architecture of the device to a German-Italian enterprise; outsource casings to a Mexican manufacturer; outsource the manufacture of all other components to a Japanese firm; and, finally, underwrite the different currencies and their perpetually fluctuating exchange rates.Footnote13 In other words, the derivative is both a complex and a ‘one-off’ solution that has – surprisingly – played a role ‘parallel to that played by gold in the nineteenth century’.Footnote14 This is because in a system of continually oscillating currencies, each derivative is a ‘“unique and momentarily definitive combination of those currency values’ and acts as a ‘new standard of value and thereby the new Absolute”’.Footnote15 As is plain to see, the derivative’s ability to contain micro-volatility, based on probability calculations, and produce (at least a form of) equilibrium, is quite remarkable.Footnote16 But would this mean that all we need to cope with increasing levels of social, environmental and technological complexity are better, more agile probability calculations?

Probability is most often associated with the mathematical framework for analysing unforeseeable events where the probability of an event is a number between 0 and 1, and where 0 indicates impossibility and 1 indicates certainty. A classic example is a coin toss. If the coin toss is fair, the probability of flipping a head or a tail will be around 50% and the precision with which this will occur will increase with the number of tosses. If the coin is unfair – if it’s, say, incorrectly minted – the two outcomes will not be equally likely. In situations with not one but several hundred influencing factors, probability calculations will require a correlation between past and present data. This has been common practice in economic modelling since the nineteenth century until John Maynard Keynes’ 1923 Treatise on Probability changed the temporal direction of data analysis, suggesting that data about past events should be complemented by future propositions.Footnote17 This futural view was also espoused by John Von Neumann and Oskar Morgenstern in their 1944 Theory of Games and Economic Behavior. As the title suggests, the book proposed probabilistic scenarios based on games where players/decision makers were encouraged to speculate about potential scenarios (what their opponents would do to win) and formulate their strategies accordingly.Footnote18 The influence of game theory on what today is probably best called ‘artificial milintelligence’– a form of AI that weaponises forecasting techniques to create a field of strategic advantage, and is essentially a target machine – is difficult to overestimate.Footnote19

But despite the fact that most of us will find AI applications like the Wekinator – a piece of software that connects to dozens of sensors and creative coding tools to create new musical instruments and/or gesture-responsive choreographic applications – more attractive, it is naïve, perhaps also dangerous, to think that we can do away with future-orientated thinking.Footnote20 We need probability calculations and modelling to enable a cognitive mapping of the future. Given the number of accidents (past events) and risks (potential future accidents), increasing our chances of a surprise-free future by predictively analysing the less obvious consequences of past, present and future actions with the aid of methods such as Karl Popper's fallibilism,Footnote21 computer modelling, gaming and contextual mapping, cross-impact analysis and scenario modelling is key.Footnote22 Equally important are the effects of prediction itself such as self-fulfilling and self-denying predictions.Footnote23 In fact, in a reverse move, prediction has also become a paradigm for intelligence, not only in machine learningFootnote24 but also in the ‘predictive mind theory’, which argues that human brains are ‘prediction machines’ that anticipate sensory input from the environment through ‘hierarchical predictive coding’.Footnote25However, this doesn’t mean that forecasting, which relies on short and long ‘temporal windows’, as used in econometric and climate change models to create abstract futures derived from system dynamics modelling – and is, for his reason, necessarily reductive – is the only option.Footnote26

It is true that beyond the purely organisational realm, the future, as Shoshana Zuboff has persuasively argued, has been hijacked and monopolised through the discourse of inevitabilism and instrumentarianism.Footnote27 For Zuboff, the inevitabilistic reading of the future, part of what she calls ‘surveillance capitalism’, has created an enclosure where human temporality is subsumed by the temporality of machines, that may, after Giorgio Agamben, be termed automated sacerisation. Derived from the Schmittian state of exception and the spatio-temporal dispositief of the World War II concentration camp – which Agamben interprets as the blueprint of modernity – the denigration created in and by the camp, is, for Agamben, not a discrete historical event. It is a performative calibration that continues to operate in contemporary modes of neoliberal coercion. The reason why Agamben insists that ‘we are all virtually homines sacri’ is paradoxical, like the ancient Roman law, according to which homo sacer may be killed but not sacrificed.Footnote28 Excluded from the polis, homo sacer is condemned to bare life. Yet, the very power that excludes her – the sovereign ban – also includes her by way of exclusion as an essentially expendable creature. The only difference, according to Han’s reading of Agamben, is that contemporary homines sacri are not excluded but rather forcibly included –‘shut into the system’ –, that is, constrained in the protentional and interactional sense of the word and, for Han as for Zuboff, bereft of the future.Footnote29

To a significant degree, sacerisation relies on the appropriation of media technologies and on the exploitation of the social and affective needs of present-day humans. Although Zuboff’s mythical vision of ‘good old pre-digital capitalism’ is bizarre to say the least – it’s as if the Frankfurt School had never existedFootnote30 – she correctly identifies that ‘the prediction imperative’ performs integrally with the ‘economies of action’,Footnote31 which turn ‘code’ into ‘law’ by allowing actors such as corporations to embed their values in programmes and protocols, effectively constraining, and, in some cases, dictating human actions.Footnote32 Consider the widespread application of shortcutting algorithmic procedures to complex decision processes, which turn possibilities into probabilities and probabilities into (what appear to be) mathematical-logical necessities – such as algorithmic health insurance claim or loan-application rejections based on too few steps and insufficiently nuanced questions that are irrevocable – despite being erroneous.Footnote33 Overdetermination is here a reductive protentional-interactional operation that robs humans of agency.

Yet, at the other end of the spectrum, processes such as unsupervised machine learning and back propagation produce perpetual change. New forms and scales of indeterminacy are unleashed by recursive machinic operations and the production of new temporalities in and by technical systems.Footnote34 In order to understand the process of machinic or algorithmic self-creation, so to speak, it’s important to understand the medial aspect of inscription, which is anything but invariant. As Cornelia Vismann has observed, inscription – any kind of inscription – is inseparable from self-praxis, regardless of agent (human or other-than-human).Footnote35 All media, understood as mediatic relations as well as gadgets, programmes and protocols, engage in what she calls ‘auto-praxis’ [Eigenpraxis].Footnote36 This means that no thing or process is ever independent of its particular context and conditions of coming into being – space, time and environmental factors.Footnote37 As a result of these complex relations, the agent-thing iteratively steers emergent processes in new and, for humans, unpredictable and imperceptible directions,Footnote38 which is why Vismann concludes that while humans act de jure – in the sense that they are legally responsible for a course of action – things, often, act de facto: they steer emergent processes in new directions.Footnote39

Zuboff, however, contrasts datafied futures with human futures.Footnote40 Echoing the existentialist tradition, she suggests that projecting ourselves into the future is what makes us human.Footnote41 Quoting Jean-Paul Sartre, Zuboff goes on to conclude that ‘in addition to the need to will, it is also necessary to will to will’.Footnote42 The problem, however, is that the ‘will to will’ is too narrowly related to the out-dated separation of humans from machines. It is, without doubt, both correct and necessary to pose questions about human agency in the age of relentless datafication. However, we cannot ignore how far removed the current notion of agency is from the ‘sovereign individual’ model implied by Agamben and Zuboff. For decades now, the term ‘posthuman agency’ has been used as a variation on the anthropocentric, will-centric, (neo)liberal and, often also, male-centric agency. Most elaborations of the term come from feminist, non-humanist theories of subjectivity and from post-1980s sociology, itself in many ways a response to the increasing complexities of global (and, in particular, technological) interconnected-ness.

For example, an emergent theory of agency based on fluids, rather than solids and self-identity, was already present in Luce Irigaray’s 1970s writing.Footnote43 Similarly, in her 1985 Cyborg Manifesto, Donna Haraway argued for multiple, not always manifest ways of being that crossed the human, animal, machine, physical and non-physical divide.Footnote44 In her more recent Companion Species Manifesto, Haraway expands on the initial claim that the cyborg is ‘our ontology’ suggesting that both ‘[c]yborgs and companion species’ bring together ‘carbon and silicon, freedom and structure […] diversity and depletion […] in unexpected ways’.Footnote45 Elsewhere she suggests that agency encompasses diverse agents of interpretation, recording, and directing that multiply relational action, of humans, machines or entrained compounds.Footnote46 For Katherine Hayles, too, the posthuman is ‘a material-informational entity whose boundaries undergo continuous construction and reconstruction’.Footnote47 Instead of thinking in terms of discrete bodies, Hayles advocates the privileging of ‘informational pattern over material instantiation’, since ‘embodiment in a biological substrate is […] an accident of history rather than an inevitability of life’.Footnote48 Similarly, Rosi Braidotti’s plea for ‘an enlarged community’ is based on ‘environmental inter-connections’ and emphasises the inseparability of the posthumanist subject from an ‘eco-philosophy of multiple belongings’.Footnote49

In social theory, Bruno Latour’s well-known Actor-Network Theory (ANT) has advocated a relation between networks and assemblages of humans and non-humans where ‘actors’ – or, better said, ‘actants’ – have equal agency. Having emerged as a response to increasing quantities of post-industrial accidents, of the sort mentioned above, ANT tackled the role of (technological) objects: ‘microbes, scallops, rocks, and ships presented themselves to social theory in a new way’;Footnote50 seatbelts and door stops became the ‘missing masses’ that ‘stand in for human actors’ to embed specific socio-political values and ‘enrol human actors into certain programs of action’.Footnote51 Importantly, Latour’s use of the term ‘objectivity’ doesn’t refer to some sort of ‘inner state of justice and fairness, but to the presence of objects which have been rendered able […] to object to what is told about them.Footnote52 In a more recent work, Latour abandons the idea of programmed actions and their consequences, evident in such phrases as ‘standing in for human actors’, and argues for an ontologically flat account of agency that crosses the animate-inanimate divide.Footnote53

But how precisely do multi-agentic behaviours play out in contemporary media cultures? Despite the significance of such notions as ‘distributed agency’ in fields like digital humanities and anthropology, the relation of distributed agency to human and machinic knowledges, overdetermination, indeterminacy, and future-casting, has been less clear.Footnote54 In fact, most probability calculations often link the moves of pre-formed, pre-existing agents to sequences of likelihood, conditioning, and causation. In alternative accounts of agency, such as Latour’s and Karen Barad’s (which she terms agential realism), agentic compounds emerge from intra-actions with phenomena, discourses, and apparata (rather than the other way around), galvanising configurations that govern relations of likelihood, conditioning, and causation.Footnote55

Relying on Latour, Barad’s and other posthuman agential theories, this transdisciplinary issue takes a critical perspective on the ideology of determinism, statistical procedures and political representation, trans-human augmentation, techno-immunological governance, data visualisation and multi-species agency in an attempt to show how probability and agency co-articulate in contemporary contexts. The gathered articles fall into three thematic strands: a) probability as a cultural, mathematical and political category, in particular its relationship to statistics and information that both create and dissolve specific agential positions; b) technologies that claim to improve the security and/or ability of the user but have the obverse effect; and c) a speculative approach to present and future forms of agency, based on artistic, or art-science strategies and propositions.

In the first strand, Vladimir Tasić analyses the increasingly mathematised framework of contemporary biopolitics (e.g. biometrics and predictive policing as based on the production and surveillance of digital transmission, compression and encryption). Questioning their mathematical backdrop, Tasić traces the many definitions of determinism asking questions such as: do the familiar critiques of the ‘mechanism ideal’, loosely based on nonlinear dynamics yet lush in unexamined political subtext, present an even more troubling appropriation of complex technical developments by the dominant ideologies? Tasić queries the current delineation of territories known as ‘academic disciplines’, in an archipelago, that, he argues, is itself subject to governance by numbers, suggesting a different approach where mathematics, philosophy and sociology may address the political challenges posed by probabilistic-deterministic governance by numbers.

Lukas Griessl’s article ‘From Skopein to Scraping: Probability, Agency, and the Politics of Public Opinion Research’ explores probabilistic reasoning in survey statistics. Applying Jacques Rancière’s theorisation of politics (as essentially pre-cognitive and aesthetic) to public opinion research, he argues that the impression that public opinion research makes it possible for everyone’s voice to be heard, which further enables political agency, is in fact incorrect: public opinion research dissolves the political sphere, necessary for political agency to exist. In a similar vein, Joel White in ‘Intropy, Sintropy and the Rise of Monopolies of Information’, interrogates the correlation between information overload and Bernard Stiegler’s concept of societal informational entropy, seen as a decrease in the reliability of informational sources, which results in a constraint of agency in daily praxis. Given that monopolies of information are monopolies of telecommunication, the attempt to solve the entropy of information by reducing the number of sources has the adverse effect of monopolising knowledge resulting in the control over the dissemination of information, and overdetermining how an agent can act.

The second thematic strand includes essays by Btihaj Ajana and Stephanie Polsky. In ‘The Immunopolitics of Covid-19 Technologies’ Ajana shows how, in the last two and a half years, the concept of immunity has become a central theme in the geopolitical struggle for vaccines and immunity passports. Examining immunity as biopolitical rationality – as that which regiments perceptions of self and other, inside and outside – Ajana analyses discourses circulating in the governmental and public realms and, in particular, their entwinement with the technologies mobilised to immunise the population. Investigating the use of immunity to legitimise governmental strategies in the light of the ‘immunitary biopolitics’, she argues that the ongoing unpredictability of the Covid-19 crisis is an opportunity to rethink the function of immunity – beyond its traditional definitions of self/non-self, exemption and defence – in order to reconceptualise human-material agency within media-ecological dimensions, rather than outside it. In ‘Notes on a State of Extrusion’ Polsky focuses on cognitive augmentation devices. Her premise is that ‘humanity’, like its predecessor the colonial subject, is seen as fundamentally imperfect, incomplete, and dependent, which is why its development, or ‘progress’, requires enhancement devices. Polsky argues that we should understand this approach as one of applied indigeneity, because it undermines the intelligence native to humans with the aim of replacing it with reductive as well as, often, mal-functioning technologies.

In the third strand, Francesca Cavallo, in ‘Who and What Connects the Dots: Emma Kunz’s method as Infographics and the Politics of Probability’, reappraises contemporary probabilistic decision-making models arguing that the renewed interest in divinatory art is not just a reaction to mathematical overdetermination, but the effect of ever-more voracious probabilistic reasoning in search of new forms of sensing the future. Using Kunz’s drawings, which respond to the urge to communicate with the invisible and translate this communication into data-driven visualisations, Cavallo shows how Kunz’s oracular methodology probes probability. In ‘Multi-Agential Situations’ Iain Campbell turns to two of John Cage’s percussion works that use plant materials as instrumentation: Child of Tree (1975) and Branches (1976), to question the agency of instruments and plants on the one hand, and indeterminacy, on the other. Drawing on Michael Marder’s ‘plant-thinking’ and Eudardo Viveiros de Castro’s anthropological notion of ‘multinaturalism’, Campbell discusses multi-species encounters and the transformation of conceptual schemes that present an onto-epistemological formulation of the musical situation. The last article in this strand, by Natasha Lushetich, uses examples of art-science work, that of Stelarc, Eduardo Kac and Maja Smrekar. Approaching these works in Andrew Pickering’s vein – as ‘strange objects’ – Lushetich senses-imagines the qualitative variety of posthuman actants and their agentic processes. The focus on prosthetics, transgenics and cross-species (AI-canine) learning cycles, as well as on tertiary retention, plasticity and a reverse use of Skinnerian reinforced learning techniques, enables a tentative formulation of multi-species (phenomenologically and socially felt) posthuman agency.

Together, the eight articles reflect on the relationship of automated management processes to their mathematical, cultural-historical, political and, more broadly, technological backdrops. By considering agency in its more-than-human forms and exploring how theoretical advancements in the study of agency and probability shift our understanding of the social and cultural effects of emerging media practices, this issue further develops ongoing discourse on the entwinement of technology, embodiment and otherness, offering a multi-scalar perspective on the workings of (inter-)subjectivity, intra-activity and governance in our highly mediatised contemporary life.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Correction Statement

This article has been republished with minor changes. These changes do not impact the academic content of the article.

Additional information

Notes on contributors

Natasha Lushetich

Natasha Lushetich is Professor of Contemporary Art, Media & Theory at the University of Dundee and Arts and Humanities Research Leadership Fellow. Her research is interdisciplinary and focuses on intermedia and critical mediality; global art; the status of sensory experience in cultural knowledge; biopolitics; performativity; datafication and complexity. Email: [email protected]

Notes

1 See, for instance: ‘Grenfell Tower Inquiry’; Davies, ‘What Can We Learn From the Grenfell Tower Disaster?: Priorities for Sustainable Change’; Elmer, ‘The Truth about Grenfell Tower: A Report by Architects for Social Housing’.

2 McQuillan, “Data Science as Machinic Neoplatonism.”

3 Han, Psychopolitics, 68.

4 Ibid.

5 Ibid., 69, emphasis mine.

6 GWF Hegel quoted in Han, Ibid.

7 Ibid.

8 Ibid., 68.

9 See, for example, Hayles, Unthought: The Power of the Cognitive Unconscious.

10 Srnicek, “Navigating Neoliberalism,” para 26.

11 Vaughan, The Challenger Launch Decision, xv.

12 Ibid., 416.

13 LiPuma and Lee quoted in Jameson, “The Aesthetics of Singularity,” 117–118.

14 Jameson, “The Aesthetics of Singularity,” 122.

15 Ibid.

16 In micro-temporal contexts, derivatives, as written contracts that treat the future as having already happened, will fail to create an absolute equilibrium, due to the discrepancy between the flow of information and writing as an inscriptive practice. See Appadurai, Arjun. Banking on Words: The Failure of Language in the Age of Derivative Finance. Chicago: Chicago University Press, 2016.

17 Keynes, Treatise on Probability.

18 Von Neumann and Morgenstern, Theory of Games and Economic Behavior.

19 See Lushetich. “Stupidity: Human and Artificial,” Media Theory, Vol. 6. No. 1, p. 113–126, Nov. 2022.

21 Fallibilism suggests that empirical knowledge propositions can be accepted even though they cannot be proven. See Perkinson, “Popper’s Falibilism”.

22 Bell, Foundations of Future Studies.

23 The classic example of a self-fulfilling prophecy is the rumour that a will go bankrupt, which makes major shareholders withdraw their shares, which, in turn, leads to bankruptcy making the initially false rumour true. In a self-denying prophecy, a probabilistic calculation that there will not be enough trains for any form of ‘public transport’ by 2030, prompts the transport companies to acquisition more trains, and in 2030, there is no shortage of trains, which proves this supposition to be terminally false.

24 See Heikkilä and Heaven. “Yann LeCun Has a Bold New Vision.”

25 Clark, Surfing Uncertainty.

26 Poli, Introduction to Anticipation Studies, 34.

27 For Zuboff, instrumentarianism refers to the fact that users are no longer ends in themselves but have become “a means to profits in new behavioral futures markets” [original emphasis]. See Zuboff, “Surveillance Capitalism and the Challenge of Collective Action,” 13..

28 Agamben, Homo Sacer, 123. It should be emphasised that Agemben does not conflate neoliberal coercion with the Nazi concentration camp.

29 Han, Psychopolitics, 24.

30 In The Age of Surveillance Capitalism, Zuboff reads the current iteration of informational capitalism as a deviant phenomenon apparently neglecting an entire history of critique of soft technologies of power, embedded in the entertainment industry, easification and likefication, from the Frankfurt School to contemporary critics like Franco ‘Bifo’ Berardi.

31 Zuboff, Surveillance Capitalism, 200.

32 Lessig, Code and Other Laws of Cyberspace.

33 See for example, O’Neil, Weapons of Math Destruction; Eubanks, Automating Inequality: How High Tech Tools Profile, Police and Punish the Poor.

34 See, for example, Hayles, Unthought.

35 Vismann, “Cultural Techniques and Sovereignty.”

36 Ibid., 84.

37 All technical contexts have morphological capacities as the Black Swan phenomenon, known in High Frequency Trading, shows. Ellie Ayache defines this phenomenon as the ’non-probability-bearing event of change of context or, in other words, a truly contingent event,’ Ayache, The Black Swan, 11.

38 Ibid.

39 Ibid., 84 –86.

40 Zuboff’s theoretical lineage – social philosophy and labour relations – is also very different to Vismann’s – law and media philosophy.

41 Zuboff, Surveillance Capitalism, 330.

42 Ibid., 290, emphasis original.

43 Irigaray, This Sex Which is Not One, 110.

44 Haraway, “Cyborg Manifesto.”

45 Haraway, The Companion Species Manifesto.

46 Haraway, “Compoundings.”

47 Hayles, How We Became Posthuman, 2.

48 Ibid.

49 Braidotti, The Posthuman, 53.

50 Latour, Reassembling the Social, 10.

51 Latour, “Where are the Missing Masses?,” 64.

52 Latour. Reassembling The Social, 3.

53 Latour, An Inquiry into Modes of Existence.

54 See, for example, Verbeek, Peter-Paul. Moralizing Philosophy: Understanding and Designing the Morality of Things. Chicago: University of Chicago Press, 2011, and Ingold, Tim. Correspondences. Cambridge: Polity, 2020.

55 Barad, Meeting the Universe Half-Way, 2007.

Bibliography

  • Agamben, Giorgio. Homo Sacer: Sovereign Power and Bare Life. Translated by Heller-Roazen, Daniel. Stanford: Stanford University Press, 1988.
  • Appadurai, Arjun. Banking on Words: The Failure of Language in the Age of Derivative Finance. Chicago: University of Chicago Press, 2016.
  • Elmer, Simon. ‘The Truth about Grenfell Tower: A Report by Architects for Social Housing.’ Architects for Social Housin, July 21, 2017. https://architectsforsocialhousing.co.uk/2017/07/21/the-truth-about-grenfell-tower-a-report-by-architects-for-social-housing/
  • Ayache, Ellie. The Black Swan: The End of Probability. Hoboken, NJ: John Wiley, 2010.
  • Barad, Karen. Meeting the Universe Half-Way: Quantum Physics and the Entanglement of Matter and Meaning. Durham: Duke University Press, 2007.
  • Braidotti, Rosi. The Posthuman. Cambridge: Polity, 2013.
  • Bell, Wendell. Foundations of Future Studies. Vol. 1, New Brunswick: Transaction, 1997.
  • Clark, Andy. Surfing Uncertainty: Prediction, Action and the Embodied Mind. Oxford: Oxford University Press, 2016.
  • Davies, Hywel, “"What Can We Learn From the Grenfell Tower Disaster?: Priorities for Sustainable Change." SDAR* Journal of Sustainable Design & Applied Research: Vol. 6: Iss. 1, Article 6 (2018). Accessed April 18, 2023.
  • Eubanks, Virginia. Automating Inequality: How High Tech Tools Profile, Police and Punish the Poor. New York: McMillan, 2018. Grenfell Tower Inquiry. Accessed April 18, 2023. https://www.grenfelltowerinquiry.org.uk/
  • Han, Byung-Chul. Psychopolitics: New Technologies of Power. Translated by Erik Butler. London: Verso, 2017.
  • Haraway, Donna. “Compoundings.” In Sensorium: Embodied Experience, Technology, and Contemporary Art, edited by in Caroline Jones. Cambridge, MA: The MIT Press, 119–124.
  • Haraway, Donna. “Cyborg Manifesto: Science, Technology, and Socialist-Feminism in the Late 20th Century.” In Simians, Cyborgs and Women: The Reinvention of Nature. New York; Routledge, 1991, 149–181.
  • Haraway, Donna. The Companion Species Manifesto: Dogs, People, and Significant Otherenss. Chicago: University of Chicago Press, 2003.
  • Hayles, N. Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature and Informatics. Chicago: University of Chicago Press, 1999.
  • Hayles, N. Katherine. Unthought: The Power of the Cognitive Unconscious. Chicago: University of Chicago Press, 2017.
  • Heikkilä, Melissa, and Will Heaven. “Yann LeCun Has a Bold New Vision for the Future of AI.” MIT Technology Review, June 24, 2022.
  • Ingold, Tim. Correspondences. Cambridge: Polity, 2020.
  • Irigaray, Luce. This Sex Which is Not One. Translated by Porter Catherine. Ithaca: Cornell University Press, 1985.
  • Jameson, Frederick. “The Aesthetics of Singularity.” New Left Review 92, March – April (2015): 101–132.
  • Keynes, John Maynard. Treatise on Probability. The Classics. Us, 2013 [1923].
  • Latour, Bruno. An Inquiry into Modes of Existence: An Anthropology of the Moderns. Translated by Catherine Porter. Harvard: Harvard University Press, 2013.
  • Latour, Bruno. Reassembling the Social: An Introduction to Actor-Network Theory. Oxford: Oxford University Press, 2005.
  • Latour, Bruno. “Where are the Missing Masses? The Sociology of a Few Mundane Artifacts?.” In Shaping Technology/Building Society: Studies in Sociotechnical Change, edited by Wiebe Bijker Wiebe and John Law (eds.) Cambridge MA: MIT Press, 1992, 225–258.
  • Lessig, Lawrence. Code and Other Laws of Cyberspace. New York: Basic Books, 1999.
  • Lushetich, Natasha. “Stupidity: Human and Artificial.” Media Theory, Vol. 6. No. 1(2022): 113–126.
  • McQuillan, Dan. “Data Science as Machinic Neoplatonism.” Philos. Technol. 31 (2018): 253–272.
  • Von Neumann, John and Oskar Morgenstern. Theory of Games and Economic Behavior. Princeton: Princeton University Press, 2007 [1944].
  • O’Neil, Cathy. Weapons of Math Destruction. New York: Broadway Books, 2016; Eubanks, Virginia. Automating Inequality: How High Tech Tools Profile, Police and Punish the Poor. New York: MacMillan, 2018.
  • Perkinson, Henry. “Popper’s Falibilism.” ETC.: A Review of General Semantics, Vol. 35, No. 1 (1978): 5–19.
  • Poli, Roberto. Introduction to Anticipation Studies. Springer, 2017.
  • Srnicek, Nick. “Navigating Neoliberalism: Political Aesthetics in an Age of Crisis.” After Us: Art-Science-Politics (2015): https://medium.com/after-us/navigating-neoliberalism-f9fae2405488
  • Vaughan, Diana. The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA, Chicago: University of Chicago Press, 1996.
  • Verbeek, Peter-Paul. Moralizing Philosophy: Understanding and Designing the Morality of Things. Chicago: University of Chicago Press, 2011.
  • Vismann, Cornelia. “Cultural Techniques and Sovereignty.” Theory, Culture & Society. Vol. 30, Issue 6 (2013): 83–93. Wekinator. Accessed April 18, 2023. http://www.wekinator.org/
  • Zuboff, Shoshana. The Age of Surveillance Capitalism, London: Profile Books, 2019a.
  • Zuboff, Shoshana. “Surveillance Capitalism and the Challenge of Collective Action.” New Labor Forum 28, no. 1 (2019b): 10–29.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.