5,464
Views
15
CrossRef citations to date
0
Altmetric
Working For, Against, and Alongside Formalization

Fissures in algorithmic power: platforms, code, and contestation

ORCID Icon & ORCID Icon

ABSTRACT

Digital labour platforms do not attempt to build trust between worker, client, and platform on the basis of strong and durable ties. Instead, platforms utilize a double articulation of algorithmic power to govern spatially dispersed workforces in both material and discursive ways. However, algorithms do not have hegemonic outcomes, and they do not entirely strip away agency from platform workers. Through manipulation, subversion, and disruption, workers bring fissures in algorithmic power into being. Fissures in algorithmic power are moments in which algorithms do not govern as intended. While these moments do not simply result in positive outcomes for workers, they show that algorithmic power is inherently only ever partial.

Introduction: what are fissures in algorithmic power?

Ostensibly, algorithms ‘rule our world’ (Steiner Citation2012). More than a decade ago, Lash (Citation2007, p. 71) argued that ‘a society of ubiquitous media means a society in which power is increasingly in the algorithm’. Algorithms have become actors in society that shape how people live, love, and work. They direct and redirect economic processes, affect high school grades and college admissions, and influence how people get jobs and how those jobs are managed. Algorithms, in other words, exert and mediate power, enabling and constraining social action in a myriad of realms.

In recent years, a body of literature has asserted that algorithms can herald in a new era of efficiency, transparency, and good governance (Schildt Citation2017). In opposition to this work, there is a deep vein of scholarship which is critical of the social benefits of algorithms. Such research has exposed how search engines reinforce racial biases (Noble Citation2018), the impacts of automated decision-making on working-class communities (Eubanks Citation2018), and the relationship between quantification and precarious labour (Moore Citation2017). While these two strands of work clearly stand in opposition to one another, they have one thing in common. Namely, they agree on the transformative power that algorithms can have in society. Therefore, a sense of algorithmic hegemony characterizes contemporary analyses of power relations in the ‘black box society’ (Pasquale Citation2015).

In this paper, we contend that digital labour platforms are essential objects of analysis to grasp the contested nature of algorithmic power. Digital labour platforms can be thought of as a set of digital infrastructures that mediate interactions between consumers and workers: bringing together the supply of and demand for labour. In most cases, a single company controls that digital infrastructure as proprietary resources. However, there are also examples in which control is exerted by multiple economic actors. Following Woodcock and Graham (Citation2019), we distinguish between two types of platform work. Geographically-tethered work requires a job to be done in a particular place (e.g. delivering food from a restaurant to an apartment or driving a person from one part of town to another). Cloud work, in contrast, is work that can, in theory, be requested and conducted from anywhere. Requesters, or clients, can use digital labour platforms to find workers that may be located anywhere on the planet. In both cases, the use of algorithms to govern spatially dispersed workforces is a defining feature of the labour process.

However, we argue that algorithms do not have hegemonic outcomes, and they do not entirely strip away agency from platform workers. Through manipulation, subversion, and disruption, workers bring fissures in algorithmic power into being. We define fissures in algorithmic power as moments in which algorithms do not govern as intended. These fissures are often intentionally created by sociotechnical practices of platform workers.Footnote1 Practices of manipulation, subversion, and disruption, however, do not simply equate with liberation or autonomy for workers. In fact, fissures in algorithmic power entail serious risks: deactivation from the platform, a deepened normalization of gamification and quantification, and the erosion of solidarity and trust between workers. Nonetheless, they underscore the fact that workers have various tactics at their disposal to interact with the digital infrastructure of platforms in ways that work in their favour.

Merriam-Webster (Citation2020) offers two non-medical definitions of a ‘fissure’: First, a ‘narrow opening or crack of considerable length and depth usually occurring from some breaking or parting’. Second, a ‘separation or disagreement in thought or viewpoint’. We mobilize this analogy precisely due to the term’s ambiguity. Given its double entendre of material connotations as in geological fissures and discursive connotations as in ideological fissures, the analogy enables us to relate the contested technological materiality of platforms to their disputed discursive politics.

Before introducing the notion of fissures in algorithmic power, we first outline the value of applying the analysis of the politics of infrastructure to digital labour platforms. Drawing on a multi-year action research project, we then discuss the sociotechnical practices in which workers manipulate, subvert, and disrupt the operations of platforms. We conclude by reflecting on the wider implications of fissures in algorithmic power beyond the realm of digital labour platforms.

Digital labour platforms and the politics of infrastructure

It has long been a key focus of infrastructure studies and related disciplines to understand the ‘materialities of things, sites, people, and processes that locate media distribution within systems of power’ (Parks and Starosielski Citation2015, p. 5). More recently, a rich body of literature has grown around the relatively novel tendency that digital platforms acquire a range of features that were traditionally in the realm of infrastructure (Plantin et al. Citation2018). Plantin and Punathambekar (Citation2019, p. 2) argue that some platforms have reached such a high level of indispensability that they now ‘seem to function as vital infrastructures in the world at large […] such that living without them shackles social and cultural life’. For instance, by governing urban transportation ecosystems (Van Dijck et al. Citation2018) or by bringing a ‘planetary labour market’ into being (Graham and Anwar Citation2019). Van Doorn (Citation2017, p. 5) argues that platforms are ‘active “infrastructural” agents in the reconstitution of labour relations and the nature of work’.

As several contributions in this special issue of Cultural Studies aptly demonstrate, infrastructures are by no means neutral mediators without politics or distinct modes of political contestation. Instead, infrastructures actively shape and influence ‘the nature of a network, the speed and direction of its movement, its temporalities, and its vulnerability to breakdown’ (Larkin Citation2013, p. 328). Along similar lines, Graham (Citation2010, p. 9) contends that studying moments of infrastructural breakdown, dissolution and change is ‘perhaps the most powerful way of really penetrating and problematizing those very normalities of flow and circulation’. In short, the analytical prism of the politics of infrastructure prioritizes the contested vis-à-vis the consensual.

When it comes to conceptualizing the contested politics of digital labour platforms, it is paramount to consider the strain of workers internalizing entrepreneurial risks as an inextricable feature of platform-mediated work, echoing general patterns of the ‘fissured workplace’ (Weil Citation2014). Similar to innovative industries associated with the ‘dot-com bubble’ (Neff Citation2012) and myriad ‘alternative work arrangements’, platform workers face the burden of employment rights enforcement to challenge legal misclassifications as self-employed contractors (Prassl and Risak Citation2015). This status quo is amplified by the fact that most digital platforms classify both workers and clients as customers of their service. As Rosenblat (Citation2018, p. 9) argues, ‘by muddying the bright red line that defines traditionally distinct roles, like those of the worker, entrepreneur, and consumer, [platforms] rewrite the rules of work surrounding algorithmic technology’.

In this regard, it is worth stressing that some digital labour platforms deploy machine learning systems based on real-time data from workers and clients. A key implication of this is the non-compensation of algorithm training. A navigation algorithm of a ride-hailing platform like Uber ‘can guide most traffic along the best performing route, while routing a small fraction of the traffic via an alternate, undertested, possibly sub-optimal route, merely to test the route and generate data about road conditions’ (Choudary Citation2018, p. 12). By capturing valuable data through A/B testingFootnote2 without compensating or informing workers about these experiments, platforms may amplify pre-existing information asymmetries. Given the real-time experimentation that shapes the ontogenetic materiality of algorithmic systems, the affordances that they entail for workers’ modes of transgressing a platform’s code and affecting its operations in unintended ways are dynamic, contingent, and malleable as well. In other words, the digital infrastructure of labour platforms can be described as being in a ‘permanently beta status’ (Neff and Stark Citation2003), perpetually adapting to always-unfolding sociospatial relations and formations of workers and clients.

This section has shown that the politics of infrastructure with respect to digital labour platforms are not located in a single or stable entity, practice or hierarchy. Instead, they are continually negotiated within complex sociotechnical relations that directly affect the livelihood of platform workers. Digital labour platforms do not attempt to build trust between worker, client, and platform on the basis of strong and durable ties. Therefore, we argue that the algorithmic power utilized by platforms is best understood as a double articulationFootnote3 of both material and discursive practices.

The double articulation of algorithmic power

Inspired by Silverstone’s (Citation1999) dialectical concept of mediation, we theorize algorithmic technologies that govern the operations of platform labour as being ‘located within the flows of particular socio-cultural discourses’ (Livingstone Citation2007, p. 18). In describing the interactions of the materiality of technology and socio-cultural discourses as dialectical, we posit that ‘human action simultaneously creates structures as social systems and is shaped by such structures’ (Jackson et al. Citation2002, p. 6). As Bucher (Citation2017, p. 42) writes, ‘while algorithms certainly do things to people, people also do things to algorithms’. Hence, the creation and re-creation of an algorithmic order for platform labour is an open-ended process shaped by the duality of structure, yet in different ways for different settings. The organization of labour processes, then, is never fixed in nature; it is not just shaped by the decisions of platform designers and managers, but crucially by the deviant actions and formations of workers. Algorithmic power enables and constrains social action, it entails domination and counteractions, and it is practised – not possessed.

Scholars have long shown that digital forms of labour control tend to be imperfect rather than an all-encompassing. Bain and Taylor (Citation2000, p. 16) unveil call centre workers’ agency to disrupt the ‘electronic panopticon’, individually and collectively. Shestakofsky (Citation2017, p. 393) asserts that ‘when software systems are designed to defend against algorithmic efforts to game them, human variability and imperfection may be able to circumvent software’s strictures’. Amin and Thrift (Citation2002, p. 128) build on Latour’s oligopticon, arguing that ‘the networks of control that snake their way through cities are necessarily oligoptic, not panoptic: they do not fit together’. Kitchin and Dodge (Citation2011, p. 75) write ‘while the system strives for perfection in terms of regulating and producing code/space, it continues to have cracks that allow unintentional sociospatial relations and formations’. Likewise, scholars in the tradition of the social shaping of technology (Mackay and Gillespie Citation1992, p. 698) hold that ‘people may reject technologies, redefine their functional purpose, customize or even invest idiosyncratic symbolic meanings in them’.

In terms of algorithmic modes of managerial control in the context of global outsourcing, Aneesh (Citation2009, p. 367) notes that ‘code may not always succeed in organizing work as intended’. Based on ethnographic fieldwork with web journalists and criminal justice practitioners, Christin (Citation2017) similarly describes discrepancies between the rhetoric of employers and the everyday role of algorithms as processes of ‘decoupling’. She argues that the ‘descriptions [managers] provide of algorithmic use often diverge from what takes place among their employees and many algorithms are either ignored or actively resisted’ (Christin Citation2017, p. 9). All these examples show that the malleability of managerial control to workers’ practices of ‘vertical and horizontal disruption’ (Evans and Kitchin Citation2018, p. 44) is by no means a novel phenomenon that emerged with the rise of labour platforms.

Despite these similarities, we argue that the discontinuity and historical distinctiveness of digital labour platforms lies precisely in their use of algorithms as mediators between the material and the discursive. Algorithms serve not only as a purely technological instrument to control, track and supervise spatially dispersed workforces in ever-more efficient ways. Platform companies also make use of their proprietary technology to devolve responsibility for their own misbehaviour, offloading the liability of instances such as wage theft and union busting to glitches of their algorithmic systems (Captain Citation2019). It would thus be insufficient to restrict the analysis of fissures in algorithmic power to either the discursive or the material. Rather, it is crucial to analyse those analytical dimensions as two sides of the same coin.

On the one hand, platforms carefully deploy narratives of flexibility, freedom, and entrepreneurship to systematically devolve social and economic risks to workers. Platforms position themselves as technology companies and neutral intermediaries rather than as, say, transportation companies. The rationale for this specific framing is that it allows platform businesses to tactically avoid labour laws and sector-based regulations addressing key issues like minimum wage, health and security of workers, or freedom of association. Deliveroo, a food delivery platform, categorizes its couriers as self-employed contractors rather than employees, providing managers with a document that outlines the adequate vocabulary: ‘Do say: Supplier agreement, e.g.: Your supplier agreement may be terminated if you continue to fail to meet the service delivery standards. Don’t say: Employment contract, e.g.: You are obliged by your employment contract to hit certain performance targets’ (Butler Citation2017). As independent contractors, couriers do not get sick pay, holiday pay, and minimum wages.Footnote4 Platforms put an effort in highlighting the ephemeral temporalities of the labour relationship; workers are seen as an anonymous, replaceable group that can be managed seamlessly (Gray and Suri Citation2019).

Platforms also carefully frame what does and does not count as working time. Delivery and taxi drivers, for instance, spend significant amounts of time waiting on their bikes, and in their cars, for jobs to come in. Those workers create value for their respective platforms by allowing them to elastically scale up their orders. However, those workers are not paid for any of that time: it is carefully classified and articulated as non-work time. Cloud work platforms are similarly characterized by large amounts of unpaid labour. Indeed, one study found that online freelancers spend an average of 16 h a week looking for new jobs (Wood et al. Citation2019). At the same time, those same freelancer platforms are careful to frame payment for platform workers as essentially optional. Guru.com prominently note on their website: ‘Pay only for a job well done’. Upwork.com similarly boasts: ‘Upwork … helps ensure that an hour paid is an hour worked’.

On the other hand, when it comes to the dimension of technological materiality, workers face the challenge to make sense of the changing modes and operations of algorithmic forms of management. Our understanding of materiality entails workers’ interpretative practices and the construction of folk theories on how algorithms work, as highlighted by a number of empirical studies (Chan and Humphreys Citation2018; Rosenblat Citation2018). Importantly, the ways in which workers interact with, and are able to make sense of, platforms vary fundamentally across sectors, geographies, and degrees of labour fragmentation. For example, in some cases, workers might come into physical contact with each other, while contacts between domestic workers are limited and freelancers usually work from home. Yet, what unites these examples is that all digital labour platforms utilize ‘a diverse set of technological tools and techniques that structure the conditions of work and remotely manage workforces’ (Mateescu and Nguyen Citation2019).

Although digital labour platforms prescribe an algorithmic order – one that is constantly being created and re-created – for the organization of work, research shows that workers interact with algorithms that govern the labour process in ways not intended by platform designers and managers (Chen Citation2018a; Sun Citation2019; Kellogg et al. Citation2020). However, the relationship between such deviant practices and the power exerted and mediated by algorithms remains under-theorised. In particular, it is pivotal that algorithms deployed by platforms to govern workforces are not isolated or fixed computational models. Instead, they are embedded in ‘massive, networked [systems] with hundreds of hands reaching into them, tweaking and tuning, swapping out parts and experimenting with new arrangements’ (Seaver Citation2013, p. 10). For instance, there is not just one Uber algorithm that manages all drivers on the planet in the exact same way. Instead, workers are being governed by, and interact with, different permutations of code that are ‘only knowable in their becoming as opposed to their being’ (Bucher Citation2018, p. 49). This does not necessarily mean that platforms develop multiple algorithms. The point is that there are multiple realities of how workers are ‘perceiving, feeling, acting, and knowing’ (Bucher Citation2018, p. 18) algorithms in practice.

As we have argued in this section, the double articulation of algorithmic power requires taking seriously the contested discursive politics of platforms without ignoring their complex materiality. However, while platforms wield algorithmic power discursively as well as materially, fissures in the operations of platforms are also characterized by this double articulation.

Fissures in algorithmic power: manipulation, subversion, disruption

This paper draws primarily on a multi-year action research project in order to support its core arguments. We have been conducting research on platform work and platform workers in Europe, Africa, and Asia over the last decade. This research allowed us to conduct one-on-one interviews, surveys, focus groups, and action research workshops in which we convened platform workers and platform managers in the same rooms. For our theoretical contribution, we also build on desk research and other studies. This strategy of reaching into a broad-base of evidence is done not to make representative claims, but rather point to actually-existing examples from the lives of platform workers. We present a theoretically grounded categorization of how workers ‘make use of technology to fight against the imperfect panopticon and […] determine their own conditions’ (Chen Citation2018b, p. 234). Our claims are necessarily about the existence of practices in which workers affect the operations of algorithmic power, rather than the frequency of their occurrence.

provides a categorization of fissures in algorithmic power in the context of digital labour platforms. It is pivotal that these categories might dynamically overlap with each other and by no means claim analytical exclusivity. For example, subverting a platform’s rules and operations through creating an artificial rise in surge pricing might coincide with practices of manipulation. Yet, each category also dovetails with risks for workers, which range from deactivation to the normalization of quantified rankings and behavioural nudges as forms of control.

Table 1. A typology of fissures in algorithmic power.

Manipulation

Manipulation points to the circumvention of algorithmic power. Workers break the rules of the platforms, often through digitally-mediated practices. Examples of the manipulation of geographically-sticky platform work include the use of third-party software to automatically reject or accept job requests, place fake bookings or inflate fares through the modification of GPS data flows. Although qualitative and quantitative research on such practices is still in its infancy, burgeoning research with a focus on ride-hailing platforms suggests that manipulation is not a marginal phenomenon. Chen (Citation2018a, p. 2705) finds that ‘about 40% [of ride-hailing drivers in China] in total have reported either installing bot apps (N = 1719) or registering their vehicles on multiple devices (N = 505)’. Those alternative user interfaces allow drivers to compare several requests and choose the best option. As a result, workers complicate ‘the temporal politics within the vehicle, as drivers are able to expose themselves to multiple ride requests at the same time (Chen Citation2018a, p. 2702). The affordances of bot apps range from rejecting ride requests without sanctions to catching the request with the highest fare, but they do not necessarily guarantee a higher income for workers.

It is crucial to situate manipulation as part of the precarious situation of workers. When interviewing platform managers about the working conditions of their workers, the most common response from platforms is either ‘they are not our workers’ or even ‘they are not workers’. What is happening here is not a denial the work is being carried out or a denial that it is being carried out by people who do the requisite work, but rather a fundamental reframing of the organization of that work. During an interview about working conditions, the representative of one large transportation platform, when asked about payment, quickly noted that ‘we don’t pay our workers’. This is not an assertion that workers do not receive any income for their labour. Rather, it is a discursive inversion of the flow that that income takes. To the manager, it is clients who pay workers, and workers who pay the platform a fee for connecting them. Platforms, in other words, present themselves as neutral mediators providing a digital infrastructure for economic activity – one that is vulnerable to be manipulated in various ways.

Drawing upon a survey of 516 platform workers, the Institute for Development of Economics and Finance in Jakarta claims that 81 percent of motorcycle and ride-hailing drivers ‘manipulated orders to reach their targets’ (The Straits Times Citation2018), with 61 percent being ‘aware that their colleagues manipulated orders to get incentives’ (ibid.). This includes the use of fake accounts and modified apps to bypass root and mock location detections and to cancel rides without sanctions. According to Yuniar (Citation2018), ‘on the dark web, there are at least 1,300 unofficial versions of Go-Jek’s app [an Indonesian platform company]’. As a reaction, platform companies design and implement tracking systems to detect modified apps and run whistle-blower campaigns that encourage drivers to report others engaged in practices of manipulation.

A creative example is the use of mock GPS software by Uber drivers (Adegoke Citation2017). In those cases, platform workers modify the operations of platforms by bringing fake spatial data flows into being that algorithmic systems may misinterpret as a genuine movement of cars. Interestingly, as Uber Engineering (Citation2018) admits, ‘location integrity as a defence strategy is a complex task and suffers from limitations in regions with few Uber trips’. Uber also acknowledges that the ‘fraud black market itself is very sophisticated and adapts to new products and new services over time’. These subtle changes in platform design and the key role of training data for machine learning systems used to automatically detect GPS spoofing software highlight the contingency of fissures in algorithmic power: an ongoing cat-and-mouse-game without a clear winner.

However, it not only the economic precarity as part of the platform-mediated labour relationship that leads workers to practices of manipulation. The structural vulnerability of migrant workers, who provide a large share of the labour-power behind many digital labour platforms, also complicates the analysis of manipulation. In the past, a range of platform companies made little effort to check who uses their accounts, giving those lacking a visa, work permit, or social security number new income opportunities (Van Doorn et al. Citation2020). For instance, Bryan (Citation2019) reports about practices of food delivery workers across the United Kingdom renting their Deliveroo and UberEATS accounts to others who have not passed official requirements such as right-to-work checks. Similar examples of couriers subcontracting platform work to migrant workers have taken place in other countries. Given that workers are incentivized to report other workers engaged in such practices, it is important to stress that breaking rules of platforms through manipulation entails the danger of decomposing solidarity and trust within informal support networks.

Subversion

The second category, subversion, addresses the attempts of workers to subvert platforms’ rules without breaking them. The difference between manipulation and subversion is that the latter occurs through opportunities built in the labour process itself. Therefore, it is worth stressing that practices of subversion do not necessarily entail the same level of deviance as practices of manipulation or disruption.

With regard to location-based platforms, subversion encompasses the intentional cancellation of ‘rides in the system to avoid negative ratings from angry customers, since negative ratings lead to automatic sanctions’ (Möhlmann and Zalmanson Citation2017, p. 12). Other examples include the general avoidance of UberPOOL rides, preventing a software time-out after letting a job request go by logging off, the triggering of false demand in certain areas to subvert mechanisms of surge pricing and using multiple accounts and devices. Another example is Uber and Lyft drivers at Reagan National Airport in Washington DC simultaneously turning off their ride-hailing apps for a minute or two to trick the app into thinking there are no drivers available, collectively causing a temporary price surge (Mamiit Citation2019). In reaction to media reports, workers warned others: ‘don’t talk about Surge Club’. Those moments in which algorithms do not work as intended by platform managers are collectively brought into being by platform workers (i.e. to affect surge price systems through orchestrated log-offs), but they also benefit from loopholes that enable deviant subversive practices in the first place (i.e. the technical design of Uber’s surge price system).

As part of fieldwork that we conducted in Berlin, a food delivery courier stated: ‘What we are concerned about is that it is not clear how the fee for each order is determined, but we have different strategies to deal with the app’. For example, couriers try to make sense of the app’s zone-based order distribution system by exchanging their experience within chat groups. Given the opaque allocation of orders, it is important to underline that processes of collective sense-making amongst platform workers shape not just their beliefs and expectations vis-à-vis the labour process but also the ways in which they may find loopholes in the operations of platforms.

As workers share those practices with each other, both in person and through digitally-mediated means, it is important to theorize fissures in algorithmic power as products of collective, rather than individual, sense-making. A focus on the relational nature of fissures in algorithmic power enables a recognition that workers need certain skills, knowledge, and other immaterial and material resources to make use of the digital infrastructure of platforms in ways that work in their favour. At the same time, the ways in which workers interact with, and can make sense of, platforms varies fundamentally across sectors, geographies, and degrees of labour fragmentation.

The cloud work platform Upwork, for instance, is sensitive to the fact that their clients need some mechanism to be able to trust that workers – who are often on the other side of the planet – are doing the work that they are paying for. Their solution to this trust issue is an automated surveillance system. The platform captures screenshots from workers’ computers at random intervals every ten minutes. The random increment is built in to stop workers from gaming the system. However, some workers have found ways to use that monitoring system to their own advantage. One worker revealed during an interview: ‘It comes once every 10 min. Once it has shown up in a 10-minute block you leave … you have nine minutes to do everything that is totally non-related to work’. The system could also be bypassed simply by setting up a second monitor: ‘Since I’m technical I connect my laptop to my TV … so I have two screens, I’m watching YouTube while I’m working on the platform … because the screenshot is only for the main [monitor]’.

This is not to dismiss the serious risks for platform workers associated with the normalization of algorithmic management, gamification, and metric-based rankings. One food delivery courier stressed in our workshops that he ‘rides very fast in order to reach the performance targets, and cuts the lights from time to time, which can be dangerous’. As platform workers conform to algorithmic power by trying to make sense of it, they also normalize those mechanisms of labour control. Coping with the automated performance targets by platforms can result in serious physical danger for workers, exemplified by several fatal accidents involving workers (Pskowski Citation2019).

Disruption

This mode of counteraction refers to the constitution of digital picket lines to disrupt the operating procedure of platforms. In other words, workers attempt to collectively rewrite the rules of platforms. By taking modes of disruption seriously, we contest Ettlinger’s (Citation2018, p. 5) distinction between ‘productive’ modes of algorithmic resistance and those that ‘avoid or disrupt or obfuscate the digital environment’. Disrupting the seamless operating procedure of platforms through enacting digital picket lines is a key way for workers to gain leverage over the platforms that they work for. In contrast to the invisible efforts of manipulating and subverting platforms, disruption exposes the relevance of spatial visibility. In our interviews that brought together platform managers and workers, some of the workers told us that this was their first-ever time in years of working for their platform that they had ever seen a representative of the company.

At the same time as they reconfigure labour relations, digital labour platforms also reshape the rules of how collective action emerges. In order to coordinate disruptions, workers may use the affordances of the platform to their advantage. For instance, one particularly relevant affordance of Deliveroo’s platform, algorithmically determined physical meeting points which are located near popular restaurants, has been appropriated and reconfigured by Deliveroo riders in August 2016 to push back against the platform company’s decision to change their payment structure to piecework. Intentionally designed and implemented by the platform company to boost the efficiency of deliveries, a labour organizer stressed in an interview with us that

those meeting points helped workers not just to get to know each other, or talk about working conditions, but to build a collective identity. You can create the idea of a space, within which you can talk about those things and consider your relationship with management and what they’re offering you.

However, it is important to highlight that the tensions between platforms and workers do not play out in the same way across different countries and settings. Instead, the nature of such disruptions is always shaped by contextual and geographical particularities. For example, in our fieldwork in South Africa, some platform food-delivery workers noted that on strike days, some of them travel around town and vandalize the bikes of other platform workers who refused to participate in the strike. Nastiti (Citation2017, p. 31) argues that ride-hailing drivers in Indonesia ‘relate more to a restorative demand (i.e. to go back to the old system without the performance factor) rather than an aggressive demand (i.e. demanding employment status and rights)’. While the question of which demands lead to strike action remains a task for comparative research, case studies on platform worker mobilizations in multiple European countries (Vandaele Citation2018), Italy (Briziarelli Citation2018), India (Surie Citation2018) and China (Chen Citation2018b) demonstrate that workers around the globe are able to defy circumstances that exacerbate collective actions. Having said that networked mobilizations occur rapidly, often without the involvement of traditional labour unions, it is important to add that they break apart quickly, which is also due to a high level of labour turnover.

It is vital to note here that while most platforms refuse to negotiate with or even acknowledge the collective mobilizations of their workers, collective action can pose an existential threat to the platform model. Slipped into the hundreds of pages of Uber’s (Citation2019) S-1 filing with the United States Securities and Exchange Commission, was the following admission:

We have previously received a high degree of negative media coverage around the world, which has adversely affected our brand and reputation and fuelled distrust of our company. In 2017, the #DeleteUber campaign prompted hundreds of thousands of consumers to stop using our platform within days. If a campaign similar to #DeleteUber occurs, if we fail to provide high-quality support, or if we cannot otherwise attract and retain a large number of Drivers, consumers, restaurants, shippers, and carriers, our revenue would decline, and our business would suffer.

As such, a campaign built around a hashtag could not just cause the platform to haemorrhage users, but also prompt the company to change and adapt its business practices to be less objectionable to its users. Although material successes of collective actions by platform workers have been limited, their symbolic implications must not be underestimated. Platforms profit by being seemingly ephemeral organizations – but their weightless digital infrastructure renders them very vulnerable to alternatives (Graham Citation2020). They are highly sensitive to public opinion and they exist then only insofar as we pay attention to them.

Conclusion: beyond algorithmic hegemony

Those with power in the village are not, however, in total control of the stage. They may write the basic script for the play but, within its confines, truculent or disaffected actors find sufficient room for maneuver to suggest subtly their disdain for the proceedings. (Scott Citation1985, p. 26)

In Weapons of the Weak: Everyday Forms of Peasant Resistance, James C. Scott encapsulates the omnipresence of mundane resistance practices in the wake of the Green Revolution in Malaysia. Drawing on ethnographic fieldwork, he unmasks the attempts of disenfranchized peasants to regain self-determination over agricultural labour processes and their everyday life. Applying Scott’s mode of analysis to the focus of this paper, platform workers’ utilization of the weak spots of ‘the basic script for the play’ (Scott Citation1985, p. 26) exhibits the fact that algorithmic power is inherently only ever partial. Through manipulation, subversion, and disruption, workers defy information asymmetries engendered by black-boxed systems, thereby challenging the discursive self-portrayal of digital labour platforms as seamless mediators of society’s datafication.

We have introduced the notion of fissures in algorithmic power in order to theorize the ways in which platform workers interact with technologies that control the labour process in ways not intended by platform companies. These moments in which algorithms do not govern as intended underline the necessity to challenge metaphors that prescribe an abstract sense of unidirectional power to algorithmic systems. From this point of view, algorithms are not fixed or stable technological artefacts with embedded politics. Algorithms are rather part of sociotechnical relations that are marked by certain forms of everyday contestation, which, in turn, dialectically affect the operations of platforms. Taking algorithmic fissures seriously enables us to challenge a culture of immunity and neutrality favoured by platform companies. Far merely providing a neutral technical service to workers and clients, platforms bring reshaped modes of social contestation into being. Nonetheless, a key point of our paper has been that workers’ practices of gaining an advantage over platforms do not simply result in liberation or autonomy from algorithmic power. Fissures in algorithmic power are by no means equivalent to worker power. In fact, they entail serious risks for workers: legal consequences and deactivation, a normalization of quantified and gamified modes of labour control, and potential failures to enact long-term social infrastructures of solidarity.

As the ‘platformisation of labour and society’ (Casilli and Posada Citation2019) expands, we can do better than simply accepting the narrative that algorithms wield a hegemonic power that strips agency from workers. The rise of platform-based and algorithmically-managed work does not only bring about a ‘fundamental cultural shift in what it means to be employed’ (Rosenblat Citation2018, p. 4). What is more, digital platforms also engender a veritable cultural shift of what it means to contest this distinct organization of work. Through manipulation, subversion, and disruption, platform workers are showing that code alone cannot determine the futures of work.

Algorithms shape our societies, mould our politics, and direct our economies. As they are infused into the governance of ever-more life practices, we need to ensure that accountability, transparency, and user-participation are regulated into those systems. Until then, it is worth remembering that the fissures described in this paper show that the power that algorithms exert and mediate is far from hegemonic. Fissures, and the creative, playful, and powerful forms of resistance that bring them into being, rarely render algorithms impotent, but they do open up possibilities for agency, freedom and control to be exerted beyond their reach. While we cannot go back to a world without algorithms, we can overcome algorithmic hegemony in our modes of cultural analysis.

Acknowledgments

The authors would like to thank Gemma Newlands, Julian Posada and Adam Badger for their feedback on an early draft of the paper. The paper has also benefited from the conversations, debates, and political engagements that the authors have engaged in as part of the Fairwork and Geonet projects.

Disclosure statement

No potential conflict of interest was reported by the authors.

Additional information

Funding

This work was supported by the Economic and Social Research Council (ESRC) [grant ES/P000649/1], studentship number 2094254, and the University of Oxford (Scatcherd European Scholarship). Support also came from The Alan Turing Institute under the EPSRC [grant EP/N510129/1], the Federal Ministry for Economic Development (commissioned by GIZ) (BMZ), the ESRC (ES/S00081X/1), the Leverhulme Trust (PLP-2016-155), and the European Research Council (ERC-2013-StG335716-GeoNet).

Notes on contributors

Fabian Ferrari

Fabian Ferrari is a doctoral student at the Oxford Internet Institute, University of Oxford.

Mark Graham

Mark Graham is Professor of Internet Geography at the Oxford Internet Institute, and an Alan Turing Institute Faculty Fellow.

Notes

1 We intentionally deploy a broad definition in order to encompass a variety of practices and platforms and to highlight their implications for the wider analysis of ‘algorithmic culture’ (Striphas Citation2015).

2 A/B testing is a commonly used randomized experiment in statistics with two variants to compare two versions of a single variable and determining which of the two variants is more effective. For instance, the Beijing-based ride-hailing platform Didi Chuxing has been testing out a new algorithm that relies on techniques of reinforcement learning, utilising A/B tests for different locations (Hao Citation2018).

3 While Langlois et al. (Citation2009, p. 417) point to Facebook’s double articulation of code and politics to show ‘how politics mobilize code at the same time as code formalizes politics according to specific informational logics’, we focus on the contested nature of labour processes in platform-based work.

4 This is not to dismiss that some courts – for example, in California, Poland, the Netherlands, and Spain – have recently ruled that platform workers are indeed employees rather than independent contractors. A key element of legal discussions is the dependence of workers on the algorithmic matching system of the platform. In other words, the existence of an authority relationship is essential.

References