3,294
Views
6
CrossRef citations to date
0
Altmetric
Perspectives

Rediscovering a risky ideology: technocracy and its effects on technology governance

ORCID Icon
Pages 112-116 | Received 07 Jul 2020, Accepted 08 Jul 2020, Published online: 14 Sep 2020

ABSTRACT

Despite longstanding critiques, the dominant method of governing technology remains based on processes that abstract away the complexity of human and social factors. They frame political decision-making in terms of risk assessment and cost-benefit analyses. This articles argues that the well-known issues arising from a technocratic method of governance cannot be solely addressed through procedural fixes (i.e. injecting more public participation and layperson perspectives into scientific or policy processes). This approach is, ironically, a technocratic one that views problems and solutions in terms of applying the right method. Rather, there is also a drastic need for research and practice directed not at the citizen side, but at understanding and confronting the politics and power of contemporary technocrats. Put simply, the questions of who governs technology and how they do it are answered through political contest.

When it comes to managing and governing emerging technologies in society people tend to rely on some conception of ‘risk’ to make assessments: whether it is the environmental risks of some new nanotechnological application, the health risk of genetically modified dinners, or the socio-ecological risks of de-extinction (Valdez et al. Citation2019). Risk seems to work well as a common metric, in large part, because the concept is flexible enough to be applied to a wide-range of technologies and situations.

As is well known, however, there is quite a difference between how scientists and regulatory officers conceive of risk versus how the general public does (Rayner and Cantor Citation1987). For the former group, risk is seen as an objective, quantitative measure of probabilities and percentages. Whereas, the latter group is more concerned with relating risk to larger, qualitative concepts – what Rayner calls the ‘ordinary language category’ of risk (Citation2010, 2620). This difference is no small thing; it is an important disparity that should be reflected in technology policy and practice.

As Rayner (Citation2010) explains, when the general public talks about risk they are primarily concerned with three factors: (1) can we trust the ‘institutions responsible for the commissioning, design, implementation, management, and regulation of the technology’ (Rayner Citation2010, 2620); (2) if something goes wrong is there adequate liability in place to ensure that the costs are distributed and that any reparations will be paid out; and (3) do people believe that their right to consent was respected with regards to accepting or rejecting a technology?

This difference between the professional definition and the ordinary use of ‘risk’ is an illustrative example. With a cost–benefit analysis or risk assessment the analysts can plug their variables – a necessarily incomplete and reductionist set – into a handy algorithm. The outcome of which – a discrete number or group of numbers – presents a simple tool for guiding a host of otherwise complex decisions about the type of research and innovation that are integrated into society. In the world of scientists, assessors, and policy analysts this veneer of numerical neutrality is often adequate enough. For the general public, though – the people who have to live with the risks and consequences – this process is clearly incomplete.

As decades of scholarship has noted, the domination of risk analysis as the primary framework for assessing technologies overlooks ‘more fundamental questions around ownership, control and the social ends to which the technology would be directed … ’ (Wilsdon and Willis Citation2004, 22). A utilitarian ethic does not speak to the finer details pertaining to actual implementation: What type of things might go wrong? Who will be responsible if they do? Can we trust them? Would we even consent to these risks and possible consequences in the first place?

Despite such critiques, the dominant method of governing technology remains based on processes that abstract away the complexity of human and social factors. It frames political decision-making in terms of risk assessment and cost–benefit analyses. Thus, the ineffectiveness of many democratizing engagement efforts cannot merely be blamed on poor implementation. Porter provides insight into the deeper problem when he notes that ‘There is clearly an element of technocracy, or administration by experts, in the growing influence of quantitative methods’ (Citation2003, 106). The thread connecting this method of governance is a technocratic ideology.

Technocracy has a rich history that can be traced back from Plato’s discussion of philosopher-kings to Francis Bacon’s conception of scientific utopia, and culminating in Thorstein Veblen’s more familiar vision of an engineered society (Gunnel Citation1982). In short, technocracy is an ideology founded on the authority of techno-scientific expertise over other ways of knowing and doing; appeals to optimization and objectivity as primary values; and approaches based on engineering solutions for any and all problems in the world (Sadowski and Selinger Citation2014).

In both Science and Technology Studies (STS) and Responsible Innovation (RI), the ready response to the problems of the risk paradigm and technocratic ideology is to call for more citizen participation into the governing processes (cf. Jasanoff Citation2003; Ketzer et al. Citation2019; Sadowski Citation2015). Much effort and funding, therefore, has been put towards developing public engagement methods that incorporate public views on science and innovation into technology development and policy making – leading some to call this the ‘age of engagement’ (Delgado, Kjølberg, and Wickson Citation2011). Nevertheless, engagement exercises have continued to face tough criticisms about their practical efficacy and normative assumptions (Nielsen and Boenink Citation2019; Selin et al. Citation2017).

The problem is not with the commitment to public engagement, per se. To be sure, it remains an important element for robust democratic politics in a technological society. And while RI has attempted to answer the STS call for novel, innovative methods that move beyond the so-called ‘deficit model’ (Davies et al. Citation2012), that model continues to present ‘a formidable barrier’ (Hartley et al. Citation2019).

The issues arising from a technocratic method of governance cannot be solely addressed through procedural fixes (i.e. injecting more public participation and layperson perspectives into science and processes). This approach is, ironically, a technocratic one that views problems and solutions in terms of applying the right method. Rather, there is also a drastic need for research and practice directed not at the citizen side, but at understanding and confronting the politics and power of contemporary technocrats (Campolo and Crawford Citation2020; Sadowski and Bendor Citation2019). That is, asking questions like why do they neglect deeper social concerns – trust, liability, and concern, among others – in lieu of, say, data-driven analytics and algorithmic decisions? While this type of ‘collaborative integration’, is indeed an explicit element of RI engaged scholarship (Fisher et al. Citation2015), it lacks much if any reference to technocracy as a feature of routine technical practice. Understanding this ideology is key to ensuring that science and technology boost democratic principles, instead of resist them. This, then, should be a major focus for descriptive, critical, and prescriptive studies of responsible innovation and technology in society.

Often times engagement exercises assume that they are acting as the bridge between a curious or concerned public and open, accepting experts. But this view ignores or misunderstands the political schism that separates the democratic ideal and technocratic practice: there is no need for outside opinions when scientifically sanctioned methods provide the solutions. In such cases, as Centeno states, ‘the technocratic model of objective necessity replaces the decisionistic model of politics, which leads to the “scientification of politics” and inevitably produces an authoritarian political framework’ (Centeno Citation1993: 312). The power of experts to decide how, in this instance, to govern technologies does not require – and actively resists – democratic participation. There are starkly different cultures and perspectives, and thus fragmentation and conflict, about central concepts like risk and knowledge (Douglas and Wildavsky Citation1982). Put simply, the questions of who governs technology and how they do it are answered through political contest.

Lest we think that talking about the power of technocracy is anachronistic in an age of populism, we can easily point to an endless stream of examples from the entrepreneurs, executives, and engineers who populate Silicon Valley (Sadowski and Selinger Citation2014) – one of the world’s centers of power and wealth. Yet, the ideology is dominant in other, more august, sources of advice and authority. For example, the influential report on Grand Challenges for Engineering by the US National Academies of Engineering, which influences how government, industry, and academia understand how to solve problems in the world, ‘adopts a technocratic view, sometimes explicitly and at other times more subtly. Early in the introduction … the document sings the praises of technology and strongly implies that social progress has followed suit’ (Herkert and Banks Citation2012, 112). These technocratic attitudes have long been part of formal, legislative mechanisms of governing technology. The well-known Congressional agency the Office of Technology Assessment (OTA) – established in 1972 and shuttered in 1995 – did a lot of good, useful work and set the stage for future efforts around the world. However, its practices were exclusionary (Sadowski Citation2015); ‘more diverse and plural forms of public knowledge were [marginalized]’ (Wilsdon and Willis Citation2004, 22).

There’s no question that technical experts play an important role in political decision-making – figuring out where and how they should supplement deliberative and participatory processes is a necessary project (Durant Citation2011). But first, understanding the multiple facets, causes, and consequences of this overlooked if not forgotten technocratic ideology is crucial to overcoming its impediments to democratic principles.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes on contributor

Jathan Sadowski is a research fellow in the Emerging Technologies Research Lab, Faculty of Information Technology at Monash University. His work focuses on the political economy and digital geographies of smart technologies. He is the author of Too Smart: How Digital Capitalism is Extracting Data, Controlling Our Lives, and Taking Over the World (The MIT Press, 2020).

References

  • Campolo, A. , and K. Crawford . 2020. “Enchanted Determinism: Power Without Responsibility in Artificial Intelligence.” Engaging Science, Technology, and Society 6: 1–19. doi: 10.17351/ests2020.277
  • Centeno, M. A. 1993. “The New Leviathan: The Dynamics and Limits of Technocracy.” Theory and Society 22 (3): 307–335. doi: 10.1007/BF00993531
  • Davies, S. R. , C. Selin , G. Gano , and A. G. Pereira . 2012. “Citizen Engagement and Urban Change: Three Case Studies of Material Deliberation.” Cities 29 (6): 351–357. doi: 10.1016/j.cities.2011.11.012
  • Delgado, A. , K. L. Kjølberg , and F. Wickson . 2011. “Public Engagement Coming of Age: From Theory To Practice in STS Encounters with Nanotechnology.” Public Understanding of Science 20 (6): 826–845. doi: 10.1177/0963662510363054
  • Douglas, M. , and A. B. Wildavsky . 1982. Risk and Culture: An Essay on the Selection of Technical and Environmental Dangers . Berkeley : University of California Press.
  • Durant, D. 2011. “Models of Democracy in Social Studies of Science.” Social Studies of Science 41 (5): 691–714. doi: 10.1177/0306312711414759
  • Fisher, E. , M. O'Rourke , R. Evans , E. B. Kennedy , M. E. Gorman , and T. P. Seager . 2015. “Mapping the Integrative Field: Taking Stock of Socio-Technical Collaborations.” Journal of Responsible Innovation 2 (1): 39–61. doi: 10.1080/23299460.2014.1001671
  • Gunnel, J. G. 1982. “The Technocratic Image and the Theory of Technocracy.” Technology and Culture 23 (3): 392–416. doi: 10.2307/3104485
  • Hartley, S. , C. McLeod , M. Clifford , S. Jewitt , and C. Ray . 2019. “A Retrospective Analysis of Responsible Innovation for Low-Technology Innovation in the Global South.” Journal of Responsible Innovation 6 (2): 143–162. doi: 10.1080/23299460.2019.1575682
  • Herkert, J. R. , and D. A. Banks . 2012. “I Have Seen the Future! Ethics, Progress, and the Grand Challenges for Engineering.” International Journal of Engineering, Social Justice, and Peace 1 (2): 109–122. doi: 10.24908/ijesjp.v1i2.4306
  • Jasanoff, S. 2003. “Technologies Of Humility: Citizen Participation in Governing Science.” Minerva 41: 223–244. doi: 10.1023/A:1025557512320
  • Ketzer, D. , N. Weinberger , C. Rösch , and S. B. Seitz . 2019. “Land Use Conflicts between Biomass and Power Production–Citizens’ Participation in the Technology Development of Agrophotovoltaics.” Journal of Responsible Innovation 193–216.
  • Nielsen, K. D. , and M. Boenink . 2019. “Subtle Voices, Distant Futures: A Critical Look at Conditions for Patient Involvement in Alzheimer’s Biomarker Research and Beyond.” Journal of Responsible Innovation 7 (2): 170–192. doi: 10.1080/23299460.2019.1676687
  • Porter, T. M. 2003. “The Management of Society by Numbers.” In Companion to Science in the Twentieth Century , edited by John Krige , and Dominique Pestre , 97–110. London : Routledge.
  • Rayner, S. 2010. “Trust and the Transformation of Energy Systems.” Energy Policy 38 (6): 2617–2623. doi: 10.1016/j.enpol.2009.05.035
  • Rayner, S. , and R. Cantor . 1987. “How Fair is Safe Enough? The Cultural Approach to Technology Choice.” Risk Analysis 7 (1): 3–9. doi: 10.1111/j.1539-6924.1987.tb00963.x
  • Sadowski, J. 2015. “Office of Technology Assessment: History, Implementation, and Participatory Critique.” Technology in Society 42: 9–20. doi: 10.1016/j.techsoc.2015.01.002
  • Sadowski, J. , and R. Bendor . 2019. “Selling Smartness: Corporate Narratives and the Smart City as a Sociotechnical Imaginary.” Science, Technology, & Human Values 44 (3): 540–563. doi: 10.1177/0162243918806061
  • Sadowski, J. , and E. Selinger . 2014. “Creating a Taxonomic Tool for Technocracy and Applying it to Silicon Valley.” Technology in Society 38: 161–168. doi: 10.1016/j.techsoc.2014.05.001
  • Selin, C. , K. C. Rawlings , K. de Ridder-Vignone , J. Sadowski , C. A. Allende , G. Gano , S. Davies , and D. H. Guston . 2017. “Experiments in Engagement: Designing Public Engagement with Science and Technology for Capacity Building.” Public Understanding of Science 26 (6): 634–649. doi: 10.1177/0963662515620970
  • Valdez, R. X. , J. Kuzma , C. L. Cummings , and M. Nils Peterson . 2019. “Anticipating Risks, Governance Needs, and Public Perceptions of De-Extinction.” Journal of Responsible Innovation 6 (2): 211–231. doi: 10.1080/23299460.2019.1591145
  • Wilsdon, J. , and R. Willis . 2004. See-Through Science: Why Public Engagement Needs to Move Upstream . London : Demos.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.