1,711
Views
0
CrossRef citations to date
0
Altmetric
Articles

A critical, analytical framework for the digital machine

ORCID Icon, ORCID Icon & ORCID Icon

ABSTRACT

The Faculty of Digital and Computational Studies (DCS) at Bowdoin College proposes a critical, analytical framework, referred to as the ‘4As,’ as an interdisciplinary means to interpret, evaluate, and create the data, operations, and devices of computing across all domains of knowledge production. Following other disciplines that have developed in symbiotic relationships to one another, DCS puts computation in conversation with fields from across the arts, humanities, physical, and social sciences. Our foundational premise is the bidirectional influence between these disciplines and digital artifacts and computation. The 4As (artifact, architecture, abstraction, and agency) benefit from both the scepticism of the liberal arts in the face of ubiquitous digital processes and the analytical opening for examining questions pertaining to creative and imaginative alternatives to the digital and computational status quo. We provide an ultra-contemporary case study to demonstrate the framework in use.

Introduction

Academic epistemological fields have to evolve dynamically in order to cope with the ways that new forms of understanding and criticism shape human experiences. For example, in 1609, Galileo looked through a telescope and redefined the Earth’s place in the universe. One could still study the stars without a telescope, but to do so would be to willfully handicap what one could accomplish. The interesting story is not about the telescope itself but about the data that it made available and the models that shifted subsequently. As another example, architecture requires the technologies fostered by advances in civil engineering, but each field is separate, with its own aesthetics, philosophies, and intellectual foundations. A final example: the internet was born in 1969 when a lab at Stanford was connected to another at UCLA. This was followed in short order by the era of personal computing. The technologies involved – networks and computers – are interesting in and of themselves and are well studied within the field of computer science, but more importantly, like the telescope, they are contingent upon and enabling of the cultural imaginaries within which they are embedded. Just as civil engineering enables the creation of new types of buildings, so too do networked computers allow for the creation of new types of digital artifacts.

This article describes an intellectual framework, which we are calling ‘4As,’ that has arisen from networked computers and, similar to architecture’s relationship to civil engineering, has its own field of inquiry separate from computer science. Like architecture, our new programme, Digital and Computational Studies (DCS), is both about creating new artifacts and about critically examining the artifacts that have come before – in this case, the artifacts are digital. The intellectual foundations of our framework have been shaped in collaboration with faculty from across the curriculum at our home institution, Bowdoin College, an American residential liberal arts college.

Across academia, several things are happening simultaneously: (1) computer science departments are growing rapidly, (2) Digital Humanities (DH) programmes are gaining prominence, (3) as are Computational Social Sciences (CSS) programmes, and (4) computational X subfields are growing in all of the natural sciences with a recent proliferation of centres and initiatives centred on computation. In short, colleges and universities, historically slow-moving institutions, are scrambling to keep up the increasing importance of computation. In many places, this is being done in an ad hoc fashion, or within the scope of Science and Technology Studies. We offer an alternative; a framework that is integrative and privileges neither computation nor another discipline. We are neither aiming to bring computer science to the rest of the liberal arts nor are we aiming to bring the liberal arts to computer science. This framework arises from a liberal arts setting but is not an argument that the liberal arts are a panacea to complex problems, any more than computer science or other academic structures may be. Our goal is an invitation: to bring together disciplinary priorities so that individual fields can enrich each other rather than hoping that their mere adjacency in an academic setting can prompt critical and creative thinking about digital and computational technologies. We provide a common framework and language so that conversations between fields can be more productive.

This new framework comes partly from recognition that in computer science, digital artifacts ends, and computational methods are the means to those ends, but within academia writ large, such artifacts can also be means and objects of study for new forms of critical engagement. We argue that while gaining a basic understanding of computational concepts is unequivocally a good, and increasingly necessary, addition to any education, many of the skills and concepts taught even in an introductory computer science course are simply not going to be useful for every student. Previously, students had very few options with regard to where they could learn computational thinking or acquire basic programming skills and thus have been flocking to computer science to gain these skills. However, there are many students who are not ready to try to unravel the mysteries of recursion but who still would benefit from exploring the impact of conditional statements on user experiences and opportunities.

We thus propose the 4As as an interdisciplinary means to interpret, evaluate, and create the data and operations of computing used across domains of knowledge production. The foundational principles of the 4As framework are twofold: algorithmic processes and the tools for quantification and visualization affect not only the world itself (objects of study) but also how scholars and students need to use and criticize computation in order to make sense of this digital world properly. The so-called wicked problems that we face are not easily reducible to physical, behavioural, or expressive sources but rather are the product of their interaction, potentially at a global scale.

To accomplish this bidirectional movement, DCS organizes its epistemological sphere around four aspects of our interaction with the technological world that emerged from initial coursework and research: artifacts, architecture, abstractions, and agency. By artifact, we intend anything that contains digital or computational components, with emphasis on the historical, cultural, and economic contexts of its creation and use. In turn, these artifacts communicate among themselves and with society through architectures that are combinations of digital, material, environmental, geopolitical, and commercial sites of labour. All artifacts and architectures are based on abstractions of certain functionalities or information. In order to understand the implications of abstraction, the scholar, artist, or ethical entrepreneur must identify the real-word phenomena modelled by the artifact and how these phenomena are impacted by the modelling choices and decisions behind a given technology. Finally, the agency recognizes that artifacts and architectures impact users and entire societies, changing their capacities to act in and impact the world.

In this sense, the 4As are in dialogue with other frameworks that investigate the areas of intersection between the digital and non-digital. The liberal arts ethos manifests itself in the complementarity of each A, inviting complementary perspectives and the use of parallel frameworks. Thus, the analysis of artifact dialogues and benefits from the contribution of the critical, historical, and comparative approach of Media Archaeology (Huhtamo and Parikka Citation2011). The analysis of agency is in conversation with concepts from the Actor–Network theory (Latour Citation2005) but places it explicitly in relation to the descriptive and interpretive dimensions of the other three As. The scope of application of the 4As framework is also intended to be broader than the humanities, thus transcending the horizon of frameworks explicitly aimed at digital humanities to encompass all liberal arts disciplines.

What separates the current moment from the examples of Galileo and architecture is that the changes wrought by computer and network technology cut across every discipline. Loose confederations, such as the Digital Humanities and the Computational Social Sciences, have arisen, but thus far there has been no organizing framework to draw all of the parts together. Meanwhile, the 4As do not arise from the perspective of an individual discipline but rather are drawn from the strengths of fields spanning the academy. In turn, the framework unites these disciplines in experimenting with and evaluating the impact of, digital and computational artifacts with respect to behaviour, expression, knowledge, and the physical environment.

We begin with the genesis of our approach, set at an American liberal arts college. We recount how institutional resistance, centred on a suspicion of technology, was instrumental to the importance of synthesis in our epistemology. Next, we turn to a description of the 4As framework. Finally, we give an extended example of the framework in action, exploring contact tracing in the context of Covid-19.

Why is this happening in a liberal arts setting?

According to the American Association of Colleges and Universities, a liberal education is an approach to education ‘is an approach to undergraduate education that promotes the integration of learning across the curriculum and co-curriculum, and between academic and experiential learning, in order to develop specific learning outcomes that are essential for work, citizenship, and life’ (“Advocacy for Liberal Education” Citation2020). Our own institution, Bowdoin College, frames it this way: ‘Graduates should leave Bowdoin with the ability to engage competing views critically, to make principled judgments that inform their practice, and to work effectively with others as informed citizens, committed to constructing a just and sustainable world’ (“The Mission of the College,” Citationn.d.). Inherent in each of these formulations is the idea that a liberal education should make one an informed and ethical citizen of the world. Over the centuries, the liberal arts have therefore necessarily changed as the world has changed, reflecting the new knowledge that has become ‘essential.’

It is not surprising that the bar for declaring an area of knowledge ‘essential’ has been relatively high in the liberal arts. Few liberal arts colleges, for example, have any sort of engineering programme. It is not important, the reasoning goes, that a graduate needs to understand engineering to be an effective and informed citizen. The apparent exception to this resistance, computer science, managed to gain entrance to liberal arts colleges mainly through departments of mathematics. Nevertheless, computer science has long been viewed with suspicion in liberal arts circles and some liberal arts schools still do not have computer science programmes. Bowdoin’s response to essential digital and computational knowledge was shaped in collaboration with computer science, while many other similar schools explored Digital Liberal Arts more closely aligned with Digital Humanities (Alexander and Frost Davis Citation2012; Mauro Citation2016).

For a long time, understanding technology was not ‘essential.’ It is increasingly clear, however, that in the era of personal computers, smartphones, and the internet, not having a basic understanding of technology means not understanding the world around you. Choosing to download an app on a smartphone, for example, is not simply about the basic functionality of the app, it has the implications for privacy and personal security. Later in this article, we will use contact tracing to examine this idea in detail. Further, algorithms and artificial intelligence systems are increasingly making decisions that impact day-to-day life.

Concurrently, within academia, a significant driver of change has been the tools developed by computer scientists. Text analysis, network analysis, mapping, and other tools have revolutionized how academics are able to work with data. In response, nascent fields have sprung up, e.g. the Digital Humanities and Computational Social Sciences. Many scholars within these fields have faced the same resistance that computer scientists faced in gaining entry to the liberal arts and sometimes even more, as other scholars were fearful that their fields would suffer the same fate as businesses disrupted by the digital revolution or that neoliberal priorities would displace the values of liberal education. It is increasingly the case that to eschew these tools is to wilfully handicap oneself as a scholar, ignoring new sources of data and analysis. But uncritical adoption carries other risks.

When DCS began in 2011, it was clear that the liberal arts could either be disrupted, perhaps by Massive Open Online Courses (MOOCs) or online courses, or could help lead a different kind of disruption by taking advantage of its connected structure. DCS started with a process. The process involved getting scholars together from across the campus to talk about how Bowdoin might help lead the transition into new kinds of scholarship and to do it in a way that was true to the liberal arts. DCS started with the knowledge that one of the greatest obstacles to change in academia is acceptance by the faculty at large, and so central to the process was getting buy-in by a faculty presumed to be sceptical.

A crucial turning point in the development of our programme was the recognition that faculty scepticism was not an obstacle to overcome but strength. Scepticism of computer science in the liberal arts, scepticism of new forms of scholarship, and scepticism to change generally were foundational to the programme that emerged because it ensured that DCS was not founded with a belief that computational techniques were going to save the liberal arts. Faculty were asking a simple question – ‘why?’ – and we realized that if we could not answer that question, then we could not justify our programme. And, in finding an answer to that question, we hit upon the unifying framework of all of our work. On the one hand, digital tools offered new opportunities and possibilities, and on the other, any tool has strengths and weaknesses and is not appropriate for every situation. Critically, not understanding the limitations of these tools is tantamount to being held hostage by them. Running a text analysis programme such as Voyant (Sinclair and Rockwell Citation2016) or using a particular technique such as topic modelling (Blei, Carin, and Dunson Citation2010) on a corpus does not solve research questions, it simply adds new data to consider. Further, a careful scholar will also want to know if better results are possible rather than blindly accepting output in good faith. Such a scholar has two choices – they can learn enough about the tool to interrogate it critically or they can ask an expert. The second of these choices is problematic as it requires not only an expert but often an expert who is conversant with the scholarly domain in question. Thus in all of these things is embedded the idea of a two-way relationship. Digital tools can help us interrogate data and explain things in new ways, but any particular scholarly challenge can also reveal the limitations of the tools and suggest new and better approaches. The heart of this approach is the recognition that every discipline is inherently trying to teach critical thinking and that each discipline essentially works with its own data in a hermeneutic spiral back and forth between models and the data. In the meantime, this hermeneutic spiral can be applied either using digital tools, traditional scholarly methods, or a combination of both. Meanwhile, a kind of meta-spiral is also possible for those tools themselves. For example, a scholar looking at large texts might grow frustrated at the limitations of the digital tools available and that frustration is more likely to be productive if they have a working understanding of how such tools work and can offer practical suggestions to developers. It was at this point that we recognized that not only should scholars using digital tools be conversant with basic elements of computational thinking but also it was essential that analytical methods be applied to the digital tools as well.

The two-way flow of ideas between computational scholars on the one side and more traditional scholars on the other has numerous other benefits. For one, the 4As framework becomes a kind of connective tissue between scholars in different disciplines. This allows for the seeds of a kind of interdisciplinarity that campuses often talk about but rarely achieve. Topics that are too large for a single discipline, such as the contact tracing example we discuss in this article, can be discussed, researched, and/or taught together more effectively when taken on by groups with different disciplinary backgrounds but who have shared experiences and common language for discussion. In the meantime, digital tools are precisely the most useful for problems at a large scale – finding one article among thousands, analysing thousands of books simultaneously, analysing a network with tens of thousands of nodes, etc.

Can a framework developed at a liberal arts institution translate to other institutions, particularly those where programmes are more ‘siloed’ with a greater emphasis on depth in a field than the breadth prized in liberal arts schools? We argue that the answer is ‘yes’ and further that breaking through such silos is increasingly necessary. To the first point, there is already a precedent for this kind of movement. In 1986, the Liberal Arts Computer Science group published a model curriculum for the liberal arts that was in turn highly influential on curriculum development in American research universities. To the second point, there are many problems facing the world today, many of them enabled or exacerbated by the digital environment, that are simply too large for a single disciplinary approach. We touch on one such example, contact tracing, later in this article. Tackling such problems will necessarily involve experts from multiple fields, each of which will have its own terminology and approaches. The framework described in the next section can help with that problem, providing a common language and a methodology that is rich enough to accommodate different fields in a way where scholars should easily find commonalities with their own experiences. Our goal is not to champion the liberal arts as a solution to these problems but rather to synthesize some of the unique lessons gained from our work in that environment.

Digital and computational studies analytical framework

Digital artifacts are the central phenomenon explored in the two directions that guide DCS. By artifacts, we mean anything that contains digital or computational components; for instance, a personal computer, the Facebook application, a car with an onboard computer, a smartphone, the Firefox browser, a smartwatch, a robot, or the Google search application. Artifacts can be digital content, but they can also either create or transform one or multiple digital contents. For instance, the Facebook application creates and modifies posts. Because specific identities of artifacts change over time, this ‘A’ engenders a set of attributes based on its embeddedness in our world: the problems it tries to solve; the values it fosters or inhibits; and the historical, social, and economic contexts of its creation and use. We have benefitted from past and ongoing scholarship in this area to shape our categories of concern (D’Ignazio and Klein Citation2020; McPherson Citation2012; Noble Citation2018; Tufekci Citation2017). Artifacts interact among themselves and with society through architectures from which DCS proposes a critical analysis of their assumptions, premises, intentionalities, stakeholders, and consequences. Architectures, such as the internet, can create possibilities, in part by structuring and orchestrating how artifacts work in conjunction with each other. But these same architectures also impose limits and generate tangible implications for technological societies. For example, Duarte (Citation2017) and Christie (Citation2005) have demonstrated how architectures developed with indigenous priorities and collaboration can push back against these limitations.

The artifact-architecture relationship is intentionally flexible so that many objects of study in DCS can be understood as artifacts and as architectures. This flexibility allows the interpretive focus to change depending on the artifact chosen as the centre of the analysis, without worrying about technical details of implementations that interfere with the heuristic potential of the framework. Thus, in one analysis, a mobile phone can be considered as an artifact that uses cellular communication networks as architecture or in another as an architecture that mediates communication between applications considered as artifacts.

This allows analysts to define the level of abstraction that is best suited to investigate certain aspects of the technology’s impact on the world. Wing asserted that the essence of computational thinking is abstraction (Citation2008, 3717) and in his article ‘Is abstraction the Key to Computing?’ (Citation2007, 38–39), Kramer explores the hypothesis that the ability to make abstractions is fundamental to computation. He approaches abstraction as the result of the process of leaving out of consideration some properties of objects while focusing on others. For Kramer, and for DCS, abstraction retains much of its etymological connotation: to abstract (ἀφαιρέω) is the act, to ‘take away’ from and the resulting abstraction (ἀφαίρϵμα) is ‘that which is taken away as the choice part’ ( Liddell and Scott Citation1940).

Since all digital and computational artifacts and architectures are based on abstractions of certain functionalities or data, in order to understand the implications of abstraction, the scholar (or the students in the class) must discuss the real-word phenomena modelled and how these phenomena are in turn impacted by the modelling choices and decisions. The model is either, as McCarty suggests, a representation for purposes of study or a design for realizing something new (Citation2004). Such models make their way into digital and computational artifacts as algorithms, data structures, communication protocols, and user interfaces. The richness and complexity of analysing digital abstractions go hand in hand. Each digital artifacts contains multiple interrelated abstractions that are organized hierarchically so that lower-level abstractions are implemented in models that become building blocks for the implementation of higher-level abstractions. Thus, an abstraction of the temporal order of waiting to enter a concert hall is implemented using digital enablers for accessing computer memory positions through the model of a data structure called a ‘queue.’ This abstraction, once available as a model, can become part of the implementation of higher-level abstractions, such as, for example, fulfilling requests for a pizza delivery application. And, on the other hand, the very concept of access to a memory location already indicates a new series of lower-level abstractions and models that go all the way down to the presence or absence of electrical impulses. It is important to note that models impose restrictions on abstractions. What can be represented or designed by models filters out abstractions and limits the way they are expressed in artifacts and architectures.

Given a particular object or process, the possible abstractions associated with them are neither unique nor neutral. They are driven by intentionality; they have a purpose. There is a risk of simplifying the process of abstraction and its result as only the recognition of what is essential. The problem is that the ‘essential’ depends on the premises of who analyses it, on the context of the application of abstraction, and on the intentions connected to them. In the same way that a ‘like’ button is an abstraction for the process of appreciating a post, it is conceivable that a ‘dislike’ button shares the same level of ‘essentiality.’ However, the context and objectives of Facebook guided the abstraction choices. We are living through the consequences of building social media on the premise of homophily, liking people who are like you, which Chun has traced to segregationist roots (Citation2018, 59–97). The consequences and responsibilities of the interpretative task of abstraction are always haunting a pseudo-objective neutral technical design of digital and computational artifacts (Gitelman and Jackson Citation2013, 3). Instead of a Platonic objective collection of essential attributes that can be mined from things and processes, in the bidirectional DCS movement, we propose an investigation of abstractions both as tools to understand processes and entities in the world but also as an object of critique of their intentionalities, priorities, values, and perspectives.

Understanding these abstractions also requires an exploration of data used by artifacts, how this data is collected, what it represents, and how it affects the outcome of the artifacts or architecture that use and transform it. Drucker’s concept of capta (Citation2011) as an alternative to data that is presumed to be given, suggests there is an active and therefore interpretative process in the ways abstractions are created and implemented in artifacts and architectures. When considering data abstractions, it is a recognition of what Downey eloquently suggested: that data is always already ‘cooked’ and never entirely ‘raw’ (Citation2007).

Analysing abstractions allow us to consider not only what is evident through the observed functionality but also in attributes not captured by the abstraction. Looking back at the original etymology of the word, it means both what we take away in terms of materiality, and from which individual and particular characteristics we move away as we attempt to create reusable models by capturing essential attributes through generalization. Analysing abstractions inverts the path of detachment and generalizations intrinsic to abstractions, enabling an examination of aspects that were deemed non-essential and that were not generalized. It is a journey from the general to the particular, for individuals, relationships, and social processes that may be misrepresented by abstractions.

The fourth aspect, agency, recognizes the need to explore how artifacts and architectures impact users and entire societies, changing their capacities to act in and impact the world. Understanding agency requires asking a number of questions, including: who uses the artifact and for what purposes? Who is unable to use the artefact and why? What is the relationship between the artifacts and the physical, emotional, social, civic, and economic aspects of the people who interact with them? What can people do with the artifact or with the product of its use based on its architecture? How are the impacts (in)consistent across populations? Amrute recently captured these principles in her keynote address at EPIC2019, ‘Tech Colonialism Today,’ by emphasizing that investigations of the agency are not just about identifying limitations or victims but exploring counterconduct and the ability to imagine or create alternatives to colonialist realities (Citation2019). Notably, it is the lack of attention to the agency that has led to so much criticism of Silicon Valley.

In On Technical Mediation (Citation1994), Latour explores four types of mediation of techniques – translation, composition, reversible black-boxing, and delegation – that provide heuristics but not necessarily methodologies – for considering agency in the 4As. The second and third senses explored by Latour are particularly appealing for an analysis of digital artifacts. The composition suggests that action is a property of the association of what Latour calls ‘actants,’ borrowing this concept from semiotics. Each actant contributes to the programme of action in different ways with its goals and functions. So when asked, ‘who cooked this cake?’ the answer is not just the cook, but a network of actants that involves the cook, the kitchen, the oven, the electricity, and so on. Decomposing and analysing this network of actants is fundamental to the critique of digital technologies. Latour’s third type of mediation is reversible black-boxing. In Latour’s example, a presentation is a joint action between the presenter and the overhead projector; however, the overhead projector becomes opaque, or a black box, within this composition, as does all the software involved in the projection process. All of the components and intentions in the projector also become a black box accessible only through an intentional analytical effort, which is, without any particular ties to an actor–network theory or methodology, precisely what is proposed in the 4As framework in the agency dimension.

In a liberal arts context, the primary agency of concern is making informed decisions in service of a just and sustainable society. Kaplan (Citation2003) emphasizes the relationship between technologies and the production of meaning in a way that aligns with this final A of the framework. Technologies are not only artifacts that we use, but they are intrinsically integrated into the foundational layers of fundamental issues such as identity, interpersonal relationships, and social and political structures. Therefore, they demand a reflection that goes beyond a purely technical description but that advances to more in-depth questions concerning the meanings being created and modified by the new technologies. In particular, there are critical questions as to how such meanings affect the agency in the world transformed by the technologies. Kaplan captures this analytical dimension by proposing a series of ‘who-questions:’ who made it, who uses it, who paid for it, who benefits from it, and who suffers from it. The systematic and methodological exploration of such questions again points to the necessary interdisciplinarity of DCS and its identity as a new offshoot on the ever-evolving tree of liberal arts, as they touch on philosophical, sociological, psychological, economic, political, environmental aspects, and so on.

The DCS 4As serve as an analytical model used to critically examine how technologies are impacting personal, social, and environmental contexts. The goal of the model is to provide guidance towards a deeper understanding of how technologies affect our lives by exploring multiple angles in a methodological yet flexible and creative way. By focusing on contexts, multiple disciplines’ priorities, and processes, we have designed a preliminary interpretative structure that is responsive to the rapidly moving target of digital and computational development. In turn, this high-level analytical framework is capacious enough to operate in the ongoing conditions of rapidly accelerating change.

The 4As at work – contact tracing

Contact tracing as artifact

Covid-19 has revealed both the great promise of digital technologies as well as their current limitations and problems and thus is ripe for an examination using the 4As framework, especially since the issues involved span multiple disciplines. Our goal is neither to propose a one-size-fits-all solution to this complex problem nor to offer a comprehensive study of all forms of contact tracing. Rather, we want to demonstrate how the 4As operate as a critical, analytical framework, even for a phenomenon that is emergent rather than developed. We begin with artifact questions that examine the problem(s) being solved, value systems at play, and contexts of creation and use.

The proposed technological solutions for identifying individuals who might have been in contact with someone who tests positive for COVID-19 must address the problem of both tracing the spread of the virus and notifying individuals who are at risk of being infected. Superficially these would seem to be straightforward, questions easily solvable by technologists. Yet, we will see that choices around abstraction in some apps disproportionately reflect the historical, social, and economic contexts of this technology’s creation and use. The populations being asked to use (or be subjected to) these apps are in the midst of reflections and reactions to a confluence of several political, intellectual, and cultural debates. This has become a moment of global protest of systemic injustices and racism, accelerated by the high-visibility murders of several Black individuals at the hands of white American police officers, necessitating a focus on privacy to protect those who are politically, economically, geographically, and culturally vulnerable. Wealth and basic need disparities have been exacerbated by the ubiquitous economic downturn that has accompanied the pandemic (i.e. fissures in internet access equity have been exposed as schools moved instruction online, but so too the lack of availability of clean water for hand washing), such that digital literacy, device ownership, connectivity, charging, and the possibility of theft are potential obstacles to widespread use. This global public health emergency is occurring at a time when the expertise of scientists and other intellectuals is increasingly undermined by platforms designed to capitalize on (if not provoke) post-truth behaviours, requiring solutions that shift more agency into the hands of the individual. Thus an app that requires a smartphone, data logging, notifications, and trust in corporations and institutions that have contributed to these conditions, might not be effective in all contexts. Indeed, emerging research evaluates the success of national and regional technological interventions that have aligned with value systems (Wnuk, Oleksy, and Maison Citation2020), political contexts (Raposo Citation2020; Barbieri and Darnis Citation2020), and framing the problem as surveillance rather than tracing (McCall, Skutsch, Honey-Roses Citation2021).

By synthesizing priorities of different fields in order to shape the underlying questions for analysing a digital artifact, the 4As bring together design concerns typically reflected by compartmentalized specialist research output like these examples. Bringing specialists together around a complex problem, particularly an urgent and emergent one, provides more than opportunities for critique: the specialists become agents of potential change. That connection between the artifact and agency rests in decisions made about the architecture to use and the abstraction to instantiate.

Architecture considerations

DCS architecture questions prompt us to consider the human and technological infrastructures that connect devices. The overall human, cultural environment is one of the technological solutionism: in the US, this has meant that the unicorn startup or the multinational monolith will sell (or hopefully donate) the artifact that will solve the problem through the design of a universal application. The situation is ripe for promises supported by the origin myths, hype, and the rhetoric of magical solutions to complex problems pervasive in certain spheres of North American technocultural production (Elish and Boyd Citation2018). That hype in turn is perpetuated by media coverage that overlooks the significant architectural barriers to app design and use (Alkhatib Citation2020; Gruber Citation2020). Within that structure are other architectures that determine how the artifacts interact with one another. For starters, 4G and 5G technologies are insufficient because of coverage concerns (let alone the conspiracy theories that prevent widespread adoption). Bluetooth is a possible solution, it is present in most smartphones and can be deployed in ways that protect privacy, but its accuracy is limited, necessitating the collection of more data. Security expert Bruce Schneier has bluntly articulated how accuracy issues will provoke false positives and false negatives for contacts, rendering such an app effectively useless (Citation2020). Other proposed solutions require more sensors in the device: accelerometers, gyroscopes, or light detectors (O’Neill Citation2020). These immediately run up against the contextual values described above.

By exploring the ways in which intentionalities of architectures result in different consequences for stakeholder groups, the 4As brings together disciplines for evaluation and indication of alternatives. Social scientists would propose a more relational and collaborative understanding of this scenario; literary scholars would exploit the implicit dichotomies to create an opportunity for speculation. That is, what if the solution is not having a team of technologists (even in collaboration across monoliths like Apple and Google) adding complexity to a single device but having a team of stakeholders coordinate multiple simple artifacts across several contexts? Even optimistic articles about such apps recognize the broader systems of power, incentives, and technology in which an ambitious tracking programme needs to be enacted (Economist Citation2020). Charlton D. McIlwain, founder of the Center for Critical Race and Digital Studies, reminds us that the black community has recognized the need for a multi-pronged approach for generations. His description of Anita Brown’s technological revolution against white racial dominance outlines that it ‘needed to be digitized and networked – not monetized – and that it must include citizens, government, educational institutions, and corporations’ (Citation2019, 167). The 4As prompt us to ask: what if the way to achieve the goal is not a single artifact, but an architecture unto itself, and one that blends the technical with human agency in the contexts in which contact is likely to occur?

Abstraction

The answers to many of these analytical and speculative questions hinge on which abstraction is embodied by the artifact. The 4As framework suggests evaluating the differences between an app designed to track users (surveillance) and an app designed to model the transfer of virus droplets (epidemiology). To a certain extent, industry conversations about the use of bluetooth and encryption are attempting to mitigate the consequences of a surveillance-based model because that framework is already in place: disproportionate power structures, large-scale collection of data with an uncertain permanence or secondary use and loss of context for the sake of universal implementation and decision-making. Moral cautionary tales abound in history. When coupled with the war rhetoric being used by some to frame the challenges of protecting a population from a disease, proposed abstractions risk perpetuating social injustices by making a social problem a people problem, with emphasis on the phrasing that echoes the abstractions behind algorithmic ‘solutions’ to crime. As Selbst et al. articulate in the context of fairness in machine learning models, unless the frame for the abstraction includes social elements such as ‘decisions made by humans and human institutions,’ the resulting model risks unintended and unjust outcomes (Citation2019, 3). Add to this the lack of representation and diversity in Silicon Valley, and the opportunities for correcting or at least foreclosing, certain inequities, let alone creating an inclusive, fair model of the phenomenon are practically eliminated.

Emphasizing abstraction fosters considering the differences in what gets extracted in order to track people versus tracking opportunities for viral transmission. The first abstraction is hyper-individualized, focusing on establishing the length of a hypothetical line separating two people, with the varieties of devices and ways of carrying them interfering with the accuracy of data collection. Such a model does not reflect the roles of space and placemaking in interactions or transmission, something captured, yet problematized by Google’s mobility reports and cellphone tracking data (Google Citation2020; Thompson and Warzel Citation2019). The second abstraction emphasizes the setting. Environmental geographers would remind us that in addition to the people using those spaces, individuals and groups have varying responsibilities associated with commercial, non-commercial, and public areas. Adding a third actor could allow for triangulation, other systems for data collection, and yet other models for determining when to alert someone about possible transmission. Conversations with specialists in other fields could lead to different frames for the abstraction, thus suggesting alternative sensors, data to collect, and mechanisms for alerting device owners of potential contact with the virus.

In principle, the abstraction indicates the data to be collected, which then informs the model. In practice, the data that can be collected is often a proxy for the abstraction, since it relies on the modelling choices of the technology behind the collection. Co-opting a smartphone as a disease tracking device both transfers the original intentionalities of the phone’s development into a public health context and fixes the technical landscape in which the abstraction can occur. As such, any solution risks being part of what Benjamin has labelled the New Jim Code: ‘the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era’ (Citation2019, 5–6). To break from this cycle, in addition to acknowledging the historical warnings offered by McIlwain, the multi-disciplinary DCS approach advocates for the investigation of cultural, social, and technical models beyond the mainstream or the main problem: examples of large-scale cultural adoption of low-cost accessories, structures of successful community collaboration, and minimal technology applications that are accessible and implementable in diverse environmental and socioeconomic conditions.

The ideal abstraction for a delimited context versus the one that gets implemented can thus vary widely, particularly if pre-existing artifacts and architectures drive the abstraction process rather than vice versa, as demonstrated above. By approaching the question of abstraction from multiple disciplinary perspectives, DCS creates analytical avenues to imagine alternatives to problematic status quos. The chief outcome of such a realignment of the relationship between abstraction, artifact, and architecture can then be the reclaiming of agency.

Agency

If DCS were to declare an ethos, it would likely follow Benjamin’s counter-declaration of the prevailing mantra of Silicon Valley: ‘Move slower and empower people’ (Citation2019, 17). The 4As framework drives questioning disparities of use and accessibility, as well as how an artifact’s affordances or limitations impact a user’s ability to act or to be acted upon. In the case of a COVID-19 tracking artifact, the consequences of abstractions on agency are notable. Evan free apps run on devices that are not free, and even apps that rely on bluetooth to detect proximity still require a cellular or another internet connection to a central database in order to receive alerts when a positive test is recorded for someone in a user’s proximity. Inequities of access in all of these areas exacerbate vulnerabilities for populations already struggling disproportionately with being on the front line of exposure, at high risk for job loss, and living in under-resourced areas. Moreover, subsequent models of disease transmission built from this data would exclude the experiences of those populations, further widening the gulf of understanding, access, and equity.

A primary question of agency is what can be done with the artifact. A surveillance model limits opportunities for taking actions to: have a device, install an app, and set requisite permissions to enable data collection. Knowledge, and subsequent power to act on it, rests in the privileged hands of individuals in a space, not the person or organization responsible for that space. What opportunities exist for a user to disrupt the system other than opting out? In addition to reports of false positives that would trigger unnecessary alerts, artist Simon Weckert’s virtual traffic jam in Google Maps, created from a hand-drawn wagon filled with 99 cell phones, is a reminder of vulnerabilities in such systems.

In a moment of global protest against racism, the agency of users wary of being identified by governments or counter-protesters is paramount. Given the problems in data collection, the resulting interpretation of COVID-19 transmission will reflect, at best, transmission in a subset of relatively affluent users who are willing to opt in and self-report a diagnosis. Such concerns have prompted scholars at Oxford and the Turing Institute to propose an ethical framework in which to evaluate such apps (Morley et al. Citation2020). We propose the 4As as a mechanism for thinking through such artifact creation in order to produce outcomes that are more just and sustainable, offer more nuanced understanding of a phenomenon, and engage with global perspectives on technology use and development.

Concluding remarks

We recognize the primacy of process over product in these bidirectional disciplinary encounters. Computers and network technology are changing scholarship by providing new ways of obtaining and sharing data as well as new objects of study, while computer science affords new tools to analyse data as well as new challenges for scholarship to confront. In a liberal arts context, this directly impacts our work as educators and researchers. In the past, at such moments when technologies fundamentally changed how we analyse or experience the world, other disciplines have co-evolved to provide an organized framework for interpreting and responding to those changes. In this article, we proposed a new academic discipline, which we call DCS, and an associated framework, the 4As, that can serve to integrate work that currently spans the academy. At a moment when the ubiquity of devices and apps would suggest that they are a given, like data in its etymological root datum, the 4As provide a way to treat these artifacts similarly to Drucker’s capta: things removed from a context, subject to sceptical scrutiny for their partiality, and one of many alternatives yet to be imagined.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Crystal Hall

Crystal Hall is Associate Professor of Digital Humanities at Bowdoin College where she teaches courses on the relationship between technology and scholarly practice. Her research specialization is Renaissance and Early Modern Italy, and her digital project on Galileo's library builds on both the research completed and the questions raised by her first book, Galileo's Reading (2013). As a founding member of Digital and Computational Studies, she contributes to several campus committees, regularly conducts outreach to colleagues from all academic disciplines, and acts as a liaison with the offices of Information & Technology, Academic Technology & Consulting, and Bowdoin Libraries.

Eric Chown

Eric Chown is the Sarah and James Bowdoin Professor of Digital and Computational Studies at Bowdoin College, a program that he helped found. Chown won an NSF CAREER grant in 2001 for his work on computational models of space, and another NSF RUI grant in 2010 for his work in robotics. His general areas of research also include computational models of human learning, the human emotion system, cognitive robotics, and most recently how people create and understand metaphors. He spent 12 years as the team leader of the Bowdoin College Standard Platform League (SPL) RoboCup team, the Northern Bites, that won the World Championship in 2007, and had several other finishes in the top 3 in the world.

Fernando Nascimento

Fernando Nascimento is Assistant Professor in Digital and Computational Studies at Bowdoin College teaching courses on philosophy of technology and hermeneutics. His research is organized in three interconnected academic axes of ethics, hermeneutics, and digital technologies and has the French philosopher Paul Ricoeur as its main theoretical reference. Prior to his academic positions, he worked for almost 20 years in the telecommunication industry developing software for mobile devices worldwide. He is currently co-director of the Digital Ricoeur project and director of the Society for Ricoeur Studies.

References