4,345
Views
2
CrossRef citations to date
0
Altmetric
Review

Circulation of conspiracy theories in the attention factory

ORCID Icon
Pages 162-177 | Received 09 Dec 2021, Accepted 20 Feb 2022, Published online: 03 Apr 2022

ABSTRACT

The article argues that the hybrid media environment contributes to contemporary epistemic contestations. Framing the argument with the historical and social scientific contexts of our present media landscape it discusses the logic governing the content confusion that permeates this landscape in relation to the construction of world views and social reality. Then, it examines the notion of an attention factory. By way of an example of how the attention factory works and how conspiracy theories are circulated, the QAnon phenomenon is presented. Finally, the article considers whether and how aspects of today’s media environment can be considered responsible or a contributory factor to the high public exposure and visibility of conspiracy theories. The article concludes with a brief discussion of some factors that are deterring the spread of conspiracy theories.

Introduction

In our social media dominated landscape, genre boundaries and the motives of content producers have become increasingly and irrevocably obscured. A text or video that looks like news may well be genuine, bona fide news, but it may also be satire or parody or the product of trolling, provocation for provocation’s sake. This intertwining of genres and motives is addressed by Mara Einstein in her book Black ops advertising (Citation2016). The term she uses to describe the boundary-busting phenomenon is content confusion.

In this article I set out to show how our current media environment contributes to the prominence of conspiracy theories because of the febrile combination of increased epistemic instability and the deregulated attention factory. With epistemic instability (Harambam, Citation2020) I refer to a situation where collective imaginaries of the world, i.e. “truths” about how the world is and how we can find information are being challenged. Traditional knowledge producing authorities such as science and universities, media and cultural organizations, governmental bodies and decision makers as well as established religious institutions are also being contested, undermined, and challenged for different reasons and motivations.

In relation to the media environment and its content confusion, Mara Einstein’s idea of a “black ops operation” is in many ways useful here. On the one hand, the concept underlines the obscurity of the meaning of contents and the difficulty of establishing producers’ motivations in the present media environment. On the other hand, it reminds us that we face quite intentional manipulation every day – whether in the form of advertising, malign information operations, propaganda or provocation.

Another concept I discuss in this article is that of the attention factory. This is an extension and elaboration of the concept of the “attention machine,” developed with colleague Niina Uusitalo (Uusitalo & Valaskivi Citation2020). The concept of the attention machine evolved from our study into how newsrooms directed and controlled the audience’s attention in their coverage of a terror attack. Following Swiss cultural theorist Yves Citton (Citation2017), we take the view that in today’s media environment attention requires an analysis of its epistemic factor, not just its economic value. The social reality in human communities is always determined based on the current focus of collective attention and our communication and media technologies play a key role in directing and controlling attention. Therefore, the type of logic that dominates and governs the communication environment is critical in managing collective attention.

Always outrageous and shocking, conspiracy theories evoke emotions, attract attention and extract reactions, and for these reasons, they make for particularly “useful” content in the attention factory. Given the landscape of content confusion, it is hard for media users to identify conspiracy theories, and harder still to know when those theories are possibly being used for purposes of manipulation.

I begin by describing the historical and social scientific context of our present media landscape. I then discuss the logic governing the content confusion that permeates this landscape in relation to the construction of worldviews and social reality, before presenting the notion of attention factory. As an example of how the attention factory works to circulate conspiracy theories, I refer to the spread of the QAnon phenomenon. Finally, I discuss whether and how aspects of today’s media environment can be considered solely responsible, or a contributory factor, to the high public exposure and visibility of conspiracy theories. Finally, I briefly address some factors deterring the spread of conspiracy theories.

The secularization thesis

Modern Western social theory and sociology have considered society to be divided into specialized functional sectors, each with its own institutions to safeguard its operation (see e.g., Beyer, Citation1994). The theory is that in secularized society, the state, economy, science, and law have diverged into their own separate segments with their corresponding institutions (Casanova, Citation1994). The secularization argument furthermore posits that with modernization and the rise of education levels, religion will gradually cease to have an impact in society. Modernization theory and the secularization thesis have been widely challenged, but they still continue to shape and influence our everyday understanding of the world, as concretely represented by the newspaper (and news in general). Journalism is an epistemic authority that divides the world into seemingly clear-cut segments: politics, economy, culture, sports, domestic and foreign events.

In the theory of modern society, epistemic authorities had their own established institutions. These authorities, including the family, the school system, religious institutions (in the Nordic context especially Lutheran churches), science (including the university system), journalism and public authorities, have largely assumed responsibility for knowledge and meaning production in society. At the same time, these institutions of knowledge production have steered the search for a common understanding of society, life and the world and structured the understanding of history and the future. This production of meanings and understandings has depended importantly on religious and media institutions as well as political, educational and science institutions.

While the secularization thesis has been called into question in many ways, the biases of modernization theory have become increasingly apparent with the expansion of education, rising living standards, the growth of individualization and advances in communication technologies. Sociology, anthropology and religious studies are constantly having to revisit and reassess their past where the narrative of modernization was developed without any consideration of global inequality and exploitation (Bhambra, Citation2014), the interweaving of different segments of society and the biases of secularization theory. Furthermore, large bodies of social theory have been blind to the role of communications and especially media technology in the construction of communities and belief systems. Another factor that long stood in the way of recognizing the religious aspects of other ideological or worldview-based phenomena, such as conspiracy thinking, was a narrow conception of religion. In this Eurocentric view, religion is seen through its established and institutionalized forms and is mainly defined with explicit or implicit reference to Christianity. Although there is a growing body of literature that “blur distinctions between media spaces and complicate definitions of religion” (Peterson, Citation2020, p. 1) this way of defining religion still has a strong hold in study of religion and media.

Modernization theory also continues to enjoy wide circulation and to inform many attempts to explain the current situation. For example, the mediatization theory (see Hjarvard, Citation2008), which predates the rise of social media, outlined a situation where the “logic of the media” penetrates other specialized sectors, changing their respective logics and the mutual relations between those sectors.

Yet the expansion of internet use and especially the establishment of social media have made increasingly clear the insufficiency of modernization theory alone to explain the current situation. The spaces of public debate have diversified, conceptions of private and public have become dramatically intertwined, and processes of knowledge and meaning production have become democratized so that epistemic authorities no longer are in the position to control the public space of debate or to set the agenda (Peterson, Citation2003). At the same time, questions have been raised as to whether they ever had that control and to what extent modernization theory and its narrative of secularization have in fact steered research toward recognizing some things and ignoring others (see e.g. De Vries, Citation2001; Hoover, Citation2011; Peterson, Citation2020; Promey, Citation2014; Stout, Citation2012).

Journalism and social media, equally dependent on attention

With growing contestations of epistemic institutions, often called as “the post-truth era,” research of media, journalism in particular has developed a growing interest on epistemic questions by suggesting a new research field of “Epistemologies of Digital Journalism” (Ekström & Westlund, Citation2019; Ekström et al. Citation2022) and proposing social epistemology a useful new paradigm for journalism and media studies (Godler, Reich, & Miller, Citation2020). Quite characteristically to journalism studies this research is mostly normative in the sense that it gives suggestions as to how journalism could and should do better and how audiences of journalism should be educated, for the epistemic stability of the society to be sustained and strengthened. Because of the context of journalism this research, although pointing out the challenges of the digital environment to journalism, tends to ignore the fact that journalism itself is a part of the same functioning logic of the media environment and dependent on user attention as all media forms are.

In this era of social media, traces remain of discussions that used to take place in the everyday. The social contexts of these discussions are also different: social media communities can now be formed around a leisure interest, fandom or peer support, with no physical distance limitations. The translocal circulation of influences is more possible than ever before. On the one hand, this trend in development is challenging the taken-for-granted status of national epistemic institutions and creating a public space of debate with more voices – in other words, creating democratization. On the other hand, epistemic institutions are gaining a stronger role in the sense of directing attention, one reflection of which is the surge of conspiracy debates during the pandemic. The Covid-19 pandemic prompted an immediate 30–50% rise in the use of instant messaging, social media, and streaming services (Aral, Citation2020) – which of course also brought an increase in the volume of circulating contents, including conspiracy materials. But journalistic media played a pivotal role in drawing public attention to the phenomenon.

Since a prime driver of our media environment today is attention, many of the machines in the attention factory are specifically designed to attract maximum audience attention. The ability to direct attention is recognized as central to the social power of journalism (Kunelius & Reunanen, Citation2014). Wide-ranging discussions about the attention economy in the social media context have addressed the ways in which social media companies constantly adjust and fine-tune their algorithms to ensure the maximum spread of content that effectively captures the attention of users. Aral (Citation2020) uses the term “hype machine” to describe this algorithmic feedback loop that relies on artificial intelligence engineering as well as human intervention. Within this hype machine, it seems that journalistic gatekeeper organizations have irreversibly lost control over the attention of audiences. But journalism still plays a significant part in how the attention factory works. It serves as the factory’s attention apparatus, which among other things effectively mainstreams marginal phenomena, such as the circulation of conspiracy theories.

By now considered a rather naïve and idealistic vision, the internet and later social media were long developed (or at least marketed and lobbied) on the premise that technology which enabled polyphonic participation will in itself produce equal freedom of speech and expand opportunities for participation and strengthen democracy. In the social sciences, even critical research long preoccupied itself with these opportunities offered by new technology, such as online activism. One instance of this was the celebration of the Arab Spring 2011 as the first major revolution of the social media age (see e.g., Wolfsfeld, Segev, & Sheafer, Citation2013). Many of the engineers and consultants who had key roles in social media companies have since distanced themselves from their earlier work and acknowledged the adverse social consequences of algorithmic social media. (for a cavalcade of these mea culpa interviews see Orlovski, Citation2020). Many of these technology developers have proceeded to establish projects and start-ups to develop “better” technology, most notably former Google executive Tristan Harris’ Center for Humane Technology. Likewise, media studies have increasingly begun to emphasize the impacts of unintended social and environmental consequences (see e.g. Andrejevic, Citation2013; Hintz, Dencik, & Wahl-Jorgensen, Citation2019; Kuntsman & Rattle, Citation2019; van Dijck, Poell, & de Waal, Citation2018; Velkova, Citation2021).

The thinking has gradually gained ground that the crux is not technology itself but rather the way in which technology has been taken into use. In particular, the ways in which technology is harnessed for profit are liable to have undesirable social consequences (Couldry & Meijas, Citation2019; Zuboff, Citation2019). Whenever the use of current media technology is driven solely by profit motives, the social consequences are severe and serious. In other words, the combination of naïve technological determinism and belief in the blessings of unregulated free markets has led to a situation where the life of social media users serves as the fuel for the attention factory, whose interest is guaranteed by algorithmic emphasis on maximally provocative content. Recently political decision-makers have also become wise to the unintended consequences of the structures of our media environment. Examples are provided by European GDPR legislation, which regulates data collection and users’ privacy protection, and by increasing debate over how global social media platforms should be regulated and restricted in order to minimize damage.

In the situation today, content confusion and the contestation of epistemic authorities both serve to erode social trust and, paradoxically, undermine polyphony and democratization because of hate speech, for example. The features of the media environment that have potentially given a voice to minorities and the oppressed are the very same features that have facilitated manipulation and information operations.

To condense the above, I see the whole media environment – not just journalism – as an intertwined system of epistemological production and reproduction (see Hemmer, Citation2016). The way this system enables collective knowledge creation and socio-cultural sensemaking is dependent on the technological affordances, as well as the operating and business logics of the system. In what follows, I argue in more detail how some of the core logics of our contemporary commodified media environment are contributing to the epistemic instability we are currently experiencing.

Epistemic instability in the age of content confusion

Jaron Harambam uses the concept of epistemic instability in his ethnographic study. With it he refers to the experience and the circumstances of media users exploring conspiracy theories in the jungle of different information sources. For me, the concept of epistemic instability also makes a significant elaboration of the rather tedious debate about “fake media” and ”the post-truth era,” and furthermore, serves the purpose of describing the erosion of the power of epistemic authorities and the democratization of knowledge production.

The concept of epistemic instability illustrates how in the present-day media environment, knowledge and meaning production has become an increasingly disruptive battlefield. For some time now, both every day and political discourse have been trying to make sense of this battlefield by referring to fake news or fake media and the post-truth age. The former notion has become a political weapon that is used to question the value of news that one does not want to hear and even to attack professional journalistic media. Debate on the post-truth era, for its part, tends to get bogged down in the question of when, if ever, we have lived in an age of truth, if we have now landed in the post-truth era. It seems that the concept of epistemic instability offers different angles on how the power of epistemic authorities today has become visible and how challenging these authorities has become such an everyday thing. The contest between different modes of knowledge production and different ways of seeing the world and the debate on what is true are now taking place in the same spaces, on the same social media platforms where the news meets opinion and where, say, the medical specialist meets the experiential expert. At the same time, the tones of political debates have been infiltrated with populist “bad manners” (Moffit, Citation2016) resulting into erosion of common courtesy in discussions.

To return to the example of television news in Finland: It well illustrates the magnitude of the changes we have seen in the media environment. If one “official” broadcasting company airs the news at the same time on both the channels that are available, there can be no question about the genre of the programming or about who has produced the news. This kind of genre clarity upheld and consolidated the genre contract between the programme producers and viewers, that is, conceptions of textual characteristics and the shared conditions of production and reception (Fiske, Citation1987; Gledhill, Citation1997; Morley, Citation1981; Ridell, Citation1998; Valaskivi, Citation1999). The historical strength of the television news genre in Finland is described by a satirical television show Iltalypsy (Haggling) that made its debut in the early 1990s and that initially caused deep public concern. As the programme was first hosted by newsreaders and as it used materials left over from the newsroom, critics feared that it would be impossible for the audience to distinguish news from satire. Indeed, the producers eventually had to make a cleaner break from the television news and to underline the programs' satirical nature by means of various visual cues to make sure the genre boundary remained clear (Valaskivi, Citation2002; Wilska, Citation1997). But at least it was still possible to curb and even prevent content confusion by changing the ways in which the program was made.

Today this remedy is no longer possible because the genre contract has been thoroughly eroded by the diversification of media channels, the fragmentation of audiences and the business models of algorithmic social media. This process is illustrated by the concept of content confusion discussed earlier (Einstein, Citation2016; Noppari, Haara, Nelimarkka, Toivanen, & Valaskivi, Citationn.d.; Valaskivi, Citation2017). In a media environment permeated by content confusion, it is often impossible to know – at least without thorough investigation – who has produced a specific media content and for what purposes: a text that has the appearance of a news article may be just a joke or propaganda, and a documentary may turn out to be an advertisement. In content confusion, audience choice of content also takes place differently: mostly not based on genre but through algorithmic curation that seeks to catch user attention and induce reactions. Often the motivation for producing a text will never become known to the recipient, who may decide to share the content without that knowledge and so make it even harder for the next recipients to understand that motivation.

At the same time, the attention factory will seek to maximize the attention and reactions it receives from users. These are the drivers of its business profits and data. The different parts of the attention factory are all interconnected, and content is circulated between them. In this process of circulation, that content changes and mutates, and it is framed and reframed in multiple different ways (Pyrhönen & Gwenaëlle, Citation2020; Toivanen, Nelimarkka, & Valaskivi, Citation2021). Different agents’ motives for framing and content production become blurred along the way, and the motives of users involved in the circulation will be layered on top of them.

Content confusion and the difficulty of identifying text genres and the motives for producing texts are key sources of epistemic instability. Not only are epistemic authorities and their position and legitimacy purposely challenged and undermined by circulating conspiracy theories, by attacking and critiquing elites and by questioning their legitimacy. It is also difficult to distinguish for what purposes content is produced in the attention factory and who has produced that content. A text that popularizes scientific results, for instance, can have the appearance of pseudoscience, and vice versa. This confusion paves the way for propaganda and manipulation, but also propagates the spread of misinformation.

The democratization of knowledge production means that different ways of organizing knowledge or seeing the world appear on the same arenas to compete for the attention of media users. Scientific knowledge and traditions on the highest rungs on the epistemic hierarchy, typically the domain of universities or churches, are challenged by experiential knowledge, an intuitive approach or by advocating channeling as a way of acquiring more authentic knowledge (Robertson, Citation2021). The key content of conspiracy theories often stresses that epistemic institutions or authorities are corrupt or evil liars. Their proper replacements, conspiracy theories have it, would be the community of the “enlightened” who have seen the truth. Under conditions of content confusion, social media users must make a special effort to ascertain the origin and veracity of the arguments presented. In most cases this effort will be rewarded, and the lies will be uncovered, but the average social media user will not always have the means available to recognize deliberate manipulation, for example – whether its motives stem from financial or political interests or just pure amusement. A user, who shares the worldview of the presented arguments, will not even try.

Attention factory, emotions, and algorithms

It is nothing new to point out that attention is pivotal in our media environment today. This is well-established in the debate and research on the public sphere, marketing, and branding, most particularly by virtue of the concept of attention economy. In this article, however, I follow the ideas of Swiss cultural theorist Yves Citton (Citation2017) regarding the “ecology of attention.” The concept of ecology conveys that we are dealing with something much wider than just “economy”: attention and the collective direction of attention contributes to the construction of social reality. The media environment in its entirety – that is, the attention factory – makes up an “ecological” system whose principles have a key influence on how conceptions of reality, of good and bad or “us” and “others” evolve and take shape in our digitalized and datafied society. In other words, it is the media ecology that defines how our media functions as a system of epistemological production and reproduction. To put it most simply, the logic of the media environment, technological affordances and the ways in which people act in the environment determine what becomes social reality and what does not. Issues and subjects that receive no attention in public debate do not become part of our shared reality and social imaginaries. In essence, then, the attention factory is ultimately a matter of power: the direction and tone of attention determine our understandings of the meaning of ourselves and others in our community and society.

The attention factory is a highly complex system that comprises multiple different platforms, channels, genres, content producers, circulators, and other actors. Although the attention machines within this system – journalism, algorithmic social media platforms, anonymous imageboards, and so on – have different logics, cultures, and ethical premises, they all share the same dependence on user attention. Attention machines will therefore aim aggressively to ensure that they can gather as many pairs of eyes and clicking fingers as possible. Recent research is broadly unanimous that the problem lies precisely in the complete disinterest of most attention machines in what kind of content is harnessed to the end of garnering user attention (e.g., Aral, Citation2020; Zuboff, Citation2019). As a result, the question of whose truth is the right one changes into a subjective and identity politics issue. On the one hand, conspiracy theories are used as tools of identity politics, and on the other hand, various kinds of communities emerge in the social media field that reinforce themselves and their worldview by collecting, sharing, editing, and commenting on conspiracy materials.

All this communicative activity provides fuel for the attention factory, which applies a range of attention-garnering tools that were not available to any other earlier technology. Attention machines collect data on users’ actions and movements and on this basis infer what kind of content those users are interested in or with whom, what kind of other users, they are networked. Media users with their human reactions are the workers of the attention factory that produce for media platforms and media houses their profits and the data for the further development of artificial intelligence.

Contents that evoke an emotional response are particularly valuable in the attention factory because, as a rule, they will first attract people’s attention and then prompt a reaction (Knuutila, Citation2019; Wahl-Jorgensen, Citation2019). From recent Facebook leaks we know that the social media platform for this reason designed its algorithm to prioritize contents that triggered the “hate” emoji (The Wall Street Journal, Citation2021). Conspiracy theories are then profitable contents as they elicit both attention and emotions for various, even contrasting reasons. First, the element of surprise almost always attracts attention, and conspiracy theories are usually surprising because they don’t have to obey the laws of physics or any other established facts. For the same reason a lie will always gain more traction in social media than a sound factual argument because it is probably more surprising than the truth. Secondly, conspiracy theories make use of well-known and existing mythical narratives and stereotypes that for some people bring a sense of group membership and for others a need to contest these narratives. With both supporters and opponents disseminating the lie, the algorithm-strengthened message spreads even farther. Thirdly, conspiracy theories are effective tools in creating enemy images and in drawing distinctions and boundary lines.

Attention factory, QAnon and epistemic instability in crises

It has been noted that the appeal of conspiracy theories increases at times of crisis, in the same way as interest in religions and the exercise of religion increase with the growth of social instability and feelings of insecurity. The crises may be acute, sudden and result from natural disasters, terror attacks or war, but they may also be longer term, slowly evolving situations that have no definite start or end point. In our modern life, crises and the collective understanding of their meaning, reasons, depth and consequences are all formed via the media. In media society, crises and catastrophes become hybrid media events (Sumiala, Valaskivi, Tikka, & Huhtamäki, Citation2018) whose reasons and explanations are sought on different kinds of platforms and in different kinds of genres.

In a crisis situation, conspiracy theories provide clear and simple explanations to complex questions that epistemic authorities are unable to answer with any conviction. The scientific process is slow and based on the best evidence currently available, which may be questioned with the emergence of new knowledge. Journalism, too, often produces imperfect, incomplete knowledge, especially in conditions of unfolding crises. In this situation, conspiracy theories can answer the need for certainty and security, which is particularly pronounced if people are concerned and angry and fearful, emotions that are commonplace in crisis situations. Conspiracy theories also offer a community of like-minded people, a potential source of social support – something that is rarely forthcoming from either science or journalistic texts. In this sense, conspiracy theories provide an answer to epistemic instability: they are a new authority in which people can believe. This happens although conspiracy theories at the same time highten a sense of crisis and ferment a sense of threat with suggestions of evil and deceitful actors working behind the scenes.

As I noted above, the attention factory accelerates and expands circulation of conspiracy theories among other reasons because they elicit reactions both from those who are interested and who support them and from those who are bemused and who firmly reject them (Knuutila, Citation2019). Contents that elicit widespread public reaction will be algorithmically interpreted as holding much interest to large numbers of people and therefore they will be spread to ever new users. It is now acknowledged that the algorithms used by Youtube have prioritized far-right content (Helin, Citation2021) and in Finland, for instance, mainly offered anti-vaccine disinformation and conspiracy theories to people looking for Covid vaccine information.

Conspiracy theories elicit reactions because they elicit emotions. The ideal raw material for the attention factory is content that draws reactions from as many users as possible. Conspiracy theories baffle people who think the stories are ridiculous and who think the people peddling them are “crazy.” At the same time, others are interested and inspired. With reactions coming from both the baffled and the enthused, algorithms will tag conspiracy theories as ideal, optimum content. That content will consequently be shared and spread ever more widely. Once the phenomenon reaches critical mass on social media platforms, or when journalists learn about it through social media updates by critics of the phenomenon, journalism will begin to show an interest, spreading the conspiracy narratives to millions of recipients. In other words, mainstreaming is the result of journalistic media giving news coverage to the social media phenomenon (see e.g. Chadwick, Citation2013 on the relationship between journalism and social media). By this stage at the latest, politicians will begin to take note and seize the opportunities presented. It is as likely, however, that some politicians have jumped the bandwagon already much earlier. Journalism readily reports politicians who circulate these narratives further amplifying conspiratorial contents.

To illustrate how the attention factory works, I refer to QAnon, which concretely demonstrates the interactions and mutual relations between different media platforms, channels and genres. QAnon was perhaps not initially deliberate propaganda, but it was used as such by Trump’s presidential campaign, as well as by Russian and Chinese actors (The Soufan Center, Citation2021).

Nevertheless, the original location of QAnon has been traced back to the 4chan imageboard and later to the 8chan imageboard. Central to the imageboard culture is satirical and ironic “trolling” and “shitposting,” pranking and joking with the aim of getting other users to believe in malicious arguments made about any subject that arouses speculation and emotions: about political opinions, religious views, current events, and identity issues. In this environment conspiracy theories and posing as an “insider” of some organization of interest is part of the culture of interaction. For true fans and devotees, the imageboard culture and the climate of speculation are nothing out of the ordinary. Fooling “naïve” users and people from outside the imageboard culture is part of the inner circle having fun. (QAnon Anonymous, Citation2018).

There are many theories, but it is still not known who decided to start spreading cryptic messages under the pseudonym of Q on 4chan in October 2017. Q claimed to be a US government insider with Q-level security clearance. As it turned out, Q was a diehard Trump supporter whose online prophecies hinted at a global satanic pedophile cult running a secret war to overthrow the US president. Q predicted that Trump would prevail in this war between good and evil and that subsequently the world would be a better place. Q was by no means the only insider appearing on 4chan at the time; others included FBIAnon and WhiteHouseAnon. (Ellmer, Citation2021; QAnon Anonymous, Citation2018).

There are several reasons why QAnon took off the way it did. My focus here is on the factors that are related to the current media environment. The first steps came with social media influencers, who intentionally spread “crumbs” of Q’s messages on YouTube and Reddit. Over time, QAnon began to appear on well-known far-right channels, such as Alex Jones’s radio show and Breitbart online news, which were already known for spreading conspiracy materials. Q supporters then began to turn up at Donald Trump’s rallies ahead of the 2018 midterm elections, and the campaign organization started actively to exploit the phenomenon. At this stage Q’s “crumbs” had moved from 4chan to 8chan (later 8kun), but most of the increasing number of followers were mainly picking up the message via other platforms. When Reddit banned all QAnon content, the material began to seep through to Facebook and Twitter. FBI announced in 2019 that the growing extremist movement associated with QAnon had brought an increased threat of domestic terrorism in the United States (FBI, Citation2019) and has repeated this warning several times, latest in June 2021 (Barr & Pecorin, Citation2021).

QAnon entered the international mainstream with the Covid-19 pandemic. In March 2020, when most countries across the world were closing their borders and ordering citizens into lockdown, the use of social media and instant messaging services jumped by almost 50% overnight (Aral, Citation2020). Thousands of QAnon-related accounts were set up on Facebook and Twitter and the number of members in these groups skyrocketed. Various conspiracy theories related to the pandemic became attached to the original Q narratives. By this point the phenomenon had grown to such an extent that journalistic media around the world were covering the story and academic researchers studying it continuously. The public spotlight prompted a new wave of interest in new audiences. Political point scoring over QAnon inevitably ensued, and at the same time critics of the phenomenon were also spreading the message through their involvement in public debate.

QAnon helps to shed light not only on how the attention factory works, but also on how epistemic conflicts emerge and expand under the conditions of the current media environment. QAnon is the embodiment of challenging epistemic institutions and authorities. QAnon and its supporters are suggesting that scientists are biased and lying for financial gain, or just out of malice; that Democrats are lying to satisfy their desire for influence and power; and that journalists and media organizations are “fake news,” spreading lies to promote an elite conspiracy. Religious leaders, who have condemned the dissemination of conspiracy theories and urged people to get vaccinated against Covid-19, for instance, are immediately considered part of this evil elite conspiracy. QAnon is interwoven with historical and religious conspiracy theories and among other things propagates an antisemitic narrative of elite secret societies of global bankers.

In challenging epistemic authorities Q effectively takes their place, while the QAnon community decides to take the same advice as is given in critical media literacy guides: Do your own research. This QAnon belief system slots in perfectly with the individualist culture where everyone can, and where everyone should form their own convictions and believe they have done this all by themselves. Paradoxically, any source or content that supports the core messages of the belief system lends itself to the construction of the narrative, whether that is a film, a scientific report or even a critical journalistic story. Following Q’s crumbs becomes a game that is played by a growing number of followers. Unwittingly, the opponents of the phenomenon also become involved in the game, spreading it to people who otherwise would never come across QAnon materials.

Attention factory, conspiracy theories and counterforces to circulation

Our media environment serves as an attention factory that produces and directs human attention. All attention machines within that factory are interconnected. Platforms, channels, and genres work in the same factory and contribute to mainstreaming conspiracy theories requiring multidisciplinary approaches combining not only media research and religious studies, but also overcoming the still prevalent divide between journalism studies, cultural studies and study of popular communication. The epistemic justifications of conspiracy theorizing such as QAnon are typically based on combinations of various genres and combine a vast array of popular culture reference, reframing of news reports from different sources, academic articles, religious references and so forth. In this syncretism QAnon is emblematic of content confusion.

The more emotions the content elicits, the more it attracts attention and reactionas and the more widely it spreads. In other words, the social media business model is based on provocation. Journalistic media are equally dependent on readers’ and viewers’ attention. The focus of journalism, too, is more and more often based on user data analysis. Even when journalistic guidelines are followed, the main priority is always to reach the audience. That is why, in the end, the attention factory is always governed by this same logic: the whole media ecology aims at attracting the users. This drive is fueled with both algorithmic and human prioritizing of affective – particularly hateful (Wahl-Jorgensen, Citation2019) – contents, which conspiracy theories readily provides. Ironically, journalist media publishing about conspiracy theories might end up not only spreading the theories, but with “debunking” practices also undermining its own reliability. (see e.g. Dentith, Citation2020)

It is commonplace to claim that the implications of a business model that is based on provocation and profiling has come as a surprise to technology companies and legislators alike. However, research shows that internet scholars and developers recognized early on the potential distorting effects of data and the problems with privacy protection and urged these to be considered. (Zuboff, Citation2019). Another indication that the problems were indeed known from an early stage is that Google’s developers started out from the idealistic position that they would refuse to use the search engine for advertising purposes, recognizing that the application of commercial principles would deeply change the nature of the search engine. But that idealism soon fell by the wayside. Global faith in innovation (Valaskivi, Citation2012, Citation2020), coupled with faith in start-up entrepreneurship (Valaskivi & Sumiala, Citation2013) and excitement about the revolutionary potential of the “platform economy,” led most countries in the advanced world to encourage technology development based on a profit motive. Given these premises, Facebook and other social media platforms have started too late and done too little to curb the spread of misinformation and disinformation and the explosion of conspiracy materials.

Yet it would be an exaggeration to suggest that technology itself is to blame for epistemic instability and the spread of conspiracy theories. Conspiracy theories and their use for propaganda purposes have no doubt existed throughout human history and they have always been spread using the communication technology available at the time (Butter & Knight, Citation2020; Fay, Citation2019; Hofstadter, Citation1964). Conspiracy theories have been developed for various reasons and used for various purposes, and there is nothing inherently dangerous or condemnable about speculating with possibilities or believing in theories that are incompatible with generally accepted knowledge, nor are the people who entertain these ideas insane or irrational. In the current media environment, the spread of conspiracy theories assumes certain features that are related to the mechanisms of the attention factory, but also to the dynamics of social action. That is, it is quite common for affective communities to form around conspiracy theories, with inner logics that are closely akin to various fan communities. For instance, the accumulation of social capital and appreciation in these communities takes place via active content production and circulation. This inner logic is therefore also liable to increase the circulation of conspiracy materials.

To simplify, one might suggest that the attention factory benefits from content confusion. Ultimately it makes no difference to the social media “hype machine” what kind of content is being circulated. This is the very feature of the attention factory that tends to undermine the stability of society’s epistemic foundation. Transnational and cross-platform circulation of contents further add to content confusion and make it impossible to ascertain the motives for producing the contents in circulation. This in turn creates fertile ground for the spread of propaganda and information operations.

So why have conspiracy theories not taken over completely and replaced all knowledge production, despite the mechanisms of the attention factory and content confusion? Because of human beings. Even before the Covid-19 pandemic there was growing recognition of the problems of the digital media environment, and not just in narrow scholarly circles. Information on data leaks, government surveillance of individual citizens and the proliferation of lies had caused mounting public concern about the role and the possible need to rein in social media giants. The expansion of QAnon – and the change of power in the United States – also led to the wide scale removal of QAnon materials from virtually all major algorithmic social media platforms, although much of that material and the related communities soon reappeared on other channels such as Telegram and Parler. And Q has stopped posting; there have been no new messages since December 2020. Some of Q’s followers have given up their faith, and people who occasionally spread QAnon materials have perhaps moved on to other subjects. The phenomenon itself and the political dividing lines that made QAnon so attractive to so many people especially in the US have not gone away, even though the audiences have shrunk considerably since Donald Trump faided from the public eye. Conspiracy myths and narratives also remain in circulation, it is just that they have become attached to other topical themes. For example, a Finnish group opposing coronavirus restrictions, decided – after serious and heated exchange – to close discussion on their Telegram group after far-right attempts to hijack the agenda for debate.

The circulation and possible reinforcement of contents on social media platforms happens algorithmically. It is this circulation that leads to an accumulation of attention to items that evoke the most reactions. Sometimes there will be a conscious effort to accelerate this accumulation by means of bots or trolling, for example. Cross-platform circulation, on the other hand, usually requires a human agent, someone who actively transfers the content or idea from one place to another, in the way that QAnon was transferred from an imageboard to wider social media audiences. The main driver of mainstreaming is thus the human individual (together with others), the media user who believes in the conspiracy theory or who thinks it is a big joke, the politician who thinks there are points to be scored by spreading the theory – or the journalist who thinks the audience will be interested in conspiracy theories and therefore they will want to pay attention. Humans and human-created organizations and institutions are also the agents that can change the logic of the attention factory – if they want to. It is not easy but nor is it impossible.

The article is intended for the special issue on Epistemic Contestations in the Hybrid Media Environment.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by the Academy of Finland under Grant 329343; and The Helsingin Sanomat Foundation under Grant “The Politics of Conspiracy Theories.”

Notes on contributors

Katja Valaskivi

Associate Professor Katja Valaskivi heads the Helsinki Research Hub on Religion, Media and Social Change (Heremes) and is a Research Programme Director at the Helsinki Institute for Social Sciences and Humanities (HSSH) at the University of Helsinki. Her research focuses on the circulation of belief systems, worldviews and ideologies from the perspectives of media research and study of religion.

References

  • Andrejevic, M. (2013). Infoglut. How too much information is changing the way we think and know. Routledge.
  • Aral, S. (2020). The hype machine: How social media disrupts our elections, our economy, and our health – And how we must adapt. Currency.
  • Barr, L., & Pecorin, A. (2021) FBI warns lawmakers frustrated QAnon conspiracy followers could again turn violent. ABC News. Retrieved from https://abcnews.go.com/Politics/fbi-warns-lawmakers-frustrated-qanon-conspiracy-followers-turn/story?id=78288191
  • Beyer, P. (1994) Religion and Globalization Sage.
  • Bhambra, G. K. (2014). Connected sociologies. Bloomsbury.
  • Butter, M., & Knight, P. (2020). Conspiracy theory in historical, cultural and literary studies. In M. Butter & P. Knight (Eds.), Routledge Handbook of conspiracy theories. Routledge.
  • Casanova, J. (1994). Public Religions in the Modern World. University of Chicago Press.
  • Chadwick, A. (2013). Hybrid media system. Politics and power. Oxford University Press.
  • Citton, Y. (2017). The ecology of attention (1st). ( B. Norman, Trans.). Polity Press. ( Original work published 2014).
  • Couldry, N., & Meijas, U. A. (2019). The costs of connection. Stanford University Press.
  • De Vries, H. (2001). In media res: Global religion, public spheres, and the task of contemporary comparative religious studies.In H. de Vries & S. Weber (Eds.),Religion and media pp. 3–42. Stanford: Stanford University Press
  • Dentith, M. R. X. (2020). Debunking conspiracy theories. Synthese, 198(10), 9897–9911. doi:10.1007/s11229-020-02694-0
  • Einstein, M. (2016). Black ops advertising: Native ads, content marketing, and the covert world of the digital sell. OR Books.
  • Ekström, M., Lewis, S., & Westlund, O. (2022). Epistemologies of digital journalism and the study of misinformation. New Media & Society, 22(2), 205–212. doi:10.1177/1461444819856914
  • Ekström, M., & Westlund, O. (2019). Epistemology and Journalism. Oxford Research Encyclopedia of Communication.
  • Ellmer, M. (2021, April 15). QAnon & its’ origins. Grey Dynamics. https://www.greydynamics.com/qanon-and-its-origins/
  • Fay, B. (2019). The Nazi Conspiracy Theory: German Fantasies and Jewish Power in the Third Reich. Library & Information History, 35(2), 75–97. doi:10.1080/17583489.2019.1632574
  • FBI. (2019) Anti-government, identity-based, and fringe conspiracy theories very likely motivate some domestic extremists to commit criminal, sometimes violent activity. Retrieved from https://www.scribd.com/document/420379775/FBI-Conspiracy-Theory-Redacted#from_embed
  • Fiske, J. (1987). Television culture. Methuen.
  • Gledhill, C. (1997). Genre and gender: The case of soap opera. In S. Hall (Ed.), Representation: Cultural representations and signifying practices (pp. xx–yy). Sage.
  • Godler, Y., Reich, Z., & Miller, B. (2020). Social epistemology as a new paradigm for journalism and media studies. New Media & Society, 22(2), 213–229. doi:10.1177/1461444819856922
  • Harambam, J. (2020). Contemporary conspiracy culture: Truth and knowledge in an era of epistemic instability. Routledge.
  • Helin, S. (2021) Salaliittoja ja Raamattua. Youtube tarjoaa salaliittoteorioita ja Raamatun tulkintoja koronarokotteista tietoja hakeville suomalaisille. Ylen selvitys paljastaa somejätin tekemän virheen. (Conspiracies and Bible. Youtube offers conspiracy theories and Bible studies for Finns looking for Covid-19 vaccination information. Yle’s report reveals the errors of the social media giant.) National Broadcasting Company Yle report May 30, 2021. Retrieved from https://yle.fi/uutiset/3-11951146
  • Hemmer, N. (2016). Messengers of the right: Conservative media and the transformation of American politics. University of Pennsylvania Press.
  • Hintz, A., Dencik, L., & Wahl-Jorgensen, K. (2019). Digital citizenship in a datafied society. Polity.
  • Hjarvard, S. (2008). The mediatization of religion: A theory of the media as agents of religious change. Northern Lights: Yearbook of Film & Media Studies, 6(1), 9–24. doi:10.1386/nl.6.1.9_1
  • Hofstadter, R. (1964). The paranoid style of American politics. Alfred A. Knopf.
  • Hoover, S. M. (2011). Media and the imagination of religion in contemporary global culture.European Journal of Cultural Studies,14(6), 610–625.
  • Knuutila, A. (2019). Närkästyksen kone. Miksi uusoikeiston ääni kuuluu verkossa muita vahvemmin. (The indignation machine. Why the far right gets a louder voice in the net than others) Politiikasta.fi May 9, 2019. Retrieved from https://politiikasta.fi/narkastyksen-kone-miksi-uusoikeiston-aani-kuuluu-verkossa-muita-vahvemmin/
  • Kunelius, R., & Reunanen, E. (2014). The medium of the media: Journalism, politics and the theory of “mediatisation.” Javnost – The Public, 19(4), 5–24. doi:10.1080/13183222.2012.11009093
  • Kuntsman, A., & Rattle, I. (2019). Towards a paradigmatic shift in sustainability studies: A systematic review of peer reviewed literature and future agenda setting to consider environmental (un)sustainability of digital communication. Environmental Communication, 13(5), 567–581. doi:10.1080/17524032.2019.1596144
  • Moffit, B. (2016). The global rise of populism. Performance, political style, and representation. Stanford University Press.
  • Morley, D. (1981). ‘The nationwide audience’: A critical postscript. Screen Education, 39, 3–14.
  • Noppari, E., Haara, P., Nelimarkka, M., Toivanen, P., & Valaskivi, K. (n.d.). Sisältösekaannuksen selviytymisopas: Selviä verkon infokaaoksesta [A survival guide to content confusion: How to overcome online information chaos]. Tampereen yliopisto: Tutkimuskeskus Comet. https://sisaltosekaannus.fi
  • Orlovski, J. (2020) The social Dilemma. Documentary film available on e.g. Netflix.
  • Peterson, M. A. (2003). Anthropology and communication: Media and myth in the new Millenium. Berghahn Books.
  • Peterson, K. M. (2020). Pushing boundaries and blurring categories in digital media and religion research. Sociology Compass, 14(3). First published Jan 13, 2020. doi:10.1111/soc4.12769
  • Promey, S. M. (2014).Sensational religion: Sensory cultures in material practice. Hartford, CT: Yale University Press.
  • Pyrhönen, N., & Gwenaëlle, B. (2020). Conspiracies beyond fake news: Producing reinformation on presidential elections in the transnational hybrid media system. Sociological Inquiry, 90(4), 705–731. doi:10.1111/soin.12339
  • QAnon Anonymous (2018 Aug 18) Introduction to QAnon podcast. Retrieved from https://soundcloud.com/qanonanonymous/episode-1-introduction-to-qanon
  • Ridell, S. (1998). Tolkullistamisen politiikkaa. Tv-uutisten vastaanotto kriittisestä genrenäkökulmasta [Politics of sensibility: Tv-news reception from a critical genre perspective]. Acta Universitatis Tamperensis 617. Tampereen yliopisto.
  • Robertson, D. G. (2021). Legitimizing claims of special knowledge: Towards an epistemic turn in religious studies. Temenos – Nordic Journal of Comparative Religion, 57(1), 17–34. doi:10.33356/temenos.107773
  • The Soufan Center. (2021). Quantifying the Q conspiracy: A data-driven approach to understanding the threat posed by QAnon. Limbik. https://thesoufancenter.org/wp-content/uploads/2021/04/TSC-White-Paper_QAnon_16April2021-final-1.pdf
  • Stout, D. A. (2012). Media and religion: Foundations of an emerging field. New York: Routledge.
  • Sumiala, J., Valaskivi, K., Tikka, M., & Huhtamäki, J. (2018). Hybrid media events: The Charlie Hebdo attacks and the global circulation of terrorist violence. Emerald.
  • Toivanen, P., Nelimarkka, M., & Valaskivi, K. (2021). Remediation in the hybrid media environment: Understanding countermedia in context. New Media & Society, 1–26. doi:10.1177/1461444821992701
  • Uusitalo, N., & Valaskivi, K. (2020). The attention apparatus: Conditions and affordances of news reporting in hybrid media events of terrorist violence. Journalism Practice, 1–19. doi:10.1080/17512786.2020.1854052
  • Valaskivi, K. (1999). Relations of television. Genre and gender in the production, reception and text of Japanese family drama [Doctoral dissertation, University of Tampere]. Trepo. https://trepo.tuni.fi/handle/10024/66421
  • Valaskivi, K. (2002). Leipää ja rinkeliä: Johdatus asian ja viihteen suhteeseen suomalaisessa televisiossa [Bread and pretzel: Introduction to the relation of fact and entertainment in Finnish television]. Tampereen yliopiston tiedotusopin laitoksen julkaisuja B43. Tampereen yliopisto.
  • Valaskivi, K. (2012). Dimensions of innovationism. In P. Nynäs, M. Lassander, & T. Utriainen (Eds.), Post-Secular Society (pp. 129–156). Transaction Publishers.
  • Valaskivi, K. (2017). Valheita, sopimuksia ja sisältösekaannusta [Lies, contracts, and content confusion]. Aikalainen. http://aikalainen.uta.fi/2017/12/01/valheita-sopimuksia-ja-sisaltosekaannusta/
  • Valaskivi, K. (2020). The contemporary faith of innovationism. In E. Bell, S. Gog, A. Simionca, & S. Taylor (Eds.), Spirituality, organization and neoliberalism: Understanding lived experiences (pp. 171–193). Edward Elgar.
  • Valaskivi, K., & Sumiala, J. (2013). Yhteisöt liikkeessä: Innovaatiouskon kiertoa jäljittämässä [Communities on the move: Tracing the circulation of innovation belief. In M. Lehtonen (Ed.), Liikkuva maailma: Liike, raja, tieto [The moving world: Movement, border, knowledge] (pp. XX–YY). Vastapaino.
  • van Dijck, J., Poell, T., & de Waal, M. (2018). The platform society. Public values in a connective world. Oxford University Press.
  • Velkova, J. (2021). Thermopolitics of data: Cloud infrastructures and energy futures. Cultural Studies, 35(4–5), 663–683. doi:10.1080/09502386.2021.1895243
  • Wahl-Jorgensen, K. (2019). Emotions, media and politics. Polity.
  • The Wall Street Journal (2021 Oct 1) The Facebook files. https://www.wsj.com/articles/the-facebook-files-11631713039?page=3
  • Wilska, K. (1997). Iltalypsy - faktan ja fiktion kaksihintajärjestelmä [Evening milking – The dual price system of fact and fiction] [Master’s thesis, University of Tampere]. Trepo. https://trepo.tuni.fi/handle/10024/89262
  • Wolfsfeld, G., Segev, E., & Sheafer, T. (2013). Social media and the arab spring: Politics comes first. The International Journal of Press/Politics, 18(2), 115–137. doi:10.1177/1940161212471716
  • Zuboff, S. (2019). The age of surveillance capitalism. Profile Books.