3,250
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Harnessing Distrust: News, Credibility Heuristics, and War in an Authoritarian Regime

ORCID Icon

ABSTRACT

To evaluate the credibility of political information, citizens rely on simple logical rules-of-thumb or heuristics based on various resources, such as personal experience and popular wisdom. It is often assumed that contrary to dependence on the media, personal experience and popular wisdom help citizens to build alternative understandings of political events. However, little is known about how citizens use heuristics in authoritarian settings. Relying on focus groups, this study uses Russian citizens’ reception of the regime propaganda regarding Ukraine in 2016–17 as a case study to investigate the credibility heuristics of citizens living in an autocratic state during war. Deploying both qualitative and quantitative analysis of citizens’ discourse, I identify the main heuristics used to evaluate the credibility of propaganda. I show that citizens perceive regime propaganda with distrust and often rely on popular wisdom and personal experience to identify bias. However, this does not necessarily guarantee a critical attitude toward regime propaganda. Citizens use these resources to evaluate propaganda’s credibility selectively depending on their political alignment. Indeed, their reliance on personal experience and popular wisdom undermines the authority of state media in general. However, propaganda resonates with the distrust toward media and politics that permeates citizens’ experiences. As a result, the reliance on these resources for interpreting political information can amplify, rather than erode, the credibility of specific news stories. These results contribute to the understanding of both how propaganda is received and credibility heuristics are used in an authoritarian environment.

Introduction

When Russia invaded Ukraine in 2022, scholars and journalists alike emphasized the importance of delivering facts about the war to the Russian public. Several weeks into invasion, both anti-war activists in Russia and the international public encountered a puzzling phenomenon: many Russian citizens supporting the war refused to believe not only accounts of reputable media, but also their own relatives in Ukraine encountering “almost surreal backlash from family members in Russia, who refuse to believe that Russian soldiers could bomb innocent people” (Hopkins, Citation2022).

The relationships between propaganda and information obtained elsewhere – from personal experience or accounts of close others – is important for understanding the nature of the Russian public’s reaction to the war as well as authoritarian political communication more generally. Scholars argue that citizens rely on heuristics or simple rules-of-thumb to make sense of the flood of information in modern, saturated media environments. They use relevant personal experience and popular wisdom to grasp the message, decide which heuristic to use, and evaluate political information critically (E.g., Gamson, Citation1992; Just et al., Citation1996). Since contemporary authoritarian regimes are characterized by profound media distrust (e.g., Tsfati & Ariely, Citation2014), these resources are even more important in autocracies.

Nevertheless, there is no consensus on whether these resources actually help citizens to evaluate media messages critically. Some scholars argue that in such environments, popular wisdom reminds citizens of the manipulative nature of the media, while available experience functions as an alternative source of information (e.g., Mickiewicz, Citation2005, Citation2008). Others show that citizens often end up consuming propaganda despite postulated skepticism (e.g., Szostek, Citation2018a). This paper asks the following research questions: How do citizens evaluate the credibility of regime communication in an authoritarian environment? Does reliance on personal experience and popular wisdom help citizens evaluate regime propaganda critically?

Putin’s regime propaganda regarding Ukraine presents a good opportunity to address these questions. Russia’s political regime is an autocracy witnessing consistent democratic backsliding (Alizada et al., Citation2021). Since 2014, Russia has been involved in the Donbas war leading to a full-scale invasion of Ukraine in 2022. State-controlled media played a key role in shaping citizens’ attitudes toward the regime and Russia’s policies in Ukraine (Alyukov, Citation2022b). At the same time, paradoxically, Russia is characterized by profound media distrust. While some media distrust results from the turbulent political legacies of the 1990s, the propagandistic nature of the reporting on Ukraine conflict has left many citizens even more skeptical of state media (Alyukov, Citation2022a; Shields, Citation2021).

The study is based on focus groups with Russian citizens structured around watching news episodes from Russian state television about Ukraine’s Euromaidan protest and the Donbas war conducted in 2016–2017. Relying on both qualitative and quantitative analysis of citizens’ narratives, I explicate the mechanisms underlying the use of personal experience and popular wisdom in evaluating the credibility of regime propaganda regarding Ukraine in 2016–17. I argue that underlying distrust toward state media drives citizens to rely on personal experience and popular wisdom as alternative sources of information for political thinking. However, this does not necessarily guarantee that they evaluate propaganda critically. Participants use these resources to evaluate propaganda as credible or untrustworthy selectively depending on political alignment. In addition, regime propaganda capitalizes on popular distrust toward politics and the media. The reliance on personal experience undermines the authority of state media in general. However, regime propaganda resonates with the distrust toward media and politics permeating citizens’ personal experience and popular wisdom. As a result, the reliance on these resources for interpreting political information amplifies credibility of specific news stories.

The article is structured as follows. First, I discuss research on propaganda in autocracies and credibility heuristics. A methodological section then outlines data collection, coding, and analysis. Then I present both qualitative and quantitative analysis of how Russian TV viewers use heuristics to evaluate the credibility of regime propaganda regarding Ukraine in 2016–17. The conclusion discusses the implications of my study for research on propaganda, credibility, and heuristics in autocracies.

Authoritarian Propaganda and Credibility Heuristics

The rise of authoritarian regimes across the world has drawn increased scholarly attention to how media can shape political attitudes in autocracies. Scholars have generated a wealth of evidence on how different types of propagandistic content affect attitudes and social relations (e.g., Asmolov, Citation2018; Stockmann & Gallagher, Citation2011; Mattingly & Yao, Citation2022), how propaganda distributed through different types of media affects attitudes (e.g., Alyukov, Citation2021; Pearce & Kendzior, Citation2012), and the factors making regime messages resonate with the public (e.g., Greene & Robertson, Citation2017; Sharafutdinova, Citation2020; Shirikov, Citation2022). However, the body of research on propaganda remains largely disconnected from the core approaches to credibility in political communication. The concept of a cognitive heuristic has been central in explaining how citizens evaluate the credibility of information in Western democracies. Heuristics are simple rules-of-thumb that help to reduce the complexity of information, infer missing information, and allow individuals to make up their minds on an issue despite limited engagement. They originate from the mismatch between the overwhelming flow of information that individuals routinely encounter and the limited capabilities of the human mind (Fiske & Taylor, Citation2017). To “tame the information tide” (Graber, Citation1988), most citizens attend only to selected messages, focus on familiar cues, and remember only general meanings (Graber, Citation2001). Stereotypical representations of the world, or schemata, and rapidly executed decision-making rules based on schemata, or heuristics, play a key role in this process. They help individuals to build new information into preexisting knowledge and quickly make up their minds on political issues.

Several factors can influence the formation and use of heuristics. 1) Domain type. Heuristics are domain-specific. While the deservingness heuristic may help individuals evaluate welfare policies by reducing the complexity of political processes to the moral character of the recipient (Petersen, Citation2015), it is unlikely to be used to evaluate news credibility. 2) Usefulness. A heuristic is used when it is useful and helps one to predict changes in the environment (Gigerenzer & Todd, Citation1999). For instance, party agenda is used as a heuristic to form opinions in more stable party systems, but it is less relevant in less stable party systems (Brader & Tucker, Citation2012). 3) Issue salience. Individuals are more likely to rely on heuristics rather than engage in in-depth information processing when they perceive the issue as less important (Ciuk & Yost, Citation2016)) Political awareness. Less politically aware individuals are more likely to rely on heuristics (Kam, Citation2005). While some heuristics are learned from the environment and can be abandoned quickly (Gigerenzer & Todd, Citation1999), others are more resilient as they are learned during formative years (Graber, Citation2001) or even rooted in evolutionary adaptations (Petersen, Citation2015).

Heuristics provide easy-to-use guidelines for processing new information, recall, and forming judgments in saturated media and political environments (Metzger et al., Citation2010). Instead of thoroughly evaluating information, citizens routinely rely on the reputation of media organizations (Hovland & Weiss, Citation1951), ideological identifications of sources (Baum & Gussin, Citation2008), recency of information (Sundar et al., Citation2007), and endorsements of others (Son et al., Citation2020). Exploiting these features of the environment, they make quick inferences about credibility of information. Heuristics are constructed from the information received from the environment. The number of channels through which an individual can experience the environment is limited to direct personal experience, the media, or other people. Gamson (Citation1992) defines these three sources of information as cultural resources. Trying to make sense of an issue, people draw on something they already know – media discourse, situations they experienced firsthand (experiential knowledge), or shared cultural knowledge that consists of widely used analogies, cliches, and logic (popular wisdom).

Putin’s regime propaganda regarding Ukraine in 2016–17 provides a perfect case for analyzing credibility heuristics. When an issue is not salient, citizens are more likely to rely on heuristics (Ciuk & Yost, Citation2016). The Donbas war had generated an emotional response in 2014, but the continuous stream of information about Ukraine in the media soon led to disengagement (Alyukov, Citation2022a). Even citizens’ enthusiastic reaction to the annexation of Crimea, which was driven by strong emotional resonance (e.g., Greene & Robertson, Citation2022), was, to a significant degree, based on dissembling (Hale, Citation2022). Despite the dominance of stories about Ukraine in state media, the war was not a salient topic for the Russian public several years after it began. After 2015, two-thirds of Russians did not pay much attention to news about Ukraine (Levada Center, Citation2017), and citizens’ attitudes toward the conflict could be characterized as “non-attitudes” (Converse, Citation1970).

In the context of low media trust and conflicting ideological narratives, audiences use heuristics based on these resources in a compensatory way. For instance, Mickiewicz (Citation2005, Citation2008) finds that citizens rely on personal experience as a heuristic to compensate for the inability to rely on the authority of media organizations in Russia. Similarly, Szostek (Citation2018b) and Alyukov (Citation2021) find that in a context of low trust and conflicting media narratives, citizens rely on popular wisdom including the persuasive intent and consistency heuristics in Russia and Ukraine. Similarly, Vihalemm and Juzefovičs (Citation2022) find that Russian-speaking audiences in the Baltic states exhibit a greater level of distrust toward media organizations in the context of conflicting narratives in the Western and Russian media. Not being able to rely on the reputation of sources, they compensate for this lack by using personal experience or cultural norms as heuristics.

When information provided by media organizations does not appear trustworthy, it is logical to seek alternative sources. However, there is disagreement over whether the use of personal experience and popular wisdom to evaluate credibility actually makes citizens more critical news consumers. Some scholars argue that available personal experience and popular wisdom allow citizens to successfully identify bias (Mickiewicz, Citation2005, Citation2008). However, other scholars demonstrate that media distrust can co-exist with the reproduction of the regime’s narratives. One explanation for this paradox is the superior reach of regime propaganda traveling across different media (Alyukov, Citation2021). Regime narratives shape online “eyewitness” discourse which is perceived as more credible than state media (Szostek, Citation2018a). Another explanation is the difference between reported attitudes toward the media and actual perceptions and practices of verification. Skepticism toward the media can be seen as a socially desirable trait making citizens report skeptical attitudes rather than perform critical analysis (Flanagin & Metzger, Citation2007).

Hypotheses

The unclear role of personal experience and popular wisdom in evaluating political information in authoritarian environments constitutes an important research puzzle. To explore how personal experience and popular wisdom affect perceived credibility of propaganda, this study looks at the relationships between these resources and credibility heuristics. Based on prior research, I expect three heuristics to be relevant for the study.

  1. The authority of state television is an important cue which increases perceived credibility of information in Russia (Shirikov, Citation2022). Hence, I expect the authority heuristic to play an important role in citizens’ evaluation of credibility. The authority heuristic is based on the idea that information is credible because the source of content is an expert or official authority (Sundar, Citation2008). It is a rule of thumb which dictates: media organizations controlled by the state or professional journalists are more likely to provide credible information.

  2. Due to the totalitarian legacies and intensive intra-elite struggle in the 1990s, politics in Russia is associated with hypocrisy, corruption, and violence, as opposed to “the honest, sincere individual life, dominated by personal relationships, the pursuit of prosperity and/or a career, and self-distancing from the political realm” (Zhuravlev et al., Citation2019, p. 166). Russian state media have been relying on this cultural apoliticism and framing the conflict as a confrontation between the Ukrainian government, driven by political calculations, and ordinary citizens in Eastern Ukraine, driven by apolitical and authentic needs. Hence, I expect the authenticity heuristic to play an important role in citizens’ evaluation of credibility. It is a rule of thumb which dictates: If a message does not involve political ideas or politicians, it does not manipulate the viewer and should be trusted.

  3. Russians are often aware of media bias looking for signs of manipulative intent (e.g., Mickiewicz, Citation2008; Szostek, Citation2018b). Hence, I expect the persuasive intent heuristic to play an important role in citizens’ evaluation of credibility. The persuasive intent heuristic is based on the idea that “there is some sort of manipulation or ulterior motive on the part of the information provider” (Metzger et al., Citation2010, p. 432). It is a simple rule of thumb which dictates: if there are signs suggesting that the story attempts to change the viewer’s opinion, the message should not be trusted.

I consider the perception of a news story as credible or untrustworthy as coinciding with the use of a heuristic. Some heuristics lead to the perception of information as credible (authority, authenticity), while others lead to the perception of information as untrustworthy (persuasive intent). My central theoretical assumption is that heuristics are rules-of-thumb which are stored in memory, retrieved, and applied to information to evaluate its credibility. However, their retrieval and application depend on prior knowledge. This prior knowledge can be acquired via personal interactions with the environment (personal experience) or derived from shared culture that consists of widely used analogies, cliches, and logic (popular wisdom). Relevant personal experience and popular wisdom can help an individual to grasp the message and decide which heuristic to use.

I assume that the discrepancy between personal experience with Ukraine and the way events in Ukraine are portrayed by propaganda undermines credibility of propaganda. Hence, personal experience will make the authority heuristic less likely to be used (the source does not represent reality accurately despite its authority), the authenticity heuristic less likely to be used (the source does not represent reality accurately despite seemingly authentic apolitical nature of information), and the persuasive intent heuristic more likely to be used (the mismatch between experience and reality suggests that the source attempts to manipulate the viewer):

H1a. The use of personal experience is negatively associated with the authority heuristic.

H1b. The use of personal experience is negatively associated with the authenticity heuristic

H1c. The use of personal experience is positively associated with the persuasive intent heuristic.

I assume that popular wisdom undermines the credibility of propaganda. As media distrust deeply permeates culture in Russia (e.g., Mickiewicz, Citation2008), popular wisdom contains a multitude of cliches suggesting that any information in the media is a form of manipulation. Hence, popular wisdom will make the authority heuristic less likely to be used, the authenticity heuristic less likely to be used, and the persuasive intent heuristic more likely to be used (there is no relevant personal experience, but everybody knows that media are manipulative):

H2a. The use of popular wisdom is negatively associated with the authority heuristic.

H2b. The use of popular wisdom is negatively associated with the authenticity heuristic.

H2c. The use of popular wisdom is positively associated with the persuasive intent heuristic.

Research Design

This study was based on eight focus groups which exposed all participants to the same sequence of news reports.Footnote1 Four focus groups were conducted in St. Petersburg in 2016 and four in Moscow in 2017. The news reports were selected from Channel One – the most watched Russian TV channel (Volkov et al., Citation2021). I chose stories representing the key ideas underlining Putin regime’s narrative on Ukraine, such as the portrayal of the overthrowing of the pro-Russian government in Ukraine as organized by the US and the framing of the Donbas war as a confrontation between ordinary pro-Russian citizens and an illegitimate Ukrainian government (Lankina & Watanabe, Citation2017). The following broadcasts were used during the focus groups: 1) a report about the police crackdown on Euromaidan protesters in Kyiv in November 2013 (30/11/2013), 2) a report about the Russia-backed referendum in Donetsk in May 2014 (5/5/2014), and 3) a report about clashes between pro-Russian forces and the Ukrainian National Guard in Eastern Ukraine in June 2014 (1/06/2014). The first report can be found in Appendix Section 1.

These broadcasts highlight the key elements of the regime propaganda regarding Ukraine. Similar rhetorical tropes have continued to underpin regime propaganda up until the full-scale invasion in 2022 (e.g., Alyukov, Citation2022b). Each focus group had three rounds. After watching each broadcast, participants were asked about their opinions on the story. Participants were also asked to reflect on Russian politics and media in general as well as their media consumption. The focus group scenario can be found in Appendix Section 2. Before the start of the focus groups, participants were also asked to fill out short questionnaires about their media consumption and political knowledge. The questionnaire can be found in Appendix Sections 3 and 4. A total of 56 participants took part in the study. Participants were recruited by the pollster Public Opinion Foundation. Groups had seven to eight participants each. Each focus group’s length was around two and a half hours. Participants received compensation for participation. All discussions were moderated by the author. All participants provided informed consent. The data was anonymized.

Fear and self-censorship represent a potential challenge to the validity of data in a repressive autocracy. However, the data was collected in 2016–17, the moment when Putin’s regime had not reached its full repressive potential. For instance, list experiments – a technique designed to identify untruthful responses – suggest that a significant share of Russians falsified their preferences in response to sensitive political questions after the 2022 invasion (Chapkovski & Schaub, Citation2022), but did not do so in 2015 (Frye et al., Citation2017). In addition, while preference falsification is a significant challenge for surveys, it is not as challenging for qualitative research. While participants can omit some information, in-depth interviews and focus groups last from one to several hours. As participants make hundreds of statements, the researcher can determine a participant’s political views based on overall narrative even if some information is omitted.

Participants were sorted into groups to minimize potential discomfort. As pollsters do not provide data on respondents’ political views, groups were based on age. As age is a strong predictor political views in Russia (e.g., Greene & Robertson, Citation2017), people of similar age are likely to have similar views. The mean age of groups varied from 22.6 (the youngest group) to 51.8 (the oldest group). The number of statements made by participants is indirect evidence suggesting that participants feel comfortable about discussed issues (Lunt & Livingstone, Citation1996). Participants were quite active – the mean and median number of statements made by a participant were 35 and 30.6. As participants discussed news stories with interest and at length, it is unlikely that self-censorship was a significant factor undermining validity. Still, it is not possible to eliminate the possibility of self-censorship fully. This, however, equally applies to all social science research in autocracies. Even list experiments designed to tackle the issue of self-censorship cannot eliminate preference falsification fully (Frye et al., Citation2022).

Participants' characteristics can be found in . The sample is balanced in terms of age and gender, but slightly skewed toward people with higher education. Most participants watch television news daily and weekly, but half report receiving news from online media and social media too.

Table 1. Participants’ characteristics and media use.

While this study explicates the mechanisms underlying credibility evaluation rather than makes inferences about the general population, this media consumption breakdown is fairly similar to nationwide trends (see Appendix Section 5).

While most often researchers opt for reading focus group transcripts and reporting repeating themes in the data (e.g., Mickiewicz, Citation2008), some researchers conduct quantitative analysis on their transcripts (e.g., Gamson, Citation1992; Graber, Citation2001; Just et al., Citation1996; Perrin, Citation2005). When used together, these two approaches can complement each other. While in-depth qualitative reading can give insights into how participants perceive credibility, quantitative analysis can help to trace the co-occurrence of discursive elements. This article is based on both a qualitative and quantitative analysis of the transcripts.

Focus groups were recorded, converted into text, and coded. ATLAS.ti was used for qualitative coding. Depending on research questions, scholars opt for various units of coding, such as conversational exchange (Gamson, Citation1992; Just et al., Citation1996) or conversational turn (Graber, Citation2001; Perrin, Citation2005). As the study focuses on individual judgment, I opted for conversational turns (CTs) or single uninterrupted individual statements as a unit of coding (Sacks et al., Citation1974). At the first stage, qualitative thematic analysis was used to code each CT for the absence or presence of specific categories. Following Boyatzis’s (1988) approach to deductive thematic analysis, I developed a code sheet for each code.

Heuristics in participants’ discourse were operationalized as simple rules-of-thumb used to infer credibility. Heuristics co-occurred with difference resources. Resources in participants’ discourse were operationalized as references to specific knowledge external to the discussed story, but incorporated in the discussion to build an argument. Following the constructionist model of political communication (e.g., Gamson, Citation1992; Just et al., Citation1996; Neuman et al., Citation1992), I coded each instance of heuristic for the presence or absence of references to personal experience or popular wisdom. Each category included specific domains – for example, references to knowledge acquired through travel, personal experience with economy, or facts learnt through close others were all coded as personal experience. In addition, each statement was coded for political orientation in terms of whether it was supportive of the regime, critical of the regime, or critical of both the regime and the opposition. Examples of my coding sheets and codebooks can be found in Appendix Section 6. Two researchers coded focus group data independently. To account for chance agreement, Krippendorff’s Alpha (ak) was used to assess reliability. Variables with ak reaching .8 are usually considered reliable in established areas (Riffe et al., Citation2019), but coefficients as low as .667 can be acceptable for drawing tentative conclusions in quantitative content analysis (Krippendorff, Citation2004). The results are quite low due to highly unstructured nature of data, but are above or close to accepted standards: ak(mean)=.68; ak(heuristics)=.69; ak(resources)=.71; ak(orientations)=.64 Tetrachoric correlations were used to assess consistency. Correlations between elements are negative or weak suggesting that categories do not overlap too much and do not code for the same pattern (rtet (heuristics) = from − 0.03 to 0.04; rtet (resources) = 0.3; rtet (orientations) = from − 0.35 to − 0.04) (See Appendix Section 7).

Based on a qualitative thematic analysis, I performed a quantitative analysis. The results of coding were exported into R. As a result, the data set includes 2,192 CTs as observations. Variables have a dichotomous nature. Each CT as an observation takes the value of 1 or 0 for each category. While this article relies on statistical analysis, it retains the limitations of qualitative research. Focus groups are recruited based on quota sample and might not be representative of the general population. They cannot be used to infer causal relationships similar to experiments.

Data Analysis

Heuristics and Resources

As predicted, the authority and authenticity heuristics were frequently used to evaluate messages as credible, while the persuasive intent heuristic was used to challenge credibility. The examples below demonstrate how these heuristics were used by participants.

  1. The authority heuristic emerged in conversations in response to cues read by participants as signs of credibility, particularly, association with the government and signs of journalistic professionalism. For instance, a participant explains that “You believe television because these are state channels […]. The government is supposed to be responsible for the credibility of information on state channels.” Similarly, participants associated visible attributes of journalistic professionalism with credibility. For instance, a participant argues that “The way they [journalists] speak [suggest that] they are professionals. [It invokes] trust.”

  2. The authenticity heuristic emerged in conversations in response to cues read by participants as signs of credibility, such as the presence of ordinary people who are not in position of power or violence which is difficult to stage. For instance, a participant argues that people in news reports about protests in Donbas look credible because “they lack any ideological motives.” Another participant argues that “It [the news] seems credible because you are on the same level with these people [protesters in Donbas].” When the moderator asks the participant to clarify what “on the same level” mean, the participant responds by saying that “they are not political figures. They are [ordinary] people just like us.”

  3. The persuasive intent heuristic emerged in conversations in response to cues read by participants as signs of manipulation, such as repetition of the same information or presence of loaded labels. For instance, a participant argues that “They [journalists] added dramatic music [to the news episode about the Donbas war] and heavily edited it. It seems like they deliberately want to affect [the viewer].” Another participant explains: “I see constant repetition of the same phrases which they [the government] use to change my opinion. When I see that they are repeating, I don’t believe this information.”

CTs were also coded for references to two resources: personal experience and popular wisdom. Following Gamson (Citation1992), these resources were operationalized as references embedded in participants’ judgments about the news reports screened during discussions. Personal experience included references to personal experience with travel, economy, and indirect personal experience – participants’ references to opinions of close others. While information acquired from others is not first-hand experience stricto sensu, it is a type of indirect personal experience alternative to the media. For instance, a participant refers to her own experience with economy to interpret a news story: “I thought it was a credible story. Because we have a problem with salaries – I am paid very little despite [the fact that the country has rich] natural resources (R3).” Popular wisdom included references to widely shared cliches, logical rules, and analogies, as well as shared Soviet and post-Soviet past. For instance, a participant relies on logic to argue that the news story about military engagements in Eastern Ukraine is untrustworthy. The conversation between Ukrainian pilots over the radio presented by Russian TV as evidence of deliberate attack on civilians “sounds too banal. I am not convinced that this is a [real] conversation between pilots.” Another participant refers to the Soviet past: “We all lived during the Soviet Union. Are there any people who believe what they are told [on TV] one hundred percent?”

Constructing Credibility Judgments

Participants’ speech varied in terms of heuristics and resources used. represents the proportion of each code to the total number of CTs. As many CTs do not contain a heuristic or a resource, the proportions do not add up to 1. The most common heuristic is persuasive intent followed by authenticity, and authority. The most common resource is popular wisdom followed by personal experience.

Figure 1. Frequencies of heuristics and resources.

Barplots show the proportion of each code to the total number of conversational turns. The most common heuristic is persuasive intent, followed by authenticity and authority. The most common resource is popular wisdom, followed by personal experience.
Figure 1. Frequencies of heuristics and resources.

While these frequencies represent participants’ discourse as a whole, they do not tell us much about the relationships between resources and heuristics. presents logistic regression models estimating the relationships between the probability of a CT to contain a heuristic (outcome) given the presence of a resource in a CT (predictor).

Table 2. Resources and heuristicsFootnote2.

The reliance on personal experience indeed undermines the credibility of propaganda by making the authority heuristic less likely to be used, supporting H1a. Interestingly, personal experience with the economy and information from others undermine the use of the authority heuristic, while the effect of experience of travel to Ukraine is not significant. The reliance on popular wisdom undermines credibility of propaganda making the authority heuristic less likely to be used, supporting H2a. References to cliches and logic are weak but significant predictors.

The negative effect of personal experience on the authority heuristic is based on the discrepancy between personal experience and reality presented in state-controlled media. For instance, one participant specifically refers to “federal channels” and criticizes credibility of state media: “I don’t watch the news (…) I don’t want to brainwash myself.” Instead, he mostly receives information from “relatives in Donetsk and Luhanks [cities in Russia-controlled part of Eastern Ukraine] (R31).” Another participant compares the information received from state television and his experience with the economy: “I’m not sure about censorship, but they [TV channels] surely do not provide accurate information. It’s just ridiculous – they talk about unemployment. They say that 2–3% are unemployed in Russia. What? 25% (R32).” Similarly, the negative effect of popular wisdom on the authority heuristic is based on media skepticism deeply permeating popular wisdom in Russia. For instance, a participant refers to the cliché “the truth is always in the middle” to question credibility of state media. He explains that “I am inclined to believe something in the middle. I can’t trust [state] channels.”

Surprisingly, while reliance on personal experience undermines the credibility of state media in general by making the authority heuristic less likely to be used, it reinforces the credibility of specific messages by making the authenticity heuristic more likely to be used. Thus, H1b is not supported. All three types of personal experience contribute to the authenticity heuristic, with travel to Ukraine being the strongest predictor, followed by personal experience with the economy and information acquired from others. The positive effect of personal experience on the authenticity heuristic is based on resonance between the apolitical, authentic framing of events in state media and participants’ personal experience. Regime propaganda often frames the conflict as a confrontation between the Ukrainian government, driven by political calculations, and ordinary citizens in Eastern Ukraine, driven by apolitical and authentic needs. This framing makes characters in stories look more trustworthy because participants find them similar to themselves. For instance, a participant finds the news episode about protests in Donbas credible because his own personal experience with the economy resonates with the characters’ nonpolitical concerns as opposed to political protests in Kyiv: “It [the news report featuring Donbas citizens protesting against the Kyiv government] is about improving social guarantees. Their demand is clear and understandable for everyone – for us and for you. We all want to earn more money for our work. It is not political [as opposed to political nature of the Maidan protests in Kyiv shown earlier] (R35).'’ Another participant argues that “It [the news] seems credible because you are on the same level with these people [protesting citizens of Donbas].” After being asked to clarify the phrase “on the same level,” the participant explains: “Well, they are not political figures. They are [ordinary] people just like us (R8).”

References to popular wisdom also make the authenticity heuristic more likely to be used. Thus, H2b is not supported. However, this effect is a combination of a strong positive effect of references to logic and a weaker negative effect of references to the Soviet past. For instance, a participant uses logic and the idea that real emotions are difficult to fabricate to infer credibility: “Maybe it’s good acting, but it [the story] shows emotions of real people. It looks like they are sincere and act of their own accord (R19).”

Finally, somewhat counter-intuitively, the models suggest that personal experience makes the persuasive intent heuristic – the main rule of thumb used by participants to question credibility of regime propaganda – less likely to be used. Thus, H1c is not supported. Rather, it is associated with popular wisdom – primarily with logic – thus supporting H2c. However, it is not surprising given that the persuasive intent heuristic is rather abstract in nature. It is based on the use of logic used to identify formal features of the message which suggest the presence of manipulative intent. For instance, a participant questions credibility of state-controlled Channel One because “they like to criticize one side and whitewash the other side.” There is no personal experience involved – it is based on general logical inference that “it does not work like this in reality (R30).”

Beyond Credibility: The Role of Political Alignment

Another way to explore the relationships between resources and heuristics is to look at what specific heuristics and resources are preferred by different audiences. To identify preferred heuristics and resources of audiences with different political alignments, I coded CTs for political orientations. Based on the dominant political orientation of each participant, participants were divided into three audiences. Then I both disaggregated the use of heuristics and resources by audience and modeled the relationships between political orientations, heuristics, and resources.

Political orientation is understood as an ideological aspect of the CT. Three political orientations were identified. 1) A pro-regime orientation: statements with a pro-regime political orientation which sympathizes with the actions of the Russian government. For instance, a participant sympathizes with Putin and criticizes Ukraine’s policies: “I want some kind of a resolution [of the war]. I want a good leader [for Ukraine] – just like our Putin […] But they [Ukrainians] turned it into something absurd – they blame Russia for everything (R42).” 2) An anti-regime orientation: statements with an anti-regime political orientation criticize Putin’s regime as an illegitimate authoritarian government based on violence and propaganda. For instance, a participant criticizes propagandistic nature of state television: “[TV channels] represent everything in a black-and-white way. If you compare it with alternative sources, their evaluations are not objective. You don’t trust them because you don’t want to brainwash yourself (R30).” 3) An apathetic orientation: statements with an apathetic orientation are critical of both the government and the opposition, portray any media as manipulative, politics as difficult to relate to everyday life, and ordinary citizens as unable to understand political processes. For instance, a participant discusses Donbas citizens as victims of both Ukraine and Russia and complains about her inability to comprehend political processes: “These people are just used for propaganda. Nobody is interested in their interests. Neither Russia, although they are presented in the positive way, nor Ukrainian authorities, nor local authorities (…) It does not make any sense to discuss it. What we see is like a tip of the iceberg. Why [it happened], the relationships between politicians and countries – it is all in some other universe [inaccessible for ordinary people] (R1).”

Participants’ speech varied in terms of political orientation. represents the proportion of CTs with a specific orientation to the total number of CTs. As many CTs do have a clear political orientation, the proportions do not add up to 1. Most CTs are apathetic followed by pro-regime and anti-regime CTs.

Figure 2. Proportions of political orientations.

The barplot shows the proportion of conversational turns with a specific orientation to the total number of conversational turns. Most conversational turns are apathetic, followed by pro-regime and anti-regime turns.
Figure 2. Proportions of political orientations.

Scholars argue that political apathy formed by an authoritarian environment plays a crucial role in shaping the attitudes of the Russian public (e.g., Shields, Citation2021). In line with these findings, most CTs were apathetic. Most participants did not believe that they could comprehend political processes, complained about manipulative nature of state and oppositional media, and perceived the Donbas war as a distant, arcane realm which was difficult to relate to their everyday life. A participant was assigned to an audience if the proportion of an orientation in her discourse was larger than the proportion of the two other orientations. An example of this assignment process and the characteristics of each audience can be found in the Appendix Sections 8 and 9. The apathetic audience is the largest one (26) followed by the pro-regime audience (16), and smallest being the anti-regime audience (5).

How do political orientations mediate the use of resources to apply heuristics? represents the preferred heuristics and resources disaggregated by audience type. The shares are calculated relative to the number of CTs made by an audience to compensate for the audience size and activity.

Figure 3. Heuristics and resources disaggregated by audience.

Barplots plot the preferred heuristics and resources disaggregated by audience type. The pro-regime audience’s preferred heuristic is the authenticity heuristic followed by the authority heuristic. The anti-regime and apathetic audiences’ preferred heuristic is the persuasive intent heuristic. The pro-regime and anti-regime audiences’ preferred resource is popular wisdom followed by personal experience. The apathetic audience is the only group relying on personal experience more than other groups.
Figure 3. Heuristics and resources disaggregated by audience.

presents models estimating the relationships between the probability of a CT to contain a heuristic or a resource (outcome) depending on the political orientation of a CT (predictor) controlled for a number of variables.

Table 3. Orientations, heuristics, resources.Footnote3

The analysis suggests that audiences use the same resources to apply heuristics differently depending on their political orientation. The pro-regime audience’s preferred heuristic is the authenticity heuristic followed by the authority heuristic. The anti-regime and apathetic audiences’ preferred heuristic is the persuasive intent heuristic. The pro-regime and anti-regime audiences’ preferred resource is popular wisdom followed by personal experience. The apathetic audience is the only group relying on personal experience more than other groups. The analysis allows me to further explicate how the reliance on personal experience and popular wisdom can reinforce the credibility of regime propaganda.

While the pro-regime audience relies on personal experience less than other audiences, the apathetic audience relies on personal experience more than other audiences. As the authenticity heuristic is based on the idea that one should trust one’s own personal experience rather than manipulative media, it is reasonable to expect the apathetic audience to use the authenticity heuristic. However, it is a preferred heuristic of the pro-regime audience. It happens because the pro-regime and apathetic audiences use personal experience selectively depending on political orientation. For instance, a pro-regime participant refers to her personal experience of traveling to Crimea: “I visited it [Crimea]. People there really run after Russian passports. There were dissatisfied with many things – prices, laws, politics.” She then compares her memories about people in Crimea to the news story about protests in Donbas, uses the authenticity heuristic, and finds the story credible because, just like in Crimea, people presented in the story are “ordinary people” driven by apolitical motives: “But they are ordinary people rather than politicians. Simple physical safety was a priority for them (R1).” To the contrary, an apathetic participant uses personal experience in a completely different way. The participant refers to her experience of travel and life in Ukraine: “I myself from Kharkiv. 17 years in Kharkiv, 40 years in Moscow. But I visit my relatives [in Ukraine] often and such […] I was shocked when I saw [how the Donbas war started in 2014]. The [Russian] media says one thing, people [in Ukraine] say another thing. I don’t know what the truth is, and what propaganda is anymore (R40).” Instead of using personal experience for constructing the authenticity heuristic, this participant refrains from judgment. As she is politically disengaged, alternative perspectives accessed through her relatives confuse her and do not allow her to convert her experience into a heuristic.

Similarly, all audiences used popular wisdoms. However, the use of the same elements of popular wisdom selectively could lead to different heuristics depending on political alignment. For instance, two participants focus on the same element of the same story: emotions of ordinary people surviving shelling in Donbas. Following logic, an anti-regime participant identifies emotions as a sign of manipulation: “It’s too emotional (…) It has it all: emotional distress, facial expressions, trembling face (…) It’s made to make people cry (…) It makes me feel that they attempt to influence my opinion (R16).” In this context, emotions are seen as a sign of persuasive intent: if emotions are present in the story, the story is designed to influence the viewer’s opinion. Relying on logic, a pro-regime participant comes to a different conclusion. He identifies emotions as a sign of authenticity and concludes that the story is trustworthy: “They really show emotions of real people. It feels like they are sincere. They act of their own conviction rather being controlled (R4).” In this context, emotions are seen as a sign of authenticity. If emotions are present in the story, people in the story act independently rather than being used to affect the viewer’s opinion.

Conclusion

How do citizens evaluate the credibility of regime communication in an authoritarian environment? Does reliance on personal experience and popular wisdom help them to evaluate regime propaganda critically? My findings offer some tentative answers to these questions. Participants exhibited a great level of distrust of regime propaganda and attempted to rely on personal experience and popular wisdom as alternative sources of frames for political thinking. However, it did not have a uniform effect on the perceived credibility of regime propaganda. Personal experience and popular wisdom made participants less likely to rely on the authority heuristic, but more likely to rely on the authenticity heuristic. The use of these resources undermined credibility of propaganda in general, but reinforced credibility of specific stories. In addition, participants used these resources selectively depending on their political alignment. While the apathetic audience relied on personal experience more than other audiences, it was reasonable to expect them to rely on the authenticity heuristic, which is based on the idea that one should trust one’s own personal experience rather than manipulative media. However, it was used primarily by the pro-regime audience further confirming that regime propaganda capitalizes on media distrust. Similarly, while popular wisdom led pro-regime participants to use the authenticity heuristic and evaluate regime propaganda as credible, it led anti-regime and apathetic participants to use the persuasive intent heuristic and evaluate regime propaganda as untrustworthy.

These findings question the idea that personal experience serves as an alternative source of information which can help citizens to resist frames imposed by the media (e.g., Gamson, Citation1992; Mickiewicz, Citation2008). While the discrepancy between personal experience with events and the way these events are portrayed by propaganda undermines credibility of state media in general, my findings demonstrate that personal experience is not a solid and unambiguous entity which can be compared to the content of propaganda to discard it. The way it is activated depends on the post factum use of schemas or “mental maps, which serve as the general guidelines for reconstructing specific types of events” (Graber, Citation2001, Ch. 2, para. 1), which, in turn, depends on frames offered by the media. Similarly, popular wisdom is often perceived as a source of counter-establishment themes (e.g., Billig et al., Citation1988; Gamson, Citation1992; Neuman et al., Citation1992), especially in countries where media distrust deeply permeates the culture (Mickiewicz, Citation2008). However, my findings suggest that the use of popular wisdom has no uniform effect on the perceptions of the credibility of regime propaganda. Citizens are likely to select those elements of popular wisdom which confirm their political views and evaluate regime propaganda depending on their political alignment.

Moreover, a number of scholars suggested that one of the strategies of the Russian state media is to instill distrust toward any form of media and political information (e.g., Alyukov, Citation2022b; Shields, Citation2021). My analysis lends further evidence to the efficacy of this strategy. Regime propaganda often frames the Donbas war as a confrontation between political intentions of the Ukrainian government and apolitical authentic needs of ordinary citizens in Eastern Ukraine. This apolitical framing appeals to citizens’ personal experience, triggers the authenticity heuristic, and lends more credibility to regime propaganda. In the analysis of Russian citizens’ reception of the regime propaganda regarding Ukraine in 2014, Szostek (Citation2018a) finds that regime narratives resonated even with those who preferred alternative sources of information because state television “shapes ‘eyewitness’ discourse on the forums and blogs that certain individuals trust more than ‘official’ sources” (Szostek, Citation2018a). My findings demonstrate that the connection between media distrust and the effectiveness of regime propaganda can be more intimate. Paradoxically, regime propaganda can harness media distrust to amplify its credibility.

These results highlight several important patterns which are relevant for democratic contexts as well. The connection between low media trust and belief in untrustworthy information is not unique to authoritarian contexts. In democracies, the formation of partisan echo chambers is driven partly by low trust in mainstream media (e.g., Allcott & Gentzkow, Citation2017; Ognyanova et al., Citation2020). As citizens are increasingly skeptical of mainstream media, they seek alternative sources of information which can contain untrustworthy information. My findings point at similar mechanisms, but in a different environment. Most of the research on echo chambers assumes that exposure to untrustworthy information is a bottom-up phenomena driven by partisans’ the desire for opinion reinforcement. Echo chambers tend to be formed by small but active and polarized minorities (Guess et al., Citation2020). The participants of this study demonstrate a similar connection between media distrust and belief in propaganda with one exception: unlike partisans in democracies, they do not have extremely narrow media diets and are not at the fringes of political spectrum. As most of the mainstream political content in Russian state media is extremely biased and contains untrustworthy information, one could say that the Russian state media sphere is akin to an echo chamber at the fringes of democracy, but scaled to the level of national media system.

In addition, my findings point to the need to rethink some conventional assumptions about exposure to untrustworthy information. Critical attitudes toward the media are often seen as a cornerstone of a democratic citizenry. They allow citizens to identify bias and get a more objective understanding of the political processes (e.g., Toepfl, Citation2014). However, my findings suggest that media skepticism can backfire amplifying the credibility of untrustworthy information. After the start of Russia’s full-scale invasion of Ukraine in 2022, scholars and journalists alike emphasized the importance of “puncturing Russia’s disinformation bubble” (Financial Times, Citation2022) proposing measures which vary from delivering information about the war to the Russian public via targeted advertising to allowing an international audience to reach Russian citizens via phone. While it is important to circumvent the Kremlin information blockade, the efficiency of these measures might be limited because they target citizens who are already very skeptical about the very possibility of trustworthy information.

Acknowledgments

I am thankful to Svetlana Erpyleva, Andrei Semenov, Joanna Szostek, Samuel Greene, Gregory Asmolov, participants and organizers of the Politics of Russia and Eurasia Series workshop (Florian Toepfl, Noah Buckley, Andrey Tkachenko, John Ora Reuter, Margarita Zavadskaya), two anonymous reviewers, and editors for providing insightful feedback on the article, to Gregory Asmolov for assistance with coding, and to Zachary Reyna for his editorial assistance.

Disclosure Statement

No potential conflict of interest was reported by the author.

Data Availability Statement

Focus group transcripts are not publicly available as they can potentially contain information that can compromise the privacy of research participants, but can be provided by the author upon reasonable request. Data treatments (news episodes) are available online at the Channnel One website via the following links:

1) Euromaidan protests in Kyiv: https://www.1tv.ru/news/2013-11-30/54959-v_kieve_storonniki_kursa_na_evrointegratsiyu_vytesnennye_s_maydana_sobirayutsya_na_drugoy_ploschadi

2) The Russia-backed referendum in Donetsk: https://www.1tv.ru/news/2014-05-05/40994-v_donetskoy_oblasti_idyot_podgotovka_k_referendumu_kotoryy_naznachen_na_11_e_maya

3) The clashes in Eastern Ukraine: https://www.1tv.ru/news/2014-05-05/40994-v_donetskoy_oblasti_idyot_podgotovka_k_referendumu_kotoryy_naznachen_na_11_e_maya

Additional information

Notes on contributors

Maxim Alyukov

Maxim Alyukov (PhD, University of Helsinki, 2021) is a postdoctoral fellow at King’s Russia Institute, King’s College London, and a researcher with an independent research group Public Sociology Laboratory. His research focuses on media, political communication, and political cognition in autocracies with a particular focus on Russia.

Notes

1. Earlier articles rely on this data to explore two other aspects of news reception in an authoritarian environment: cross-media consumption (Alyukov, 2021) and the mechanisms underlying opinion formation (Alyukov, 2022a).

2. Logistic regression models. N: 7 Groups; 49 speakers; 2192 CTs. *p < .05 **p < .01 ***p < .001. Predictor: resources. Outcome: heuristics. Robust standard errors in parentheses. Controls: gender (dichotomous; reference level – female); education (ordinal; high school educated < college students < college educated); political knowledge (calculated as a sum of correct answers to political knowledge items); OMU (online media use index; various online sources are assigned values reflecting the degree of engagement they require: 5 (online newspaper), 4 (online news website), 3 (online channel), 2 (news aggregator), 1 (website of a TV channel). Online media use index is calculated as a sum of these values for each participant); N of CTs – a number of statements made by a participant. As CTs within each individual are highly correlated, standard errors are clustered at the level of the individual.

3. Logistic regression models. N: 7 Groups; 49 speakers; 2192 CTs. *p < .05 **p < .01 ***p < .001 Predictor: orientations. Outcome: heuristics and resources. Robust standard errors in parentheses. Controls: see .

References

  • Alizada, N., Cole, R., Gastaldi, L., Grahn, S., Hellmeier, S., Kolvani, P., Lachapelle, J., Lührmann, A., Maerz, S. F., Pillai, S., Lindberg, S. I. (2021). Autocratization turns viral. Democracy report 2021. V-Dem Institute.
  • Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211–236. https://doi.org/10.1257/jep.31.2.211
  • Alyukov, M. (2021). News reception and authoritarian control in a hybrid media system: Russian TV viewers and the Russia-Ukraine conflict. Politics, Advance online publication. https://doi.org/10.1177/02633957211041440
  • Alyukov, M. (2022a). Making sense of the news in an authoritarian regime: Russian Television Viewers’ reception of the Russia–Ukraine conflict. Europe-Asia Studies, 74(3), 337–359. https://doi.org/10.1080/09668136.2021.2016633
  • Alyukov, M. (2022b). Propaganda, authoritarianism and Russia’s invasion of Ukraine. Nature Human Behaviour, 6(6), 763–765. https://doi.org/10.1038/s41562-022-01375-x
  • Asmolov, G. (2018). The disconnective power of disinformation campaigns. Journal of International Affairs, 71(1.5), 69–76.
  • Baum, M., & Gussin, P. (2008). In the eye of the beholder: How information shortcuts shape individual perceptions of bias in the media. Quarterly Journal of Political Science, 3(1), 1–31. https://doi.org/10.1561/100.00007010
  • Billig, M., Condor, S., Edwards, D., & Gane, M. (1988). Ideological dilemmas: a social psychology of everyday Thinking. Sage Publications.
  • Brader, T., & Tucker, J. (2012). Following the party’s lead: Party cues, policy opinion, and the power of partisanship in three multiparty systems. Comparative Politics, 44(4), 403–420. https://doi.org/10.5129/001041512801283004
  • Chapkovski, P., & Schaub, M. (2022). Solid support or secret dissent? A list experiment on preference falsification during the Russian war against Ukraine. Research & Politics, 9(2). https://doi.org/10.1177/20531680221108328
  • Ciuk, D. J., & Yost, B. A. (2016). The effects of issue salience, elite influence, and policy content on public opinion. Political Communication, 33(2), 328–345. https://doi.org/10.1080/10584609.2015.1017629
  • Converse, P. E. (1970). Attitudes and non-attitudes: Continuation of a dialogue. In E. R. Tufte (Ed.), The quantitative analysis of social problems (pp. 168–189). Addison Wesley.
  • Financial Times. (2022). How to Puncture Russia’s disinformation bubble. March 13, 2022. https://www.ft.com/content/c29fab4f-d386-406d-a5c1-e6119c7a9177
  • Fiske, S., & Taylor, S. (2017). Social cognition. From brains to culture. Sage. Publications.
  • Flanagin, A., & Metzger, M. (2007). The role of site features, user attributes, and information verification behaviors on the perceived credibility of web-based information. New Media Society, 9(2), 319–342. https://doi.org/10.1177/1461444807075015
  • Frye, T., Gehlbach, S., Marquardt, K. L., & Reuter, O. J. (2017). Is Putin’s popularity real? Post-Soviet Affairs, 33(1), 1–15. https://doi.org/10.1080/1060586X.2016.1144334
  • Frye, T., Gehlbach, S., Marquardt, K., & Reuter, O. J. (2022). Is Putin’s popularity (still) real? A Cautionary note on using list experiments to measure popularity in authoritarian regimes. PONARS Eurasia Policy Memo NO. 773, 1–10. https://doi.org/10.1080/1060586X.2023.2187195
  • Gamson, W. (1992). Talking politics. Cambridge University Press.
  • Gigerenzer, G., & Todd, P. M. (1999). Simple heuristics that make us smart. Oxford University Press.
  • Graber, D. (1988). Processing the news. how people tame the information tide. Guilford Publications.
  • Graber, D. (2001). Processing politics: Learning from television in the internet age. University of Chicago Press.
  • Greene, S., & Robertson, G. (2017). Agreeable authoritarians: Personality and politics in contemporary Russia. Comparative Political Studies, 50(13), 1802–1834. https://doi.org/10.1177/0010414016688005
  • Greene, S. A., & Robertson, G. (2022). Affect and autocracy: Emotions and attitudes in Russia after crimea. Perspectives on Politics, 20(1), 38–52. https://doi.org/10.1017/S1537592720002339
  • Guess, A., Nyhan, B., & Reifler, R. (2020). Exposure to untrustworthy websites in the 2016 US Election. Nature Human Behaviour, 4(5), 472–480. https://doi.org/10.1038/s41562-020-0833-x
  • Hale, H. E. (2022). Authoritarian rallying as reputational cascade? Evidence from Putin’s popularity surge after Crimea. The American Political Science Review, 116(2), 580–594. https://doi.org/10.1017/S0003055421001052
  • Hopkins, V. (2022). Ukrainians find that relatives in Russia don’t believe it’s a war. New York Times. March 6. https://www.nytimes.com/2022/03/06/world/europe/ukraine-russia-families.html
  • Hovland, C., & Weiss, W. (1951). The influence of source credibility on communication effectiveness. Public Opinion Quarterly, 15(4), 635–650. https://doi.org/10.1086/266350
  • Just, M., Crigler, A., Alger, D., Cook, T., Kern, M., & West, D. (1996). Crosstalk: Citizens, candidates, and the media in a presidential campaign. University of Chicago Press.
  • Kam, C. D. (2005). Who toes the party line? Cues, values, and individual differences. Political Behavior, 27(2), 163–182. https://doi.org/10.1007/s11109-005-1764-y
  • Krippendorff, K. (2004). Content analysis. An introduction to its methodology (2nd ed.). Sage.
  • Lankina, T., & Watanabe, K. (2017). ‘Russian spring’ or ‘spring betrayal’? The media as a mirror of Putin’s evolving strategy in Ukraine. Europe-Asia Studies, 69(10), 1526–1556. https://doi.org/10.1080/09668136.2017.1397603
  • Levada Center. (2017). Rossiysko-ukrainskiyi otnosheniya [Russian-Ukrainian Relationships]. Levada Center. https://www.levada.ru/2017/06/23/rossijsko-ukrainskie-otnosheniya/
  • Lunt, P., & Livingstone, S. (1996). Rethinking the focus group in media and communications research. Journal of Communication, 46(2), 79–98. https://doi.org/10.1111/j.1460-2466.1996.tb01475.x
  • Mattingly, D., & Yao, E. How soft propaganda persuades. (2022). Comparative Political Studies, 55(9), 1569–1594. Epub ahead of print 15 February. https://doi.org/10.1177/00104140211047403
  • Metzger, M. J., Flanagin, A. J., & Medders, R. B. (2010). Social and heuristic approaches to credibility evaluation online. Journal of Communication, 60(3), 413–439. https://doi.org/10.1111/j.1460-2466.2010.01488.x
  • Mickiewicz, E. (2005). Excavating concealed tradeoffs: How Russians watch the news. Political Communication, 22(3), 355–380. https://doi.org/10.1080/10584600591006636
  • Mickiewicz, E. (2008). Television, power, and the public in Russia. Cambridge University Press.
  • Neuman, R., Just, M., & Crigler, A. (1992). Common knowledge: News and the construction of political meaning. University of Chicago Press.
  • Ognyanova, K., Lazer, D., Robertson, R., & Wilson, C. (2020). Misinformation in action: Fake news exposure is linked to lower trust in media, higher trust in government when your side is in power. Harvard Kennedy School Misinformation Review, 1(4). https://doi.org/10.37016/mr-2020-024
  • Pearce, K., & Kendzior, S. (2012). Networked authoritarianism and social media in Azerbaijan. Journal of Communication, 62(2), 283–298. https://doi.org/10.1111/j.1460-2466.2012.01633.x
  • Perrin, A. (2005). Political microcultures: Linking civic life and democratic discourse. Social Forces, 84(2), 1049–1082. https://doi.org/10.1353/sof.2006.0028
  • Petersen, M. B. (2015). Evolutionary political psychology: On the origin and structure of heuristics and biases in politics. Political Psychology, 36, 45–78. https://doi.org/10.1111/pops.12237
  • Riffe, D., Lacy, S., Watson, B. R., & Fico, F. (2019). Analyzing media messages. Using quantitative content analysis in research (4th ed.). Routledge.
  • Sacks, H., Schegloff, E., & Jefferson, G. (1974). A simplest systematics for the organization of turn-taking in conversation. Language, 50(4), 696–735. https://doi.org/10.1353/lan.1974.0010
  • Sharafutdinova, G. (2020). The red mirror: Putin’s leadership and Russia’s insecure identity. Cambridge University Press.
  • Shields, P. (2021). Killing politics softly: Unconvincing propaganda and political cynicism in Russia. Communist and Post-Communist Studies, 54(4), 54–73. https://doi.org/10.1525/j.postcomstud.2021.54.4.54
  • Shirikov, A. (2022). How propaganda works: Political biases and news credibility in autocracies [ Doctoral Dissertation, University of Wisconsin-Madison]. ProQuest Dissertations and Theses database].
  • Son, J., Lee, J., Oh, O., Lee, H., & Woo, J. (2020). Using a heuristic-systematic model to assess the Twitter user profile’s impact on disaster tweet credibility. International Journal of Information Management, 54, 102176. https://doi.org/10.1016/j.ijinfomgt.2020.102176
  • Stockmann, D., & Gallagher, M. (2011). Remote control: How the media sustain authoritarian rule in China. Comparative Political Studies, 44(4), 436–467. https://doi.org/10.1177/0010414010394773
  • Sundar, S. (2008). The MAIN model: A heuristic approach to understanding technology effects on credibility. In M. Metzger & A. Flanagin (Eds.), Digital media, youth, and credibility (pp. 73–100). The MIT Press.
  • Sundar, S., Knobloch-Westerwick, S., & Hastall, M. (2007). News cues: Information scent and cognitive heuristics. Journal of the American Society for Information Science and Technology, 58(3), 366–378. https://doi.org/10.1002/asi.20511
  • Szostek, J. (2018a). News media repertoires and strategic narrative reception: A paradox of dis/belief in authoritarian Russia. New Media Society, 20(1), 68–87. https://doi.org/10.1177/1461444816656638
  • Szostek, J. (2018b). Nothing is true? The credibility of news and conflicting narratives during ‘information war’ in Ukraine:. The International Journal of Press/politics, 23(1), 116–135. https://doi.org/10.1177/1940161217743258
  • Toepfl, F. (2014). Four facets of critical news literacy in a non-democratic regime: How young Russians navigate their news. European Journal of Communication, 29(1), 68–82. https://doi.org/10.1177/0267323113511183
  • Tsfati, Y., & Ariely, G. (2014). Individual and contextual correlates of trust in media across 44 countries. Communication Research, 41(6), 760–782. https://doi.org/10.1177/0093650213485972
  • Vihalemm, T., & Juzefovics, J. (2022). Sense-making of conflicting political news among Baltic Russian-speaking audiences. National Identities, 23(3), 253–275. https://doi.org/10.1080/14608944.2020.1723512
  • Volkov, D., Goncharov, S., Paramonova, A., & Leven, D. (2021). Rossiyskiy medialandshaft-2021 [Russian media landscape 2021]. Levada Center. https://www.levada.ru/2021/08/05/rossijskij-medialandshaft-2021/
  • Zhuravlev, O., Savelyeva, S., & Erpyleva, S. (2019). The cultural pragmatics of an event: The politicization of local activism in Russia. International Journal of Politics, Culture and Society, 33(2), 163–180. https://doi.org/10.1007/s10767-019-9321-6

Appendix Section 1: An Example of a News Story

Maidan Protests in Kyiv (4 minutes 38 seconds, aired on November 30, 2013)

News Anchor:

Now news from Kyiv. The supporters of Eurointegration who were pushed out of the Maidan square early this morning are gathering at another square – Mikhaylovksya – close to the territory of the monastery where 100-200 participants of the dispersed demonstration are sheltered. The officials are making claims related to the use of force by police. According to the official data, 35 people were injured and 35 people were detained. Our correspondent Vitaliy Kadchenko will provide more details.

Correspondent:

The riots on Euromaidan started at 4 p.m. When the truck supposed to install the New Year Tree entered the square, police demanded protesters to leave. All protest actions in the city center were officially banned two weeks ago, but the ban has not worked. People in masks started to spread an unknown gas and there were several noises sounding like gunshots. People started to throw rocks, bottles, and burning logs at the riot police. Nobody was distinguishing between friends and foes. People climbing the roof of the trade center were literally pulled from it. People used batons, even legs to kick. About 40 people were injured including two Polish citizens who found themselves at the protest action against Ukraine’s officials to postpone Eurointegration as well. Seven people with various injuries were brought to the hospital. Police officials say that 12 officers were injured as well. Criminal investigations were initiated under the articles “Resistance to police officers” and “Hooliganism.” After detention, police officers let them go. About 200 participants were sheltered by the Mikhaylovkiy monastery of the Kyiv Patriarchate, the so called “heretic church,” from detention. People started to gather in demonstration in front of the monastery – they were given hot tea and warm clothes. They were asked not to use swear words and not to smoke inside the monastery.

Polish citizens:

What makes us Europeans are not agreements and associations signed by one person. My heart, not my hand and head, hurts when I look at what is happening.

Correspondent:

Despite it being the weekend, the ambassadors of the Netherlands and Finland and the chair of the representative office of the European Union visited protesters. They refused to speak from the stage remembering the agitation of the chief of the Lithuanian Seim for the Eurointegration before the summit in Vilnius. That is how they explained their reasons for being at the square.

Arja Makkonen, Ambassador of Finland:

We have come here because we want to know what has happened – to see the situation with our own eyes.

Kess Klompenhouwer, Ambassador of Netherlands:

We decisively condemn violence against protesters.

Correspondent:

The Prime Minister of Ukraine, Nikolay Azarov, demanded to find those who are responsible for the violent dispersal of the demonstration. He even posted it on his Facebook page. The chair of the Cabinet of Ministers assured citizens that the ruling party is not interested in the escalation of this situation and asks Ukrainians not to succumb to provocations.

Azarov Facebook page:

I am infuriated and concerned by the events that happened on the Maidan square last night. The data I have does not allow one to make definite conclusions: who is responsible for this provocation. That is why an investigative group was created including officials from the Prosecutor’s Office. It will give a qualified legal assessment of all parties involved in the conflict. This group has a clear task: to carry out an investigation in a very limited time and give society a clear answer: who should receive a punishment, what punishment, and for what.

Correspondent:

The Minister of the Interior promised to find and punish culprits. The leaders of the oppositional parties claimed that they are starting to prepare nationwide strikes to demand the resignation of the government and early election of the Rada [Ukrainian parliament]. Arseniy Yatsenyuk called the assault of Euromaidan by police “a special operation of the ruling party.”

Arseniy Yatsenyuk:

We – three opposition parties – have made a decision about the formation of the Headquarters of the national resistance. We are starting to prepare a nationwide strike. We expect a reaction from our Western partners. Not only words but actions.

Correspondent:

Tomorrow the rally will move to a new place. The announced “Narodnoye veche” will take place in Shevchenko Park which is also in the city center. The opposition announced the mobilization of supporters across the country. The deputies of three factions — “Batkivshchyna,” “Udar,” and “Svoboda”— will try to enforce compliance with their demands in the Verkhovna Rada [Ukrainian parliament] next week. We all know how it happens. In the meantime, a huge rally in support of President Yanukovych’s decision not to sign the agreement with the EU took place in Kharkiv on the biggest Square of Freedom in Europe. The directors of big businesses gave speeches and reiterated that the main market for Ukraine is Russia and CIS countries. Deteriorating relationships with neighbors can lead to economic collapse. Vitaliy Kadchenko, Evgeniy Krivonosov, Sergey Titenko and Grigoriy Yemelyanov, Channel One, Kyiv.

Visuals:

Map of Ukraine; clashes between police and protesters; throwing rocks and bottles; building barricades; rally in front of monastery; interview with injured Polish citizens; interview with the Ambassadors of Finland and the Netherlands; Facebook page of Prime Minister Arseniy Yatsenyuk speaking at the press conference of the opposition; peaceful pro-Russian meeting in Kharkiv supporting President Yanukovychs’ decision not to sign the EU-Ukraine agreement.

Section 2: Focus Group Scenario

  • First Impressions. Please describe your first impressions about the news report. What associations come to your mind? Which elements in the video attracted your attention?

  • Emotions. What emotions does this news report provoke? Did you have situations when emotions or mood made you avoid watching TV or watch TV more?

  • Credibility. Do you think that this news report is credible? Or is it rather questionable? Why?

  • Negative effects. Do you watch news about Ukraine? Can you say that you started watching news more after the beginning of the conflict? Less? Why?

  • Practices of watching. Do you watch news alone or with friends/family? Do you discuss the news with them? Do you have arguments or disagreements? Do you tend to agree with them, or do you rather stick to your own opinion? Do you try to convince others? Whose opinions are important for you in the evaluation news and politics?

  • Internet usage. Do you read or watch news online? What kind of sources online do you use: blogs, news websites, news aggregators, social networks, YouTube? Is there any difference between the same news on TV and online? How do you determine what is more credible if and when the stories on TV and online contradict?

Section 3: Media Consumption Questions

  1. How often do you watch TV?

  2. Do you usually watch TV alone or with friends/relatives?

  3. What channels do you have access to?

  4. What type of content do you watch most?

  5. What type of content comes second?

  6. How often do you watch the news?

  7. If you watch the news, what news programs do you prefer?

  8. How often do you use the Internet?

  9. Do you use the Internet at home or at work?

  10. What purpose do you use the Internet for?

  11. What social networks do you use?

  12. If you watch the news on the Internet, what websites do you prefer to watch the news?

Section 4: Political Knowledge Questions

Following the advice of political psychologists, the pre-focus group questionnaire included neutral factual questions dealing with political institutions and recent major political events. Since this is not a quantitative study, the questions were designed to obtain additional contextual information about participants rather than to measure their political knowledge and draw reliable conclusions. The questionnaire included items related to domestic politics, international politics, and major political events discussed intensively in the media during the period of the study:

  1. Who is the current speaker of the Russian State Duma?

  2. What kind of elections took place in Russia in 2016?

  3. Which countries are members of NATO?

  4. Is the UK a member of the European Union?

  5. Which country accepted the highest number of Syrian refugees?

  6. Where was the Admiral Kuznetsov carrier group going in 2016?

  7. How many aircraft carries does Russia have?

Each question had one or several correct answers. All correct answers were summed to create the political knowledge index for each participant. Participants’ scores ranged from 1 to 10. The mean index was 4.6. Mean indices were 4.1 for the apathetic audience, 5.8 for the pro-regime audience, and 6.19 for the anti-regime audience.

Section 5: Media Consumption

Table A1. Participants’ media use.

Table A2. Russians’ media use (Volkov et al., 2021).

Participants’ media use is similar to nationwide trends with slightly lower television news consumption. According to a 2021 survey of the independent pollster Levada Center, 54% of Russian citizens watch television news every day and 20% almost every day. Among focus group participants, 42% report watching television news daily. Online news consumption is also fairly similar to nationwide trends. According to the same survey, 55% of Russians report using online news daily, which is similar to focus group participants (60%). Furthermore, 37% report using social media for news consumption, which is similar to social media news use among participants (40%) (Volkov et al., 2021).

Section 6: Codebook

Types of coding can be represented as a continuum running from theory-driven deductive content analysis when patterns in text are assigned to predefined categories to data-driven coding when patterns in text are identified inductively. I used deductive thematic analysis. Based on prior research and following Boyatzis’s (1988) approach to deductive thematic analysis, I developed a code sheet for each code which included: 1) a label for a code, 2) a definition of what is coded, 3) a list of specific indicators suggesting that a pattern in text should be assigned to the code, and 4) an example. Examples of sheets for two heuristics, two resources, and two orientations can be found in the Tables below.

Table A3. Codes for authenticity (presence of violence) and persuasive intent (repetition).

Table A4. Codes for popular wisdom (analogy) and personal experience (travel).

Table A5. Apathetic orientation (arcane realm of politics) and anti-regime orientation (Putin’s regime is illegitimate).

Table A6. Heuristics: full list of codes.

Table A7. Resources: full list of codes.

Table A8. Orientations: full list of codes.

The full codebooks can be found below. For heuristics, the first number in the codebook designates the type of discursive element (1 – heuristic), the second number designates the heuristic (e.g., 2 – authority), the third number designates the outcome (1 – trust, 2 – distrust), the fourth number designates a cue a participant reacts to (e.g., 1 – emotional nature of the report). For resources, the first number in the codebook designates the type of discursive element (2 – resource), the second number designates resource type (1 – personal experience, 2 – common sense), the third number designates specific types of reference (e.g., 2 – reference to experience with economy). For orientations, the first number designates the type of discursive element (3 – orientation), the second number designates orientation type (1 – pro-regime schema, 2 – anti-regime schema, 3 – apathetic schema), and the third number designates specific political ideas (e.g., 3 – patriotism).

Section 7: Consistency and Reliability of Coding

In order to calculate inter-coder reliability, two researchers coded focus group data independently. To account for chance agreement, Krippendorff’s Alpha was used to assess reliability. The results are quite low due to highly unstructured nature of data, but above accepted standards. In addition, tetrachoric correlations (rt) were used to access the consistency of coding. Correlations between orientations are negative and weak suggesting that categories do not overlap and do not code the same pattern. Correlations between different heuristics are weak and negative suggesting that categories do not overlap and do not code the same pattern. Correlation between resources is positive but weak suggesting that categories do not overlap and do not code the same pattern

Table A9. Consistency.

Section 8: An Example of Assigning Participants to Audiences

A participant was assigned to an audience if the proportion of this type of orientation in her discourse was larger than the proportion of the other two orientations. E.g., if the proportion of the pro-regime orientation is higher than the proportions of both the anti-regime and the apathetic orientations in a participant’s speech, then she was assigned to the pro-regime audience (Pro-regime orientation > anti-regime orientation & pro-regime orientation > apathetic orientation).

Table A10. Assignment of participants.

Section 9: Audiences

In addition to socio-demographic characteristics, a survey including questions regarding media consumption and political knowledge was used prior to focus groups (see Appendix Sections 3 and 4). The anti-regime audience is younger, better educated, the most politically knowledgeable, and seeks more news online. The pro-regime audience is older, less educated, moderately politically knowledgeable, and moderately active online. The apathetic audience is in between in terms of age and education. It is the least politically knowledgeable and active online in terms of news consumption.

Table A11. Audiences.