0
Views
0
CrossRef citations to date
0
Altmetric

ABSTRACT

An experiment (N = 209) investigated whether terms about the scientific method impacted processing fluency similarly to other forms of jargon. Political ideology emerged as a significant moderator of the effects of method jargon on processing fluency. Specifically, liberals reported a more negative processing experience after exposure to the jargon condition, in line with expected general effects of jargon, while conservatives did not report a similarly negative experience. This preliminary result adds to accumulating evidence that suggests political ideology poses a challenge for science communication.

The rise of partisan polarization poses a challenge to effective science communication (Chinn & Hasell, Citation2023). This study focuses on understanding how scientific method jargon impacts processing fluency, and how this relationship is moderated by political ideology. Processing fluency, defined as the ease or difficulty with processing new information (Schwarz, Citation2010), is crucial in determining how effectively scientific messages are communicated to the public. Understanding how these dynamics differ by political ideology can inform tailored communication strategies for different ideological groups.

Public facing science communication aims to convey expertise, relevance, and rigor, while navigating the challenge of potentially unfamiliar terms and ideas. Jargon, a technical terminology recognized by specific groups, expresses complicated ideas concisely (Sharon & Baram-Tsabari, Citation2014). This precise communication is essential for advancing scientific knowledge (Jucks et al., Citation2007).

Previous research finds that encountering jargon can hinder understanding, leading to increased message resistance and lower perceptions of message credibility (Bullock et al., Citation2019; Riggs et al., Citation2022; Shulman et al., Citation2020). Scientific method jargon, a subset of specialized terms describing study methodologies and scientific strategies, is a potential bridge for public science communication. Unlike domain-specific jargon which may require advanced topic knowledge, terms like “validity,” “reliability,” and “causality” (see Golumbic et al., Citation2023) are introduced in early science education (Rudolph, Citation2019), making them more familiar to general audiences. Given this potential familiarity with method jargon, this study aims to investigate whether scientific method jargon functions similarly to other jargon or avoids degrading science message processing.

We expect, based on this prior work, that participants in the scientific method jargon condition will report lower processing fluency than participants in the no-jargon condition (H1).

Recent work has found that political ideology impacts the processing of scientific information (Dixon & Hubner, Citation2018). Conservatives have shown less desire to engage with new information, more anti-expert opinions, and a greater desire to “do one’s own research” compared with liberals (Chinn & Hasell, Citation2023). How these ideological differences might impact the processing of scientific method jargon is the research question we interrogate: will the relationship between the presence or absence of scientific method jargon and processing fluency be moderated by political ideology (RQ1)?

Method

Participants

Participants in this survey experiment (N = 209) were recruited via CloudResearch MTurk Toolkit. This sample was 54.1% male, the average age was 40.41 years old (SD = 11.43, Range = 23–77) and 75.8% of participants identified as White, 11.6% Black, 9.1% Asian, 0.5% American Indian or Alaska Native, 0.5% Native Hawaiian or Pacific Islander, and 2.5% other.

Procedure

All participants read a short article describing a clinical trial for a new lung cancer treatment drug, adapted from an actual news story (Kingsland, Citation2023). The difference between the two conditions was the presence or absence of methodological jargon terms (see Supplement). These terms were selected based on the Golumbic et al. (Citation2023) study of everyday scientific reasoning. Examples include ecological validity, random assignment, and causality.

Measures

Processing Fluency

Processing fluency was adapted from a validated scale (Kostyk et al., Citation2021), using a 7-point Likert-type scale where higher values indicate more fluent processing (M = 5.41, SD = 1.14, α = .87). A confirmatory factor analysis (CFA) on the six-item scale resulted in the removal of one item with low factor loading. Otherwise, this analysis suggested a good fit for a two-factor structure that is second-order unidimensional (r = .55). Scale items and the CFA are provided in the supplement.

Political Ideology

Participants’ political ideology was measured using one item, with responses ranging from 1=“Very Liberal” to 7=“Very Conservative” (M=3.53, SD=1.96). More participants identified as liberal (53.1%) relative to conservative (31.8%), or moderate (15.2%).

Results

Hypothesis one predicted that the presence of jargon would negatively impact processing fluency. A bivariate regression analysis testing this relationship (No jargon = 0, Jargon = 1) was not significant, F (1, 200) = 0.66, p = .419, r = −.06, so H1 was not supported. Interestingly, unlike much prior research (e.g., Bullock et al., Citation2019), there was no relationship between jargon exposure and processing fluency.

Research question one asked whether participants’ political ideology would moderate the relationship between jargon and processing fluency. This was tested with PROCESS model 1 (simple moderation, Hayes, Citation2013). The overall model predicting processing fluency from message condition and political ideology was statistically significant, F (3, 194) = 2.83, p < .05, R2 = .04. In this model, both political ideology, B = −0.15, SE = 0.05, t = −2.80, p < .05, and the interaction between condition and political ideology, B = 0.17, SE = 0.08, t = 2.00, p < .05, were significant. Using the standard simple slopes analysis in PROCESS, for participants who identified as liberal (1 SD below the mean on the ideology scale), the presence of method jargon was associated with lower processing fluency (as predicted in H1, B = −0.56, SE = 0.26), whereas for participants who identified as conservative (1 SD above the mean on the ideology scale), there was a weak (yet positive) relationship between the presence of method jargon and processing fluency (B = 0.27, SE = 0.26). Thus, processing differences were apparent based on political ideology.

Discussion

This experiment investigated the effects of method jargon on processing fluency and considered political ideology as a moderator. To test these effects, we constructed a set of jargon stimuli that differed from previous operationalizations by only including scientific method jargon as opposed to a wider variety of jargon. The results from this experiment suggest that perhaps method jargon, as a subset of jargon, evades some of the negative impacts of jargon terms in general. That said, exposure to method jargon did trigger ideological differences in self-described information processing.

As described in H1, we argued that if method jargon functioned in ways similar to prior research (e.g., Bullock et al., Citation2019; Riggs et al., Citation2022), exposure to these terms would lower processing fluency. Despite this strong empirical precedent, in our experiment this hypothesis was not supported. The relationship trended in the expected direction but did not reach conventional significance (α = .05).

Although the idea that method jargon can potentially enhance public science communication is important, our findings surrounding political ideology as a moderator were interesting as well. Liberal participants demonstrated the hypothesized negative relationship between method jargon and processing fluency. By contrast, conservative participants did not appear to be meaningfully impacted by method jargon, such that jargon condition and processing fluency scores were only weakly (but positively) related.

This opposing relationship may explain the null results in our tests of message condition’s impact on processing fluency (H1). Further, this pattern of results corroborates mounting evidence (e.g., Dixon & Hubner, Citation2018) that political ideology impacts audiences’ response to scientific messaging even when the science topic is politically neutral. Indeed, future research should seek to identify the root cause of these unexpected effects observed for conservatives, since that is likely necessary to determine what language features might improve fluency for this group. It is possible that the value that conservatives in general place on “doing their own research” (see Chinn & Hasell, Citation2023) might have led them to approach these messages with more overconfidence (guided by their perceived research ability), than liberals, prompting them to report higher processing fluency without actually comprehending the message better. Perhaps method jargon, by describing specific elements of research, triggers this thought process among conservatives in a way that more obscure jargon does not. These potential explanations are speculative, but merit further investigation.

Despite some intriguing and atypical results, it is possible that the null results obtained were due to power and not theory. Indeed, a jargon manipulation that only manipulates one dimension of jargon (i.e., strictly methodological terms) may result in a weaker induction than a manipulation that can evoke jargon in many ways. The manipulation of scientific method jargon meant that some terms did not have a seamless colloquial replacement. Consequently, in some instances, filler words were used to preserve comparable message length. This was necessary for our focus on scientific method jargon but produced some minor semantic inconsistencies between message conditions as a result. Additionally, because we only tested these ideas with a one-shot experiment, with one message manipulation, the broader generalizability and replicability of these effects is uncertain, as is true of most experimentation. Thus, further research is needed. Although null results were unexpected, it is provocative to consider that ideological differences may be affecting scientific information processing in less obvious, but potentially impactful, ways.

In sum, the goal of this research was to consider how communicating about the scientific method, through jargon, could impact audiences’ information processing. Our results provide initial evidence that using terms associated with the scientific method does not hinder message processing. This experiment also offers new evidence for specific mechanisms through which political ideology is impacting scientific discourse. Although it is hoped that moving forward practitioners will not need to produce different messages for political liberals versus conservatives, our findings suggest non-trivial differences in the way these audiences approach the processing of scientific information. Thus, we encourage future research to remain thoughtful about subtle ways that political ideology may be impacting the processing of scientific information, even when the topic itself is nonpolitical.

Supplemental material

Supplementary Material_CRR_R2.docx

Download MS Word (23.8 KB)

Disclosure statement

No potential conflict of interest was reported by the author(s).

Supplemental material

Supplemental data for this article can be accessed online at https://doi.org/10.1080/08824096.2024.2382743

Additional information

Notes on contributors

Blue Lerner

Blue Lerner is a doctoral student in the School of Communication at The Ohio State University. Her research focuses on understanding the ways that polarization can impact responses to science communication.

Hillary C. Shulman

Hillary C. Shulman is an Associate Professor in the School of Communication at The Ohio State University. Her work explores how to communicate about complex or polarizing topics in ways that increase audience engagement with the topic.

References

  • Bullock, O. M., Colón Amill, D., Shulman, H. C., & Dixon, G. N. (2019). Jargon as a barrier to effective science communication: Evidence from metacognition. Public Understanding of Science, 28(7), 845–853. https://doi.org/10.1177/0963662519865687
  • Chinn, S., & Hasell, A. (2023). Support for “doing your own research” is associated with COVID-19 misperceptions and scientific mistrust. Harvard Kennedy School Misinformation Review. https://doi.org/10.37016/mr-2020-117
  • Dixon, G., & Hubner, A. (2018). Neutralizing the effect of political worldviews by communicating scientific agreement: A thought-listing study. Science Communication, 40(3), 393–415. https://doi.org/10.1177/1075547018769907
  • Golumbic, Y. N., Dalyot, K., Barel Ben David, Y., & Keller, M. (2023). Establishing an everyday scientific reasoning scale to learn how non-scientists reason with science. Public Understanding of Science, 32(1), 40–55. https://doi.org/10.1177/09636625221098539
  • Hayes, A. F. (2013). Introduction to mediation, moderation, and conditional process analysis: A regression-based approach. The Guilford Press.
  • Jucks, R., Schulte-Löbbert, P., & Bromme, R. (2007). Supporting experts’ written knowledge communication through reflective prompts on the use of specialist concepts. Zeitschrift für Psychologie/Journal of Psychology, 215(4), 237–247. https://doi.org/10.1027/0044-3409.215.4.237
  • Kingsland, J. (2023, March 3). New drug boosts chances of survival after lung cancer surgery, trial confirms. Medical news Today. https://www.medicalnewstoday.com/articles/new-drug-boosts-chances-of-survival-after-lung-cancer-surgery-trial-confirms#Lower-risk-of-cancer-spreading
  • Kostyk, A., Leonhardt, J. M., & Niculescu, M. (2021). Processing fluency scale development for consumer research. International Journal of Market Research, 63(3), 353–367. https://doi.org/10.1177/1470785319877137
  • Riggs, E. E., Shulman, H. C., & Lopez, R. (2022). Using infographics to reduce the negative effects of jargon on intentions to vaccinate against COVID-19. Public Understanding of Science, 31(6), 751–765. https://doi.org/10.1177/09636625221077385
  • Rudolph, J. L. (2019). How we teach science: What’s changed, and why it matters. Harvard University Press.
  • Schwarz, N. (2010). Feelings-as-information theory. In P. Van Lange, A. W. Kruglanski, & E. T. Higgings (Eds.), Handbook of theories of social psychology (pp. 289–308). Sage.
  • Sharon, A. J., & Baram-Tsabari, A. (2014). Measuring mumbo jumbo: A preliminary quantification of the use of jargon in science communication. Public Understanding of Science, 23(5), 528–546. https://doi.org/10.1177/0963662512469916
  • Shulman, H. C., Dixon, G. N., Bullock, O. M., & Colón Amill, D. (2020). The effects of jargon on processing fluency, self-perceptions, and scientific engagement. Journal of Language & Social Psychology, 39(5–6), 579–597. https://doi.org/10.1177/0261927X20902177