17,995
Views
84
CrossRef citations to date
0
Altmetric
Introduction

A review of educational responses to the “post-truth” condition: Four lenses on “post-truth” problems

&

Abstract

Educators have been increasingly concerned with what can be done about “post-truth” problems—that is, threats to people's abilities to know what is true—such as the spread of misinformation and denial of well-established scientific claims. The articles and commentaries in this special issue present diverse perspectives on how “post-truth” problems related to scientific and socio-scientific issues might be educationally addressed. The goal of this introductory article is to review and analyze the educational responses to the “post-truth” condition that are reflected in this special issue and in the literature at large. We argue that these responses have employed four lenses that focus on different underlying factors related to people's ways of knowing: not knowing how to know, fallible ways of knowing, not caring about truth (enough), and disagreeing about how to know. Each of these lenses offers different explanations of how education might aggravate or mitigate “post-truth” troubles.

“We’re not just fighting an epidemic; we’re fighting an infodemic,” said WHO Director-General Tedros Adhanom Ghebreyesus at a conference on the COVID-19 outbreak. “Fake news spreads faster and more easily than this virus, and is just as dangerous” (WHO, Citation2020). Indeed, misinformation and conspiracy theories about COVID-19 have spread widely online (Brennan et al., Citation2020). Although misinformation and rumors about diseases were common also in the Middle Ages, social media has amplified and accelerated this phenomenon, creating new challenges for public health communicators (Zarocostas, Citation2020).

In many ways, this “infodemic” comes as no surprise. In recent years, so-called “post-truth” phenomena—including massive spread and acceptance of misinformation, denial of scientific claims, and more—have been a focus of growing public discussion and concern (McIntyre, Citation2018; Oxford English Dictionary, Citationn.d.; Prado, Citation2018). In this article, we use the term “post-truth” condition to refer to a range of current threats to people's abilities to know what is true or most accurate in media- and information-rich societies. Like others before us, we use the term “post-truth”Footnote1 not in any literal sense (truth is not a thing of the past, we would argue), but rather in a critical and sarcastic sense (Buckingham, Citation2019). That is, this term is used to decry social trends that reflect a disregard for truth and for reliable ways of knowing what is true. Post-truth phenomena are troubling because inaccurate information, rumors, and conspiracy theories can severely harm people’s abilities to make wise health, environmental, political, social, and economic decisions, at a time when the stakes involved in these decisions are becoming increasingly high.

Post-truth challenges are palpable in science. Scientific misinformation and disinformation on topics such as vaccinations, climate change, and pollution are rife online (Lewandowsky et al., Citation2017; Scheufele & Krause, Citation2019). Well-established scientific claims are challenged and disputed, including by powerful politicians, and various forms of science denial and pseudoscience are promoted online (Hansson, Citation2017; Lewandowsky et al., Citation2017; Scheufele & Krause, Citation2019). Among some groups, confidence in science is decreasing, particularly on socially and politically sensitive topics (Funk et al., Citation2019; Gauchat, Citation2012). Rejection of science on topics such as climate change, vaccinations, and pandemics poses real threats to people’s lives and well-being. This raises pressing questions: What can education do in the face of these challenges? How can education help prepare the next generation to better engage with science in a post-truth world?

This special issue

The goal of this special issue is to reflect on the role of education in addressing post-truth problems related to scientific and socio-scientific issues. All of the authors in this special issue are deeply involved in promoting student and lay engagement with science through diverse theoretical perspectives. The specific aims that we posed to the authors were to: (a) analyze the nature of current post-truth thinking problems about scientific issues, (b) examine why education appears to be ineffectual in mitigating these problems, and (c) develop new proposals for addressing these problems that take into account what is known about the nature of thinking, about the challenges of teaching better thinking, and about instructional research to date.

The articles in this special issue focus on different post-truth challenges and raise diverse proposals about how these might be addressed. Sinatra and Lombardi (Citation2020) tackle the challenge of fostering students' abilities to evaluate scientific claims and sources through teaching students to reappraise plausibility judgments. Lapsley and Chaloner (Citation2020) focus on cultivating intellectual virtues to counteract the pull of epistemic bubbles and echo-chambers. Kienhues et al. (Citation2020) propose new approaches for countering science denialism by sealing gateways to rejection of science. Feinstein and Waddington (Citation2020) focus on helping people work together to make appropriate use of science to address real-world concerns. Chinn et al. (Citation2020a) address the problem of pervasive disagreements about how to know, and how students might learn to negotiate these. In their commentaries, CitationTabak (2020) reflects on the articles’ deliberation-oriented proposals, on their qualified view of science, and on the need to attend more to issues of appropriation; Duschl (Citation2020) situates the articles in relation to past and current efforts to enable students to learn about the struggles inherent in scientific inquiry.

In light of the great diversity of post-truth problems and proposals for addressing them, the goal of this introductory article is to review the range of current educational discourse about addressing the post-truth condition, including the contributions to this special issue, and to try to make sense of the variety of approaches that have been adopted. Doing so, we hope, might provide the field with a roadmap for navigating this terrain and empower educators to design and critique educational responses more effectively.

In the next section, we briefly present several trends associated with the post-truth condition, followed by some of their imputed causes. Then, given the undeniable difficulties of achieving truth, we ask why educators should nonetheless care about pursuing it. After this, we sketch a roadmap of educational discourse about post-truth problems. In a nutshell, we contend that this discourse has employed four lenses that focus on different underlying factors related to people's ways of knowing. We have labeled these lenses: not knowing how to know, fallible ways of knowing, not caring about truth (enough), and disagreeing about how to know. We present each lens, explaining how it construes the focal challenge and the ensuing educational remedies. To conclude, we reflect on the value of these approaches and on where to go next. Although our primary disciplinary focus is scientific and socio-scientific issues, we hope that this review will be relevant to educators working in other disciplines, as well.

The post-truth condition: Trends and factors

The post-truth condition is associated with several interconnected trends that jointly undercut the possibilities of gaining accurate knowledge. We describe these very briefly, in no particular order of severity:

  1. Increasing prevalence and influence of misinformation and disinformation. Communication in the public sphere is increasingly subject to varied forms of misinformation and disinformation. Misinformation is partially or entirely incorrect information that does not necessarily reflect an intention to mislead, whereas disinformation is inaccurate information shared with an intent to manipulate or harm (Wardle & Derakhshan, Citation2017). People are frequently exposed to misinformation and disinformation (Allcott & Gentzkow, Citation2017; European Commision, Citation2018). International data indicates that many people think that news and social media do not do a good enough job separating fact from fiction (Newman et al., Citation2017) and view fake news as a serious problem in their country and for democracy in general (European Commision, Citation2018).

  2. Increasing rejection of well-established claims. A Rand organization report identified increasing disagreement about facts and analytical interpretations of facts and data as one hallmark of “truth decay” (Kavanagh & Rich, Citation2018). To be sure, disagreement, critique, and skepticism are healthy phenomena. The problem lies in increasing disagreements about verifiable facts and claims that are strongly supported by evidence. Examples in science include the public’s disagreement with the scientific consensus on matters such as vaccine safety, evolution, and anthropogenic climate change (Funk et al., Citation2015; Kavanagh & Rich, Citation2018).

  3. Placing personal belief and experience above facts and evidence. Oxford English Dictionary famously defined the term post-truth as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief” (Oxford English Dictionary, Citationn.d.). In addition to prioritizing emotions and personal beliefs (Prado, Citation2018), people may also base their judgments on personal experiences or on anecdotal reports when these support their beliefs, rather than on systematic data (Kavanagh & Rich, Citation2018).

  4. Declining trust in institutional providers of information such as journalism and science. The post-truth condition is also characterized by increasing distrust of traditional sources of information. In many countries, trust in news media has been continuously declining (Newman et al., Citation2019). In parallel, consumption of news from social media and partisan sources has increased (Shearer & Klein, Citation2019), resulting in an increasingly splintered information landscape. Whether trust in science and scientists is declining is a more complex topic. Levels of trust remain generally high in the US and in Europe (Krause et al., Citation2019). In the US, trust in science has declined among political conservatives but not among liberals (Gauchat, Citation2012). Trust in science and scientists is more polarized on socially or politically controversial scientific topics (AAAS, Citation2018; Funk et al., Citation2019). Thus, there are contexts in which the trustworthiness of science is in dispute, leading to doubt and uncertainty about scientific information.

  5. Increasing fragmentation and polarization of information consumption. Some associate the post-truth condition with the rise of “echo chambers”—reliance on a closed ecology of information sources that cater to one's own preferences, sentiments, or worldviews (Wardle & Derakhshan, Citation2017). This phenomenon is not pervasive; people still get exposed to diverse ideas and opinions online (Flaxman et al., Citation2016). However, current media may make it easy to limit one's exposure to alternative ideas and viewpoints (Bakshy et al., Citation2015; Benkler et al., Citation2018; Flaxman et al., Citation2016). This can undermine the possibility of developing shared knowledge and engaging in public discourse.

These trends jointly create a climate in which intellectual values such as truth, accuracy, fair-mindedness, and open-mindedness have become harder to pursue and achieve. Although none of these phenomena are historically new, their current scope and scale may be more extreme than before (Kavanagh & Rich, Citation2018).

The post-truth condition has been attributed to multiple interacting factors. It is beyond the scope of this manuscript to review these factors in detail. Instead, we mention several briefly and not in order of importance:

  • Socio-technological factors. Information and communication technologies, particularly social media, have dramatically increased the volume of information, the ease of accessing it, and the speed with which it spreads (Kavanagh & Rich, Citation2018; Wardle & Derakhshan, Citation2017). Much of this information is not professionally vetted or curated. Although the increase in availability of information has many positive consequences, it also increases the scope and speed of the spread of inaccurate information (Vosoughi et al., Citation2018). Furthermore, the filtering and personalization algorithms of search engines and social media can limit exposure to diverse information and sources (Bakshy et al., Citation2015; Flaxman et al., Citation2016).

  • Political factors. Misinformation and disinformation are often spread to advance the political and ideological agendas of parties, countries, and movements (Benkler et al., Citation2018; Wardle & Derakhshan, Citation2017). The information landscape is increasingly dominated by politically partisan websites that cater to political agendas and identities and provide content that confirms these identities (Benkler et al., Citation2018). Growing political polarization may drive post-truth phenomena by intensifying dissent regarding topics such as climate change (Lewandowsky et al., Citation2017).

  • Economic factors. Misinformation and disinformation are also propagated in order to achieve economic gains and serve corporate interests (Oreskes & Conway, Citation2010; Wardle & Derakhshan, Citation2017). Further, media companies seek to maximize profits for themselves and their advertisers by promoting information based on its emotional appeal and individuals' personal preferences (Lazer et al., Citation2018). This kind of information environment discourages veritistic values by design. Societally, growing economic inequality may increase political polarization and thereby fuel the spread of misinformation (Lewandowsky et al., Citation2017).

  • Scientific factors. Some have suggested that aspects of contemporary science and science communications may reduce trust in science (Krief et al., Citation2017; Scheufele & Krause, Citation2019). The increased role of the private sector in sponsoring research, conferences, and journals increases threats of financial bias (Oreskes, Citation2019). News of replication crises in some fields and reversals of scientific findings may exacerbate concerns about the reliability of science (Hendriks et al., Citation2020). Trust in science may also be undermined by science communication practices such as overstating scientific findings or creating a “false-balance” (Scheufele & Krause, Citation2019).

Of course, post-truth trends have also been attributed to psychological and educational factors and their interactions with the aforementioned factors. We discuss these educational and psychological factors at length in what follows. But before we do so, we would like to reflect on why educators should care about these issues at all.

Why care about the truth?

In a special issue devoted to post-truth, an account of truth is in order. Briefly, the most prevalent conceptualization of truth is based on the common-sense intuition that beliefs or propositions are true if they correspond to some part or aspect of reality (BonJour, Citation2009; Bourget & Chalmers, Citation2014). This need not be exact correspondence; it could involve, for example, relationships of resemblance. Explaining the link between propositions and reality, however, has proven to be very difficult—some argue impossible (BonJour, Citation2009). This has led to emergence of alternative theories of truth such as the coherence theory of truth (briefly, that true propositions fit consistently with other believed propositions) and the pragmatist theory of truth (briefly, that true propositions are successful in practice) (Glanzberg, Citation2018).

Despite ongoing debates about the nature of truth, most philosophers consider true beliefs to be valuable (Pritchard et al., Citation2018). True beliefs have an instrumental or practical value for individual and social well-being (Frankfurt, Citation2006; Horwich, Citation2006). Engineers rely on accurate information to plan robust buildings, and people need accurate information about contagious diseases so that they can keep safe and avoid quack remedies. Accurate facts and data are also necessary for rational societal decision-making and for democratic deliberation and debate (Kavanagh & Rich, Citation2018).

Furthermore, people's capabilities to find out the truth underlie the capacity for social critique and the ability to stand up to power (Lynch, Citation2004). Some of the deepest sources of concern surrounding the post-truth condition are that by surrendering to the never-ending stream of misinformation, disinformation, and bullshitFootnote2 (Frankfurt, Citation1986), people are unwittingly making themselves vulnerable to political manipulation, domination, and authoritarianism (Illouz, Citation2019; McIntyre, Citation2018; Prado, Citation2018; Tesich, Citation1992).

True beliefs can also be valuable intrinsically, for their own sake (Horwich, Citation2006; Lynch, Citation2004). For example, people wish to find out the truth in domains that do not necessarily have practical utility such as the history of human evolution and the origins of the universe.

However, knowing what is true can be hard. People may be misled by their senses, their judgments might be biased, and their socio-cultural contexts might shape what they perceive as true. The growth of knowledge has often revealed that claims once considered to be true were not actually so. Scientific claims, in particular, are tentative and open to revision (see discussions of the tentativenss and fallibility of science in Duschl, Citation2020; Feinstein & Waddington, Citation2020; Kienhues et al., Citation2020). Thus, some have argued that the aims of science should be more modest, such as achieving empirical adequacy (Godfrey-Smith, Citation2003). Along these lines, Feinstein and Waddington (Citation2020) argue that science education should not focus on achieving truth but rather on gaining “pretty good knowledge.”

However, the truth of some matters can be known with a relatively high degree of certainty (for example, the IPCC (Citation2018) states with “very high confidence” that global mean surface temperature is higher now than the average between 1850 and 1900). Furthermore, even if it can be sometimes hard to know if the truth has been found, truth still remains a guiding ideal (BonJour, Citation2009). That is, although it is often hard to know if specific claims are true, the idea that truth matters guides inquiry, underlies the evaluation of scientific claims, evidence, and methods, motivates open-minded consideration of alternatives, and so forth (Lynch, Citation2004; McIntyre, Citation2015).

The pursuit of truth, in science and elsewhere, is facilitated by the social practices used to develop knowledge, such as shared practices of producing and critiquing evidence (Duschl, Citation2020; Longino, Citation1990). Scientists strive to achieve objectivity and validity through mechanisms of social critique and corroboration (Douglas, Citation2007; Kienhues et al., Citation2020; Longino, Citation1990). The resulting scientific knowledge, although fallible and imperfect, is the most reliable knowledge there is on some matters (de Ridder, Citation2020).

Four educational lenses on the post-truth condition

Since the post-truth condition burst into public awareness, there has been a wave of writing about the role of education in addressing post-truth phenomena (e.g., Britt et al., Citation2019; Buckingham, Citation2019; Darner, Citation2019; Journell, Citation2019; Kendeou et al., Citation2019). Educators have been interested in how well students and teachers are prepared for dealing with post-truth phenomena and how to boost their preparedness. Of course, this does not suggest that the sole responsibility for addressing these problems lies with education. On the contrary, an effective response is likely to require combined regulatory, social, technological, and educational measures (Feinstein & Waddington, Citation2020; Lewandowsky et al., Citation2017; Wardle & Derakhshan, Citation2017).

Educators’ proposals for addressing the post-truth condition have been diverse. On our reading, these proposals can be categorized into four broad approaches, each of which applies a different lens to analyze post-truth thinking challenges. These lenses naturally focus on educationally tractable factors that contribute to post-truth trends. Each lens offers (a) an explanation of how people's ways of knowing can drive post-truth trends, (b) an analysis of how education might aggravate the problem, and (c) corresponding proposals for how education could mitigate the problem. A summary of the four lenses is provided in .

Table 1. Four educational lenses on the post-truth condition.

These lenses are context-sensitive; that is, their applicability varies across different domains, topics, and tasks, and under different social and pragmatic conditions and constraints. In addition, these lenses are by no means mutually exclusive. Indeed, some researchers employ more than one lens to explain post-truth phenomena, and each of the five post-truth trends that we have described can be explained using multiple lenses.

Our analysis is informed by the AIR model of epistemic thinking (Chinn et al., Citation2014; Chinn & Rinehart, Citation2016). This model identifies three components of epistemic thinking—that is, thinking that is geared toward achieving epistemic ends such as knowledge and truth: (a) Epistemic Aims that people strive for, such as wanting to know something, and the value of these aims; (b) Epistemic Ideals (i.e., norms or criteria) for evaluating whether epistemic aims have been achieved and appraising the quality of epistemic products; and (c) Reliable epistemic processes—procedures, strategies, and methods that have a good likelihood of resulting in successful epistemic outcomes. We argue that each of the four lenses focuses on different kinds of problems with respect to people's epistemic aims, ideals, and processes.

Lens #1: Not knowing how to know

What is the problem?

Many educational accounts attribute post-truth phenomena to gaps in knowledge and reasoning skills needed to evaluate information in the digital sphere (e.g., Breakstone et al., Citation2018). On this analysis, people's epistemic aims are not the cause of the problem: People generally prefer accurate information and reliably formed knowledge. However, they fail to achieve their epistemic aims because they are unfamiliar with appropriate epistemic criteria and processes for evaluating current scientific information and sources, or because their current criteria and processes are inadequate.

Some locate the problem in insufficient media and digital literacy skills (which can be broadly defined as skills for participation in media- and information-rich societies, Hobbs, Citation2010). For example, students might not know how to fact-check online claims, how to evaluate memes and YouTube videos, how to identify media bias, and more (Hobbs, Citation2017; Journell, Citation2019; Wardle & Derakhshan, Citation2017).

Furthermore, students and laypersons typically lack sufficiently deep and accurate knowledge about science to directly evaluate many science claims and evidence (first-hand evaluation; Bromme & Goldman, Citation2014), especially when it comes to complex and controversial issues. Hence, people need skills for indirectly evaluating science claims and evidence (second-hand evaluation; Bromme & Goldman, Citation2014) by evaluating the trustworthiness of their sources or the degree to which claims are accepted within the scientific community (Bromme et al., Citation2018; Duncan et al., Citation2018).

However, people can find it difficult to identify reliable sources of science information and avoid questionable ones (Bråten et al., Citation2018). For example, they might find it hard to identify markers of scientific expertise (Macedo-Rouet et al., Citation2019), especially when questionable scientific information promoted by splinter groups or pseudoscientists is presented with scientific trappings (Wineburg & McGrew, Citation2019). They might also be unfamiliar with effective evaluation strategies such as lateral-reading (i.e., leaving the site to search for information about its source; Wineburg & McGrew, Citation2019) or searching for the scientific consensus (Cook et al., Citation2017). People may be unaware of misleading science communication techniques such as employing fake experts or creating an impression of false balance in the scientific community (Cook et al., Citation2017; Hansson, Citation2017; Oreskes & Conway, Citation2010). These difficulties may increase susceptibility to scientific misinformation and disinformation.

How might current education aggravate this problem?

Current education may be aggravating this problem by not providing students with sufficient opportunities for developing relevant competencies. For example, Kavanagh and Rich (Citation2018) argued that insufficient civics, media literacy, and critical thinking instruction in the United States is a main cause of “truth decay.” Indeed, students who receive media literacy instruction are more likely to distinguish between evidence-based information and misinformation (Kahne & Bowyer, Citation2017).

Furthermore, current science education might not provide students with a good enough understanding of how authentic scientific inquiry is conducted, in all its complexity and messiness, and how scientific evidence is produced and evaluated (Chinn et al., Citation2020b; Duncan et al., Citation2018; Duschl, Citation2020; Kienhues et al., Citation2020). Science education may also be neglecting competencies that nonscientists can employ in their everyday interactions with science, such as identifying who to trust (Bromme et al., Citation2018; Sinatra & Lombardi, Citation2020) and collaboratively using science to address complex real-world problems (Feinstein & Waddington, Citation2020; Tabak, Citation2020).

How might education mitigate this problem?

A focus on inadequate ways of knowing suggests that education should help students develop new or better ways of knowing. There are several proposals regarding how to achieve this.

Developing students’ civic media and digital literacy competencies

Many have argued that education needs to provide more opportunities for developing civic media and digital literacy skills that can help students critically analyze, evaluate, interpret, and deliberate about online information and sources (Hobbs, Citation2017; Journell, Citation2019; Kahne & Bowyer, Citation2017). Further, instruction of these skills needs to evolve in order to provide students with more robust tools for dealing with current communication and media (e.g., Breakstone et al., Citation2018; Hobbs, Citation2017; Sinatra & Lombardi, Citation2020). This involves acquiring fact-checking skills (Caulfield, Citation2017), a better understanding of media bias, including the role of economic incentives (Buckingham, Citation2019), tools for evaluating memes (Hobbs, Citation2017; Journell & Clark, Citation2019), and more. Particularly important for dealing with scientific information are second-hand evaluation competencies for judging sources of scientific claims and evidence (Brante & Strømsø, Citation2018; Bromme et al., Citation2018; Sinatra & Lombardi, Citation2020).

Promoting scientific literacy that addresses the public's everyday needs

An underlying assumption of the articles and commentaries in this special issue is that science education should devote more attention to ways of knowing that are relevant to the everyday lives of nonscientists. This entails rethinking which science practices are prioritized by science education. Feinstein and Waddington (Citation2020) strongly argue that students should not try to emulate the “internalist” evaluation practices of scientists but rather learn to engage with science to address goals that have everyday personal, social, and civic relevance. For example, science education could address lay competencies for evaluating scientific claims and sources in the media (Bromme & Goldman, Citation2014; Duncan et al., Citation2018; Sinatra & Lombardi, Citation2020) as well as evaluating forms of scientific communication (Höttecke & Allchin, Citation2020). Because interactions with science often occur in social settings, it is important to foster students' abilities to engage with science in collective and collaborative ways (Feinstein & Waddington, Citation2020; Tabak, Citation2020).

Inoculating students against misleading scientific communication techniques

To achieve these goals, students need to become more familiar with the myriad ways in which science is communicated in the digital sphere (Höttecke & Allchin, Citation2020). This involves learning about the nature of current science communication, including misleading communication techniques and how to identify these (Cook et al., Citation2017; Scheufele & Krause, Citation2019; Sinatra & Lombardi, Citation2020). A promising instructional approach is “prebunking”—warning in advance of misleading communication techniques (e.g., false controversy or false balance)—which can help inoculate against the effects of misinformation (Cook et al., Citation2017; van der Linden et al., Citation2017).

Lens #2: Fallible ways of knowing

What is the problem?

Lens #1 reflects an optimistically rational view of human reasoning: Educators hope that once better epistemic ideals and processes are taught, students will be able and disposed to employ them. However, psychological research has extensively documented that human thinking often fails to conform to standards of rationality (e.g., Kahneman, Citation2011; Nickerson, Citation1998; Stanovich, Citation2011). Thus, a dominant alternative lens on the causes of post-truth trends is that these result from fundamental cognitive fallibilities—biases and limitations in cognitive processing that can produce erroneous judgments (Britt et al., Citation2019; Greene et al., Citation2019; Lazer et al., Citation2018; Lewandowsky et al., Citation2017; Rapp & Salovich, Citation2018). Although these cognitive biases and limitations are by no means new, they can be amplified by the current information environment as inaccurate and partisan information proliferates. In terms of the AIR model, on this analysis, post-truth problems occur not because people are unaware of appropriate epistemic aims, ideals, or processes, but because cognitive proclivities and failings severely limit their abilities to successfully engage with such aims, ideals, or processes.

Consider, for example, the epistemic aim of forming true beliefs. Although people are generally inclined to favor true claims over false ones (Pennycook et al., Citation2019), they are still highly susceptible to the illusory-truth effect, that is, to the cognitive tendency to perceive repeated (mis)information as truer than information that has not been heard before (Dechêne et al., Citation2010; Pennycook et al., Citation2018). Such fallible cognitive processes undermine achieving accuracy.

People's best efforts to evaluate claims and evidence can also be undermined by confirmation bias, which is the tendency to evaluate claims and evidence that confirm prior beliefs more favorably than claims and evidence that disconfirm prior beliefs (Chinn & Brewer, Citation1993; Lord et al., Citation1979; Nickerson, Citation1998). It is important to note that confirmation bias need not imply a disregard for truth, nor is it always unreliable. As Lord et al. (Citation1979) noted, “the same bias leads most of us to be skeptical about reports of miraculous virgin births or herbal cures for cancer” (p. 2106). Nonetheless, confirmation bias also has negative repercussions on people's ability to consider alternative claims and evidence, can impair decision making, and may increase attitude polarization (Nickerson, Citation1998; Taber & Lodge, Citation2006).

Fallible thinking also results from the lazy or miserly nature of human thinking (Kahneman, Citation2011). Dual-process theorists have argued that people tend to engage in intuitive, quick, and automatic thinking that is overridden by more reflective and deliberate thinking only in some conditions (Kahneman, Citation2011; Stanovich, Citation2011). Some research has suggested that failure to detect misinformation can be driven more by “lazy” reasoning—that is, a disinclination to engage in analytical thinking—than by motivated reasoning (Pennycook & Rand, Citation2019).

Problems arising from lazy reasoning can be exacerbated by people's tendencies to overestimate the extent or depth of their own knowledge (Dunning, Citation2011) and understanding (Rozenblit & Keil, Citation2002); inflated self-estimates of knowledge and understanding may lead to overreliance on independent judgment in situations when it would be preferable to suspend judgment or defer to experts (Scharrer et al., Citation2017). (For further discussions of the role of cognitive biases and limitations in post-truth problems, see Britt et al., Citation2019; Greene et al., Citation2019; Lewandowsky et al., Citation2017; Sinatra & Lombardi, Citation2020).

How might current education aggravate this problem?

One would hope that education would help people overcome their cognitive biases and limitations. However, evidence suggests that education may sometimes even exacerbate these problems. For example, people who are politically knowledgeable can be even more susceptible to confirmation bias because they possess greater tools to counterargue belief-inconsistent claims and evidence (Taber & Lodge, Citation2006). Similarly, greater science comprehension can sometimes result in increasingly polarized attitudes regarding some politicized science claims (Kahan et al., Citation2012). The fear is that science education may provide students with just enough knowledge and skills to rationalize their beliefs and increase confirmation biases, without instilling in them a sufficient awareness of the limitations of their lay knowledge and reasoning. Furthermore, Kienhues et al. (Citation2020) suggest that science education's emphasis on independent understanding of complicated scientific issues may downplay the difficulties and limitations that nonexperts can face in understanding science.

How might education mitigate this problem?

The direct instructional approach of simply teaching students about cognitive biases has had mixed results, because people find it hard to detect biases in their own judgments and might not make the effort to do so even after being told (Greene et al., Citation2019). However, research suggests other ways for reducing fallible thinking.

Promoting epistemic vigilance

Because people have a tendency to assume that everyday communication is accurate (Brashier & Marsh, Citation2020), some have suggested that education should increase students' epistemic vigilance regarding the dangers of online misinformation and disinformation (Britt et al., Citation2019). Epistemic vigilance involves monitoring the trustworthiness of information and sources while taking into account the risk of being misinformed (Sperber et al., Citation2010). For example, epistemic vigilance regarding sources with vested interests may increase tendencies to critically evaluate data (Gierth & Bromme, Citation2020). Indeed, alerting people to potential inaccuracies can reduce the tendency for lazy reasoning, improve discernment of fake news, and reduce intentions to share it on social media (Pennycook et al., Citation2019). Recent survey data suggest that growing awareness of misinformation has led news consumers to become more discerning about what they read and share (Newman et al., Citation2019). Thus, cultivating students' awareness and vigilance regarding misinformation and disinformation online may promote more careful and attentive information use.

Learning to cope with cognitive biases and limitations

Instruction might also focus on enabling students to learn processes that can help people cope with biases (Greene et al., Citation2019). For example, students might be taught to stop and consider arguments, explanations, and evidence on both sides of an issue (e.g., McCrudden & Sparks, Citation2014).

Dealing with biases is likely to require developing metacognitive skills and dispositions for reflecting on one's judgments. Greene et al. (Citation2019) proposed combining SRL (self-regulated learning) and social-psychological interventions to address cognitive biases. Sinatra and Lombardi (Citation2020) focus on the potentially biasing effect of students' plausibility judgments on their evaluations of the credibility of sources and claims. To address this challenge, Sinatra and Lombardi (Citation2020) propose that students be taught to metacognitively reappraise their plausibility judgments—to step back and ask themselves “Is this explanation plausible, and how do I know?” (p. 120, emphasis in the original).

Lens #3: Not caring about truth (enough)

What is the problem?

Both lens #1 and lens #2 assume that people are basically motivated to acquire accurate information, so that if they are provided with better tools to evaluate information and to overcome the negative consequences of fallible reasoning, they will respond by attempting to think in more rational and impartial ways. However, what if the problem is that people sometimes do not care enough about accuracy and impartiality? An alternative explanation of post-truth trends is that these stem from a waning commitment to epistemic aims, ideals, and processes. People may not care enough about truth.

This waning commitment to truth can take two distinguishable forms. One is that people may sometimes care more deeply about personal, social, economic, or political goals and values, than about epistemic aims, ideals, and processes. As McIntyre (Citation2018) put it, “post-truth is not so much a claim that truth does not exist as that facts are subordinate to our political point of view” (p. 11, original emphases). Viewed in this way, post-truth problems occur because people’s social, economic, political, and other commitments can outweigh their commitment to epistemic aims, to rational epistemic ideals, and to the use of reliable epistemic processes and sources. For example, the phenomenon that prominent political leaders do not pay a price for uttering mistruths may reflect a diminished regard for the value of truth in the public sphere (Illouz, Citation2019; Swire et al., Citation2017). A disregard for truth may also be apparent in everyday communication. In a recent Amazon Mechanical Turk study by Pennycook et al. (Citation2019), about one fifth of the respondents reported that they did not think it was important to share only accurate news on social media. In some situations, accuracy may be less important than goals such as pleasing followers and friends, signaling group membership, or promoting political interests.

A second form of a diminished caring about truth can arise when people “give up” on seeking accurate information because they do not believe that they can find it in an information environment inundated with misinformation and disinformation. According to Expectancy-Value Theory (Eccles & Wigfield, Citation2002), the subjective value of engaging in a task can be reduced by lowered expectancies of success and by greater relative costs. This is exactly what happens in the post-truth condition: The confusing information environment makes it more difficult to find out what is true, and the costs of trying to do so are much higher, thereby reducing the value of pursuing the truth. Confusion may be deliberately sown by economic and political agents. For example, energy companies have worked to create a misleading impression that scientific conclusions about the causes of global warming are controversial and uncertain (Oreskes & Conway, Citation2010). Political leaders may disseminate inaccurate and misleading claims because this helps them garner power or distract the public from unpopular actions (Lewandowsky et al., Citation2017; McIntyre, Citation2018). When even seemingly mundane facts are disputed, people may conclude that there is little value in pursuing the truth, because it is too difficult to attain it.

Whether people do not care much about the truth because they think that it is impossible to achieve, or because they care more about other things, the result might be the same: not making the effort to validate claims by scrutinizing sources, corroborating information, and considering alternative arguments and evidence.

How might current education aggravate this problem?

Current education systems might not be doing enough to foster caring about truth and knowledge. Lapsley and Chaloner (Citation2020) suggest that schools and higher-education institutions may be contributing to this problem by putting a stronger emphasis on grades and personal advancement than on desiring knowledge and truth. School culture might also dampen students' inclinations to validate information and seek justification by creating sheltered epistemic environments in which students are not provided with sufficient opportunities for encountering disputed, unvetted, or unreliable information (Chinn et al., Citation2020b) and have too few opportunities to engage in authentic scientific inquiry (Chinn & Malhotra, Citation2002; Duschl, Citation2020). Thus, students might not have enough opportunities to develop a sense of the importance of seeking truth and of the urgency of critically examining claims and sources.

How might education mitigate this problem?

Cultivating dispositional intellectual virtues

Lapsley and Chaloner (Citation2020) argue that science education should do more to cultivate dispositional intellectual virtues, that is, habits of mind that dispose people to good thinking. Lapsley and Chaloner list several important intellectual virtues that education should promote, including intellectual humility, honesty, curiosity, open-mindedness, perseverance, fair-mindedness, impartiality, and objectivity. They discuss how such virtues might be fostered through direct instruction, modeling, practice, self-assessment, metacognitive instruction, and supporting relatedness, competence, and autonomy. Other educators have discussed additional approaches to fostering dispositional intellectual virtues (e.g., Baehr, Citation2016; Sharon & Baram-Tsabari, Citation2020).

Fostering students' intellectual identities and agency

A critical question is how to foster students' motivation so that they will be inclined to pursue epistemic aims, adopt virtuous habits of mind, and engage in reliable epistemic processes. Lapsley and Chaloner (Citation2020) posit that fostering students' science identity is vital for motivating intellectual virtues and bridging the gap between knowing the right thing to do and doing it. Students may be more likely to adopt epistemic practices when they see these as part of their identity (Walsh & Tsurusaki, Citation2018) or when they can identify with role models who engage in such practices (Darner, Citation2019). Cultivating students' intellectual agency in the classroom may be essential for developing a sense of commitment to epistemic practices (Chinn et al., Citation2020a; Sinatra & Lombardi, Citation2020). In a similar vein, Tabak (Citation2020) calls for greater attention to students' appropriation of epistemic practices.

Connecting epistemic pursuits to what matters to people

Rather than ignoring students' everyday experiences and socio-cultural affinities in the science classroom, it might be wiser to think of these as resources for deepening engagement with science and fostering caring about truth and knowledge. Feinstein and Waddington (Citation2020) argue that pursuing truth is less important than adapting science to make sense in people's lives and enabling them to engage with science in personally and socially meaningful ways. Chinn et al. (Citation2020a) propose that to motivate understanding and evaluation of epistemic aims, ideals, and processes, instruction should build on students' personal and social experiences and encounters with science and media in their everyday lives.

Lens #4: Disagreeing about how to know

What is the problem?

The foregoing lenses share the assumption that post-truth problems arise from some sort of failure to engage in appropriate ways of knowing—through using inappropriate ideals or processes, or through not caring about epistemic aims sufficiently. But what if people fundamentally disagree about which ways of knowing are better than others? And what if students do not buy into the ways of knowing that educators wish to promote because their communities value other ways of knowing?

A fourth explanation of the origins of the post-truth trends is that these reflect an increased prevalence of “alternative epistemologies” (Lewandowsky et al., Citation2017) or “deep epistemic disagreements” (Chinn et al., Citation2020a). For example, different knowledge communities can hold different ideals for evaluating news sources and news, and they can disagree about which news-producing processes are most reliable. What is fake news according to the ways of knowing of one epistemic community may be accurate knowledge according to the ways of knowing of another community, and vice versa (Druckman & McGrath, Citation2019). Viewed from this lens, the post-truth condition emerges from the loss of a shared epistemology and the increasing visibility and impact of competing epistemologies.

For example, Chinn et al. (Citation2020a) describe how public discourse about vaccine safety and climate change can involve disagreements about what counts as valuable knowledge, about which criteria should be used to evaluate scientific claims and evidence, and about the reliability and acceptability of various methodologies and sources of scientific knowledge. These alternative epistemologies can result in disagreements about what is true. Some alternative epistemologies may contest the epistemic authority of science because they see science as tied to strong financial interests or because they see science as geared toward promoting particular ideological values and agendas that are in conflict with their own values and worldviews (Hansson, Citation2017; Scheufele & Krause, Citation2019; Waisbord, Citation2018).

A danger in misconstruing differences in epistemology as inadequacies in knowledge or reasoning skills is that this may lead to intellectual arrogance and to ignoring important social and cultural divides in how people know (Latour, Citation2018; Lynch, Citation2019). As Latour (Citation2018, p. 25) has critically commented:

It is not a matter of learning how to repair cognitive deficiencies, but rather of how to live in the same world, share the same culture, face up to the same stakes, perceive a landscape that can be explored in concert. Here we find the habitual vice of epistemology, which consists in attributing to intellectual deficits something that is quite simply a deficit in shared practice.

Furthermore, in democratic societies, as well as in science, there can be multiple legitimate ways of knowing. (This of course does not mean that all ways of knowing are similarly reliable.) Hence, ignoring disagreements about how to know creates a risk of trying to impose an epistemic hierarchy based on authority, rather than engaging in deliberation and debate about how to know (boyd, Citation2018; Marres, Citation2018). Disagreements about ways of knowing cannot be addressed by simply dismissing the other side as wrong. But neither can they be addressed by saying that anything goes.

How might current education aggravate this problem?

Current science education frequently treats the epistemology of science too simplistically, often focused on simple observations or controlling variables in simple experiments (Chinn & Malhotra, Citation2002; Duncan et al., Citation2018), far from the complexities of studies that get debated in online forums. Further, there is too little attention to the social practices of critique and construction that contribute to the reliability of science (Ford, Citation2008; Höttecke & Allchin, Citation2020; Kienhues et al., Citation2020).

In addition, students may not have sufficient opportunities to reflect on the justification of epistemic practices and to consider when different ways of knowing can have merit and when they might fail to produce reliable knowledge (Chinn et al., Citation2020a). Science education has also been critiqued for ignoring the epistemologies of students' home communities and presenting Western science as a universal way of knowing while failing to consider its socio-cultural underpinnings or to emphasize that science itself embraces great diversity in ways of knowing (Bang & Medin, Citation2010; National Research Council, Citation2009).

How might education mitigate this problem?

Various researchers, including the writers of this special issue, take different approaches to addressing this problem.

Reestablishing the epistemic authority of science

Kienhues et al. (Citation2020) seek to reestabish the epistemic authority of science in the face of its deniers by promoting better understanding of scientific practice. They argue that people’s understanding of science as a rational, evidence-based practice is currently undermined by misunderstandings and misperceptions of the value-laden, social, and tentative nature of scientific inquiry. By helping students gain a more realistic and nuanced understanding of these aspects, and of how scientific practice nonetheless enables gradual development of reliable knowledge, Kienhues et al. hope that students might be better able to develop informed (rather than blind) trust in science.

Learning to discuss and evaluate disagreements about how to know

Chinn et al. (Citation2020a) take a different pedagogical approach to addressing disagreements about how to know. They propose that students need to learn to discuss and negotiate their disagreements—to reflect on the justifications for different ways of knowing by collaboratively examining reasons and evidence for preferring particular epistemic aims, ideals, and processes over others. They propose that engaging in explorations into ways of knowing can promote better understanding of the reliability of scientific ways of knowing, a better grasp of when different ways of knowing are appropriate and why, as well as greater competence in socially negotiating disagreements about how to know.

Acknowledging and coordinating multiple epistemologies

An alternative suggestion for addressing epistemic tensions is to give a voice to multiple epistemologies and to help students and teachers navigate these epistemologies. For example, Bang and Medin (Citation2010) argued that science education should acknowledge the epistemologies of students' indigenous communities and integrate these in the science curriculum, alongside Western ways of knowing. Taking a different approach, Hawley and Sinatra (Citation2019) addressed Christian teachers' anxieties about teaching evolution by discussing the differences between religious and scientific ways of knowing and considering when each of these might apply.

On the value of four lenses and the risks of choosing just one

Educational and psychological writing addressing the post-truth condition often (although by no means always) predominantly employs one of the four lenses. We believe that each of these lenses warrants dedicated theoretical and empirical attention. Indeed, because of the scope and complexity of these lenses, addressing them adequately will require in-depth exploration of each lens. However, in this section, we will argue that an effective educational response to the post-truth condition must ultimately take all four lenses into account.

As we documented throughout the previous sections, the explanations provided by each of the four lenses are supported by diverse empirical evidence. Thus, there are empirical grounds to consider the factors described by each of the lenses as contributing to the persistence and pervasiveness of post-truth trends. It is possible that, depending on individuals, topics, tasks, and contexts, different lenses might come to the fore. For example, not caring enough about truth may be especially problematic in contexts that evoke strong ideological or economic motivations. Not knowing how to know could be especially problematic for people who want to know what is true but are unaware of the many ways in which deceptive information can be transmitted online. However, across people's diverse encounters with scientific information, all four factors are likely to come into play at some point or other.

Accordingly, there are good educational reasons to take all four lenses into account. Indeed, we would argue that educational reforms that do not do so will have limited impact. This is because the complexity of the post-truth condition, across diverse people, topics, and settings, cannot be fully grasped by any single lens. Although there are tensions between the lenses, these tensions can protect against overly narrow solutions (Sfard, Citation1998).Footnote3

To illustrate, consider the prevalent assumption that post-truth problems can be solved by promoting students' media and digital literacy skills (e.g., Breakstone et al., Citation2018; Hobbs, Citation2017; Lewandowsky et al., Citation2017). Various scholars have expressed concerns about this approach (boyd, Citation2018; Lazer et al., Citation2018; Sinatra & Lombardi, Citation2020; Vraga & Bode, Citation2017). On our analysis, all of these concerns stem from perceptions that a narrow focus on fostering students' skills (lens #1), without taking additional lenses into account, could be ineffective and might even lead to undesirable outcomes.

To start, arguments have been made that media and digital evaluation skills are susceptible to evaluation biases (lens #2) and that instruction might even exacerbate these biases by providing students with tools for rejecting uncongenial information (boyd, Citation2018; Sinatra & Lombardi, Citation2020). Furthermore, teaching students media and digital literacy skills does not ensure that those skills will be used in actual media environments (Vraga & Bode, Citation2017), because too little attention might be devoted to cultivating the value of pursuing accuracy (lens #3). A focus on fake news and misleading communication may also unintentionally induce wholesale skepticism about the possibility of finding reliable sources of knowledge (boyd, Citation2018; Lazer et al., Citation2018). Finally, media and digital literacy skills instruction may be perceived as promoting an “elitist” or “liberal” epistemology that conflicts with the epistemologies of students' communities (lens #4; boyd, Citation2018; Vraga & Bode, Citation2017). This “assertion of authority over epistemology” could potentially backfire and lead to a rejection of media literacy and critical thinking instruction (boyd, Citation2018).

We have used the challenge of fostering media and digital literacy skills to illustrate the value of applying multiple lenses. A similar critical analysis can be applied to other lenses, as well. For example, learning how to cope with cognitive biases or how to respond to disagreements about how to know might be futile if students do not care enough about truth and justification. Fortunately, by applying multiple lenses to post-truth problems, the articles and commentaries in this special issue, individually and collectively, suggest new paths for improving digital and media literacy instruction as well as how science and the nature of science are taught at school.

The objective of employing all four lenses is consistent with our previous arguments that epistemic growth can be promoted only by taking into account the multi-faceted nature of epistemic performance (Barzilai & Chinn, Citation2018). Specifically, we have argued that efforts to promote epistemic growth should address students' abilities and motivations to adaptively engage with epistemic aims, ideals, and processes, in varied social configurations, and to metacognitively understand and regulate their epistemic performance (Barzilai & Chinn, Citation2018). Correspondingly, addressing post-truth challenges will require fostering students' abilities to pursue truth and accuracy in adaptive ways, their dispositions to care about doing so, their metacognitive understanding and regulation of their own and others' fallible ways of knowing, and their capacities to engage in such pursuits in concert with other people, both lay and expert, despite possible disagreements about how to know.

Moving forward

One way to advance research on these issues is to continue exploring the instructional proposals suggested by each of the lenses. Lens #1 has received the most empirical attention so far, but the constantly changing information environment rapidly raises new educational challenges. The other lenses have received less empirical attention, and there is a pressing need to develop and test effective instruction guided by each lens.

A second way to move forward is to start examining approaches that address the intersections of multiple lenses. For example, Darner (Citation2019) proposed an instructional approach that simultaneously attempts to foster students' abilities to evaluate scientific information while addressing their motivation to do so. Elsewhere, we have proposed five instructional design principles for promoting successful epistemic performance in a post-truth world that jointly address all four underlying causes of post-truth thinking problems (Chinn et al., Citation2020b). For example, we discuss how designing epistemically authentic learning environments may help foster evaluation skills while developing a commitment to accuracy (Chinn et al., Citation2020b).

In all of these endeavors, it may be crucial to situate instruction in students' current social, cultural, and political contexts. Post-truth problems emerge in these contexts and often derive their force from conflicts among social, cultural, and moral goals and values. Rather than ignoring these contexts, it may be critical to design instruction in ways that address social and political issues and encourage civic participation (Feinstein & Waddington, Citation2020). Doing so may require greater collaboration between science, social science, and civics educators to design interdisciplinary instruction that can address the post-truth condition. Thus, another important goal for future research is to develop productive ways for engaging teachers, teacher educators, and policy makers in interdisciplinary and disciplinary efforts to promote better thinking about scientific and socio-scientific issues in a post-truth world.

Acknowledgments

This article benefited from the thoughtful comments of Na'ama Av-Shalom, Rainer Bromme, Ravit Duncan, Mikko Kainulainen, Dorothe Kienhues, Toshio Mochizuki, Danielle Murphy, Alexander Renkl, Gale Sinatra, Lars Sorensen, Kathryn Wentzel, Randi Zimmerman, and an anonymous reviewer. We are also very grateful to Kathryn Wentzel for her support in preparing this special issue.

Notes

1 Although we use “post-truth” not in its literal sense, we henceforth drop the quotation marks for ease of reading.

2 Bullshit is defined by Frankfurt (Citation1986) as a type of discourse that is neither true nor false but rather reflects an indifference to truth: It is unconcerned with correctly representing the way things are.

3 The title of this section and our arguments are inspired by Anna Sfard's (Citation1998) seminal article, “On two metaphors for learning and the dangers of choosing just one.”

References

  • AAAS. (2018). Perceptions of science in America. American Academy of Arts & Sciences. https://www.amacad.org/publication/perceptions-science-america
  • Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211–236. https://doi.org/10.1257/jep.31.2.211
  • Baehr, J. (2016). Intellectual virtues and education: Essays in applied virtue epistemology. Routledge.
  • Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science (New York, N.Y.), 348(6239), 1130–1132. https://doi.org/10.1126/science.aaa1160
  • Bang, M., & Medin, D. (2010). Cultural processes in science education: Supporting the navigation of multiple epistemologies. Science Education, 94(6), 1008–1026. https://doi.org/10.1002/sce.20392
  • Barzilai, S., & Chinn, C. A. (2018). On the goals of epistemic education: Promoting apt epistemic performance. Journal of the Learning Sciences, 27(3), 353–389. https://doi.org/10.1080/10508406.2017.1392968
  • Benkler, Y., Faris, R., & Roberts, H. (2018). Network propaganda: Manipulation, disinformation, and radicalization in American politics. Oxford University Press.
  • BonJour, L. (2009). Epistemology: Classic problems and contemporary responses (2nd ed.). Rowman & Littlefield Publishers.
  • Bourget, D., & Chalmers, D. J. (2014). What do philosophers believe? Philosophical Studies, 170(3), 465–500. https://doi.org/10.1007/s11098-013-0259-7
  • boyd, D. (2018). You think you want media literacy… do you? Data & society: Points. https://points.datasociety.net/you-think-you-want-media-literacy-do-you-7cad6af18ec2
  • Brante, E. W., & Strømsø, H. I. (2018). Sourcing in text comprehension: A review of interventions targeting sourcing skills. Educational Psychology Review, 30(3), 773–799. https://doi.org/10.1007/s10648-017-9421-7
  • Brashier, N. M., & Marsh, E. J. (2020). Judging truth. Annual Review of Psychology, 71(1), 499–515. https://doi.org/10.1146/annurev-psych-010419-050807
  • Bråten, I., Stadtler, M., & Salmerón, L. (2018). The role of sourcing in discourse comprehension. In M. F. Schober, D. N. Rapp, & M. A. Britt (Eds.), The Routledge handbook of discourse processes (2nd ed., pp. 141–166). Routledge.
  • Breakstone, J., McGrew, S., Smith, M., Ortega, T., & Wineburg, S. (2018). Why we need a new approach to teaching digital literacy. Phi Delta Kappan, 99(6), 27–32. https://doi.org/10.1177/0031721718762419
  • Brennan, J. S., Simon, F. M., Howard, P. N., & Nielsen, R. K. (2020). Types, sources, and claims of COVID-19 misinformation. Reuters Institute for the Study of Journalism. https://reutersinstitute.politics.ox.ac.uk/types-sources-and-claims-covid-19-misinformation
  • Britt, M. A., Rouet, J.-F., Blaum, D., & Millis, K. (2019). A reasoned approach to dealing with fake news. Policy Insights from the Behavioral and Brain Sciences, 6(1), 94–101. https://doi.org/10.1177/2372732218814855
  • Bromme, R., & Goldman, S. R. (2014). The public's bounded understanding of science. Educational Psychologist, 49(2), 59–69. https://doi.org/10.1080/00461520.2014.921572
  • Bromme, R., Stadtler, M., & Scharrer, L. (2018). The provenance of certainty: Multiple source use and the public engagement with science. In J. L. G. Braasch, I. Bråten, & M. T. McCrudden (Eds.), Handbook of multiple source use (pp. 269–284). Routledge.
  • Buckingham, D. (2019). Teaching media in a ‘post-truth’ age: Fake news, media bias and the challenge for media/digital literacy education/la enseñanza mediática en la era de la posverdad: Fake news, sesgo mediático y el reto para la educación en materia de alfabetización mediática y digital. Cultura y Educación, 31(2), 213–231. https://doi.org/10.1080/11356405.2019.1603814
  • Caulfield, M. A. (2017). Web literacy for student fact checkers. https://webliteracy.pressbooks.com/
  • Chinn, C. A., Barzilai, S., & Duncan, R. G. (2020a/this issue). Disagreeing about how to know: The instructional value of explorations into knowing. Educational Psychologist, 55(3), 167–180. https://doi.org/10.1080/00461520.2020.1786387
  • Chinn, C. A., Barzilai, S., & Duncan, R. G. (2020b). Education for a “post-truth” world: New directions for research and practice. Educational Researcher. Advance online publication. https://doi.org/10.3102/0013189X20940683
  • Chinn, C. A., & Brewer, W. F. (1993). The role of anomalous data in knowledge acquisition: A theoretical framework and implications for science instruction. Review of Educational Research, 63(1), 1–49. https://doi.org/10.2307/1170558
  • Chinn, C. A., & Malhotra, B. A. (2002). Epistemologically authentic inquiry in schools: A theoretical framework for evaluating inquiry tasks. Science Education, 86(2), 175–218. https://doi.org/10.1002/sce.10001
  • Chinn, C. A., & Rinehart, R. W. (2016). Epistemic cognition and philosophy: Developing a new framework for epistemic cognition. In J. A. Greene, W. A. Sandoval, & I. Bråten (Eds.), Handbook of epistemic cognition (pp. 460–478). Routledge.
  • Chinn, C. A., Rinehart, R. W., & Buckland, L. A. (2014). Epistemic cognition and evaluating information: Applying the AIR model of epistemic cognition. In D. Rapp & J. Braasch (Eds.), Processing inaccurate information (pp. 425–454). MIT Press.
  • Cook, J., Lewandowsky, S., & Ecker, U. K. H. (2017). Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLoS One, 12(5), e0175799. https://doi.org/10.1371/journal.pone.0175799
  • Darner, R. (2019). How can educators confront science denial? Educational Researcher, 48(4), 229–238. https://doi.org/10.3102/0013189x1984941
  • de Ridder, J. (2020). How many scientists does it take to have knowledge? In K. McCain & K. Kampourakis (Eds.), What is scientific knowledge? (pp. 3–17). Routledge.
  • Dechêne, A., Stahl, C., Hansen, J., & Wänke, M. (2010). The truth about the truth: A meta-analytic review of the truth effect. Personality and Social Psychology Review, 14(2), 238–257. https://doi.org/10.1177/1088868309352251.
  • Douglas, H. (2007). Rejecting the ideal of value-free science. In H. Kincaid, J. Dupré, & A. Wylie (Eds.), Value-free science? Ideals and illusions (pp. 120–139). Oxford Univeristy Press.
  • Druckman, J. N., & McGrath, M. C. (2019). The evidence for motivated reasoning in climate change preference formation. Nature Climate Change, 9(2), 111–119. https://doi.org/10.1038/s41558-018-0360-1.
  • Duncan, R. G., Chinn, C. A., & Barzilai, S. (2018). Grasp of evidence: Problematizing and expanding the next generation science standards’ conceptualization of evidence. Journal of Research in Science Teaching, 55(7), 907–937. https://doi.org/10.1002/tea.21468.
  • Dunning, D. (2011). The Dunning–Kruger effect: On being ignorant of one's own ignorance. In J. M. Olson & M. P. Zanna (Eds.), Advances in experimental social psychology (vol. 44, pp. 247–296). Academic Press.
  • Duschl, R. A. (2020/this issue). Practical reasoning and decision making in science: Struggles for truth. Educational Psychologist, 55(3), 187–192. https://doi.org/10.1080/00461520.2020.1784735
  • Eccles, J. S., & Wigfield, A. (2002). Motivational beliefs, values, and goals. Annual Review of Psychology, 53(1), 109–132. https://doi.org/10.1146/annurev.psych.53.100901.135153
  • European Commision. (2018). Flash Eurobarometer 464: Fake news and disinformation online. https://op.europa.eu/en/publication-detail/-/publication/2d79b85a-4cea-11e8-be1d-01aa75ed71a1/language-en.
  • Feinstein, N. W., & Waddington, D. I. (2020/this issue). Individual truth judgments or purposeful, collective sensemaking? Rethining science education's response to the post-truth era. Educational Psychologist, 55(3), 155–166. https://doi.org/10.1080/00461520.2020.1780130
  • Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly, 80(S1), 298–320. https://doi.org/10.1093/poq/nfw006.
  • Ford, M. (2008). Grasp of practice’ as a reasoning resource for inquiry and nature of science understanding. Science & Education, 17(2–3), 147–177. https://doi.org/10.1007/s11191-006-9045-7.
  • Frankfurt, H. G. (1986). On bullshit. Princeton University Press.
  • Frankfurt, H. G. (2006). On truth. Alfred A. Knopf.
  • Funk, C., Hefferon, M., Kennedy, B., & Johnson, C. (2019). Trust and mistrust in Americans’ views of scientific experts. Pew Research Center. https://www.pewresearch.org/science/2019/08/02/trust-and-mistrust-in-americans-views-of-scientific-experts/.
  • Funk, C., Rainie, L., Smith, A., Olmstead, K., Duggan, M., & Page, D. (2015). Public and scientists’ views on science and society. Pew Research Center. https://www.pewresearch.org/science/2015/01/29/public-and-scientists-views-on-science-and-society/.
  • Gauchat, G. (2012). Politicization of science in the public sphere: A study of public trust in the United States, 1974 to 2010. American Sociological Review, 77(2), 167–187. https://doi.org/10.1177/0003122412438225.
  • Gierth, L., & Bromme, R. (2020). Beware of vested interests: Epistemic vigilance improves reasoning about scientific evidence (for some people). PLoS One, 15(4), e0231387. https://doi.org/10.1371/journal.pone.0231387.
  • Glanzberg, M. (2018). Truth. In E. N. Zalta (Ed.), The Stanford Encyclopedia of philosophy. https://plato.stanford.edu/entries/truth/
  • Godfrey-Smith, P. (2003). Theory and reality: An introduction to the philosophy of science. University of Chicago Press.
  • Greene, J. A., Cartiff, B. M., Duke, R. F., & Deekens, V. M. (2019). A nation of curators: Educating students to be critical consumers and users of online information. In P. Kendeou, D. H. Robinson, & M. T. McCrudden (Eds.), Misinformation and fake news in education (pp. 187–206). Information Age Publishing.
  • Hansson, S. O. (2017). Science denial as a form of pseudoscience. Studies in History and Philosophy of Science, 63, 39–47. https://doi.org/10.1016/j.shpsa.2017.05.002.
  • Hawley, P. H., & Sinatra, G. M. (2019). Declawing the dinosaurs in the science classroom: Reducing Christian teachers’ anxiety and increasing their efficacy for teaching evolution. Journal of Research in Science Teaching, 56(4), 375–401. https://doi.org/10.1002/tea.21479.
  • Hendriks, F., Kienhues, D., & Bromme, R. (2020). Replication crisis = trust crisis? The effect of successful vs failed replications on laypeople’s trust in researchers and research. Public Understanding of Science, 29(3), 270–288. https://doi.org/10.1177/0963662520902383.
  • Hobbs, R. (2010). Digital and media literacy: A plan of action. Knight Commission on the Information Needs of Communities in a Democracy. The Aspen Institute.
  • Hobbs, R. (2017). Teaching and learning in a post-truth world. Educational Leadership, 75(3), 26–31.
  • Horwich, P. (2006). The value of truth. Nous, 40(2), 347–360. https://doi.org/10.1111/j.0029-4624.2006.00613.x
  • Höttecke, D., & Allchin, D. (2020). Re-conceptualizing nature-of-science education in the age of social media. Science Education, 104(4), 641–666. https://doi.org/10.1002/sce.21575.
  • Illouz, E. (2019). A brief history of bullshit: Why we've learned to ignore truth. Ha’aretz newspaper. https://www.haaretz.com/israel-news/.premium.MAGAZINE-a-brief-history-of-bullshit-why-we-ve-learned-to-ignore-truth-1.7837206
  • IPCC. (2018). Summary for policymakers. In V. Masson-Delmotte, P., Zhai, H.-O. Pörtner, D. Roberts, J. Skea, P. R. Shukla, A. Pirani, W. Moufouma-Okia, C. Péan, R. Pidcock, S. Connors, J. B. R. Matthews, Y. Chen, X. Zhou, M. I. Gomis, E. Lonnoy, T. Maycock, M. Tignor, & T. Waterfield (Eds.), Global warming of 1.5 °c: An IPCC special report (pp. 1–24). World Meteorological Organization.
  • Journell, W. (Ed.). (2019). Unpacking fake news: An educator's guide to navigating the media with students. Teachers College Press.
  • Journell, W., & Clark, C. H. (2019). Political memes and the limits of media literacy. In W. Journell (Ed.), Unpacking fake news: An educator's guide to navigating the media with students (pp. 109–125). Teachers College Press.
  • Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), 732–735. https://doi.org/10.1038/nclimate1547.
  • Kahne, J., & Bowyer, B. (2017). Educating for democracy in a partisan age: Confronting the challenges of motivated reasoning and misinformation. American Educational Research Journal, 54(1), 3–34. https://doi.org/10.3102/0002831216679817.
  • Kahneman, D. (2011). Thinking, fast and slow. Penguin Books.
  • Kavanagh, J., & Rich, M. D. (2018). Truth decay: An initial exploration of the diminishing role of facts and analysis in American public life. Rand Corporation.
  • Kendeou, P., Robinson, D. H., & McCrudden, M. T. (Eds.). (2019). Misinformation and fake news in education. Information Age Publishing.
  • Kienhues, D., Jucks, R., & Bromme, R. (2020/this issue). Sealing the gateways for post-truthism: Re-establishing the epistemic authority of science. Educational Psychologist, 55(3), 144–154. https://doi.org/10.1080/00461520.2020.1784012
  • Krause, N. M., Brossard, D., Scheufele, D. A., Xenos, M. A., & Franke, K. (2019). Trends: Americans’ trust in science and scientists. Public Opinion Quarterly, 83(4), 817–836. https://doi.org/10.1093/poq/nfz041
  • Krief, A., Hopf, H., Mehta, G., & Matlin, S. A. (2017). Science in the post-truth era. Current Science (00113891), 112(11), 2173–2174.
  • Lapsley, D., & Chaloner, D. (2020/this issue). Post-truth and science identity: A virtue-based approach to science education. Educational Psychologist, 55(3), 132–143. https://doi.org/10.1080/00461520.2020.1778480
  • Latour, B. (2018). Down to earth: Politics in the new climatic regime. Polity Press.
  • Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., Metzger, M. J., Nyhan, B., Pennycook, G., Rothschild, D., Schudson, M., Sloman, S. A., Sunstein, C. R., Thorson, E. A., Watts, D. J., & Zittrain, J. L. (2018). The science of fake news. Science (New York, N.Y.), 359(6380), 1094–1096. https://doi.org/10.1126/science.aao2998.
  • Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369. https://doi.org/10.1016/j.jarmac.2017.07.008.
  • Longino, H. E. (1990). Science as social knowledge: Values and objectivity in scientific inquiry. Princeton University Press.
  • Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37(11), 2098–2109. https://doi.org/10.1037/0022-3514.37.11.2098.
  • Lynch, M. P. (2004). True to life: Why truth matters. MIT Press.
  • Lynch, M. P. (2019). Know-it-all society: Truth and arrogance in political culture. Liverlight Publishing.
  • Macedo-Rouet, M., Potocki, A., Scharrer, L., Ros, C., Stadtler, M., Salmerón, L., & Rouet, J.-F. (2019). How good is this page? Benefits and limits of prompting on adolescents’ evaluation of web information quality. Reading Research Quarterly, 54(3), 299–321. https://doi.org/10.1002/rrq.241.
  • Marres, N. (2018). Why we can't have our facts back. Engaging Science, Technology, and Society, 4, 423–443. https://doi.org/10.17351/ests2018.188.
  • McCrudden, M. T., & Sparks, P. C. (2014). Exploring the effect of task instructions on topic beliefs and topic belief justifications: A mixed methods study. Contemporary Educational Psychology, 39(1), 1–11. https://doi.org/10.1016/j.cedpsych.2013.10.001.
  • McIntyre, L. (2015). Respecting truth: Willful ignorance in the Internet age. Routledge.
  • McIntyre, L. (2018). Post-truth. MIT Press.
  • National Research Council. (2009). Learning science in informal environments: People, places, and pursuits. National Academies Press.
  • Newman, N., Fletcher, R., Kalogeropoulos, A., Levy, D., & Nielsen, R. K. (2017). Reuters Institute digital news report 2017. Reuters Institute for the Study of Journalism. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/Digital%20News%20Report%202017%20web_0.pdf.
  • Newman, N., Fletcher, R., Kalogeropoulos, A., & Nielsen, R. K. (2019). Reuters institute digital news report 2019. Reuters Institute for the Study of Journalism. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2019-06/DNR_2019_FINAL_0.pdf.
  • Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220. https://doi.org/10.1037/1089-2680.2.2.175.
  • Oreskes, N. (2019). Why trust science? Princeton University Press.
  • Oreskes, N., & Conway, E. M. (2010). Merchants of doubt: How a handful of scientists obscured the truth on issues from tobacco smoke to global warming. Bloomsbury Publishing.
  • Oxford English Dictionary. (n.d.). Post-truth. In Oxford English Dictionary. Oxford University Press. https://www.oed.com/view/Entry/58609044?redirectedFrom=post-truth
  • Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology General, 147(12), 1865–1880. https://doi.org/10.1037/xge0000465.
  • Pennycook, G., Epstein, Z., Mosleh, M., Arechar, A. A., Eckles, D., & Rand, D. G. (2019). Understanding and reducing the spread of misinformation online. PsyArXiv Preprints. https://doi.org/10.31234/osf.io/3n9u8
  • Pennycook, G., & Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39–50. https://doi.org/10.1016/j.cognition.2018.06.011.
  • Prado, C. G. (Ed.). (2018). America’s post-truth phenomenon: When feelings and opinions trump facts and evidence. Praeger.
  • Pritchard, D., Turri, J., & Carter, J. A. (2018). The value of knowledge. In E. N. Zalta (Ed.), The Stanford Encyclopedia of philosophy. https://plato.stanford.edu/entries/knowledge-value/
  • Rapp, D. N., & Salovich, N. A. (2018). Can’t we just disregard fake news? The consequences of exposure to inaccurate information. Policy Insights from the Behavioral and Brain Sciences, 5(2), 232–239. https://doi.org/10.1177/2372732218785193.
  • Rozenblit, L., & Keil, F. (2002). The misunderstood limits of folk science: An illusion of explanatory depth. Cognitive Science, 26(5), 521–562. https://doi.org/10.1207/s15516709cog2605_1.
  • Scharrer, L., Rupieper, Y., Stadtler, M., & Bromme, R. (2017). When science becomes too easy: Science popularization inclines laypeople to underrate their dependence on experts. Public Understanding of Science (Bristol, England), 26(8), 1003–1018. https://doi.org/10.1177/0963662516680311.
  • Scheufele, D. A., & Krause, N. M. (2019). Science audiences, misinformation, and fake news. Proceedings of the National Academy of Sciences, 116(16), 7662–7669. https://doi.org/10.1073/pnas.1805871115.
  • Sfard, A. (1998). On two metaphors for learning and the dangers of choosing just one. Educational Researcher, 27(2), 4–13. https://doi.org/10.3102/0013189x027002004.
  • Sharon, A. J., & Baram-Tsabari, A. (2020). Can science literacy help individuals identify misinformation in everyday life? Science Education. Advance online publication. https://doi.org/10.1002/sce.21581
  • Shearer, E., & Klein, H. (2019). Americans are wary of the role social media sites play in delivering the news. Pew Research Center. https://www.journalism.org/2019/10/02/americans-are-wary-of-the-role-social-media-sites-play-in-delivering-the-news/
  • Sinatra, G. M., & Lombardi, D. (2020/this issue). Evaluating sources of scientific evidence and claims in the post-truth era may require plausibility judgments. Educational Psychologist, 55(3), 120–131. https://doi.org/10.1080/00461520.2020.1730181
  • Sperber, D., Clément, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., & Wilson, D. (2010). Epistemic vigilance. Mind & Language, 25(4), 359–393. https://doi.org/10.1111/j.1468-0017.2010.01394.x.
  • Stanovich, K. E. (2011). Rationality and the reflective mind. Oxford University Press.
  • Swire, B., Berinsky, A. J., Lewandowsky, S., & Ecker, U. K. H. (2017). Processing political misinformation: Comprehending the trump phenomenon. Royal Society Open Science, 4(3), 160802. https://doi.org/10.1098/rsos.160802.
  • Tabak, I. (2020/this issue). Post-truth GPS: Detour at truth, take the long route to useful knowledge. Educational Psychologist, 55(3), 181–186. https://doi.org/10.1080/00461520.2020.1784734.
  • Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50(3), 755–769. https://doi.org/10.1111/j.1540-5907.2006.00214.x.
  • Tesich, S. (1992). A government of lies. Nation, 254(1), 12–14.
  • van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the public against misinformation about climate change. Glob Chall, 1(2), 1600008. https://doi.org/10.1002/gch2.201600008. h]
  • Vosoughi, S., Roy, D., & Aral, S. (2018). “The spread of true and false news online. Science (New York, N.Y.), 359(6380), 1146–1151. https://doi.org/10.1126/science.aap9559.
  • Vraga, E. K., & Bode, L. (2017). Leveraging institutions, educators, and networks to correct misinformation: A commentary on Lewandosky, Ecker, and Cook. Journal of Applied Research in Memory and Cognition, 6(4), 382–388. https://doi.org/10.1016/j.jarmac.2017.09.008.
  • Waisbord, S. (2018). Truth is what happens to news. Journalism Studies, 19(13), 1866–1878. https://doi.org/10.1080/1461670X.2018.1492881.
  • Walsh, E. M., & Tsurusaki, B. K. (2018). “Thank you for being Republican”: Negotiating science and political identities in climate change learning. Journal of the Learning Sciences, 27(1), 8–48. https://doi.org/10.1080/10508406.2017.1362563.
  • Wardle, C., & Derakhshan, H. (2017). Information disorder: Toward an interdisciplinary framework for research and policy making. Council of Europe policy report DGI(2017)09. Council of Europe. https://firstdraftnews.com/wp-content/uploads/2017/11/PREMS-162317-GBR-2018-Report-de%CC%81sinformation-1.pdf?x29719.
  • WHO. (2020). Munich security conference (WHO director-general speech). https://www.who.int/dg/speeches/detail/munich-security-conference
  • Wineburg, S., & McGrew, S. (2019). Lateral reading and the nature of expertise: Reading less and learning more when evaluating digital information. Teachers College Record, 121(11), Article 2. https://www.tcrecord.org/Content.asp?ContentId=22806
  • Zarocostas, J. (2020). How to fight an infodemic. The Lancet, 395(10225), 676. https://doi.org/10.1016/S0140-6736(20)30461-X.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.