1,282
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Critical literacy in the PISA 2018 reading literacy assessment

ORCID Icon & ORCID Icon
Received 08 Feb 2023, Accepted 23 Oct 2023, Published online: 30 Nov 2023

Abstract

This study examined which aspects of critical literacy are focused on in the reading literacy assessment for the Programme for International Student Assessment (PISA) 2018 and what kinds of texts are related to the critical literacy items in the test. Based on theory-oriented qualitative content analysis, critical literacy items in PISA represented, for example, determining the purpose of a text or separating fact and opinion. Most of the items were related to genre approaches to critical literacy, yet they ignored social justice aspects. Furthermore, the test consisted mostly of informational texts and only a few fictional texts. The findings indicate a need to develop a more comprehensive critical literacy assessment tool.

Introduction

The importance of critical literacy has increased in modern society. Critical literacy skills are listed as key competencies for lifelong learning and twenty-first-century skills (see Bellanca & Brandt, Citation2010; European Commission, Citation2019), and in many countries, educational aims include advancing students’ critical thinking (Lombardi et al., Citation2021). The importance of critical literacy is emphasized especially in an online environment, because of the increasing need for critical evaluation of the quality and credibility of information sources with the spread of fake news and disinformation. However, research shows that many adults and students have poor skills in evaluating information (e.g., Hämäläinen et al., Citation2020; Kiili et al., Citation2018; Kiili et al., Citation2019; OECD, Citation2019b; OECD, Citation2021; Sulkunen & Malin, Citation2014). While several studies focus specifically on online reading and source reliability, they ignore social justice or the political aspects of critical literacy.

In large-scale, comparative international reading literacy assessments, the importance of critical reading literacy is acknowledged because one area of assessment is respondents’ ability to evaluate information (see Mullis & Martin, Citation2019; OECD, Citation2019a). The Organisation for Economic Co-operation and Development (OECD) created the Programme for International Student Assessment (PISA) as a literacy assessment and education policy tool that evaluates the reading literacy skills of 15-year-olds in several countries.Footnote1 In Finland, the national curricula have reflected the academic debate (Kauppinen, Citation2010), and research has thus had an impact on education policy. For example, in 2021, partly fueled by PISA results, Finland formed its first national literacy strategy that aims toward a society in which the importance of literacy is widely recognized (Finnish National Agency for Education, Citation2021). One of the goals of this strategy is to strengthen multiliteracy skills, which also include critical literacy (see also Finnish National Agency for Education, Citation2014). In addition, the national media education policy aims to develop everyone’s media literacy, including critical literacy, as media literacy skills are diversifying with the change and broadening of media (Ministry of Education and Culture, Citation2019). It is recognized that media education can help with critically reviewing representations of groups of people and producing more equal and diverse imagery (Ministry of Education and Culture, Citation2019).

Traditionally, PISA has approached reading literacy mostly as a functional skill for real-life purposes (Kauppinen, Citation2010). It addresses different cognitive reading processes that reach beyond the most basic reading skills (OECD, Citation2019a). Students’ proficiency is assessed in locating information (i.e., accessing and retrieving information within a text and searching for and selecting relevant text), understanding text (i.e., understanding the literal meaning and constructing an integrated representation of text), and evaluating and reflecting on text. This third process contains aspects of critical evaluation, such as assessing the quality and credibility of the text(s), reflecting on their content and form, and detecting and handling conflict between texts or sources (OECD, Citation2019a, pp. 34–42). Each of the reading processes are assessed across different genres and text types.

However, although the PISA reading literacy assessment includes this critical stance on reading, it is unclear whether the reading test covers all aspects of critical literacy and whether the process of evaluating and reflecting on text is the only process with critical literacy items.Footnote2 Since critical literacy is such a crucial element of reading literacy in the current textual landscape, and PISA results are used to inform literacy policies, it is important to determine how PISA covers critical literacy in the reading literacy assessment. When considering students’ critical literacy skills in the context of PISA, it is important to clarify what aspects of critical literacy PISA assesses and what aspects are mastered by students. In addition, a comprehensive conceptualization of critical literacy is essential for the development of valid assessment. Previous studies concerning the validity of PISA have examined, to mention a few, text authenticity (Sulkunen, Citation2007), equivalence of translations (Arffman, Citation2007), response-process-based evidence for PISA’s self-report questionnaire scales (Hopfenbeck & Maul, Citation2011), open-ended items (Arffman, Citation2016), and anchored scales (Stankov et al., Citation2018). However, critical literacy has not yet been addressed in such studies. This study fills the research gap by examining critical literacy in the PISA 2018 reading literacy test and how the critical literacy aspect of PISA is related to a wider critical literacy framework by posing the following research questions:

  1. Which aspects of critical literacy do the reading literacy items in the PISA 2018 reading test focus on?

  2. What kinds of reading processes, text types, and genres are related to the critical literacy items in the reading test?

Theoretical framework

Critical literacy

Theories and pedagogies present various definitions and approaches to critical literacy due to the term’s diverse nature. The critical aspect of reading is most covered in sociocultural approaches, which highlight the importance of power, agency, identity, communities and cultures, and desire for change (Lewis et al., Citation2007; Perry, Citation2012), but due to evolving technologies and digital reading practices, it is also often understood as part of multiliteracies, which include media and information literacy, with an emphasis on critical evaluation (Anstey & Bull, Citation2006; New London Group, Citation1996). In many definitions of literacy and reading, critical literacy is mentioned as a separate aspect, but it is difficult to distinguish it from other aspects of literacy or literacy as a whole (see e.g., Luke & Freebody, Citation1999 for their four resources model). Thus, it is not relevant to compartmentalize critical literacy into one category as a separate process from other textual activities; instead, the interfaces and interactions of critical literacy in all textual activities should be considered (see Frau-Meigs, Citation2013; Kupiainen et al., Citation2015). Next, we introduce three possible approaches to critical literacy, i.e., social justice approaches, genre approaches, and multiple literacy approaches (see Rogers, Citation2013; ).

Table 1. Approaches to critical literacy.

First, the critical literacy perspective, and the idea of linking reading the word to reading the world, originates in Freire’s (Citation1970) critical pedagogy approach, which has its roots in Marxist and phenomenological philosophies. Rogers (Citation2013, Citation2014) introduces Freire’s notion of “critical” as a means for the political and economic emancipation of marginalized groups through language and literacy and as one of the social justice approaches to critical literacy. Janks (Citation2013) adds that the ethical commitment to equity and social justice is the essence of the critical literacy perspective. Pitkänen-Huhta and Holm (Citation2012) recognize that in addition to the Freirean approach, critical literacy relates to equality in education. This tradition emphasizing equality in education differs from the Freirean approach, as it focuses on how to provide equal access to existing literacies in education (Edelsky, Citation2006; Ladson-Billings, Citation1992). In addition, critical literacy relates to power relations at work in the classroom and focuses on literacy values and how they are produced and reproduced in classroom practices (Martin-Jones, Citation2007).

Second, critical literacy can be closely linked to critical text analysis or critical discourse analytical approaches (Luke, Citation2014; Louloudi, Citation2022), whereas the Freirean approach lacks specificity on how students can engage with complex structures of texts (Luke, Citation2014). Rogers (Citation2013) introduces genre approaches to critical literacy to consider genre, lexical, and grammatical choices (Luke, Citation2014). Genre approaches stem from a systemic linguistic theory by Halliday (Citation1994) and focus on “acquiring competence in the linguistic structures of dominant discourses through the analysis of the patterns of texts and the way these structures carry out social functions” (Rogers, Citation2013, p. 11). The presupposition is that all representation is mediated and affected by value systems in language (Fowler, Citation1996). The aim of these approaches is to change the world by exposing misinterpretation and discrimination in a variety of modes of public discourse in genres such as newspapers, propaganda, and interviews. Fowler (Citation1996) also acknowledges that the topics examined are rather sensitive by nature. These include inequality in education, racism, sexism, and several other social and political issues. Hence, the social justice aspect and ideological functions of texts are apparent in genre approaches (see Luke, Citation2014), but the emphasis is on the critical analysis of language and discourse, textual features, uses, purposes for use, and organization of genres (Perry, Citation2009, Citation2012; Rogers, Citation2013, Citation2014).

Third, critical literacy can be approached with multiple literacy approaches (Rogers, Citation2013), which are grounded in the concept of multiliteracies and are commonly attributed to a group of researchers and theorists of the New London Group (Citation1996). Multiliteracies refer to the skills of receiving, evaluating, and producing different forms of spoken, written, audiovisual, and digital texts in different media. It is therefore an umbrella concept for many different skills, knowledge, and strategies when working with texts (Kupiainen et al., Citation2015), for example, in cultural, media, and internet literacy, including critical literacy. However, many studies using the multiliteracy framework highlight critical literacy as a part of online reading comprehension (e.g., Hämäläinen et al., Citation2020), which includes the ability to evaluate the level of accuracy, reliability, relevance, and bias of information at all stages of comprehension, even predictively (Leu et al., Citation2004, Citation2017; Rieh, Citation2002).

Many scholars who work within the theory of multiliteracies focus on the changing social, political, and economic world (Cope & Kalantzis, Citation2000), but a critical stance is especially apparent in implications for practice (Perry, Citation2012). Multiliteracies pedagogy and critical literacy intersect through critical framing, which involves the process of denaturalizing, critiquing, and re-evaluating (New London Group, Citation1996). This process leads to transformed practice through multiple or multimodal meaning-making—linguistic, audiovisual, digital, and spatial (Cope & Kalantzis, Citation2000; Kim et al., Citation2020; Kress, Citation2000; Perry, Citation2012)—relating to Freirean aims for empowerment. In this context, critical literacy could be defined as “the use of the technologies of print and other media of communication to analyze, critique, and transform the norms, rule systems, and practices governing the social fields of everyday life” (Luke, Citation2012, p. 5).

In addition to intersections with Freirean social justice, the concept of critical literacy in multiliteracies is related to critical thinking and evaluation skills and the acquisition of information (Barton, Citation2007; Leu et al., Citation2004). In fact, some scholars approach critical reading as an act of critical thinking (Lewis, Citation1991), which also includes the ability to assess and present arguments (Facione, Citation1990; see also the argumentation theory by Walton, Citation2006).

From the perspective of our second research question, it is also relevant to consider the role of texts in the three approaches. Social justice approaches do not seem to take a stand on the nature of text. In genre approaches, the emphasis seems to be on linguistic texts, while multiple literacy approaches seem to keep multimodal and digital texts separate from other texts, which appears to be an outdated distinction, as language is one of the modalities used in the increasingly digital textual landscape. For example, in Finland, all literacy education involves a wide selection of texts that include print and digital texts (Finnish National Agency for Education, Citation2014).

In the PISA reading assessment, texts included in the test have been categorized by genre and text type. According to Bakhtin (Citation1986, p. 60) genres are relatively stable thematical, compositional, and stylistic types of expression. Genre is primarily characterized by the communicative purpose(s) that it is intended to fill (Bhatia, Citation1993, p. 13), whereas text type refers to different kinds of linguistic realizations in texts (Werlich, Citation1983). PISA’s classification of text types is based on the work of Werlich (Citation1983, pp. 39–41), whose text typology aims to describe universal text types in terms of typical surface structure, i.e., sentence types and the relations between sentences in the text: Descriptive text type is related to space and consists of phenomena-registering sentences. Narrative text type deals with phenomena in time and consists of action-recording sentences. Expository text type explains the relations of concepts and elements and phenomenon-linking sentences are typical of this text type. Argumentative text type is related to the validity of relations among concepts—it is also related to the cognitive process of judging—and consists of quality-attributing sentences. Instructive text type consists of action-demanding sentences instructing what to do. In PISA, text types have been complemented with transaction, which usually aims to achieve a personal purpose, and transactional texts “build on the possibly private knowledge and understanding common to those involved in the transaction” (OECD, Citation2019a, p. 48), for example, through text message exchanges.

Assessing reading literacy in PISA 2018

Many countries participate in large-scale comparative reading literacy assessments to monitor students’ progress in reading over time and to consider the factors associated with achievement. These studies may focus either on mastery of school curricula (Mullis & Martin, Citation2019), or the skills needed in various situations in current and future everyday life (OECD, Citation2019a). PISA focuses on the latter.

For the 2018 assessment, the PISA reading literacy assessment framework was revised and expanded to meet the needs of the digital age and digital reading. In PISA 2018, reading literacy was defined as “understanding, using, evaluating, reflecting on and engaging with texts in order to achieve one’s goals, to develop one’s knowledge and potential and to participate in society” (OECD, Citation2019a, p. 28). This definition of reading literacy includes the idea that readers may have to take a critical stance toward the materials they read, as “evaluating” requires readers to assess the veracity of arguments, the author’s point of view, and the relevance of a text (OECD, Citation2019a, p. 29), which also corresponds to the broader critical literacy framework discussed earlier.

In addition, “reflecting” requires readers to relate the text’s content and form to their own experience and knowledge and then to evaluate the text from that perspective (OECD, Citation2019a, p. 29). This also relates to the wider critical literacy framework, as one main characteristic of critical literacy is the ability to assess and compare what one has read to previously acquired information (Criscuolo, Citation1965), as well as the consideration of new ideas or information in light of one’s previous beliefs (Harris & Sipay, Citation1990). The connection between genre approaches and critical literacy (see Rogers, Citation2013) is also apparent, as “reflecting on texts can include weighing on author’s claim(s), their use of rhetorical and other use of discourse, as well as inferring the author’s perspective” (OECD, Citation2019a, p. 29).

Finally, “to participate in society” refers to the ability of literate people to fully take part in economic, political, communal, and cultural life (OECD, Citation2019a, p. 30). This aspect of reading literacy and its goals in the assessment framework may suggest that participation also includes taking a critical stance and offering an opportunity for the emancipation and empowerment of individuals, which is the original purpose of critical literacy (see Freire, Citation1970).

While the PISA assessment framework outlines and specifies what reading literacy is and how it should be assessed, other factors also influence the actual reading literacy test (i.e., the texts and items included in the assessment). In 2018, seventy-nine countries participated in PISA. In this assessment, special attention was paid to the reading test’s comparability and acceptability across countries and languages (OECD, Citation2019b). In practice, this meant ensuring the standardized implementation of the test and quality assurance procedures for the translations of the text (OECD, Citation2019b). More importantly, from the perspective of critical literacy, national experts were invited to submit texts of various genres and text types, develop items for the reading test, and review the reading test units and items in different stages of the test development (OECD, Citationn.d.). In the reviews, test materials were rated based on their relevance to 15-year-old students and sensitivity issues related to cultural, gender, or other potential bias. As a result of the national reviews, some units and items were excluded from the reading test based on different countries’ oppositions before the field trial (OECD, Citation2019b, Citationn.d.). As a result, the PISA reading test is unlikely to include texts with sensitive issues that might cause controversy in participating countries.

Method

As stated above, this qualitative study focuses on the following research questions:

  1. Which aspects of critical literacy do the reading literacy items in the PISA 2018 reading test focus on?

  2. What kinds of reading processes, text types, and genres are related to the critical literacy items in the reading test?

The data consists of the PISA 2018 reading literacy items in English. The number of literacy items in PISA 2018 main survey was 247, and they were all included in the analysis.Footnote3 All the PISA reading literacy items were analyzed because (a) in the authors’ experience, critical literacy items in PISA might be found in all cognitive reading processes, not only in the process of evaluating and reflecting on text, and PISA does not explicitly address all the different critical literacy aspects used in the assessment, and (b) based on the literature review, critical literacy is a substantial part of all reading literacy.

The analysis has utilized theory-oriented qualitative content analysis through the collaboration of two researchers (Cornish et al., Citation2014; Mayring, Citation2015). In the first phase of the analysis, two researchers first individually analyzed all the reading literacy items. The analysis was guided by the three approaches to critical literacy presented in the theory section of this study and their relevant features, i.e., analytical framework was constructed from the research literature and the PISA reading literacy framework. Moreover, whenever it was necessary to determine what an individual item measured, the PISA coding guideFootnote4 was used to support the analysis. Second, the two interpretations of the data were discussed jointly to produce an agreed-upon interpretation about whether the items correspond to a wider critical literacy framework. At this stage, some of the items were excluded from the analysis. The final categorization of the critical literacy items was based on the predominant characteristics of the items.

In the second phase of the analysis, critical literacy items identified in the first phase were categorized by cognitive reading process, text type and genre. Categorization of cognitive processes and text types were based on PISA classification of items, and genre categories were determined based on pre-dominant characteristics and communicative purposes of the texts by the two researchers.

The PISA reading literacy items are confidential, so we illustrate the critical literacy items with items that have been released from the PISA 2018 field trial and main survey. For this reason, we cannot present examples from all critical literacy categories in our findings. The released items are from three different units: Chicken Forum, Cow’s Milk, and Rapa Nui. The first two were omitted from the main study based on participating countries’ objections of the content (OECD, Citation2018) and thus they were not part of the data analyzed in this study. The Rapa Nui unit, however, was administered in the PISA 2018 main survey and was thus included in the data of the current study. While all example units and items were not part of the data of the study, except for Rapa Nui, they do represent typical PISA items across the reading processes measured, and thus are used to illustrate the items analyzed.

Findings

Critical literacy categories

Based on the content analysis of all PISA 2018 reading items (247), 76 of these items represented some aspect of critical literacy described in the theoretical framework of this study. The critical literacy items were classified into seven different categories, which are as follows (in order of the number of items per category):

  1. Determining the purpose of a text or piece of text (28)

  2. Argumentation (24)

    1. Assessing argumentation in the text (11)

    2. Arguing one’s own opinion (13)

  3. Assessing relevance (9)

  4. Assessing the reliability and fairness of a source (8)

  5. Identifying and evaluating the author’s attitude (3)

  6. Defining the audience or target audience for the text (2)

  7. Separating fact and opinion (2)

The first category included the majority of critical literacy items (28 items). In this category items asked students to determine the purpose of a text or piece of text. This analysis required knowledge of purposes of use of given genres and the ability to analyze different discourses and means of influence (e.g., Perry, Citation2012; Rogers, Citation2013). A piece of text might be a single detail, sentence, paragraph, or image that students were asked to analyze in relation to the rest of the text. Items in this first category asked students to consider the (main) purpose of the text, why something is described in detail, or why the author has said, referred to, or described something. For example, in Chicken Forum Question 5Footnote5 (OECD, Citation2018, p. 28), students had to read a transactional online chat and answer a simple multiple-choice question: “Why does Avian_Deals respond to Ivana_88’s post? (a) To promote a business, (b) To answer Ivana_88’s question, (c) To add to Monie’s advice, (d) To demonstrate expertise with birds.” Students relied on the information in the post to deduce that Avian_Deals was promoting their business (a).

Other critical literacy items in this first category required students to analyze the difference between a specified section of text and the rest of the text in terms of style or content. They also had to identify the purpose of the formatting feature in relation to content. For example, students had to identify the attempt to appeal to the audience through the title and use of illustration in a text. They also had to recognize how the use of sources supports the author’s point of view and strengthens the credibility of the author in creating a positive image.

The second critical literacy category on argumentation focused on (a) assessing argumentation in the text (11 items), or (b) the students’ own argumentation (13 items). These were primarily related to multiple literacy approaches, as students had to not only evaluate arguments but also produce their own arguments and synthesize information across different sources (e.g., Leu et al., Citation2004). In the first subcategory, students had to evaluate whether the author was successful in their linguistic choices (for example, in the category of determining the purpose of a text, students had to evaluate the author’s linguistic choices but not their appropriateness). In addition, students were asked to analyze how claims were supported in the text. For example, in Rapa Nui Question 5 (OECD, Citation2018, p. 17), students read an article and were expected to answer a simple multiple-choice question to identify which element of the text could be used as evidence to support a certain claim: “What evidence do Carl Lipo and Terry Hunt present to support their theory of why the large trees of Rapa Nui disappeared? (a) The rats arrived on the island on settler's canoes, (b) The rats may have been brought by the settlers purposefully, (c) Rat populations can double every 47 days, (d) The remains of palm nuts show gnaw marks made by rats” (the correct answer).

In the second subcategory of argumentation, students had to demonstrate their own argumentation. For example, they had to hypothesize about the probable change to information presented in a text given a different context, relate information in a text to familiar personal experience, or support an opinion by combining prior knowledge with information from the text. In Rapa Nui Question 7 (OECD, Citation2018, p. 19), students were asked to produce an open response to the question: “After reading the three sources, what do you think caused the disappearance of the large trees on Rapa Nui? Provide specific information from the sources to support your answer.” They could choose to support either theory or argue that there is a need for further research. In this example, the response required specific information directly from the texts to support an opinion.

In Cow’s Milk Question 7Footnote6 (OECD, Citation2018, p. 43), students read three stances from Anna, Christopher and Sam who are talking about the two texts (“Farm to Market Dairy” and “Just Say ‘No’ to Cow’s Milk”):

Christopher

: No matter what the coffee shop owner does, I’m going to keep drinking milk every day. It’s really good for you.

Anna

: Not me! I’m going to drink a lot less milk from now on if it’s not good for you.

Sam

: I don’t know, I think we need to know more before we make a conclusion.

Then students were asked to consider whose argument they agreed with. To receive credit, students could choose any stance, but they had to provide a reason from at least one of the texts to support their selection. In doing so, they demonstrated how to handle the conflict between the information presented on the two webpages (OECD, Citation2018, p. 43). In this example, a critical aspect was to justify one’s opinion by taking a position directly from the text that one believes in. However, students were not necessarily required to present their own cogent arguments, as it was sufficient to refer to information from the texts.

In the third category, which focuses on assessing relevance (nine items), students assessed either the relevance of a search term or the relevance of a source. Consequently, this category was related to multiple literacy approaches and online reading comprehension, as students were required to analyze what to read to learn about a particular topic or issue, and conversely, which search term, search result, or post was not relevant to the topic (e.g., Rieh, Citation2002). For example, in Chicken Forum Question 3 (OECD, Citation2018, p. 26), students assessed whether each post was relevant to the topic of giving aspirin to chickens by clicking on either “Yes” or “No” for each post. In this example, students needed to evaluate the content more closely to determine whether each post was relevant to Ivana_88’s question.

Continuing with multiple literacy approaches, in the fourth category, assessing the reliability and fairness of a source (eight items), students had to consider whether something or someone was an unbiased, biased, or reliable source of information. In this category, students had to analyze if any sources were used, whether the source was reliable in handling a particular topic, and the interests of the source. For example, in Chicken Forum Question 6 (OECD, Citation2018, p. 29), students were asked: “Who posted the most reliable answer to Ivana_88’s question?” Any selection other than Avian_Deals was credited with a supporting explanation because students had to recognize that a source’s reliability is compromised if it has commercial interests (OECD, Citation2018). Students had to consider the marketing discourse in the answer by Avian_Deals, and then consider different aspects that make the other answers more reliable. In this analysis, students checked the author’s credentials (see Harris, Citation2020); one had personal experience, one suggested checking with a veterinarian, and one was a veterinarian. According to Walton (Citation2006, p. 87), in a more in-depth critical analysis, it is not sufficient to rely merely on expert opinions, as they are also open to critical questioning. However, in PISA scoring, relying on expert opinion and personal experience was sufficient.

In other items in this category, students had to identify provocation, persuasion, and instructive case, and determine whether a text presented only one perspective. They also had to consider whether a source was reliable if the release date was not recent or if the source relied only on memory (see Harris, Citation2020). The information was deemed reliable if it was based on research or an official body.

In the fifth category, students were asked to identify and evaluate the author’s attitude (three items). This category differs from the first category (determining the purpose of a text or piece of text), as students had to analyze some linguistic aspects more closely to determine the author’s view, attitude, appreciation, and position. This reflects genre approaches and Halliday’s (Citation1994) systemic-functional theory and its interpersonal metafunction and linguistic choices, which could express attitudes, values, and feelings. For example, students needed to evaluate adjectives and make inferences about the author’s attitude (positive, negative, or neutral) toward something presented in the text.

The sixth category, which involves defining the audience or target audience for the text, was a small category with only two items. However, defining the audience or target audience is important when reading critically, as one has to consider why the author has written for a specific audience and who the text seeks to influence. This kind of analysis required once again genre knowledge to define the uses and purposes for use of given genres (e.g., Perry, Citation2012).

Lastly, the seventh category, separating fact and opinion, also had only two items. Students had to consider, for example, whether a claim was true or needed further research. In Rapa Nui Question 3 (OECD, Citation2018, p. 14), students were given five statements from a book review:

  1. In the book, the author describes several civilizations that collapsed because of the choices they made and their impact on the environment.

  2. One of the most disturbing examples in the book is Rapa Nui.

  3. They carved the moai, the famous statues, and used the natural resources available to them to move these huge moai to different locations around the island.

  4. When the first Europeans landed on Easter Island in 1722, the moai were still there, but the trees were gone.

  5. The book is written well and deserves to be read by anyone who is concerned about the environment.

Students were then asked to analyze whether the statements were factual or represented the perspective of the author by clicking on either “Fact” or “Opinion” for each statement. Students did this by evaluating the lexical choices of the presented statements, so this category also related to genre approaches (Luke, Citation2012). For example, “disturbing” and “written well” indicated that statements two and five were opinions, and other statements were facts.

Reading processes and texts related to critical literacy items

Overall, there was no clear association between cognitive reading processes, text types, or genres and the critical literacy item category identified in the content analysis (). In other words, PISA has utilized different cognitive reading processes, text types, and genres in critical literacy items in a versatile manner. However, there were some relevant details, which will be discussed next.

Table 2. Reading processes, text types, and genres by critical literacy category.

The majority of the critical literacy items were linked to the PISA cognitive process of evaluating and reflecting on texts. Altogether, 51 items were included in this category, in which 34 items involved reflecting on content and form, nine items involved corroborating and handling conflict, and eight items involved assessing quality and credibility. This was expected because the PISA assessment framework addresses critical literacy especially in these processes (see OECD, Citation2019a). However, critical literacy items could also be found in other cognitive reading processes (i.e., locating information and understanding texts). The reading process of understanding the text included 14 items, of which 13 involved integrating and generating inferences and one involved representing literal meaning. In addition, the process of locating information included 11 critical literacy items in the subcategory of searching for and selecting relevant text.

When considering what kind of genres were related to the critical literacy items, it becomes apparent that PISA has used a variety of informational genres, such as articles, brochures, webpage information sheets, online chats, and search result lists. Fictional genres were rarely used; only four critical literacy items utilized fables or stories.

An expositional text type dominated in all critical literacy categories (altogether 20 items). For example, almost half of the critical literacy items in the argumentation category (11 items) were related to an expositional text type, and only one used an argumentative text type. Altogether, 16 critical literacy items used an argumentative text type in the form of, for example, persuasive or opinionative texts.

Furthermore, determining the purpose of a text or piece of text was clearly related to the cognitive reading process of reflecting on content and form in 24 out of 28 items. Assessing relevance represented searching and selecting text in eight out of nine items and was also clearly related to the online environment, as the critical literacy items in this category utilized genres such as search result lists and online encyclopedias.

Conclusion and discussion

In this study, critical literacy in the PISA 2018 reading literacy test was examined. The authors have presented the current theory on three major approaches to critical literacy: (1) social justice approaches, (2) genre approaches, and (3) multiple literacy approaches. The perception of critical literacy in the PISA assessment framework implicitly recognizes all these approaches.

Among the reading literacy items, seven different categories representing critical literacy were found. Four of these categories were linked to genre approaches: (1) determining the purpose of a text or piece of text, (2) identifying and evaluating the author’s attitude, (3) defining the audience or target audience for the text, and (4) separating facts and opinions. Altogether, these categories included 35 items, which is almost half of all the critical literacy items. In these categories, students require genre knowledge to analyze textual features, uses, purposes for use, and the organization of given genres (Perry, Citation2012; Rogers, Citation2013, Citation2014). The other three categories—argumentation, assessing relevance, and assessing quality and credibility—represented primarily multiple literacy approaches. Assessing relevance was especially related to online reading comprehension, as items in this category were situated mostly in digital environments where students assessed, for example, the relevance of a search term (see e.g., Rieh, Citation2002).

The social justice aspect of critical literacy was absent from the PISA 2018 reading literacy items, although personal liberation, emancipation, and empowerment are mentioned in the assessment framework (see OECD, Citation2019a). This is a notable result in the context of the goals of functional literacy assessment, as PISA specifically seeks to measure the functional (i.e., practical literacy) skills required in society (e.g., filling in a job application) with a purpose for (indirect) financial interest. While functional literacy aims at providing individuals with the skills to function in society, social justice perspectives emphasize emancipating and empowering individuals (Luke, Citation2012). Thus, PISA may not find it necessary to include social justice aspects in the reading test, as they are not part of the main goals of the survey.

Regarding the analysis of critical literacy items, it must be noted that the classification of items into certain categories creates a simplified picture of interpretation, even though the categories partially overlap. Also, to avoid a one-sided interpretation, two researchers first analyzed the data alone and then discussed jointly to produce an agreed-upon interpretation of the categories and genres.

The critical literacy items were not based only on argumentative texts; instead, expositional texts were frequently used. Critical literacy items can be formed from a variety of texts, and they do not necessarily need to be highly ideological or provocative, as no text is neutral. However, in PISA, ideological and provocative texts involving highly sensitive topics were absent, although such topics are important from a critical literacy perspective (see Fowler, Citation1996). Although PISA has presented some texts with sensitive topics, they have been omitted due to opposition from participating countries. The released unit Cow’s Milk allowed for the formation of critical literacy tasks, such as weighing up ethical choices related to, in this case, drinking milk. Some other texts also provided an opportunity to test students’ critical stances, but this opportunity was not put into practice.

In addition, critical literacy for fictional texts was rarely assessed; the majority of the critical literacy tasks were formed on the basis of informative texts, and only a few fictional assignments were found in the PISA 2018 assessment. However, utilizing fictional texts to measure critical literacy is important because the need for critical evaluation is not limited to informational texts. For example, the Finnish national matriculation examination for reading literacy assesses cultural and critical literacy, and since the examination includes also fictional texts, there is potential to approach them critically (The Matriculation Examination Board, Citation2022). Moreover, some definitions of critical reading skills include the analysis of literary elements such as setting, plot, and theme (e.g., King, Citation1968), not to mention the importance of the use of fictional books to encourage critical literacy from multiple perspectives (Clarke & Whitney, Citation2009).

When considering the validity of the PISA reading test, the absence of some crucial aspects of critical literacy demonstrates construct underrepresentation. Construct underrepresentation refers to situations in which the test fails to include or capture all the important dimensions of the construct (Messick, Citation1988, pp. 44–45; Bandalos, Citation2018, p. 151). Underrepresentation can result from, for example, narrowing the content of a test (Bandalos, Citation2018, p. 151). Narrowing of the content of the PISA 2018 reading literacy test, and more precisely critical literacy, is most likely due to the diversity of the participating countries. As presented earlier in this study, the reading literacy test is a compromise between the participating countries to avoid cultural bias in testing. Due to the nature of the PISA test and the large population assessed, it is quite difficult to assess critical literacy from all possible aspects.

As a whole, PISA has incorporated many critical literacy items, which were found in all cognitive reading processes. For this reason, the process of evaluating and reflecting on texts does not measure students’ critical literacy alone or unambiguously. Thus, critical literacy is difficult to distinguish from other literacies, as discussed in the theory section of this study (see Frau-Meigs, Citation2013; Kupiainen et al., Citation2015; Luke & Freebody, Citation1999). Furthermore, when generalizations are made about, for example, the Finnish youth’s strong skills in searching for information compared to other areas assessed (see OECD, Citation2019b), it is important to be aware of the critical evaluation needed in the process of locating information, although critical interpretation in such items can remain somewhat superficial.

This superficiality of critical literacy items was also noticeable in tasks in which students had to form their own arguments. Very often, it was sufficient to present the information directly presented in the text as an argument for the students’ own opinions. Moreover, students were not necessarily expected to analyze certain aspects (e.g., the author’s credentials) very deeply, especially in multiple-choice items. Even if students are required to do some deeper analysis, research has shown that students’ critical literacy skills remain quite superficial (e.g., Hämäläinen et al., Citation2020).

In conclusion, this study suggests that there is a need to develop a more comprehensive critical literacy assessment tool that recognizes all the crucial elements of critical literacy, at least to some extent, by including topics and texts related to social justice. In addition, it would be advisable to use more fictional texts alongside informational texts when assessing critical literacy. Furthermore, a more in-depth analysis could be required from students, at least in some tasks. In terms of future research, the findings of this study could be used quantitatively to operationalize and validate critical literacy in PISA.

Disclosure statement

No potential conflict of interest was reported by the authors.

Notes

1 PISA assesses skills of 15-year-olds in reading literacy, mathematics, and science to meet real-life challenges. In addition to the test tasks, PISA has background surveys that are used to gauge, for example, students’ attitudes to learning. See more about PISA at https://www.oecd.org/pisa/aboutpisa/.

2 A test item is a specific question or task (multiple choice or open) test takers are asked to answer or perform.

3 Please note that the number of reading literacy items presented here refers to the Finnish data, and it may vary slightly from country to country.

4 The PISA coding guide includes all units with at least one short-response or extended-response item and list of available codes for the item (e.g., 0 1 9), followed by the cognitive process, which gives a general description of what the question is intended to assess. A description of how to code the item follows with examples of possible answers for each coding category (see OECD, 2019 for released items and their coding guides).

5 While Chicken Forum items were not analyzed in the current study as part of the data, they are used to illustrate typical items in some categories.

6 While this Cow’s Milk item was not one of the items analyzed in the current study as part of the data, it is used to illustrate a typical item in this category.

References

  • Anstey, M., & Bull, G. (2006). Teaching and learning multiliteracies: Changing times, changing literacies. International Reading Association.
  • Arffman, I. (2007). The problem of equivalence in translating texts in international reading literacy studies: A text analytic study of three English and Finnish texts used in the PISA 2000 reading test (Research Reports 21). [Doctoral dissertation], University of Jyväskylä, Institute for Educational Research. JYX Digital Repository. https://jyx.jyu.fi/bitstream/handle/123456789/37744/1/T021_verkkoversio.pdf
  • Arffman, I. (2016). Threats to validity when using open-ended items in international achievement studies: Coding responses to the PISA 2012 problem-solving test in Finland. Scandinavian Journal of Educational Research, 60(6), 609–625. https://doi.org/10.1080/00313831.2015.1066429
  • Bakhtin, M. M. (1986). The problem of speech genres. In V. W. McGee, C. Emerson, & M. Holquist (Eds.), Speech genres and other late essays (pp. 60–102). University of Texas Press.
  • Bandalos, D. L. (2018). Validity. Research methods, 142–194. Routledge & Guilford Press. https://bluesyemre.files.wordpress.com/2019/01/lmaguilp1808_research_methods_fb_r3-final.pdf
  • Barton, D.. (2007). Literacy: An introduction to the ecology of written language (2nd ed.). Blackwell Pub.
  • Bellanca, J., & Brandt, R. (Eds.). (2010). 21st-century skills: Rethinking how students learn. Solution Tree Press.
  • Bhatia, V. K. (1993). Analysing genre: Language use in professional settings. Longman.
  • Clarke, L. W., & Whitney, E. (2009). Walking in their shoes: Using multiple-perspectives texts as a bridge to critical literacy. The Reading Teacher, 62(6), 530–534. https://doi.org/10.1598/RT.62.6.7
  • Cope, B., & Kalantzis, M. (2000). Multiliteracies: Literacy learning and the design of social futures. Routledge.
  • Cornish, F., Gillespie, A., & Zittoun, T. (2014). Collaborative analysis of qualitative data. In U. Flick (Ed.), The SAGE handbook of qualitative data analysis (pp. 79–93). SAGE.
  • Criscuolo, N. P. (1965). A plea for critical reading in the primary grades. Peabody Journal of Education, 43(2), 107–112. https://doi.org/10.1080/01619566509537322
  • Edelsky, C. (2006). With literacy and justice for all (3rd ed.). Lawrence Erlbaum.
  • European Commission. (2019). Key competences for lifelong learning. Publications Office, Directorate-General for Education, Youth, Sport and Culture. https://doi.org/10.2766/569540
  • Facione, P. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction. Research Findings and Recommendations. https://files.eric.ed.gov/fulltext/ED315423.pdf
  • Finnish National Agency for Education. (2014). Perusopetuksen opetussuunnitelman perusteet 2014 [National core curriculum for basic education 2014]. Finnish National Board of Education. https://www.oph.fi/sites/default/files/documents/perusopetuksen_opetussuunnitelman_perusteet_2014.pdf
  • Finnish National Agency for Education. (2021). National literacy strategy 2030. Finnish National Agency for Education. https://www.oph.fi/sites/default/files/documents/National_literacy_strategy_2030.pdf
  • Fowler, R. (1996). On critical linguistics. In C. R. Caldas-Coulthard, & M. Coulthard (Eds.), Text and practices: Readings in critical discourse analysis (pp. 3–14). Routledge.
  • Frau-Meigs, D. (2013). Transliteracy: Sense-making mechanisms for establishing e-presence. In U. Carlsson, & S. H. Culver (Eds.), Media and information literacy and intercultural dialogue (pp. 175–189). MILID Yearbook 2013. NORDICOM. http://milunesco.unaoc.org/wp-content/uploads/2013/04/Media_and_Information_Literacy_and_Intercultural_Dialogue.pdf
  • Freire, P. (1970). Pedagogy of the oppressed. Continuum.
  • Halliday, M. A. K. (1994). An introduction to functional grammar (2nd ed.). Edward Arnold.
  • Hämäläinen, E. K., Kiili, C., Marttunen, M., Räikkönen, E., González-Ibáñez, R., & Leppänen, P. H. (2020). Promoting sixth graders’ credibility evaluation of web pages: An intervention study. Computers in Human Behaviour, 110, Article 106372. https://doi.org/10.1016/j.chb.2020.106372
  • Harris, A. J., & Sipay, E. R. (1990). How to increase reading ability. A guide to developmental & remedial methods (9th ed.). Longman.
  • Harris, R. (2020, Oct 19). Evaluating internet research sources. VirtualSalt. www.virtualsalt.com/evaluating-internet-research-sources
  • Hopfenbeck, T. N., & Maul, A. (2011). Examining evidence for the validity of PISA learning strategy scales based on student response processes. International Journal of Testing, 11(2), 95–121. https://doi.org/10.1080/15305058.2010.529977
  • Janks, H. (2013). Critical literacy in teaching and research. Education Inquiry, 4(2), 225–242. https://doi.org/10.3402/edui.v4i2.22071
  • Kauppinen, M. (2010). Lukemisen linjaukset – Lukutaito ja sen opetus perusopetuksen äidinkielen ja kirjallisuuden opetussuunnitelmissa [Literacy delineated – Reading literacy and its instruction in the curricula for the mother tongue in basic education] (Jyväskylä Studies in Humanities, 141). [Doctoral dissertation, University of Jyväskylä]. JYX Digital Repository. https://jyx.jyu.fi/bitstream/handle/123456789/24964/9789513940119.pdf?sequence=1&isAllowed=y
  • Kiili, C., Coiro, J., & Räikkönen, E. (2019). Students’ evaluation of information during online inquiry: Working individually or in pairs. Australian Journal of Language and Literacy, 42(3), 167–183. https://doi.org/10.1007/BF03652036
  • Kiili, C., Leu, D. J., Marttunen, M., Hautala, J., & Leppänen, P. H. (2018). Exploring early adolescents’ evaluation of academic and commercial online resources related to health. Reading and Writing, 31(3), 533–557. https://doi.org/10.1007/s11145-017-9797-2
  • Kim, S., Ramos, K. A., Chung, H., & Choi, S. (2020). Integrating critical multiliteracies pedagogy in ESL/EFL teaching. Journal of English Learner Education, 11, Article 4.
  • King, M. L. (1968). Evaluating critical reading. In M. A. Dawson (Ed.), Developing comprehension: Including critical Reading (pp. 206–213). International Reading Association.
  • Kress, G. (2000). Multimodality. In B. Cope, & M. Kalantzis (Eds.), Multiliteracies: Literacy learning and the design of social futures (pp. 182–202). Routledge.
  • Kupiainen, R., Kulju, P., & Mäkinen, M. (2015). Mikä monilukutaito? [What is multiliteracy?]. In T. Kaartinen (Ed.), Monilukutaito kaikki kaikessa [Multiliteracy is the key] (pp. 13–24). The Teacher Training School.
  • Ladson-Billings, G. (1992). Reading between the lines and beyond the pages: A culturally relevant approach to literacy teaching. Theory Into Practice, 31(4), 312–320. https://doi.org/10.1080/00405849209543558
  • Leu, D. J., Kinzer, C. K., Coiro, J., Castek, J., & Henry, L. A. (2017). New literacies: A dual-level theory of the changing nature of literacy, instruction, and assessment. Journal of Education, 197(2), 1–18. https://doi.org/10.1177/002205741719700202
  • Leu, D. J., Kinzer, C. K., Coiro, J. L., & Cammack, D. W. (2004). Toward a theory of new literacies emerging from internet and other information and communication technologies. In R. B. Ruddell, & N. Unrau (Eds.), Theoretical models and process of reading (5th ed, pp. 1570–1613). International Reading Association.
  • Lewis, C., Enciso, P., & Moje, E. B. (2007). Reframing sociocultural research on literacy: Identity, agency, and power. Lawrence Erlbaum Associates.
  • Lewis, J. (1991). Redefining critical reading for college critical thinking courses. Journal of Reading, 34, 420–423.
  • Lombardi, L., Mednick, F. J., De Backer, F., & Lombaerts, K. (2021). Fostering critical thinking across the primary school’s curriculum in the European schools system. Education Sciences, 11(9), 505. https://doi.org/10.3390/educsci11090505
  • Louloudi, E. (2022). Investigating understandings of critical literacies among Finnish and Canadian teachers. Apples: Journal of Applied Language Studies, 16, 57–76. https://doi.org/10.47862/apples.112308
  • Luke, A. (2012). Critical literacy: Foundational notes. Theory Into Practice, 51(1), 4–11. https://doi.org/10.1080/00405841.2012.636324
  • Luke, A. (2014). Defining critical literacy. In J. Z. Pandya, & J. Ávila (Eds.), Moving critical literacies forward: A new look at praxis across contexts (pp. 20–31). Routledge.
  • Luke, A., & Freebody, P. (1999). Further notes on the four resources model. Reading Online, 1–6.
  • Martin-Jones, M. (2007). Bilingualism, education and the regulation of access to language resources. In M. Heller (Ed.), Bilingualism: A social approach (pp. 161–182). Palgrave.
  • Mayring, P. (2015). Qualitative content analysis: Theoretical background and procedures. In A. Bikner-Ahsbahs, C. Knipping, & N. Presmeg (Eds.), Approaches to qualitative research in mathematics education: Examples of methodology and methods (pp. 365–380). Springer.
  • Messick, S. (1988). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed, pp. 13–103). American Council on Education/Macmillan.
  • Ministry of Education and Culture. (2019). Media literacy in Finland: National media education policy. Publications of the Ministry of Education and Culture 2019:39. https://medialukutaitosuomessa.fi/mediaeducationpolicy.pdf
  • Mullis, I. V. S., & Martin, M. O. (Eds.). (2019). PIRLS 2021 assessment frameworks. TIMSS & PIRLS International Study Center. https://timssandpirls.bc.edu/pirls2021/framework/
  • OECD. (2018). PISA 2018 released field trial and main survey new reading items. OECD Publishing. https://www.oecd.org/pisa/test/PISA2018_Released_REA_Items_12112019.pdf
  • OECD. (2019a). PISA 2018 assessment and analytical framework. OECD Publishing.
  • OECD. (2019b). PISA 2018 results (Volume I): what students know and can do. OECD Publishing.
  • OECD. (2021). Are 15-year-olds prepared to deal with fake news and misinformation? PISA in focus 2021/113. OECD Publishing.
  • OECD. (n.d.). PISA 2018 technical report. https://www.oecd.org/pisa/data/pisa2018technicalreport
  • Perry, K. H. (2009). Genres, contexts, and literacy practices: Literacy brokering among Sudanese refugee families. Reading Research Quarterly, 44(3), 256–276. https://doi.org/10.1598/RRQ.44.3.2
  • Perry, K. H. (2012). What is literacy? A critical overview of sociocultural perspectives. Journal of Language & Literacy Education, 8, 50–71.
  • Pitkänen-Huhta, A., & Holm, L. (2012). Literacy practices in transition: Setting the scene. In A. Pitkänen-Huhta, & L. Holm (Eds.), Literacy practices in transition: Perspectives from the Nordic countries (pp. 1–23). Multilingual Matters.
  • Rieh, S. Y. (2002). Judgement of information quality and cognitive authority in the web. Journal of the American Society for Information Science and Technology, 53(2), 145–161. https://doi.org/10.1002/asi.10017
  • Rogers, R. (2013). Cultivating diversity through critical literacy in teacher education. In C. Kosnik, J. Rowsell, P. Williamson, R. Simon, & C. Beck (Eds.), Literacy teacher educators: Preparing teachers for a changing world (pp. 7–20). Sense Publishers.
  • Rogers, R. (2014). Coaching teachers as they design critical literacy practices. Reading and Writing Quarterly, 30(3), 241–261. https://doi.org/10.1080/10573569.2014.909260
  • Stankov, L., Lee, J., & von Davier, M. (2018). A note on construct validity of the anchoring method in PISA 2012. Journal of Psychoeducational Assessment, 36(7), 709–724. https://doi.org/10.1177/0734282917702270
  • Sulkunen, S. (2007). Text authenticity in international reading literacy assessment. Focusing on PISA 2000 (Jyväskylä Studies in Humanities, 76). [Doctoral dissertation], University of Jyväskylä. JYX Digital Repository. https://jyx.jyu.fi/bitstream/handle/123456789/13434/9789513929763.pdf?sequence=1&isAllowed=y
  • Sulkunen, S., & Malin, A. (2014). Aikuisten lukutaito tiedon käsittelyn ja hallinnan avaintaitona [adults literacy as a key skill for information processing and management]. Kieli, koulutus ja yhteiskunta [Language, Education and Society], 5. https://www.kieliverkosto.fi/fi/journals/kieli-koulutus-ja-yhteiskunta-huhtikuu-2014/aikuisten-lukutaito-tiedon-kasittelyn-ja-hallinnan-avaintaitona
  • The Matriculation Examination Board. (2022). Äidinkielen ja kirjallisuuden kokeen määräykset [Guidelines for the examination of mother tongue and literature]. https://www.ylioppilastutkinto.fi/sites/default/files/sites/default/files/documents/aidinkieli_ja_kirjallisuus_maaraykset.pdf
  • The New London Group. (1996). A pedagogy of multiliteracies: Designing social futures. Harvard Educational Review, 66(1), 60–92. https://doi.org/10.17763/haer.66.1.17370n67v22j160u
  • Walton, D. (2006). Fundamentals of critical argumentation. Cambridge University Press.
  • Werlich, E. (1983). A text grammar of English. Quelle & Meyer.