2,798
Views
1
CrossRef citations to date
0
Altmetric
Articles

Designing media and information literacy curricula in English primary schools: children’s perceptions of the internet and ability to navigate online information

ORCID Icon &
Pages 151-160 | Received 29 Nov 2021, Accepted 20 Dec 2021, Published online: 22 Feb 2022

ABSTRACT

This paper presents findings from a study into children’s media and information literacy (MIL). The purpose of the study was to understand English primary school children’s (ages 8–11) attitude toward the internet as well as their ability to find, use and evaluate information. This then informed the development of a MIL programme of learning for primary schools. Data analysis showed that the children demonstrated low levels of MIL, for example, when identifying bias and distinguishing between fact and opinion. In addition, working in a group did not lead to better MIL. There were limited strategies for successful co-operation and often the dominant group members’ views were prioritised over evidence from the text. Therefore, working with the Digital Education Futures Initiative (DEFI) at Hughes Hall, University of Cambridge, we propose a MIL curriculum that focuses both on content but also collaborative dialogic skills such as listening to others and changing one’s own point of view.

Introduction

The children of today and tomorrow will have ever-increasing access to digital technologies, to view and to create information, in their educational, social and work lives. Those born after 1985 have been referred to as ‘digital natives’: people who have grown up with digital technologies. Those born before 1985, however, are ‘digital immigrants’ (Prensky Citation2001) – they have had to learn how to use these technologies later in life. However, while younger people may be more confident when using digital technology, questions and concerns remain over their competence (Ofcom Citation2020; National Literacy Trust Citation2018; Lazaridou, Krestel, and Naumann Citation2017). When people are adept at using digital technologies competently across a broad range of criteria, they are digitally literate and therefore digital literacy education is of great importance for young people and the societies in which they live.

This is indicated by efforts from government, academia and media organisations to understand and inform people about digital literacy, particularly in the sphere of misinformation (‘fake news’, Silverman Citation2015; Bulger and Davison Citation2018). However, developing digital competence also means being adept at finding, working with, evaluating and reporting on information. Although these skills are relevant across different information sources, not just the digital (Hobbs and Jensen Citation2013), in the digital media ecosystem the speed and volume of information highlights the need for a particular focus on a digital literacy that addresses this issue.

This study, therefore, uses the UNESCO (Citation2013) framework, which takes an interdisciplinary approach to media literacies, referred to as media and information literacy (MIL) that together ‘help to equip people with the competencies required for 21st century life and the need to deal with the huge volume of data, information and media messages coming from different communication and information platforms and providers’ (30). These criteria are also referenced by many other studies into MIL (Notley and Dezuanni Citation2019; Mitchell et al. Citation2018; Mason, Krutka, and Stoddard Citation2018; Luke Citation2012).

Although information and education to improve media literacy has been aimed at adult citizens and children, research into understanding primary school (aged 5–11) children’s media literacy has been limited. In addition, the National Curriculum Key Stage 1 and 2 computing programme of study (Department for Education Citation2013) is highly focused on children’s ability to programme, code and understand algorithms. While these are valuable skills, it means that teachers have limited information from the curriculum document about the specific criteria that constitute media and information literacies.

Therefore the aim of this study is to carry out a MIL assessment of children aged 8–11 in England and to use this information to design a MIL curriculum that is tailored for this age group.

Developing a test of MIL

Schilder, Lockee, and Saxon (Citation2016) highlight that while media literacy education is ‘thriving’, assessment of media literacy ‘remains an area of concern’ (32), specifically that the concepts and skills which comprise media and information literacy should be better delineated. In English primary schools, there is no assessment of MIL, either in the form of external testing or in-school teacher assessment. It is therefore difficult for teachers to be able to ascertain how children are engaging with sources of online material.

Creating an assessment tool for media literacy is difficult task, particularly because of the young age of the participants. Pereira and Moura (Citation2019) also identified that there were difficulties associated with creating a tool given ‘the complexity of the concept’ (23) and what one can realistically collect. This is particularly the case when assessing primary school children because of limited reading ability which necessitates shorter tests lengths. There is also the issue that children have had a more limited exposure to the world: knowing what is established truth may be more difficult. UNESCO (Pereira and Moura Citation2019) offers a comprehensive MIL assessment framework, consisting of three components; this test focuses on Component 2: understanding, assessment and evaluation of information and media. Criteria of this component include:

  • − Comparing facts

  • − Distinguishing facts from opinion

  • − Identifying underlying values

  • − Understanding the role of media and public institutions

  • − Evaluating the quality of information (57)

For the last of those points, the quality of information is specifically defined as its accuracy, relevance, currency, reliability and completeness. We operationalised the criteria in the UNESCO framework to create a set of 10 questions per test. We also included a question asking children to rate different sources of information, including the internet.

We selected themes for the questions that would be appropriate for primary-aged children, such as making reference in the questions to Harry Potter, football and school-based scenarios. We utilised the authors’ experience as former primary school teachers to use appropriate language for the age group. We then piloted the questions with a group of five Year 3 children and five Year 5 children to ascertain that the questions were pitched at a suitable reading age.

Pereira and Moura (Citation2019) also write that there is a danger of ‘sacrificing the study of a complex reality for the sake of obtaining a ranking’ (23). For that reason, we chose to create and administer two equivalent tests: one to be taken individually and the other to be taken in groups of three. Two test questions were created for each of the areas identified, and one of each pair was randomly assigned either to the individual test or to the group test. This follows the work of Wegerif et al. (Citation2017), who developed a group measures test to establish whether there was any added value from collaborative work. The tests, therefore, provided us with an individual test data stream that can be compared with the group test data to examine how peer influences affect MIL.

Research questions

The two questions which we were investigating in this study were:

  1. To what extent do children demonstrate media and information literacy according to the UNESCO criteria?

  2. To what extent does working in groups change the way in which children engage with information?

Participants

The tests were administered to a total of 360 individual children and 120 groups of three children in five schools in three counties in the south of England. Children were either in Year 3 (aged 7–8) or Year 5 (aged 9–10). and testing was carried out in the autumn term.

Analysis

We carried out a descriptive statistical analysis on the test scores for the individual and group tests to compare between the two tests. This allowed us to determine whether children performed better on their own or with their peers.

We also recorded the group tests for further thematic analysis to find out how the children were interacting with each other and the impact that this had on the way in which they answered questions. We carried out the recordings ourselves and then watched the videos as a research team to identify the themes that emerged from the children’s interaction. We did not make transcriptions of the recordings, instead drawing on Wegerif’s et al (Citation2017) group test work where notes were made during the test (or in this case, while watching the video).

Findings

Books were rated more highly than the internet as a source of information

Children were given a headline to read and asked to rate on a Likert scale which source they would use to find out more information and ascertain if what they had read was true. The five sources of information given were: friends, parents, the internet, books and teachers. They were asked to rate each source between 1 (not a good source of information) and 10 (a very good source of information). The findings below detail children’s responses. The tables below show the results from the individual and group tests respectively. Sources of information are rated highest if they were chosen as the single or joint highest. and .

Table 1. The highest-rated sources of information in the individual tests.

Table 2. The highest-rated sources of information in the group tests.

In all tests for both age groups, children rated books as the best source of information. The children’s comments indicated that books were trustworthy sources of information, whereas the internet was not. Children commented that ‘you can’t trust the internet’ and ‘the internet tells lies’.

Children relied on their parents as a source of information

Children in Year 3 perceived parents as a better source of information than the internet. Even in Year 5, parents were still seen as an important source of information. This selection of comments indicates some of the reasons why: ‘your mum and dad know most things’, ‘they know more than us’, ‘ask your mum and dad would probably be a nine because they’d probably know’.

Teachers did not score highly as a source of information

In the individual tests for both year groups, children rated teachers the lowest as a source of information. This correlates with the NLT (Citation2018) report which found that of the children surveyed, 29.9% reported that they would speak to their parents first about fake news, while only 6.4% would speak to their teachers. It is notable that although teachers were rated lowest as a source of information in both individual and group tests, it was much lower in the group test.

Friends did not view peers as a reliable source of information, but children were susceptible to peer influence

Several children made comments that indicated they didn’t think highly of their peers as a source of information, such as: ‘they don’t know any more than you’ and ‘if you ask your friend they’ll probably know as much as you do’. Despite this, the group tests revealed that there were several times when the answer that was decided upon was the result of one child’s vehemence about it. We also saw many occasions where children could not decide on an answer and put it to a vote. As the children were in groups of three, this meant a ‘2 against 1’ outcome, and the ‘winning’ answer was circled. Children seemed to view these approaches a valid way of choosing an answer and did not try to convince others by justifying their thinking.

Children found it difficult to identify fact and opinion in a text

indicates the percentages of pupils who were able to correctly able to identify fact and opinion in a short piece of text. Pupils were asked to underline the facts in this text. In this question, it was possible to score negative marks, as one mark was added for each fact identified, and one mark has taken away for each opinion that was incorrectly identified as a fact.

Table 3. Pupil scores for identifying facts in a text.

Year 3 pupils scored less highly than those in Year 5: the highest percentage for Year 3 pupils was those who scored 0 marks. This does not mean that they failed to identify any facts, but rather that they were indiscriminate in their underlining of facts and opinions. For example, if they had underlined three facts and three opinions, their overall score would be 0. Year 5 children performed better: the most frequent score for this age group was two marks, with a further 23% scoring three.

The group tests provide further information about how children answered this question. Children encouraged each other to underline whole sentences without being aware that they contained both facts and opinions. Only one child underlined more discriminately, commenting that if a word was an adjective then it was an opinion. This child was able to highlight the relevant parts of sentences, which we rarely saw.

Children’s bias was evident

One question gave a short passage in which the Harry Potter books were described as boring. The question asked children to identify what the passage was trying to make them think – the correct option was that the Harry Potter books are boring. Success in this question depended on what children thought about the Harry Potter books. Those who didn’t like them made comments such as ‘I’ve never read them’; ‘they are boring’; ‘I don’t like boy stuff’; and were more likely to correctly identify that the passage was trying to make them think that the Harry Potter books are boring. Those who were enthusiastic about them, however, were not inclined to circle the answer that the books were boring. One child vehemently said ‘No!’ when asked to circle that answer by another in her group, before grudgingly circling the correct answer. Another child absolutely refused to circle the answer that they were boring because she liked the books so much, commenting ‘I know it’s the right answer, but I’m not circling it’.

Children relied on their prior knowledge rather than the text throughout

One question gave children the information that scientists at a chocolate company have discovered that chocolate is good for you. They were asked to rate on a 10-point rating scale whether or not they believed this was true, and then asked to write a reason for their choice of answer. The information of importance in this question was that the scientists worked for (i.e. were paid by) the company. This should have been an indication to the children that this information could have been released by the company, or took the form of advertising. Although a high proportion of children scored highly on this question, nearly every reason given was due to prior knowledge, with such responses as ‘chocolate is bad for you’, ‘chocolate has sugar in’ and ‘sugar is bad for you’ given as the main reasons.

Only two children offered an explanation that was based on a critical appraisal of the text: one child wrote that ‘they might be advertising their chocolate’, and another that ‘maybe scientists did the experiment wrong’. This accords with the Ofcom (Citation2020) findings that younger children were generally less aware of advertising, particularly when it was presented in more subtle forms.

However, it is also notable that nearly all children did score highly on this question. Most children immediately discounted the story as true, making reference to sugar content. There have been many public health campaigns in recent years that have focussed on sugar: a comment by one of the children, ‘that thing with the sugar cubes’ refers to a campaign to show children how many cubes of sugar are in common foods and drinks.Footnote1 This indicates that high-profile campaigns which target children through a number of social channels (pre-school, school, home) have a strong impact on children’s beliefs and could be an antidote to inaccurate information that they encounter.

Children’s reliance on prior knowledge also impacted upon their group work. One question asked about driverless cars – children were given three short pieces of text and asked to find information to rate given ‘facts’ as true or not. One instance was particularly notable: all three texts gave the information that driverless cars were not on public roads yet, and so when given the statement ‘driverless cars are not on the road yet’, children should have rated this as strongly true. However, in one group, a child was adamant that he had seen driverless cars on the road, and would not listen to his peers when they said that the text said otherwise. The child’s prior ‘knowledge’ took precedence, and the group marked down that the statement was strongly false. We saw instances of this happening throughout the group tests.

Discussion

This study correlated with a number of other studies about the causes of low levels of MIL and the spread of misinformation. First is the inability to recognise misinformation, and low levels of media literacy are part of the problem. The other issue, identified by Scheufele and Krause (Citation2019) is whether or not people have the motivation to recognise misinformation. A number of studies have shown that ‘selective exposure’ occurs because people are inclined to access ‘belief-consistent’ information (7664). This study found that this was also the case for primary school children, whose beliefs (for example about Harry Potter) influenced how they regarded information they were given.

DiFonzo and Bordia (Citation2007) similarly found that unverified information is more likely to be accepted as true and passed on if it conforms to the recipient’s beliefs or, conversely that it does not correspond to the recipient’s beliefs and so is not passed on. The evidence from the group test in this study indicated that not only were children unwilling to accept information that did not accord with strongly held beliefs, they also wanted to impress these beliefs upon their peers. This was even the case when children were presented with evidence to the contrary: the knowledge that they believed they already had was a more important factor when answering the questions.

In this study we found that children reacted in contradictory ways to the information that could be provided by their peers rather than adults. Children consistently regarded their friends as inaccurate sources of information, yet conversely, the group tests showed that children were easily persuaded by their friends to give answers that were contradictory to the evidence that was given in the text. Those children who believed strongly that they had the right answer, and articulated this in vehement terms often did not listen to those children who proposed other answers, even where reasons were being given: confidence was the key here. Einav et al. (Citation2020) refer to research which has been carried out on young children that indicates that they are not ‘indiscriminate in their trust’ (2). Children as young as pre-school age take into account a speaker’s age, experience or confidence, for example, to assess their trustworthiness. This was also the case in our research, with children valuing their parents as a source of information over their friends, except where their friends demonstrated an unshakeable confidence in their own answers.

These findings indicate that unstructured group work, when some participants have low levels of MIL, is not an effective way for children to engage with information. Most striking was the way in which children integrated (or failed to integrate) new information with knowledge or beliefs that they already had, and the impact of this on other children. Taking this into account is an important factor when designing MIL curricula; Mihailidis and Viotty (Citation2017) argue that calling on citizens to act as fact checkers is not sufficient for good MIL because people do not read information in this way. Instead, they suggest a framework for media literacies that is ‘relational and not individualistic’ (451). Their argument is that digital technologies have such capacity for the spread of information that trying to limit the spread is not the best approach. Therefore effective MIL must consider not only the relationship between a person and the material they are viewing but also relationships between people.

It was clear from this study that children also need more information about how to use the internet to find good quality, reliable information. The most frequent comment was how they could not trust ‘the’ internet: participants tended to view it as a singular entity, and very few participants made more nuanced references to the internet as a collection of information, some of which was more reliable and useful to them and some less so. This is likely to be because information for young children about the internet tends to focus on how they can keep themselves safe online. Ofcom (Citation2020) found that children are perceived as a vulnerable group by adults, with parents having more awareness of specific harms to children such as viewing age-inappropriate content.

Conclusion: designing a MIL curriculum

We have used the findings from this study and wider literature to design a MIL curriculum that focuses on two elements of learning: curriculum content and pedagogy. This addresses our two key findings that children were not adept at effectively navigating different sources of information and that children negatively influenced each other when they did not take into account the views of all group members, justify their thinking, or consider changing their own point of view. Therefore, working with Rupert Wegerif at the Digital Education Futures Initiative (DEFI) at Hughes Hall, University of Cambridge, we have named the curriculum that we have developed ‘Inquiring Online’, highlighting both the content and pedagogy aspects of MIL.

In terms of content, we have focused on eight areas that the UNESCO definition of MIL proposes and our study indicates that children find difficult. These are bias; differentiating between fact and opinion; misinformation; fact checking; fake images; researching online; sharing online information; and targeted advertising. Given the findings regarding children’s mistrust of the internet, there is also a focus on the positive aspects of the internet such as finding communities of people with shared interests and the wealth of information available.

When considering the pedagogical approach, we followed Mihailidis and Viotty’s (Citation2017) conception of MIL as active engagement in a process which connects people. The lessons are therefore inquiry-focussed and discussion-based. Children are supported to provide reasons to justify their thinking and also to listen to others. This conceptualises MIL as dialogic education (Wegerif Citation2012), because attention is paid to the way in which children engage with each other as well as the content of the material. Previous research from inquiry-based interventions, such as Philosophy with Children (PwC) and oracy work has shown that this makes a positive difference in the way that children interact with each other (Kerslake Citation2019). These findings have significant implications for the design of digital education futures. Children not only need knowledge to become digitally literate but they also need a pedagogy that focuses on their behaviours and relationships with others. Elements such as taking more equitable roles in group work and demonstrating a willingness to change one’s mind are part of a pedagogy of inquiry that can begin off-line (which is particularly important when access to technology is limited). The role of primary teachers, then, is a vital one because they can foster links between these respectful and inclusive classroom practices, and online behaviours. This will enable children to take their first steps toward global digital citizenship: respecting difference, being responsible members of online communities and understanding the complexities of the large quantities of online information that will be part of their lives. Working with primary teachers is, therefore, a key aspect of digital education curriculum design and research.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by Dr Ron Zimmern.

Notes on contributors

Laura Kerslake

Laura Kerslake is a research associate at Hughes Hall, University of Cambridge. She is a co-investigator working on the Inquiring Learners project which is part of the Digital Education Futures Initiative (DEFI). Laura is also part of the DEFI Innovation Lab and Oracy Cambridge. Her research interests include digital literacy, dialogic education and oracy.

Judith Hannam

Judith Hannam is a research associate working on the Inquiring Learners project at Hughes Hall, University of Cambridge. She is also a project administrator with the Digital Education Futures Initiative (DEFI). Judith is a former primary teacher who is interested in collaborative education research with teachers.

Notes

References