2,238
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Young peoples’ perceptions of digital, media and information literacies across Europe: gender differences, and the gaps between attitudes and abilities

ORCID Icon & ORCID Icon
Pages 435-456 | Received 23 Apr 2021, Accepted 06 Jan 2023, Published online: 05 Jun 2023

ABSTRACT

The need to develop digital, media and information literacies in young people is not a new idea. Increasing numbers of regional, national and international policies make the case and offer frameworks for such literacy development in schools. However, there is still no common agreement about what a basic level of literacy might look like across or within countries, resulting in what UNESCO described as a serious knowledge gap about the global state of digital literacy skills of youth. This article reports on the empirical findings derived from a self-assessment tool, providing a comprehensive view of these literacies in 13–18-year-old students (n = 1051) across eight countries in Europe. This article offers a significant contribution to the field, identifying gender differences and specific differences in ‘attitude minus ability’ scores relating to plagiarism and critical awareness of sources. This has implications for teaching and curriculum development across Europe.

1. Introduction

The need to develop digital, media and information literacies in our young people is not a new idea. It is widely acknowledged (Ilomäki et al., Citation2016; Porat et al., Citation2018, Redecker, Citation2017; UNESCO, Citation2018) that such literacies are vital for success in the variously described new worlds (4th Industrial Revolution, Schwab, Citation2016; Industry 4.0, European Parliament, Citation2016; The 2nd Machine Age: Brynjolfsson & McAfee, Citation2014) in which we live. This is not just a neo-liberalist perspective based on the need to develop a literate workforce but stems also from a perspective that we need such literacies to live harmoniously in multicultural societies (Oxfam, Citation2015) and to address the complex and even chaotic challenges of our current global world (Kurtz & Snowden, Citation2003), such as climate change or global pandemics.

There are an increasing number of policies at regional, national and international levels that make the case and offer frameworks for digital, media and information literacy development in schools (Ilomäki et al., Citation2016) where such literacies should be seen as fundamental (OECD, Citation2009) and a core competence for life in the twenty-first century (UNESCO, Citation2018). Within these policies, there is a growing convergence of the terms ‘digital’, ‘media’ and ‘information’ in relation to the literacies required for young people to successfully harness the opportunities technology provides (Buckingham, Citation2010). Rather than seeing them as three separate entities, they are increasingly perceived as intertwined to represent a broader repertoire of competencies that are required for effectively navigating our day-to-day lives. Some countries and organisations use the term ‘digital literacy’ to incorporate not only information and media but other literacies such as traditional and computer software/hardware literacies (UNESCO, Citation2018). The UNESCO Digital Literacy Global Framework (DLGF) project defines digital literacy as:

the ability to access, manage, understand, integrate, communicate, evaluate and create information safely and appropriately through digital technologies for employment, decent jobs and entrepreneurship. It includes competences that are variously referred to as computer literacy, ICT literacy, information literacy and media literacy.

(UNESCO, Citation2018, p. 6)

This definition incorporates the sum of associated literacies that are increasingly described as Digital Literacy or Digital Competence (Redecker, Citation2017), although there is some debate on the difference between ‘literacy’ and ‘competence’, with competence denoting a more profound development (Ilomäki et al., Citation2016). For example, the European Union (EU) defines Digital Competence as ‘the confident, critical and responsible use of, and engagement with, digital technologies for learning, work, and for participation in society’ (European Commission/EACEA/Eurydice, Citation2019, p. 25). For the purposes of this article, we subscribe to this UNESCO (Citation2018) definition and use the term ‘digital literacy’ to incorporate digital, media and information literacies together and, from now on, refer to the combination of these as digital literacy.

The UN Sustainable Development Goal (SDG) target 4.4 states that by 2030, we must: ‘substantially increase the number of youth and adults who have relevant skills, including technical and vocational skills, for employment, decent jobs and entrepreneurship’ (United Nations, Citation2015). Although one of the three indicators for this, target 4.4.2, looks for ‘youth/adults who have achieved at least a minimum level of proficiency in digital literacy skills’ (United Nations, Citation2015), there is still no common agreement about what a basic level of digital literacy might look like across or even within countries (Wallis & Buckingham, Citation2019). This may explain why UNESCO (Citation2019) found ‘a serious knowledge gap about the global state of digital literacy skills of youth and adults even though these skills play an increasingly important role in achieving the SDG target’ (p. 5). In other words, it is difficult to address the knowledge gap in the state of digital literacy, if there is no agreement on what constitutes different levels of digital literacy.

This article explores what levels of digital literacy might look like for 13–18-year-old students across eight countries in Europe, drawing on a dataset of 1051 students. In doing so, it additionally contributes to narrowing the knowledge gap noted by UNESCO (Citation2019). It does this by providing a comprehensive view of digital literacy through the dataset. It not only provides data on student digital literacy but also provides a comparison of student attitudes towards such literacies and similarities and differences according to gender. That this article reports on attitude is particularly significant, as there is little convergence across countries regarding the attitudes and values placed on such literacies (UNESCO, Citation2018), and to date, this has not been documented from a student perspective.

2. A framework for considering digital literacy

Conceptions of literacy may vary depending on viewpoints. For example, Marsh (Citation2016) noted differences between literacy as a situated social practice that is tied to context (such as political, economic etc.) as opposed to literacy as an autonomous model that proposes a set of neutral skills. In each case, the analysis of such different conceptions of literacy will consequently require different critical lenses. For example, viewing sequential levels of literacy complexity, such as collect, create, transform and safely use (Lazonder et al., Citation2020), could suggest an autonomous view of literacy. Whereas the basic, required and improvement model suggested by Griffin et al. (Citation1990) – where the basic level is about having the minimum skills to access the society; the required level is about skills needed to operate effectively in that society; and the improvement level is about empowerment where skills enable a person to take direction of their life – is more suggestive of a situated model.

Markauskaite (Citation2006) developed an analytical framework for (ICT) literacy based on a general set of dimensions of analysis that can be perceived as geo-political in nature, these being intended, implemented and achieved. The intended dimension refers to how policy documents articulate educational aims through targets, goals or strategic objectives. The implemented dimension refers to how the intended dimension is operationalised at a local level, for example in specific school contexts, and the achieved dimension refers to the results of that implementation through learner attainment. More recently, Bulfin and McGraw (Citation2015) have repurposed the 3D model developed by Green (Citation1988) and then Green and Beavis (Citation2012). This original model was used for analysis of English language literacy and was derived from the bringing together of three different positions ‘operational literacy’, ‘cultural literacy’ and ‘critical literacy’ (Green, Citation2002, p. 26). Operational literacy is presented at a functional level and encompasses the development of skills with tools or processes which enable ‘function’. Cultural literacy is that which gives meaning both in the way something is understood within a situated context and in the way that understanding enables creation. Critical literacy stems from the work of Freire, who saw literacy as a form of empowerment (Freire, Citation2010). In this element, power itself cannot be separated from literacy, and degrees of critical engagement are needed to enable full comprehension. These three are not envisioned as sequential; rather, they are overlapping, thus enabling cohesive literate practices.

Thus, the 3D model sits in the frame of a situated conception of literacy and has resonance with our conceptualisation of digital literacy which is part of a wider repertoire of skills, knowledge, values and attitudes within the field of Global Competence – a term inseparable from context, power and function (Vaccari & Gardinier, Citation2019). Indeed, the results that we report on in later sections come from one section in a much larger survey on Global Competence in its entirety. The 3D model has a degree of flexibility in its application, having now been used in a wide variety of curricular areas, within teacher education methods and to discuss the intersection of maths, English and technology (Green & Beavis, Citation2012). In particular, Beavis (in Green & Beavis, Citation2012) demonstrated the usability of this model in relation to digital literacy, and there have been several studies to explore the 3D model in this context (see Marsh, Citation2016 or Colvert, Citation2015).

3. Digital literacy within the European context

Green (Citation2002) stated that ‘literacy is always-already political’ (Green, Citation2002, p. 31). As our research is located within Europe, we briefly foreground our findings by providing an overview of the complexities illuminated within digital literacy in that context. The policy context of digital literacy across Europe is varied in spite of recommendations such as those found in the Recommendation of 22 May 2018: key competences for lifelong learning (European Council, Citation2018). Not all governments within Europe legally require schools to teach digital literacy. For those countries that do have strategies or policies in place, the implementation of these are often not monitored or assessed (European Commission/EACEA/Eurydice, Citation2019), and the responsibility for policy implementation resides, for the majority, with external agencies. For example, in the Netherlands, although digital literacy education is not compulsory, SLO (the institution that advises the government on the curriculum to be taught) has developed, together with teachers, digital literacy example learning instances containing curriculum content around computational thinking, media literacy, information skills and basic ICT skills (SLO, Citation2019).

However varied the European landscape might be, there often appears to be a mismatch between policy and practice (Madsen et al., Citation2019). For example, in Norway, although digital literacy is part of the general curriculum (UDIR: Norwegian Directorate for Education and Training, Citation2020), government surveys suggest there is a discrepancy between policy and practice which Madsen et al. (Citation2019) attributed to the nature of policy development and political influence. This has synergies with the United Kingdom (UK), where a report on Digital Skills (ECORYS, Citation2016) stated that ‘digital literacy should be seen as a core skill alongside English and maths’ (Citation2016, p. 5). However, this recommendation has not been transferred between departments to be included in the UK national curriculum. Indeed, it is only mentioned once in the primary national curriculum (DfE, Citation2013, p. 178) and not at all in the secondary national curriculum.

In other countries, policies and implementation decisions on how digital literacy is envisaged and taught are localised rather than centralised. For example, in Germany there is no central responsibility, with the 16 federal states each having their own regulations, although a federal coordinating body (Kultusministerkonferenz – KMK) has recently developed a strategy for implementation. Owing to the, sometimes major, differences resulting from the financial positions of the individual federal states, which vary greatly (OECD Stats, Citation2020), it is unclear as yet how successful this implementation will be across the country as a whole.

In Belgium, there are three culturally sensitive curriculums (German, French, Flemish), and each views the place of digital literacy differently. For example, in French Belgium the new curriculum is still being developed, whilst in Flemish Belgium, a new curriculum based around 16 key competencies has been implemented, with the ‘digital literacy skills approach’ being based on four pillars: digital skills, ICT, media literacy and computational thinking (VOV: Flanders Education and Training, Citation2017) and underpinned by the EU Digital Competence Framework for Citizens (European Commission, Citation2016).

The EU Digital Competence Framework for Citizens (European Commission, Citation2016) is a tool that helps citizens improve their digital competence through self-evaluation, setting learning goals, identifying training opportunities and facilitating job search. This is aimed at all citizens, however, not just school students. There are a few other online tools that currently enable people to measure their digital literacy, such as the Digital Skills Accelerator (a self-assessment tool developed out of an Erasmus+ project which began in 2017), the SELFIE (enables schools to discover their digital potential using the whole school community) and the Digital Competence Wheel (a self-assessment tool developed by a company in Denmark for those already in employment). However, there is still a paucity of evaluative tools on digital literacy specifically aimed at secondary school students that spans Europe and addresses the knowledge gap identified by UNESCO (Citation2019) and discussed in the introduction.

4. Background to this study

This article reports on the empirical findings that have been derived from the first domainFootnote1 (Domain A: digital, media and information literacies) of the Global Competence Survey (GCS) (https://www.globalcompetencesurvey.org/), a self-assessment tool created as part of an EU Erasmus+ KA2 project. Although the notion of Global Competence is contested by some (Jones & Buchanan, Citation2023), it has been defined by the OECD as the capacity to examine local, global and intercultural issues, to understand and appreciate the perspectives and worldviews of others, to engage in open, appropriate and effective interactions with people from different cultures, and to act for collective well-being and sustainable development (OECD, Citation2018, p. 7).

The GCS was developed from the Global Competence Framework, which was built over iterative phases during the first year (2015–16) of the project. The initial phase consisted of a critical systematic literature review (Jones, Citation2018) of frameworks and models that scaffold 13–18-year-old learning in Global Competence. Although nine models (Jones, Citation2018) were located that were relevant to Global Competence, the review identified that none had practical application for 13–18-year-old school students. However, the analysis of the frameworks was used as a basis for a focused group discussion between the project team members (10 teachers from Italy, Germany, Belgium, the Netherlands and Norway, and one university researcher from the UK) aimed at identifying an initial set of Global Competencies that might be further explored with a wider audience. Using the set of Global Competencies identified in the initial phase, two questionnaires were developed and deployed across the partner countries, the first aimed at those people studying or working abroad (responses n = 97), the second aimed at those who employ or offer courses to foreign nationals (responses n = 24). The analysis of these, in conjunction with the focused group discussion and the nine frameworks already identified in the critical systematic literature review, enabled the project team to design a Global Competence Framework specifically for use in 13–18-year-old education across Europe. Using a design-based approach (Barab & Squire, Citation2004), it was thus created collaboratively across six different European countries and taking in the views of multiple stakeholders including teachers, multinational organisations and companies, students, universities, employees and academic literature. This framework was then presented at six different national conferences during the 2015–16 academic year to seek participant feedback. From this participative process, the framework was finalised and developed into an online survey tool.

The first of the four domains covered in the GCS is ‘Digital, Media and Information Literacies’, and data analysis from this domain is the focus of this article. The construction of both the framework and the survey tool has relevance to the field, given the lack of alignment across countries regarding attitudes towards digital literacy (UNESCO, Citation2018). In addition, the contextualisation of digital literacy within the wider field of Global Competence, we argue, enables students to situate the purpose and potential value of such literacies.

5. Method

5.1 Survey design

The six elements of the Global Competence Survey that make up Domain A are:

  • information access;

  • communicate using ICTs;

  • critical awareness of sources;

  • using digital tools;

  • ethical awareness;

  • online safety

and contain 25 separate items. Details of how these elements are constructed can be seen in Appendix 1, which also maps the elements against the EU Digital Competence Framework (European Commission, Citation2016).

5.2 The Global Competence Survey

The GCS was developed as an online questionnaire and written by the authors, using a combination of HTML forms and the PERL scripting language to collate the data and to produce a series of graphs and a detailed results report for participants and teachers on completion of the survey. The survey was created with one section per page to avoid the negative impact of page scrolling (Toepoel et al., Citation2009) and as fed back by the students to the authors as a requirement during user testing. The survey uses a dual Likert (Citation1932) scale format (see Pedder et al., Citation2010; Procter, Citation2015) which was built on the work of the Improving School Effectiveness Project (MacBeath & Mortimer, Citation2001; Robertson et al., Citation2001). This format allows respondents to enter two responses for each item (see ).

Figure 1. Example of dual Likert-scale statements taken from the Global Competence Survey.

Figure 1. Example of dual Likert-scale statements taken from the Global Competence Survey.

Thus, respondents were able to rate their ability with regard to a particular competence item and also their attitude, construed as perceived importance, towards that competence. As each statement is ranked against two different constructs, the student begins to build a profile of their ability and attitude about an item, an element, a domain and then Global Competence in its totality. This is then presented back to them as soon as they submit their final answer, in the form of an online report containing an overall polar map of their Global Competence both as ability and as attitude (see ), a breakdown of their Global Competence by domain (see where the example given is Domain A), some qualitative feedback and activities in which they should engage to address any areas that have been identified as requiring work.

Figure 2. Example polar maps of Global Competence for Ability and for Attitude.

Figure 2. Example polar maps of Global Competence for Ability and for Attitude.

Figure 3. Example of Global Competence detailed feedback: Domain A.

Figure 3. Example of Global Competence detailed feedback: Domain A.

The online survey was piloted with secondary school students from the project team countries on two different occasions during April 2016 (n = 85) and March 2017 (n = 57). Two forms of feedback were collected from students and their teachers during each pilot phase and enabled a refinement of the GCS to its final form: feedback on conceptual and language understanding regarding the item bank; feedback on the usability of the online survey tool and results report generated from its completion. The launch of the final GCS took place on 1 September 2017. At this time, the survey was published on the project website and all project partners were asked to cascade the GCS to all their networks and in turn ask these people to cascade it out to their networks and so on using a chain-referral method. The GCS was also disseminated at six international education conferences during this time and delegates asked to cascade the GCS to their networks and so on.

The data generated were anonymous at point of collection with only the following demographic data collected: age, gender, country, name of school, club or other organisation, name of teacher and teacher-assigned student number. The latter three items were collected so that if a student lost their results, their teacher could supply us with the details contained in these three answers so that we might retrieve the relevant data report and forward it onto the teacher. In this way, we never needed to record any personal student data. It appears from the data timestamps that students, without exception, took the survey within their classroom lesson time.

For the main data-gathering period, 1051 responses were collected across 10 countries, between 1 September 2017 and 31 August 2020. A Cronbach Alpha (Cronbach, Citation1951) was calculated for the GCS, giving a result of 0.923. Cronbach Alpha provides an estimate of internal consistency reliability for the questionnaire, and a value above 0.90 is generally regarded as very highly reliable (Cohen et al., Citation2018).

5.3 Analysis

Between September 2017 and August 2020, the GCS was completed by 1051 respondents, and these data were loaded into the R statistical packageFootnote2 for analysis.

The main analysis consisted of calculating frequency tables for each of the 25 statements within the six elements of Domain A. This was carried out for both ability and attitude Likert-scale results. Then positive values were added together. Thus, on the ability side of the statement the excellent and good scores were added, and on the attitude side of the statement the crucial and important scores were added. This provided one number for ability in relation to the statement and one for attitude. This allowed comparisons between participants’ ability with regard to an item and their attitude with regard to that item. Equally, the gap between participants’ attitude and their ability could be calculated.

A cross-tabulation of the results by gender was also calculated. This provided attitude and ability scores and the gap between these scores for each gender. This analysis provided insights into gender differences across the range of competences.

5.4 Limitations

There are several limitations to this study. The GCS is in English, and, for the majority of the respondents, this is not their first language. To mitigate possible limitations in language proficiency, the GCS was piloted over two iterations with young people (11–19) from seven European countries. During the pilot, we specifically sought feedback about the comprehension of both statements and instructions on how to undertake the survey. In both iterations, feedback helped to refine the GCS so that it can be understood linguistically and conceptually by young people from various European linguistic backgrounds.

Additionally, the GCS is a self-reflective assessment tool, and as such it does not measure actual performance either through observation of authentic activities or bespoke testing. Therefore, the results are reliant on the accuracy of the students’ declarative feedback. However, it does use both constructs of ability and attitude, which enables a deeper interrogation of the place and perceptions of digital literacy, not only for its own sake, but within the context of Global Competence. Moreover, this is a self-improving survey as it enables students and teachers to work with their results in a systematic manner through reflective exercises and an action plan tool which are integral to the automatically generated results pdf.

6. Findings

In this section, the demographics of the sample are initially presented by age, gender and country, followed by analysis of the GCS data.

6.1 Age, gender and country

The majority of respondents, 82.2% (n = 864), were 13, 14, 15 and 16-year-old students. One person did not respond to this question. Equally, 553 (56.6%) identified as female and 473 (45%) male, with 25 (2.4%) not responding to this question.

Although overall eight countries are represented in the results, 85.1% (n = 895) of the participants are either from Italy or Germany, as shown in .

Table 1. Respondents by country.

6.2 Digital, media and information literacies

The results for perceived ability and attitude towards the six elements contained in Domain A of this survey are presented in . Students scored the 25 statements, by ability and attitude. The following tables present highlighted results for ability, attitude, and the gap between scores of attitude and ability.

Table 2. Results for A1 – Information Access.

6.2.1 Element A1 Information Access

shows A1(2) – access information using libraries and archives, and it has an ability–attitude gap of 15.1%, highlighting that 65.8% of respondents consider this skill to be important or crucial, but only about half of the respondents consider they have the ability to do this. A1(1) – access information using the internet has the highest ability (92%) and highest attitude (94.2%) scores for the whole survey. It is also the only survey statement that has an ability score above 90%.

6.2.2 Element A2 Communicate using ICTs

shows the lowest perceived ability score in this element was A2(3) – the use of spreadsheets (39.8%); however, over half the respondents considered this was an important or crucial skill to have (52.8%). Furthermore, two-thirds of respondents rated their ability for A2(4) – the use of social media for learning purposes at 72%, but only half of them considered this to be important or crucial (56.9%). This gives an attitude–ability gap score of −15.1%, which is the greatest negative score in the survey. The attitude–ability gap in this element ranges between −15.1% and 13%, a difference of 28.1%. This is the biggest difference across the six elements.

Table 3. Results for A2 – Communicate using ICTs.

6.2.3 Element A3 Critical Awareness of Sources

In , two-thirds of respondents A3(1) considered themselves able to differentiate between reliable and unreliable sources (71.1%). Their perceived ability was slightly lower for A3(2) – analyse and evaluate media content (67.3%). In A3(3), just over half considered they were able to see biased opinions in information (56.6%). Respondents rated the importance of these skills all higher than their abilities. The ability–attitude gap in this element ranges between 12.3% and 15.7%, which is the closest gap range in this study with a difference of 3.4%.

Table 4. Results for A3 – Critical Awareness of Sources.

6.2.4 Element A4 Using Digital Tools

shows that two items in this element scored below 50% with A4(1) – the ability to plan, shoot and broadcast videos at 42.7% and A4(2) – plan, shoot and broadcast podcasts scoring 27.5%, which was also the lowest score ability across the whole survey. Further, with A4(3) – use social media appropriately in a variety of different contexts, respondents rated the importance of this skill as higher than their own abilities to do it, giving an attitude–ability gap of –1.9%.

Table 5. Results for A4 – Using Digital Tools.

6.2.5 Element A5 Ethical Awareness

shows that all of the respondents rated the importance of these skills 9% or higher than their perceived abilities, with A5(4) – take other people’s views and wishes into consideration when working collaboratively being the highest attitude score (80.3%). However, A5(2) – reference work that is not your own (avoiding plagiarism), has the highest ability – attitude gap of 17.5% across all 25 statements in the survey. The ability – attitude gap in this element ranges between 9.3% and 17.5%, a difference of 8.2%.

Table 6. Results for A5 – Ethical Awareness.

6.2.6 Element A6 Online Safety

shows the attitude scores for this element were mostly high, with five out of the six statements scoring 85.2% or higher and with two over 90% these being A6(6) – to recognise online bullying, at 90.3%, and A6(4) – keep your online personal information secure, at 92.4%, showing a high concern for online safety among students. The highest ability–attitude gap however was A6(2) – create acceptable digital identities and digital footprints, at 16.9%. Generally speaking, this element contains the overall highest perceived ability and attitude scores. The ability–attitude gap in this element ranges between 11.9% and 16.9%, giving a difference of 5%.

Table 7. Results for A6 – Online Safety.

6.3 Differences of perceived abilities and attitudes towards the six elements by gender

A cross-tabulation was calculated for scores by gender. Owing to uneven sample sizes across the two groups, row percentages were calculated in the analysis to allow for comparison. These data are presented in .

Table 8. All items ranked by female attitude–ability gap.

shows that the total attitude–ability gap scores for females ranged from 20% to −14.4% (a difference of 34.4%) and for males from 16.9% to −16.2% (a difference of 33.1%).

The biggest difference in attitude–ability gap scores between females and males was as follows:

  • A1(3) – find relevant organisations for information, with attitude–ability scores for females 13.5% and males 1.1%, giving a difference of 11.9%;

  • A6(3) – recognise spam and potentially dangerous emails, females 20% and males 8.2% with a difference of 11.8%;

  • A3(2) – analyse and evaluate information in media content, females 17.5% and males 6.8% with a difference of 10.7%.

All other attitude–ability gap difference scores were below 10% difference, i.e. more closely aligned. The closest alignment with regards to ability–attitude gap between females and males were:

  • A2(1) – use programs for word processing, with attitude–ability scores for females at −0.5%, and males at 0%, giving a difference of 0.5%;

  • A3(3) – see biased opinions in information, with attitude–ability scores for females at 13%, and males at 12.2%, giving a difference of 0.8%.

In both A2(1) and A3(3), the attitude–ability gap was wider for females than it was for males. In all other statements, the attitude–ability gap showed more than a 1% difference.

In addition, presents the top five attitude–ability gap scores for both genders. For females these were:

  • A6(3) – recognise spam and potentially dangerous emails (20%);

  • A5(2) – reference work that is not your own (avoiding plagiarism) (19.5%);

  • A3(1) – differentiate between reliable and unreliable information and information sources (19.4%);

  • A4(4) – choose the best way to interact digitally with people in a range of different circumstances (19.2%);

  • A6(5) – know what to do if you see something that makes you feel uncomfortable online (18.5%).

Two of these attitude–ability gap scores were concerned with A6 – Online Safety. The other three were from A3 – Critical Awareness of Sources, A4 – Using Digital Tools and A5 – Ethical Awareness. These gap scores are all greater than the same items for males.

The top five attitude–ability gaps for males were:

  • A1(2) – access information using libraries and archives (16.9%);

  • A2(4) – use social media for learning purposes (16.2%);

  • A6(2) – create acceptable digital identities and digital footprints (15.3%);

  • A5(2) – reference work that is not your own (avoiding plagiarism) (14.5%);

  • A6(6) – recognise online bullying (13.9%).

Again, two of these attitude–ability gaps were concerned with A6 – Online Safety, although they concern different statements to the female attitude–ability gaps. The other three were from A1 – Information Access, A2 – Communicate using ICTs and A3 – Critical Awareness of Sources.

The lowest attitude–ability gap score for both genders was A2(4) Use of social media for learning purposes, in which both genders considered themselves to have good ability – females 73.1% and males 71.2% – but they did not consider this to be important, females 58.7% and males 55%, producing gap scores of −14.4% and −16.2% respectively.

Overall, out of 25 items, the attitude–ability gap score was higher for males in only five items, and there were only three items where males rated the importance of competences higher than females. These are discussed in section 7.2.

7. Discussion of key findings

The following section discusses the most significant results from this survey and relates them to the three dimensions – operational, cultural and critical – of the 3D model and ends with the implications these have for pedagogy and curriculum development.

7.1 Relating the results to the 3D model

The notion of information and its importance in today’s society has been variously described (Castells, Citation2000). Buckland (Citation2017) argued that whilst there can be no such thing as a non-information society, there is an increasing intensity in the way information dominates our lives as we move away from oral traditions to technological platforms hosting a multiplicity of facts, figures, knowledge, data, intelligence, evidence and other such materials that contribute to the notion of information. However, information does not exist in isolation; rather, it is shaped, presented, promoted and reposted in various ways depending on, for example, value or power (Park, Citation2017). Moreover, information is not equally accessible (sometimes intentionally) to all, and some information is promoted and redistributed at higher rates, creating gaps in who has access to what and an imbalance of information acquisition. Using the 3D Model (Green, Citation1988; Green & Beavis, Citation2012), it appears that students have developed the operational skills needed for information access. Indeed, when looking at access to information using the internet, the scores for ability and attitude are the highest across the whole survey. Moreover, it is the only survey statement that has an ability score above 90%. However, there is a large gap (approximately 40% for both ability and attitude) between accessing information via the internet compared to libraries and archives. Although these differing student results of information access via the internet versus the library may be indicative of both familiarity and immediacy of information via digital tools, through the critical lens it may demonstrate student lack of understanding in the important role libraries still play in accessing information and education. This mirrors a more general and current misunderstanding of the purpose of libraries in supporting education (Julien & Barker, Citation2009). Indeed, the place and perception of libraries as central organisations in society is currently under debate (Juchnevič, Citation2014), not least because of digital transformations and the diminishing role of libraries as gatekeepers of knowledge (Zeegers & Barron, Citation2010).

However, in a study commissioned by the Pew Research Centre, libraries are seen as not only key in developing and serving communities but ‘as part of the educational ecosystem and as resources for promoting digital and information literacy’ (Horrigan, Citation2015). Moreover, de Jager et al. (Citation2018) found a link between library use and increased student learning outcomes. Whilst search engines such as Google may be more accessible and cover more sources, the quality of sources is much higher in library databases (Brophy & Bawden, Citation2005). The lower results are of particular concern for those students intending to move from school to university, where independent study, management of their own learning, and a critical understanding of the place libraries have in this process, are crucial (Smith et al., Citation2013).

When we view the results for information access in parallel to critical awareness of sources, further points are illuminated. For example, whilst students report being extremely able at accessing information, 20% fewer students had the critical skills to differentiate between reliable and unreliable sources. There is also a somewhat surprising disconnect between this and their ability and attitude towards seeing biased opinions in information. Their understanding of the place bias has in information, and their ability to recognise it, are both low, which is of particular concern if viewed through a critical and cultural lens where these competences are vital in a post-truth society filled with fake news (Pangrazio, Citation2018) and the resultant implications this can have on democracy (Farkas & Schou, Citation2020), as seen in the storming of the US Capitol buildings or in the UK Brexit campaign. Our results however are in line with other researchers (see Buckingham, Citation2010; Matusiak et al., Citation2019; Shen et al., Citation2018) who have also noted that students lack such skills across the three critical, cultural and operational dimensions, especially concerning online images that appear in news reports and fake images more generally. Talwar et al. (Citation2020) suggested that fake news is more prevalent on social media as users share ‘news’ instantly. Given that, for example, in the UK 55% of young people (aged 12–15) find out about news on social media (OFCOM, Citation2020b), it is concerning that they do not appear to have the skills to see biased opinions in information.

This critical inability to see biased opinions appears at odds with students’ cultural perception that they know how to use social media effectively, which scores 20% higher in both A2(4) – for learning purposes and A4(3) – in a variety of different contexts. However, when we look at the results for how important they consider social media for learning, we see a low score, suggesting that students have a specific view of what constitutes learning (for example, perhaps school-based activities). This has similarities to the findings of others (see for example, Martin et al., Citation2018); however with the growing emergence of third-space learning and its place in education (Schuck et al., Citation2017), the conception of how social media can support learning is perhaps something that needs addressing by teachers (Mao, Citation2014) who themselves may need scaffolding (Bruner, Citation1986) in this area.

Interesting points were also uncovered when looking in more depth at statements within a particular element, for example A6 – Online Safety. With an increase in online activity by young people, parents are becoming more concerned about online safety (OFCOM, Citation2020a). Online safety and protecting children is an important issue (De Wolf, Citation2019; Ronchi & Robinson, Citation2019) and has been especially highlighted during the global COVID-19 pandemic, where children were spending more unsupervised time online due to school closures (OECD, Citation2020). Generally, participants in our survey see online safety as really important with slightly lower scores concerning their abilities, which aligns with the work of other researchers in this area (Agosto & Abbas, Citation2017; Badillo-Urquiola et al., Citation2019). Within the online safety element, one statement stood out, suggesting that students lacked an awareness of their role in maintaining online safety. For example, when viewing online safety through the cultural lens, less than half the students had good or excellent skills in creating acceptable digital identities and digital footprints. In other words, they do not perceive a link between this and keeping their online personal information secure. Their view on the importance of the development of acceptable digital footprints and digital identities was also at the lower end (64.8%) giving an attitude–ability gap of 16.9%, the second highest across all statements. However, there is a direct link between what people post online and how safe they are (Shillair et al., Citation2015).

As information increasingly dominates our lives, we each have a moral responsibility to use it ethically. Yet, some authors (Risquez et al., Citation2011; Zimerman, Citation2012) noted that plagiarism is on the increase although, according to Sureda-Negre et al. (Citation2015), there is still a paucity of research in pre-university-aged students regarding plagiarism. In our research, the statement on plagiarism – A5(2) – received the greatest gap between ability and attitude at 17.5%. So, whilst students realised how important this is (the critical dimension), they did not feel they had the skills to prevent themselves from plagiarising work (the cultural dimension). This is similar to findings in Ercegovac (Citation2005), who found that whilst a student might demonstrate understanding of plagiarism, they are unable to apply that understanding in the context of their own work. Nor did students appear to have an understanding of the creative commons licencing process. This has clear implications for teachers who need to determine why, how and when they teach students about plagiarism so that students can both contextualise and effectively avoid plagiarising the work of others.

The ability to make appropriate use of ICT tools is of vital importance not least because of our reduced co-location owing to the COVID-19 global pandemic, but also more generally as we spend more and more time working and collaborating online. Within the school setting, the ability to use certain tools can also impact on learner outcomes in specific subjects. For example, the importance of being able to use spreadsheets has been highlighted by researchers (e.g. Abramovich et al., Citation2010; Benacka, Citation2016) specifically in the cultural dimension, where the application of these operational skills is important for the development of understanding and knowledge in STEM subjects. In our study, the use of spreadsheets received the second lowest perceived ability score, with only 39.8% believing themselves to be good or excellent at this skill, even though it is seen as a core generic tool in secondary education (Leitão & Roast, Citation2014). Females scored lower on ability and higher on importance than males, which is of particular concern in relation to spreadsheet use in STEM subjects, where females are underrepresented (World Economic Forum, Citation2020). The low ability score across both genders is in line with other research, for example, Lim (Citation2005), who found that a significant proportion of students entering university were unable to use spreadsheets. Although the majority of students scored their ability low, over half the respondents ranked this as an important or crucial skill (52.8%), suggesting that within the critical dimension, they have some understanding of how important the skill is.

The US-based Association of College and Research Libraries (ACRL) noted how important images and visual literacy are in today’s society and how this ‘contemporary culture is changing what it means to be literate in the 21st century’ (ACRL, Citation2011). With a rise in visual culture and the emergence of new apps (e.g. TikTok or Instagram), it is not just important to know how to interpret visual media, but also how to create videos. However, in our survey, video creation scored low in both ability (42%) and attitude (47%), which appears at odds for an age group that are heavy users of YouTube (Anderson & Jiang, Citation2018), suggesting perhaps that they are mainly consumers rather than creators.

The ICT skill that produced both the lowest attitude (37%) and ability (27%) scores was creating podcasts. Although listening to podcasts has increased by 24% in 2018–19 within the UK for example, with 7.1 million people now listening to podcasts each week (OFCOM, Citation2019), it would seem their creation is of little interest to this age group. Although the reasons for this are unclear, it may be related to how such skills are perceived in schools in comparison to the use of word processing, presentation and spreadsheet packages. It could be associated with lower operational, cultural and critical skill levels in teachers or related to assessment methods, which often drive teaching and learning strategies (Torrance, Citation2012).

7.2 Gender differences and similarities

There have been a variety of studies that have looked at gender differences in computer and information literacies but at digital and media literacies to a much lesser extent. Punter et al. (Citation2017) suggested that gender differences can be accounted for in different ways depending on theoretical perspectives. For example, Socialisation Theory (Charlton, Citation1999) would suggest that different genders are taught and influenced by environmental forces, whereas Attribution Theory (Volman, Citation1997) looks at the perceived femininity or masculinity of objects.

Within studies on gender differences, the focus has tended to be on perceived or actual ability differences. Uniquely, whilst our article investigates ability, it also focuses on perceived attitudes as well as ability and broadens the view to incorporate digital, media and information literacies.

What can be seen from our study data is that there is a gender difference in perceived ability scores. Overall males had higher perceived ability scores in 13 out of the 25 competencies with an average score of 2.9%, whereas in the other 12 competencies the female average perceived ability score was 5.7%. Thus, overall female ability scores across all competencies were on average 1.2% higher than males. This finding concurs with the findings of previous researchers (Fraillon et al., Citation2020; Gebhardt et al., Citation2019; Punter et al., Citation2017).

Other studies have highlighted that males were more confident about using ICT than females (Gebhardt et al., Citation2019). But uniquely, we asked about the importance of this use, and again a gender difference was found. Overall, females considered the competencies to be more important or crucial in 22 out of 25 of the listed items with an average attitude score for these items of 5.9%. The only three instances where males scored the importance of a competence higher than females were, A4(1) plan, shoot, edit video, A6(2) create digital identities and A4(2) plan shoot edit podcasts. These three competencies had a male average score of 4.1%. Overall, the average score for females was 4.7% higher than males across all competencies.

Other researchers have suggested that female students perform better on tasks involving communication, design and creativity, and male students perform on more technical tasks (Gebhardt et al., Citation2019, p. 69). In our dataset, it would be difficult to disaggregate our items in terms of creative and technical (for example, plan, shoot, edit video).

More helpfully, Punter et al. (Citation2017, p. 777) found that females outperform males in evaluating and reflecting on information and that there were no significant gender differences for applying technical functionality. Our findings would back up this claim, with females having higher overall ability scores. Further, our data show that the majority of females found these competencies to be important or crucial.

7.2.1 Overall gaps

Thus, overall females had higher ability scores and higher attitude scores, on average 1.2% and 4.7% respectively, than males. Thus, females not only have greater ability scores for this range of competencies than males, but they also consider these competencies to be of greater importance than males. This second finding is at odds with the findings by Cai et al. (Citation2017), who found in their meta-analysis of gender and attitude to technology use that males believed technology was significantly more important than females did.

8. Conclusion

This article presents a comprehensive view of young people’s abilities and attitudes towards digital literacy, across eight countries drawing upon a dataset of 1051 13–18-year-old school students. Our data allow us to move beyond just ability and provide insights on which digital literacies young people consider to be of importance and crucial for them. Furthermore, our analysis of the data suggests that curriculum development and teaching needs to move beyond purely learning operational ICT skills, such as using word processing and presentation software, but to consider digital literacy from a broader perspective, with teachers helping students to consider how and why digital tools are used by others and how students themselves might apply such skills effectively and become literate in that broader sense. For example, findings in this study have significant implications for the teaching of how plagiarism is contextualised within the learning process; how the use of spreadsheets is integrated in more depth and more widely across curricula; or how critical awareness is explored across disciplines.

The data have also identified areas for further research, for example, the place of social media as a learning tool within the classroom; students’ perceptions of the relationship between online identity and online safety; how libraries actively engage with schools and young people to make libraries more ‘relevant’ to students; or the use of video and podcast creation in the development of visual/aural literacies which in turn may assist students to become more critically aware of online media. The data have also caused us to reflect on other related issues such as the fluid process of development and redevelopment of identity during adolescence (Crocetti, Citation2017) and what that means for young people, as they create and re-create their online identities.

The focus of this article has been on data gathered from an online survey on digital, media and information literacies, which forms the first part of a larger survey on Global Competence. Other researchers have highlighted the cultural importance of students having such literacies in connection with, for example, citizenship (Polizzi, Citation2020) which is covered in the second domain (Global citizenship and intercultural understanding) of our Global Competence survey. Data from this second domain, and how it relates with digital literacy, will be the focus of our next article.

Acknowledgments

In writing this paper, we acknowledge the Erasmus+ Programme and the project partners who made up the KA2 International Skills Inventory and Training Programme for Global Citizens (ISITPGC) team. The ISITPGC project, determined, developed and created a freely available Global Competence Inventory consisting of a framework and survey (self-assessment) tool, associated training resources and case studies for both teachers, so they can learn how to integrate the teaching of Global Competence into their curricula, and for secondary students, so that they may develop their Global Competence.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The work was supported by the Erasmus [2015-1-UK01-KA201-013548].

Notes on contributors

Sarah-Louise Jones

Sarah-Louise Jones, a Reader in Global Education at the University of Hull, has been involved in the field of Global Education since the mid 1990s. She has led several large-scale internationally funded projects that investigate how the broader field of Global Education has been integrated into formal and informal education settings. Internationally, Sarah is the RDC Chair on Global Education at the Association of Teacher Education in Europe (ATEE) and runs the International Global Education Network (https://www.linkedin.com/groups/12539729/).

Richard Procter

Richard Procter is a lecturer in Education at De Montfort University. His research interests include global competencies, digital literacies, assessment and the use of technology to improvement teaching and learning.

Notes

1. There are four domains in the Global Competence Survey Tool. The other three domains are: Domain B: Global Citizenship and Intercultural Understanding; Domain C: Social Awareness, Personal Attributes and Communication Skills; and Domain D: Professional and Learning Competencies.

References

  • Abramovich, S., Nikitina, G. V., & Romanenko, V. N. (2010). Spreadsheets and the development of skills in the STEM disciplines. Spreadsheets in Education, 3(3). https://search.informit.org/documentSummary;res=AEIPT;dn=204929
  • ACRL. (2011). ACRL visual literacy competency standards for higher education. http://www.ala.org/acrl/standards/visualliteracy
  • Agosto, D. E., & Abbas, J. (2017). “Don’t be dumb—that’s the rule I try to live by”: A closer look at older teens’ online privacy and safety attitudes. New Media & Society, 19(3), 347–365. https://doi.org/10.1177/1461444815606121
  • Anderson, M., & Jiang, J. (2018). Teens, social media & technology 2018. Pew Research Center. Informit Analysis and Policy Observatory (APO). https://www.pewresearch.org/internet/2018/05/31/teens-social-media-technology-2018/
  • Badillo-Urquiola, K., Chouhan, C., Chancellor, S., De Choudhary, M., & Wisniewski, P. (2019). Beyond parental control: Designing adolescent online safety apps using value sensitive design. Journal of Adolescent Research, 35(1), 147–175. https://doi.org/10.1177/0743558419884692
  • Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1–14. https://doi.org/10.1207/s15327809jls1301_1
  • Benacka, J. (2016). Numerical modelling with spreadsheets as a means to promote STEM to high school students. Eurasia Journal of Mathematics, Science and Technology Education, 12(4), 947–964. https://doi.org/10.12973/eurasia.2016.1236a
  • Brophy, J., & Bawden, D. (2005). Is Google enough? Comparison of an internet search engine with academic library resources. ASLIB Proceedings, 57(6), 498–512. https://doi.org/10.1108/00012530510634235
  • Bruner, J. S. (1986). Actual minds, possible worlds (2nd ed.). Harvard University Press.
  • Brynjolfsson, E., & McAfee, A. (2014). The second machine age: Work, progress and prosperity in a time of brilliant technologies. W. W. Norton & Co.
  • Buckingham, D. (2010). Defining digital literacy. In B. Bachmair (Ed.), Medienbildung in neuen Kulturräumen (pp. 59–71). VS Verlag für Sozialwissenschaften. https://doi.org/10.1007/978-3-531-92133-4_4
  • Buckland, M. (2017). Information and society. MIT Press.
  • Bulfin, S., & McGraw, K. (2015). Digital literacy in theory, policy and practice: Old concerns, new opportunities. In G. Romeo & M. Henderson (Eds.), Teaching and digital technologies: Big issues and critical questions (pp. 266–281). Cambridge University Press.
  • Cai, Z., Fan, X., & Du, J. (2017). Gender and attitudes toward technology use: A meta-analysis. Computers and Education, 105, 1–13. https://doi.org/10.1016/j.compedu.2016.11.003
  • Castells, M. (2000). The rise of the network society (2nd ed.). Wiley-Blackwell.
  • Charlton, J. P. (1999). Biological sex, sex-role identity, and the spectrum of computing orientations: A re-appraisal at the end of the 90s. Journal of Educational Computing Research, 21(4), 393–412. https://doi.org/10.2190/6MRU-DY8D-TMDQ-NV6P
  • Cohen, L., Manion, L., & Morrison, K. (2018). Research methods in education (8th ed.). Routledge. https://doi.org/10.4324/9781315456539
  • Colvert, A. (2015). Reframing literacies through peer-to-peer alternate reality game design in the primary classroom [ Unpublished PhD]. Institute of Education, University College of London.
  • Crocetti, E. (2017). Identity formation in adolescence: The dynamic of forming and consolidating identity commitments. Child Development Perspectives, 11(2), 145–150. https://doi.org/10.1111/cdep.12226
  • Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297–334. https://doi.org/10.1007/BF02310555
  • de Jager, K., Nassimbeni, M., Daniels, W., & D’Angelo, A. (2018). The use of academic libraries in turbulent times. Performance Measurement and Metrics, 19(1), 40–52. https://doi.org/10.1108/PMM-09-2017-0037
  • De Wolf, R. (2019). Contextualizing how teens manage personal and interpersonal privacy on social media. New Media & Society, 22(6). 146144481987657-1075. https://doi.org/10.1177/1461444819876570
  • DfE. (2013). The national curriculum in England: Key stages 1 and 2 framework document. HMSO. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/425601/PRIMARY_national_curriculum.pdf
  • ECORYS. (2016). Digital skills for the UK economy. HMSO. https://www.gov.uk/government/publications/digital-skills-for-the-uk-economy
  • Ercegovac, Z. (2005). What students say they know, feel, and do about cyber‐plagiarism and academic dishonesty? A case study. Proceedings of the American Society for Information Science and Technology, 42(1), n/a. https://doi.org/10.1002/meet.1450420142
  • European Commission. (2016). EU digital competence framework for citizens. Publications Office. http://bookshop.europa.eu/uri?target=EUB:NOTICE:KE0215657:EN:HTML
  • European Commission/EACEA/Eurydice. (2019). Digital education at school in Europe. Eurydice report. Publications Office of the European Union.
  • European Council. (2018). Council recommendations: On key competencies for lifelong learning. European Union. https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32018H060401&rid=7
  • European Parliament. (2016). Industry 4.0. https://www.europarl.europa.eu/RegData/etudes/STUD/2016/570007/IPOL_STU2016570007_EN.pdf
  • Farkas, J., & Schou, J. (2020). Post-truth, fake news and democracy. Routledge, Taylor & Francis Group.
  • Fraillon, J., Ainley, J., Schulz, W., Friedman, T., & Duckworth, D. (2020). Preparing for life in a digital world: IEA international computer and information literacy study 2018 international report. Springer International Publishing.
  • Freire, P. (2010). Pedagogy of the oppressed (30th anniversary ed.). Continuum.
  • Gebhardt, E., Thomson, S., Ainley, J., & Hillman, K. (2019). Gender differences in computer and information literacy. Springer International Publishing AG.
  • Green, B. (1988). Subject-specific literacy and school learning: A focus on writing. Australian Journal of Education, 32(2), 156–179. https://doi.org/10.1177/000494418803200203
  • Green, B. (2002). A literacy project of our own? English in Australia, 134, 25–32. https://search.informit.org/documentSummary;res=AEIPT;dn=122774
  • Green, B., & Beavis, C. (2012). Literacy in 3D: An integrated perspective in theory and practice. ACER Press.
  • Griffin, P., Jewell, R., Forwood, A., & Francis, R. (1990). Developing competency rating scales in adult literacy: An analytical approach. Open Letter, 1(1), 54–67.
  • Horrigan, J. (2015). Libraries at the crossroads: The public is interested in new services and think libraries are important to communities. Pew Research.
  • Ilomäki, L., Paavola, S., Lakkala, M., & Kantosalo, A. (2016). Digital competence – An emergent boundary concept for policy and educational research. Education and Information Technologies, 21(3), 655–679. https://doi.org/10.1007/s10639-014-9346-4
  • Jones, S. L. (2018). International skills and competencies: Tools for teaching in secondary education. In INTED2018 proceedings (pp. 4862–4866). https://doi.org/10.21125/inted.2018.0095
  • Jones, S., & Buchanan, J. (2023). Education in the Anthropocene: The need for global competence. Globalisation, Societies and Education, in press.
  • Juchnevič, L. (2014). Library roles in changing society. Social Transformations in Contemporary Society, 2014(2), 120–130. https://doaj.org/article/d3fd41c7fa32471d8e7ae87de3fd8148
  • Julien, H., & Barker, S. (2009). How high-school students find and evaluate scientific information: A basis for information literacy skills development. Library & Information Science Research, 31(1), 12–17. https://doi.org/10.1016/j.lisr.2008.10.008
  • Kurtz, C. F., & Snowden, D. J. (2003). The new dynamics of strategy: Sense-making in a complex and complicated world. IBM Systems Journal, 42(3), 462–483. https://doi.org/10.1147/sj.423.0462
  • Lazonder, A. W., Walraven, A., Gijlers, H., & Janssen, N. (2020). Longitudinal assessment of digital literacy in children: Findings from a large Dutch single-school study. Computers and Education, 143, 103681. https://doi.org/10.1016/j.compedu.2019.103681
  • Leitão, R., & Roast, C. (2014,July). Developing visualisations for spreadsheet formulae: Towards increasing the accessibility of science, technology, engineering and maths subjects. Paper presented at the 9th Workshop on Mathematical User Interfaces, Coimbra, Portugal. https://explore.openaire.eu/search/other?orpId=core_ac_uk__:73c826d80b4215eaf63cbda4480be872
  • Likert, R. (1932). A technique for the measurement of attitudes. Archives of Psychology, 140(22), 1–55.
  • Lim, K. (2005). A survey of first year university students’ ability to use spreadsheets. Spreadsheets in Education, 1(2), 71–85.
  • MacBeath, J., & Mortimer, P. (Eds.). (2001). Improving school effectiveness. Open University Press.
  • Madsen, S. S., Archard, S., & Thorvaldsen, S. (2019). How different national strategies of implementing digital technology can affect teacher educators. Nordic Journal of Digital Literacy, 13(4), 7–23. https://doi.org/10.18261/issn.1891-943x-2018-04-02
  • Mao, J. (2014). Social media for learning: A mixed methods study on high school students’ technology affordances and perspectives. Computers in Human Behavior, 33, 213–223. https://doi.org/10.1016/j.chb.2014.01.002
  • Markauskaite, L. (2006). Towards an integrated analytical framework of information and communications technology literacy: From intended to implemented and achieved dimensions. Information Research, 11(3), 252. https://doaj.org/article/d8e739ad14e0428a9587d3477ddbcbf2
  • Marsh, J. A. (2016). The digital literacy skills and competences of children of pre-school age. Media Education, 7(2), 197–214. https://oaj.fupress.net/index.php/med/article/view/8759/8534
  • Martin, F., Wang, C., Petty, T., Wang, W., & Wilkins, P. (2018). Middle school students’ social media use. Educational Technology & Society, 21(1), 213–224.
  • Matusiak, K., Heinbach, C., Harper, A., & Bovee, M. (2019). Visual literacy in practice: Use of images in students’ academic work. College & Research Libraries, 80(1), 123–139. https://doi.org/10.5860/crl.80.1.123
  • OECD. (2009). 21st century skills and competences for new millennium learners in OECD countries. http://econpapers.repec.org/paper/oeceduaab/41-en.htm
  • OECD. (2018). Preparing our youth for an inclusive and sustainable world. The OECD PISA Global Competence framework. https://www.oecd.org/education/Global-competency-for-an-inclusive-world.pdf
  • OECD. (2020). Combatting COVID-19‘s effect on children. OECD Publishing. https://doi.org/10.1787/2e1f3b2f-en
  • OECD Stats. (2020). Regional economy. Organisation for Economic Co-operation and Development. https://stats.oecd.org/Index.aspx?DataSetCode=REGION_ECONOM
  • OFCOM. (2019). Audio on demand: The rise of podcasts. https://www.ofcom.org.uk/about-ofcom/latest/features-and-news/rise-of-podcasts
  • OFCOM. (2020a). Children and parents: Media use and attitudes report 2020. https://www.ofcom.org.uk/__data/assets/pdf_file/0023/190616/children-media-use-attitudes-2019-report.pdf
  • OFCOM. (2020b). News consumption in the UK: 2020. https://www.ofcom.org.uk/__data/assets/pdf_file/0013/201316/news-consumption-2020-report.pdf
  • Oxfam. (2015). Education for global citizens. www.oxfam.org.uk/education
  • Pangrazio, L. (2018). What’s new about ‘fake news’?: Critical digital literacies in an era of fake news, post-truth and clickbait. Páginas de educación, 11(1), 6–22. https://doi.org/10.22235/pe.v11i1.1551
  • Park, S. (2017). Digital capital. Palgrave Macmillan.
  • Pedder, D., Opfer, V., McCormick, R., & Storey, A. (2010). Schools and continuing professional development in England – State of the nation research study: Policy context, aims and design. The Curriculum Journal, 21(4), 365–394. https://doi.org/10.1080/09585176.2010.529637
  • Polizzi, G. (2020). Digital literacy and the national curriculum for England: Learning from how the experts engage with and evaluate online content. Computers & Education, 152(103859), 103859. https://doi.org/10.1016/j.compedu.2020.103859
  • Porat, E., Blau, I., & Barak, A. (2018). Measuring digital literacies: Junior high-school students’ perceived competencies versus actual performance. Computers & Education, 128, 23–36. https://doi.org/10.1016/j.compedu.2018.06.030
  • Procter, R. (2015). Teachers and school research practices: The gaps between the values and practices of teachers. Journal of Education for Teaching, 41(5), 464–477. https://doi.org/10.1080/02607476.2015.1105535
  • Punter, R. A., Meelissen, M. R., & Glas, C. A. (2017). Gender differences in computer and information literacy: An exploration of the performances of girls and boys in ICILS 2013. European Educational Research Journal EERJ, 16(6), 762–780. https://doi.org/10.1177/1474904116672468
  • Redecker, C. (2017). European framework for the digital competence of educators. Y. Punie (Ed.). Publications Office of the European Union.
  • Risquez, A., O’Dwyer, M., Ledwith, A., & Matlay, H. (2011). Technology enhanced learning and plagiarism in entrepreneurship education. Education & Training (London), 53(8/9), 750–761. https://doi.org/10.1108/00400911111185062
  • Robertson, P., Sammons, P., Thomas, S., & Mortimore, P. (2001). The research design and methods. In J. MacBeath & P. Mortimore (Eds.), Improving school effectiveness (pp. 37–50). Open University Press.
  • Ronchi, E., & Robinson, L. (2019). Child protection online. In T. Burns & F. Gottschalk (Eds.), Educating 21st century children emotional well-being in the digital age (pp. 185–200). OECD Publishing.
  • Schuck, S., Kearney, M., & Burden, K. (2017). Exploring mobile learning in the third space. Technology, Pedagogy and Education, 26(2), 121–137. https://doi.org/10.1080/1475939X.2016.1230555
  • Schwab, K. (2016). The fourth industrial revolution. Penguin Random House.
  • Shen, C., Kasra, M., Pan, W., Bassett, G. A., Malloch, Y., & O’Brien, J. F. (2018). Fake images: The effects of source, intermediary, and digital media literacy on contextual assessment of image credibility online. New Media & Society, 21(2), 438–463. https://doi.org/10.1177/1461444818799526
  • Shillair, R., Cotten, S. R., Tsai, H. S., Alhabash, S., LaRose, R., & Rifon, N. J. (2015). Online safety begins with you and me: Convincing internet users to protect themselves. Computers in Human Behavior, 48, 199–207. https://doi.org/10.1016/j.chb.2015.01.046
  • SLO. (2019). Learning pathways for digital literacy. https://slo.nl/vakportalen/vakportaal-digitale-geletterdheid/leerlijnen-digitale-geletterdheid/
  • Smith, J. K., Given, L. M., Julien, H., Ouellette, D., & DeLong, K. (2013). Information literacy proficiency: Assessing the gap in high school students’ readiness for undergraduate academic work. Library & Information Science Research, 35(2), 88–96. https://doi.org/10.1016/j.lisr.2012.12.001
  • Sureda-Negre, J., Comas-Forgas, R., & Oliver-Trobat, M. F. (2015). Academic plagiarism among secondary and high school students: Differences in gender and procrastination. Comunicar (Huelva, Spain), 22(44), 103–111. https://doi.org/10.3916/C44-2015-11
  • Talwar, S., Dhir, A., Singh, D., Virk, G. S., & Salo, J. (2020). Sharing of fake news on social media: Application of the honeycomb framework and the third-person effect hypothesis. Journal of Retailing and Consumer Services, 57, 102197. https://doi.org/10.1016/j.jretconser.2020.102197
  • Toepoel, V., Das, M., & Van Soest, A. (2009). Design of web questionnaires: The effects of the number of items per screen. Field Methods, 21(2), 200–213. https://doi.org/10.1177/1525822x08330261
  • Torrance, H. (2012). Formative assessment at the crossroads: Conformative, deformative and transformative assessment. Oxford Review of Education, 38(3), 323–342. https://doi.org/10.1080/03054985.2012.689693
  • UDIR: Norwegian Directorate for Education and Training. (2020). Core curriculum – Values and principles for primary and secondary education. https://www.udir.no/in-english/
  • UNESCO. (2018). A draft report on a global framework of reference on digital literacy skills for indicator 4.4.2: Percentage of youth/adults who have achieved at least a minimum level of proficiency in digital literacy skills. https://unesdoc.unesco.org/ark:/48223/pf0000265403
  • UNESCO. (2019). Recommendations on assessment tools for monitoring digital literacy within UNESCO’s Digital Literacy Global Framework. https://www.edu-links.org/sites/default/files/media/file/ip56-recommendations-assessment-tools-digital-literacy-2019-en.pdf
  • United Nations. (2015). Transforming our world: The 2030 Agenda for sustainable development. https://sustainabledevelopment.un.org/post2015/transformingourworld
  • Vaccari, V., & Gardinier, M. P. (2019). Toward one world or many? A comparative analysis of OECD and UNESCO global education policy documents. International Journal of Development Education & Global Learning, 11(1), 68–86. https://doi.org/10.18546/IJDEGL.11.1.05
  • Volman, M. (1997). Gender-related effects of computer and information literacy education. Journal of Curriculum Studies, 29(3), 315–328. https://doi.org/10.1080/002202797184062
  • VOV: Flanders Education and Training. (2017). Flemish strategic plan for literacy 2017–2024. https://www.expoo.be/sites/default/files/atoms/files/StrategischPlanGeletterdheid2017-2024.pdf
  • Wallis, R., & Buckingham, D. (2019). Media literacy: The UK’s undead cultural policy. International Journal of Cultural Policy: CP, 25(2), 188–203. https://doi.org/10.1080/10286632.2016.1229314
  • World Economic Forum. (2020). Global gender gap report 2020. World Economic Forum. Informit Analysis and Policy Observatory (APO). https://search.informit.org/documentSummary;res=APO;dn=272071
  • Zeegers, M., & Barron, D. (2010). Gatekeepers of knowledge. Chandos.
  • Zimerman, M. (2012). Plagiarism and international students in academic libraries. New Library World, 113(5/6), 290–299. https://doi.org/10.1108/03074801211226373

Appendix 1

Mapping the EU Digital Competence Framework (2017) against Domain A of the Global Competence Framework (2016)