853
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Embedded approaches to academic literacy development: a systematic review of empirical research about impact

ORCID Icon & ORCID Icon
Received 25 Nov 2023, Accepted 07 May 2024, Published online: 20 May 2024

ABSTRACT

This systematic literature review identifies evidence used to justify embedded approaches to academic literacy development. This inquiry is of direct relevance to widening participation in higher education and persistent inequity with completion rates for linguistically and culturally diverse student groups. Using the PRISMA-P checklist, 20 studies were included. Analysis focused on their research designs, types of evidence presented, pedagogic practices implemented, and journal choice. Findings show that research designs often involved questionnaire, interview, and focus group data to generate insights about student and staff perceptions. Descriptions of pedagogic practices were brief and not always related to claims about impact, and publishing targeted a variety of disciplines. These findings highlight the need for research teams with discipline knowledge and literacy knowledge so that evidence about the impacts of embedded approaches on academic performance are included, as well as complementary publications that detail the pedagogic practices contributing to change in academic performance.

Introduction

This systematic review contributes to the field of research into the development of academic literacy through learning discipline-specific knowledge, often referred to as an embedded approach (e.g. Wingate Citation2018). Proponents of embedded approaches argue for teaching academic literacy within disciplinary contexts because it is tailored to specific assessments (i.e. it is not an extra or optional task), aims to benefit all students (i.e. it is not remedial), and does not assume that students will (i) self-identify that they need help or (ii) take the initiative to access help from outside their programme (e.g. Macnaught et al. Citation2024; Wingate Citation2015).

Such an approach is relevant to higher education’s response to issues inherent in widening participation (Younger et al. Citation2019). In Aotearoa New Zealand, the 2022 course completion rates for Māori (72%) and Pacific students (69%) were again considerably lower than for Asian (85%) and European (84%) students (Ministry of Education Citation2023). Similar trends of inequity also persist in the United Kingdom (see Office for Students Citation2023) and in the United States (see National Student Clearinghouse Research Center Citation2022). These gaps are not new, and the literacy and language challenges that university study poses to diverse student cohorts have long been documented. In 1965, for example, Bourdieu and Passeron (Citation1965) observed that ‘academic discourse is no one's mother tongue, but the children of middle-class families with mastery of the standard language find it considerably easier’ (cited in Hyland and Shaw Citation2016, 5).

One traditional solution for persistent inequity has been generic academic support practices that rely on students self-accessing – or being referred to – centralised remedial provisions outside of their disciplinary contexts (Arkoudis and Harris Citation2019). This has consistently been shown to yield low levels of student engagement (Harris Citation2016) as they perceive such offerings to be irrelevant (Chanock et al. Citation2012) or stigmatising (Turner Citation2004). It is therefore unsurprising that the degree programme completion rates remain unjust. Where embedded approaches to academic literacy development fundamentally differ is that they bring teaching about language and literacy to where students already are. Put another way, academic literacy development is part of the core business of teaching and learning in a discipline. This pedagogic positioning as ‘core’ is vital because the literacy knowledge that students need is not ‘common sense’, rather, it is highly specialised, discipline-specific knowledge that emerges ‘from the values and structure of the target field’ (Boughey and McKenna Citation2021, 67).

In support of an embedded approach, there are robust theoretical arguments about the nature of language and language learning. These are often discussed in relation to issues of social justice and more equitable learning outcomes. Perspectives on academic literacy development vary depending on the tradition and theoretical lens through which it is viewed. For example, in the tradition of Systemic Functional Linguistics, learning is viewed as ‘a semiotic process: learning is learning to mean, and to expand one’s meaning potential’ (Halliday Citation1993, 113). From this perspective, all students in higher education – including those from marginalised groups – are learning how to make meaning within their chosen disciplines; being successful involves understanding which meanings are valued and why. A similar concern with access and equity is evident in the notion of Discourses, with a capital ‘D’ (Gee Citation1996). A Discourse is the wider social context wherein members of a discourse community communicate in ways that are valued by that community. In the higher education context, lecturers are the senior members of discourse communities in their disciplines of expertise and are therefore responsible for apprenticing new members (i.e. their students) into those discourse communities. From the ideological standpoint of Academic Literacies (Lea and Street Citation1998), the goal is for students to not just access and join those discourse communities but become part of their evolution through active participation and critique (Lea and Street Citation2006). Embedded approaches are a way to shift such ideological goals from mere aspiration to practical implementation. By design, they reach all students within a cohort and target the literacy and language demands of specific discourse communities. This is something that generic and ‘self-help’ approaches cannot achieve.

While these theoretical and ideological arguments have been used to justify the adoption of embedded approaches to academic literacy development for three decades, generic provision persists (Macnaught et al. Citation2024). Impediments to more rapid change may be insufficient articulation of the pedagogic practices employed (Wingate Citation2015), and a lack of established research methodologies for investigating the impacts of embedding (Bassett Citation2022). This makes it difficult for teachers and researchers working in higher education contexts to replicate, adapt or critique teaching interventions aimed at improving inequity. If embedding is to be embraced as a core part of responding to issues of widening participation, such as unjust program completion rates, then higher education leaders and those working in academic literacy development need to understand the aforementioned arguments that academic literacy is not common sense and should not be taught in a decontextualised way. With this shared understanding and shared goal of addressing inequity, we can focus on how best to embed academic literacy, critiquing approaches and articulating the teaching practices that we attribute to positive change.

To establish what current evidence exists, we conducted a systematic literature review of research that reports on the impacts of embedded approaches. Specifically, we sought to answer four questions:

  1. What are the research designs for studies that show the impact of embedded approaches to academic literacy development?

  2. What types of impacts are reported on?

  3. What pedagogic practices are described in studies that report on the impact of embedding?

  4. Where is research about the impact of embedding published?

Method

Systematic review protocols

A systematic review approach was appropriate for investigating embedded practices due to the current diversity of the research traditions in education and the terminology used therein (Bearman et al. Citation2012). In formulating our review protocol, we used the PRISMA-P checklist (Shamseer et al. Citation2015) to ensure rigour in the design of the inclusion criteria, search strategy, study selection, data extraction, and data analysis.

Inclusion and exclusion criteria

The inclusion criteria were: (i) articles about embedded academic literacy teaching; (ii) empirical studies that reported evidence of impact; (iii) studies of tertiary education; (iv) peer-reviewed journal articles; (v) articles written in English. As outlined in the introduction, we used a broad working definition of an embedded approach to academic literacy development, namely teaching related to the literacy demands of assessment tasks that students need to complete as part of their courses and programmes. To include as many relevant studies as possible, we did not set a time range. Articles that included only theoretical argumentation and/or sharing of pedagogical practices without accompanying empirical findings about impact were excluded.

Search strategy

During September, 2022, we conducted a comprehensive literature search of five databases to which we had institutional access: Education Resources Information Center, Sage Journals, Scopus, Taylor & Francis Online, and Wiley Online Library. As embedded academic literacy research crosses multiple disciplines and varies in its use of terminology, we selected 15 search terms grouped into four search concepts, as shown in .

Table 1. Search concepts.

Search term selection was based on our knowledge of published research about embedding. After a pilot search, we discounted ‘embedded’ and ‘embedding’ as search terms because of their widespread use in contexts unrelated to academic literacy teaching. The rationale for including ‘English for Academic Purposes’ and ‘Academic literacies’ as separate searches was that excluding either may have resulted in missing relevant studies because they are distinct areas of research. For the same reason, we included the United States terms ‘Writing across the curriculum’ and ‘Writing in the disciplines’.

We searched abstracts using four combinations of search concepts: 1 & 3, 1 & 4, 2 & 3, 3 & 4. This was in response to the prolific number of results generated during the pilot search when using one concept. For example, using only Concept 1 in Taylor & Francis Online yielded 1,740 results, whereas combining it with Concept 3 yielded 272 because irrelevant results from primary and secondary education levels were excluded.

Study selection

After removal of duplicates, each researcher independently screened the titles of approximately half of the remaining publications for relevance. Each researcher then independently screened half the abstracts followed by the main texts of the remaining publications. At regular intervals, the researchers discussed recurrent reasons for exclusion and any borderline decisions, which were often due to variations in how articles classified embedded teaching.

Quality appraisal

For included studies, we used one of five quality appraisal tools. These included the Critical Appraisal Skills Programme (CASP) checklists (CASP Citation2022) for randomised controlled trials, case control studies, cohort studies, and qualitative studies. For mixed methods studies, we used the Mixed Methods Appraisal Tool (Hong et al. Citation2018). Each researcher independently appraised the included studies with consensus reached after discussing disagreements. There was no need to consult a third party.

Data extraction

The focus of the research questions led us to identify four broad categories to investigate within the selected studies: (i) research designs; (ii) educational impacts; (iii) pedagogic practices; (iv) and publication choices. Through close reading of the texts, we identified sub-categories to represent the more detailed data that was recurrent across the studies. We populated an Excel spreadsheet with detailed data in each sub-category.

Data analysis

Further close reading of the texts identified sub-categories. For instance, in the sub-category of teaching focus, the following more detailed coding emerged: writing for assessments; reading for assessments; study skills; communication skills; critical thinking skills; and not specified.

For the sub-category of research approach, the coding emerged differently. Here, we used the terms provided by the authors. For example, authors stated that their study used a mixed methods approach or constituted a case study, etc. This research terminology formed specific codes. We chose to draw on authors’ identification of research approaches for three reasons: (i) some studies presented data that were part of wider research projects, but not all parts were described; (ii) the studies were written by authors from a range of disciplines, and so there could be variation in meanings assigned to the research terminology; and (iii) sometimes the overall research approach was not stated, but could be inferred from details about data collection and treatment. In other words, there was not consistent information available for us to re-classify the stated research approaches.

The sub-categories for publication also emerged differently. These were derived from analysing the ‘about’ pages of each journal, such as the aims and scope page of Teaching in Higher Education (Aims and scope 2022). In this example, wording indicates that the journal has a broad audience of higher education teachers and researchers. Our full coding scheme is represented in .

Table 2. Coding scheme.

Both researchers conducted the coding and analysis independently. Coding was compared and then any variation documented using the comment function in Excel. Meetings between both researchers finalised the codes. Where appropriate, categories were double coded. For example, if one study reported on multiple types of impact, then each type was identified/coded.

We then conducted a frequency count of each code to generate findings about its prevalence in the studies. The small data set combined with vast variation in research designs (particularly in data types and treatment) made the data less suitable for meta-analytic procedures. Overall, this process of forming categories, sub-categories and further classifying instances with coding enabled us to clearly organise data and generate specific findings in relation to our research questions.

Results

Literature search

The initial search identified 3533 publications, which was reduced to 2131 by removing duplicates. Once we applied the inclusion criteria, this number further decreased as we reviewed titles (n = 147), abstracts (n = 42), and main texts (n = 26). Common reasons for exclusion were that publications were not about embedding (n = 62), reported on student perceptions of challenges they face rather than perceptions of interventions (n = 14), presented only theoretical arguments about academic literacy development (n = 9), or focused on sharing pedagogic practices without reporting their impact (n = 4). Citation searching identified ten more publications, of which six met the inclusion criteria (see ).

Figure 1. PRISMA flowchart of the literature search process.

Figure 1. PRISMA flowchart of the literature search process.

Quality appraisal

Thirty-two studies were quality appraised (inter-rater agreement 84%), with consensus reached after discussion of five. Twelve studies were excluded at this point: nine because their stated findings were not sufficiently derived from the data and/or their interpretations of their findings were not sufficiently substantiated; and three studies did not provide a clear explanation of their methodology. This resulted in 20 studies being selected for analysis.

General characteristics of included studies

The 20 included studies (see supplementary material) spanned the publication years of 2005–2021, indicating sustained interest in embedded practices. Most studies were undertaken in Australia (n = 10) and the United Kingdom (n = 5). Other countries/regions were South Africa (n = 2), the United States (n = 1), New Zealand (n = 1), and the Middle East (n = 1). The most common level of study was first year undergraduate (n = 11). Other studies involved multiple levels within an undergraduate programme (n = 3), postgraduate coursework (n = 2), undergraduate second year (n = 1), a preparatory programme and undergraduate levels (n = 1), and two studies did not specify a level.

Studies were conducted in a wide range of disciplines with Nursing having the highest number (n = 3). The next most frequent disciplines were Applied Linguistics (n = 2) and Sports Science (n = 2). There were also multidisciplinary papers in Health Sciences (n = 2) and Applied Science and Engineering (n = 1). Two studies reported on teaching across multiple disciplines, but findings were not specified by discipline. The following disciplines were all represented once: Biological Sciences, Business, Engineering, History, Law, Management, Medicine, and Pharmacy. This broad distribution suggests that academic literacy development is seen as relevant across many courses and programmes in tertiary institutions, particularly in Australia and the UK, with an emphasis on students who are beginning their studies.

Findings

Research question 1: what are the research designs for studies that show the impact of embedded approaches to academic literacy development?

Across the 20 included studies, seven research approaches were used. Given that research approaches have fundamental epistemological differences, and this determines how evidence is selected, treated, and valued, we have made no overarching judgments about research design and evidence types (see research question 2). Instead, we identify how the researchers described their research designs and consider the extent to which multiple data types are used to support and strengthen claims about embedded practices.

As shown in , mixed methods studies were most common (n = 8). Most other designs featured only once. The action research study (Rose et al. Citation2008), design-based research study (Pessoa, Mitchell, and Miller Citation2018) and quasi-experimental study (O’Flaherty and Costabile Citation2020) combined qualitative and quantitative data types. There was only one purely quantitative study, which was a randomised controlled trial (Salamonson et al. Citation2010). Three studies were purely qualitative: one (Devereux et al. Citation2018) used qualitative discourse analysis; and two qualitative narrative methodology studies were from the same author (Jacobs Citation2005, Citation2010). Five publications did not specify an overall research design but did specify data types and treatment.

Table 3. Research approaches and data types.

Of the eight mixed methods studies, only four (Hillege et al. Citation2014; Hunter and Tse Citation2013; Noakes Citation2021; Wingate, Andon, and Cogo Citation2011) triangulated data from questionnaires, interviews or focus groups with data that showed evidence of student performance through the use assessment grades and writing samples. Andrews, Mehrubeoglu, and Etheridge (Citation2021) triangulated questionnaire data with attendance rates, Divan, Bowman, and Seabourne (Citation2015) did so using plagiarism rates, and Baker et al. (Citation2021) used video views. Paul and Gilbert (Citation2013) presented qualitative and quantitative questionnaire data only. Due to the absence of data about student performance, half of the mixed methods studies are at risk of being viewed as less persuasive in the sense that claims about student experiences are not clearly supported by or related to other types of evidence which may further strengthen the claim. These findings highlight the opportunity for future research to complement data about perceptions with data about impact (such as assessment grades) and more emergent changes, such as identifying new features that may gradually appear in student writing.

Three other studies that specified a research design also combined qualitative and quantitative data ­ – all reporting impacts on student writing and/or performance. The action research (Rose et al. Citation2008) and design-based research (Pessoa, Mitchell, and Miller Citation2018) studies used writing samples qualitatively and quantitatively. Although their research designs differed, these studies used Systemic Functional Linguistics (Halliday Citation1993) as a framework for identifying specific changes in student writing and performance. The quasi-experimental study (O’Flaherty and Costabile Citation2020) combined student online interactions (about the impact of the intervention on their learning) with assessment grades.

The one purely quantitative study (Salamonson et al. Citation2010, 418) was a randomised controlled trial that reported on assessment grades. Although the authors acknowledge their results are ‘underpowered’ (only 23 of the 62 students in the intervention group attended the intervention), there was a statistically significant difference in mean assessments scores: in the intervention group, students who attended scored higher than both the control group and non-attendees in the intervention group.

Regarding the three purely qualitative publications, only one used more than one data type. Devereux et al. (Citation2018) used writing samples to report on student use of writing features. The authors relate student use of writing features to the those that are taught in the intervention. However, as the authors note, the absence of pre-intervention data or other methods to triangulate the writing samples data, as well as the small sample (n = 9), limits the reliability of the findings.

Of the five studies that did not specify an overall research design, only one (Tribble and Wingate Citation2013) triangulated questionnaire data about student perceptions with writing sample data that showed evidence of student performance. Chanock et al. (Citation2012) reported improvements in assessment grades over a three-year period, but that data was not triangulated. It is also unclear what the assessments were in each year, and cohort sizes were not provided. Smith et al. (Citation2012) reported on student focus group data. They also included assessment grade data, but these were awarded for student participation in an online module, with no evidence of impacts on student performance other than one excerpt from one student writing sample. Wette (Citation2019) triangulated interview data about student perceptions (questionnaire and online discussion board), subject specialists, subject tutors and literacy specialists. Groves et al. (Citation2013) reported on student perception data from one focus group.

As shown in , questionnaires, interviews and focus groups yielded only student/staff perceptions of impact or staff perceptions of their own academic identities. While writing sample data also yielded staff perceptions, it was used to interpret academic performance and specific changes in student writing after interventions.

Figure 2. Data types used for claims about impact.

Figure 2. Data types used for claims about impact.

This focus on perceptions is noteworthy because most of the evidence in support of embedded practices (56% of all impact data) is based only on what participants thought/felt. By contrast, writing samples and assessment grades accounted for only 27% of all impact data. This is currently not a persuasive collection of evidence in support of embedded practices if compared to, for example, Hattie, Biggs, and Purdie’s (Citation1996) meta-analysis of 51 studies that investigated the effects of learning skills interventions on student learning. Of the 270 impacts reported, 84% (n = 226) related to student performance or student use of specific skills, while just 16% of the impacts related to participants’ perceptions.

Research question 2: what types of educational impacts are reported on?

Eleven types of educational impact were reported. As shown in , these fell into four groupings: perceptions of students/staff, academic performance, student metrics such as engagement and attendance, and the academic identity of staff.

Table 4. Types and frequencies of educational impacts.

Of the fifty-two impacts reported, student/staff perceptions about the positive effects of embedded practices were by far the most common, as shown in . Perceptions data accounted for almost double that of academic performance and student writing combined, while the use of other student metrics was negligible.

Figure 3. Types of reported impacts.

Figure 3. Types of reported impacts.

Reporting of student and/or staff perceptions of impact was often not triangulated with evidence of change, such as academic performance (assessment grades) and/or student use of writing features. Of the 14 studies (70% of all those included) that reported on student and/or staff perceptions:

By contrast, seven of the eight studies that reported impacts on academic performance triangulated their findings:

  • four also reported on perceptions (as above), with one of those also reporting on student attendance and rates of completion and discontinuation (Noakes Citation2021),

  • two also reported on student use of writing features (Pessoa, Mitchell, and Miller Citation2018; Rose et al. Citation2008), and

  • Salamonson et al. Citation2010 compared academic performance data between intervention and control groups.

Our stock-take of empirical findings about the educational impact of embedding suggests that evidence to support it is relatively weak. A small number of studies report on what people say is happening, and even fewer report on academic performance. While data about perceptions may provide compelling evidence and generate valued insights about possible impact, our findings show the potential to further strengthen claims with additional sources of data. Beyond overall performance indicators like scores, one meaningful source of impact that future researchers could give greater attention to is new and emergent features that gradually appear in student writing. Students could then be asked about such changes which they may or may not attribute to teaching interventions or other avenues of support. Such triangulation of data (Denzin Citation1989), which could also include learning analytics to track engagement with teaching and learning materials, would respond to Wingate’s (Citation2015) call to make a stronger empirical case for embedding. In this regard, triangulated data to support claims about embedding, rather than the privileging of one data type, such as student perceptions, seems likely to not only serve researchers and practitioners with pedagogic choices, but also other audiences, such as university leadership facing decisions about strategy and resource allocation.

Research question 3: what pedagogic practices are described in studies that report on the impact of embedding?

The 20 studies included details about the class type and timing of teaching. They also usually specified who did the materials design and teaching, and what the teaching focused on. Teaching within tutorials was the most prevalent class type (n = 13), followed by adjunct workshops (n = 5), lectures (n = 4) and the Learning Management System (LMS) (n = 4), and block delivery (n = 3). Class type and timing were not specified in two studies. This distribution highlights that far more teaching (77%) occurred within the core curriculum (lecturers, tutorials, block delivery and the LMS), rather than as an adjunct option, as depicted in . This finding reflects that most included studies related the term ‘embedded’ to teaching that occurred within the regular scheduling of teaching within a course or programme.

Figure 4. The class type and timing of teaching.

Figure 4. The class type and timing of teaching.

The most common configuration of who did the teaching was subject and literacy specialists together (n = 6). Literacy specialists also collaborated to teach with subject tutors (n = 2). In some cases, the subject specialist was also a literacy specialist (e.g. Tribble and Wingate Citation2013; Wingate, Andon, and Cogo Citation2011). In contrast, other studies did not involve teaching collaborations with some utilising only the literacy specialist (n = 2), subject specialist (n = 5) or subject tutors (n = 2). Three studies did not specify who did the teaching.

Findings about who designed teaching materials also reflect that collaboration between subject and literacy specialists was the most common configuration (n = 8). In one study, literacy specialists co-designed materials with subject tutors. Individual design occurred to a lesser extent, undertaken by subject specialists (n = 4) or literacy specialists (n = 1). Six studies did not specify who designed the teaching materials. This distribution shows that in cases where the creators of teaching materials were specified more than half involved co-design between interdisciplinary collaborators, as represented in .

Figure 5. The design of teaching materials (where designers were identified).

Figure 5. The design of teaching materials (where designers were identified).

These findings about co-design reflect a broader trend towards interdisciplinary collaborations related to the teaching and researching of English for Academic Purposes. As Hyland (Citation2022) reflects, efforts to tailor the teaching of academic literacy to particular groups of students have increasingly involved literacy and subject specialists working together. These collaborations are a response to a growing awareness that ‘students have to take on new roles and engage with knowledge in new ways when they enter university and, eventually the workplace’ (Hyland Citation2022, 203). This literacy knowledge is discipline specific and therefore the prevalence of interdisciplinary co-design where two knowledge bases are brought together is arguably vital to tailoring academic literacy development to specific courses/programmes. In relation to our second research question, an inter-disciplinary research team appears to correspond with studies that include student writing performance as evidence. Specifically, all four of the studies that reported impact on student writing (Devereux et al. Citation2018; Pessoa, Mitchell, and Miller Citation2018; Rose et al. Citation2008; Tribble and Wingate Citation2013) involved a teaching and research collaboration in which at least one member had literacy knowledge.

Regarding what was taught, most of the included studies identified writing for assessments as the main pedagogic focus (n = 14). The next most frequent was reading for assessments (n = 5), followed by study skills (n = 2), critical thinking skills (n = 2), and communication skills (n = 1). Two studies (Jacobs Citation2005, Citation2010) did not specify what was taught because they investigated relationships between academic identity and involvement with embedded practices. This dominant focus on writing and reading for specific assessment tasks is represented in .

Figure 6. The focus of teaching.

Figure 6. The focus of teaching.

A sample of more specific teaching topics within the categories of writing and reading for assessments includes:

  • analysing and discussing examples of specific genres, such as lab reports (Tribble and Wingate Citation2013), historical arguments (Pessoa, Mitchell, and Miller Citation2018), and essays (Devereux et al. Citation2018);

  • co-constructing parts of a genre (Rose et al. Citation2008)

  • processes for generating essays (Noakes Citation2021);

  • guided critical reading of literature (Hillege et al. Citation2014);

  • using evidence from readings to support arguments (Chanock et al. Citation2012);

  • evaluating the warrants for research and claims of authors (Baker et al. Citation2021).

This specificity contrasts with teaching that is not as closely tailored to one assessment task. Examples include teaching study skills, such as time management (e.g. Smith et al. Citation2012), or more foundational literacy skills, such as general reading strategies (e.g. Wette Citation2019), and referencing and plagiarism (e.g. Divan, Bowman, and Seabourne Citation2015). The use of general terms like, ‘skills’, reflects the ongoing influence of pedagogies that value practical and communicative approaches to language and literacy development and which may view literacy development as a set of proficiencies to acquire (see further critique in Boughey and McKenna Citation2021; see a critique and topological positioning of pedagogic approaches to teaching writing in Macnaught Citation2024). In the context of tertiary level literacy development, the broad categorisation of ‘skills’ also points to the integration of more generic literacy skills within courses, particularly for first year undergraduate students or students returning to postgraduate studies.

Other broad categories included critical thinking skills (O’Flaherty and Costabile Citation2020) and communication skills (Paul and Gilbert Citation2013). Although not specified in detail, in both these cases, the teaching of broad skills may still focus on assessment tasks, namely critical thinking skills for case studies, and patient interaction for simulations.

The limited description of pedagogic practices is unsurprising given that the demands of empirical research make extended detail about teaching methodologies less likely. For example, in a relevant article which was not identified through the search strategy, Macnaught and colleagues (Citation2024) summarise their intensive four-year teaching intervention in just one diagram and three paragraphs. While this succinctness is necessary in order to focus on research findings (reductions in assessment task resubmission rates and changes in the teaching practices of subject specialists), attempts to replicate the pedagogic intervention would be difficult. This limitation reflects a trade-off between providing sufficient empirical evidence about the impact of embedded practices and providing sufficient detail about how and why something was taught. Based on the included studies, when impact is reported on, then space for pedagogic description is minimal.

Research question 4: where is research about the impact of embedding published?

Across the 20 studies, articles were either published in journals with a target audience of higher education teachers and researchers (n = 8), discipline-specific educators (n = 6), or language teachers and researchers (n = 6). Sample journal titles include: The International Journal of Higher Education, Nurse Education in Practice, and The Journal of English for Academic Purposes. These results indicate that academic literacy development is not only of interest to literacy specialists; it is also of interest to subject specialists and the broader audience of higher education practitioners, researchers, and policy makers.

This broad interest is unsurprising given that academic literacy development is intertwined with learning the specialist knowledge of a discipline. As Halliday proposes, we are ‘learning language, learning through language, and learning about language’ (Citation1993, 112). From this perspective, the process of learning the specialised ways of making meaning in a discipline is not external or adjunct to engaging with disciplinary knowledge – a fact that subject specialists may be keenly aware of (e.g. see O’Flaherty and Costabile [Citation2020] and Paul and Gilbert [Citation2013] in this review). The extent to which students are managing the literacy demands of their programmes is partially visible in measures of academic performance, such as the grades awarded for specific assessment tasks. It is therefore also unsurprising that broader audiences in higher education are interested in pedagogic initiatives and empirical findings that could influence institutional policies and decision making around supporting students’ academic performance.

However, this relatively even spread of research in an array of journals and for three different target audiences poses challenges for further developing an emerging field of inquiry. Findings suggest that embedded practices are relatively recent with studies that report on impact emerging from approximately 2005. The additional variation in the class type, timing of teaching, focus of teaching, and who does the materials design and teaching indicate that we do not currently all mean the same thing when we refer to an ‘embedded’ approach. If such findings continue to be widely dispersed for varied audiences, then being aware of new empirical findings and drawing on them to cumulatively build knowledge about best practice could remain difficult. As there is increasing breadth in the kinds of discourse communities and literacy issues that are investigated (see Hyland and Jiang Citation2021), more consistent use of the term, embedded, could greatly assist with searching for relevant literature in wide-ranging locations and conducting future reviews of it.

Conclusion

The purpose of this review has been to investigate the reported impacts of embedded practices in tertiary contexts. We have found that there is currently limited literature providing empirical evidence about impact. Even when such evidence is provided, claims about impact are predominantly based on perceptions of students and staff – generated from questionnaire, interview and focus group data. Within the same study, these perceptions tend to not be consistently supported by the triangulation of additional data about change, such as correspondences between what students say about benefit or improvement and shifts in assessment grades or emergent changes in the writing features that students deploy. In short, most evidence is currently limited to what those involved in an embedded approach say about their experiences. Our concern is that without additional and complementary forms of evidence to strengthen claims, findings in support of academic literacy development may not be viewed as persuasive. These choices about evidence types invite further inquiry into current knowledge making practices within this relatively new research area of tertiary level academic literacy development. The findings draw attention to (i) what we use to support claims and why, (ii) what we want to use the evidence for, and (iii) the extent to which our sources of evidence are valued by decision makers in our institutions who are responding to higher education issues such as widening participation.

In addition to limited types of evidence, reports of impact appear to be at the expense of detailed accounts of pedagogic practices. We are therefore not in a position to identify specific teaching practices for academic literacy development that may benefit students the most. However, our investigation of the class type, the timing of teaching, and the configuration of who designed and taught materials has further identified common characteristics of an embedded approach. Building on the definition provided in the introduction section to include findings from this review, these characteristics are:

  1. teaching related to the literacy demands of assessment tasks that students need to complete as part of their courses and programmes;

  2. teaching about academic literacy development that spans first year undergraduate programmes right through to postgraduate coursework programmes;

  3. teaching that occurs within the regular scheduling of teaching sessions within a course or programme; and

  4. teaching which draws on knowledge of the discipline and knowledge of discipline-specific literacy

We propose that some consensus about this definition of embedded practices is vital for building a body of knowledge, as is continuing to articulate that academic literacy development involves uncommon sense ways of meaning and needs to be explicitly taught (Martin and Rose Citation2007). This is particularly important given our findings about the wide-ranging journals in which relevant research is published. Our findings show that these journals have target audiences ranging from a broader audience of higher education teachers and researchers to the more specific audiences of discipline-specific educators or language teachers and researchers. They need to hear a consistent message about what embedded literacy practices are and why they are so important.

This diversity in target audiences points to opportunities for the design of research teams. Teams with knowledge bases that include specialist literacy knowledge and knowledge of the discipline may be best positioned to generate complementary publications that can be tailored to varying audiences. By this we mean publications that include empirical findings about impact for broader and more targeted audiences (such as tertiary leadership or educational linguists analysing texts), and corresponding publications that provide more detailed descriptions of what the specific impacts are and the pedagogic practices that have contributed to them. Such ‘paired publications’ may better support others to adapt embedded practices to their teaching and learning contexts and critique possible impact. Additionally, our findings about study designs indicate that research teams with both disciplinary knowledge and discipline-specific literacy knowledge are well placed to generate claims that include changes to academic performance. This seems to be because literacy knowledge provides a base from which to identify and evaluate specific changes in student writing ­­or other modes of communication required by assessment tasks.

To address the central issue of generating evidence that could more persuasively show impact, our findings highlight five specific areas of future research need:

  • Continuing to articulate that academic literacy is not common sense and needs to be woven into the core curriculum and explicitly taught;

  • Building consensus about what constitutes an embedded approach to academic literacy development;

  • Generating claims about impact that may include, but are not limited to, evidence related to staff and student perceptions; far more evidence related to the contribution of teaching to changes in academic performance is needed;

  • Research teams involving interdisciplinary collaborations where specialist knowledge of the discipline and knowledge of discipline-specific literacy are both present; and

  • Publishing pairs of research contributions: empirical findings about impact and corresponding detailed descriptions of pedagogic practices.

We hope that this call to action will motivate research that contributes to building a more rigorous body of knowledge about the benefits and limitations of embedded approaches to academic literacy development. In turn, such rigour can better inform decision making about specific strategies to shift the long-standing and unjust differences in completion rates for more marginalised students.

Disclosure statement

No potential conflict of interest was reported by the author(s).

References

  • Andrews, C.D.M., M. Mehrubeoglu, and C. Etheridge. 2021. Hybrid model for multidisciplinary collaborations for technical communication education in engineering. IEEE Transactions on Professional Communication 64, no. 1: 52–65.
  • Arkoudis, Sophie, and Anne Harris. 2019. EALD students at university level: Strengthening the evidence base for programmatic initiatives. In Second Handbook of English Language Teaching, edited by Xuesong Gao. Springer.
  • Baker, S., C. Field, J.-S. Lee, and N. Saintilan. 2021. Supporting students’ academic literacies in post-COVID-19 times: Developing digital videos to develop students’ critical academic Reading practices. Journal of University Teaching and Learning Practice 18, no. 4: 35–49.
  • Bassett, M. 2022. Learning advisor & lecturer collaborations to embed discipline-specific literacies development in degree programmes. Doctoral thesis, University of Auckland. https://hdl.handle.net/2292/58275.
  • Bearman, M., C.D. Smith, A. Carbone, S. Slade, C. Baik, M. Hughes-Warrington, and D.L. Neumann. 2012. Systematic review methodology in higher education. Higher Education Research & Development 31, no. 5: 625–40.
  • Boughey, C., and S. McKenna. 2021. Understanding higher education: Alternate perspectives. Cape Town: African Minds.
  • Bourdieu, P., and J. -C Passeron. 1965. “"Introduction: Langage et rapport au langage dans la situation pédagogique.” In Rapport Pédagogique et Communication, edited by P. Bourdie, J.-C. Passeron, and M. M. de Saint Marti. Paris: Mouton.
  • Chanock, K., C. Horton, M. Reedman, and B. Stephenson. 2012. Collaborating to embed academic literacies and personal support in first year discipline subjects. Journal of University Teaching and Learning Practice 9, no. 3: 18–31.
  • Critical Appraisal Skills Programme. 2022. CASP checklists. https://casp-uk.net/casp-tools-checklists/.
  • Denzin, N.K. 1989. The research act. Englewood Cliffs: Prentice-Hall.
  • Devereux, L., K. Wilson, A. Kiley, and M. Gunawardena. 2018. The proof of the pudding: Analysing student written texts for evidence of a successful literacy intervention. Journal of Academic Language and Learning 12, no. 1: A239–A253. https://journal.aall.org.au/index.php/jall/article/view/525.
  • Divan, A., M. Bowman, and A. Seabourne. 2015. Reducing unintentional plagiarism amongst international students in the biological sciences: An embedded academic writing development programme. Journal of Further and Higher Education 39, no. 3: 358–78.
  • Gee, J.P. 1996. Social linguistics and literacies: Ideologies in discourses. London: Routledge.
  • Groves, M., K. Leflay, J. Smith, B. Bowd, and A. Barber. 2013. Encouraging the development of higher-level study skills using an experiential learning framework. Teaching in Higher Education 18, no. 5: 545–56.
  • Halliday, M.A.K. 1993. Towards a language-based theory of learning. Linguistics and Education 5: 93–116.
  • Harris, Anne. 2016. Integrating written communication skills: working towards a whole of course approach. Teaching in Higher Education 21, no. 3: 287–300. http://dx.doi.org/10.1080/13562517.2016.1138456.
  • Hattie, J., J. Biggs, and N. Purdie. 1996. Effects of learning skills interventions on student learning: A meta-analysis. Review of Educational Research 66, no. 2: 99–136.
  • Hillege, S.P., J. Catterall, B.L. Beale, and L. Stewart. 2014. Discipline matters: Embedding academic literacies into an undergraduate nursing program. Nurse Education in Practice 14, no. 6: 686–91.
  • Hong, Q.N., P. Pluye, S. Fàbregues, G. Bartlett, F. Boardman, M. Cargo, P. Dagenais, et al. 2018. Mixed methods appraisal tool (MMAT), version 2018. Canadian Intellectual Property Office. http://mixedmethodsappraisaltoolpublic.pbworks.com/w/file/fetch/139355532/MMAT_2018_criteria_WORD_2018-08-08.docx.
  • Hunter, K., and H. Tse. 2013. Making disciplinary writing and thinking practices an integral part of academic content teaching. Active Learning in Higher Education 14, no. 3: 227–39.
  • Hyland, K. 2022. English for specific purposes: What is it and where is it taking us? ESP Today 10, no. 2: 202–20.
  • Hyland, K., and F. Jiang. 2021. Delivering relevance: The emergence of ESP as a discipline. English for Specific Purposes 64: 13–25.
  • Hyland, K., and P. Shaw. 2016. The Routledge handbook of English for academic purposes. London: Routledge.
  • Jacobs, C. 2005. On being an insider on the outside: New spaces for integrating academic literacies. Teaching in Higher Education 10, no. 4: 475–87.
  • Jacobs, C. 2010. Transgressing disciplinary boundaries: Constructing alternate academic identities through collaboration with ‘the other’. African Journal of Research in Mathematics, Science and Technology Education 14, no. 2: 110–20.
  • Lea, M.R., and B. Street. 1998. Student writing in higher education: An academic literacies approach. Studies in Higher Education 23, no. 2: 157–72.
  • Lea, M.R., and B. Street. 2006. The academic literacies model: Theory and applications. Theory into Practice 45, no. 4: 368–77.
  • Macnaught, L. 2024. Writing with students: New perspectives on collaborative writing in EAP contexts. London: Bloomsbury Academic.
  • Macnaught, L., M. Bassett, V. van der Ham, J. Milne, and C. Jenkin. 2024. Sustainable embedded academic literacy development: The gradual handover of literacy teaching. Teaching in Higher Education 29 (4): 1004–1022.
  • Martin, J.R., and D. Rose. 2007. Working with discourse: Meaning beyond the clause. London: Continuum.
  • Ministry of Education. 2023. Course completion rates – 2022. New Zealand Government. https://www.educationcounts.govt.nz/statistics/achievement-and-attainment.
  • National Student Clearinghouse Research Center. 2022. Completing college: National and state reports. https://nscresearchcenter.org/wp-content/uploads/Completions_Report_2022.pdf.
  • Noakes, S. 2021. Reality check: Supporting law student diversity and achievement through a novel model of support and assessment of academic literacy: Student perceptions, retention and performance. The Law Teacher 55, no. 3: 337–63.
  • Office for Students. 2023. The office for students annual review 2022. https://www.officeforstudents.org.uk/publications/annual-review-2022/a-statistical-overview-of-higher-education-in-england/.
  • O’Flaherty, J., and M. Costabile. 2020. Using a science simulation-based learning tool to develop students’ active learning, self-confidence and critical thinking in academic writing. Nurse Education in Practice 47: 102839.
  • Paul, A., and K.M. Gilbert. 2013. Interdisciplinary collaboration for clinical skills acquisition in the transition to ward-based learning. Journal of Applied Linguistics and Professional Practice 8, no. 1: 89–119.
  • Pessoa, S., T.D. Mitchell, and R.T. Miller. 2018. Scaffolding the argument genre in a multilingual university history classroom: Tracking the writing development of novice and experienced writers. English for Specific Purposes 50: 81–96.
  • Rose, D., M. Rose, S. Farrington, and S. Page. 2008. Scaffolding academic literacy with indigenous health sciences students: An evaluative study. Journal of English for Academic Purposes 7, no. 3: 165–79.
  • Salamonson, Y., J. Koch, R. Weaver, B. Everett, and D. Jackson. 2010. Embedded academic writing support for nursing students with English as a second language. Journal of Advanced Nursing 66, no. 2: 413–21.
  • Shamseer, L., D. Moher, M. Clarke, D. Ghersi, A. Liberati, M. Petticrew, P. Shekelle, and L.A. Stewart. 2015. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: Elaboration and explanation. British Medical Journal 349: g7647.
  • Smith, J., M. Groves, B. Bowd, and A. Barber. 2012. Facilitating the development of study skills through a blended learning approach. International Journal of Higher Education 1, no. 2: 108–17.
  • Tribble, C., and U. Wingate. 2013. From text to corpus – A genre-based approach to academic literacy instruction. System 41, no. 2: 307–21.
  • Turner, Joan. 2004. Language as Academic Purpose. Journal of English for Academic Purposes 3, no. 2: 95–109. http://dx.doi.org/10.1016/S1475-1585(03)00054-7.
  • Wette, R. 2019. Embedded provision to develop source-based writing skills in a year 1 health sciences course: How can the academic literacy developer contribute? English for Specific Purposes 56: 35–49.
  • Wingate, U. 2015. Academic literacy and student diversity: The case for inclusive practice. Bristol: Multilingual Matters.
  • Wingate, U. 2018. Academic literacy across the curriculum: Towards a collaborative instructional approach. Language Teaching 51, no. 3: 349–64.
  • Wingate, U., N. Andon, and A. Cogo. 2011. Embedding academic writing instruction into subject teaching: A case study. Active Learning in Higher Education 12, no. 1: 69–81.
  • Younger, K., L. Gascoine, V. Menzies, and C. Torgerson. 2019. A systematic review of evidence on the effectiveness of interventions and strategies for widening participation in higher education. Journal of Further and Higher Education 43, no. 6: 742–73.