2,937
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Reflections on conducting rapid reviews of educational research

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Pages 371-390 | Received 11 Oct 2021, Accepted 30 Aug 2022, Published online: 28 Sep 2022

ABSTRACT

Background

Rapid reviews involve a streamlined approach to knowledge synthesis. They are used to identify high-quality evidence for the purpose of informing decisions and initiatives, completed over relatively short timeframes, and have been found to reach conclusions that do not differ extensively from full systematic reviews. Although common in the health sector, rapid reviews are not as widespread in education.

Purpose

This paper reflects on the experiences of conducting a rapid review that applied review guidance from the health sector to a topic situated within education: effective Professional Learning (PL) for school-based educators. Our purpose is not to share the rapid review’s findings: rather, our interest lies in exploring the process of undertaking the review. We sought to investigate the methodological decisions we made for the education context as we carried out the review.

Methods

As part of a large-scale investigation focusing on practitioner use of research evidence in education, we undertook a rapid review to understand what is known about effective PL. Drawing on methodological literature from the health and education sectors, we documented the procedure involved in conducting our rapid review in education. At each step, we reflected on methodological issues encountered, decisions taken and the procedural adjustments we made to align the process to the education context.

Findings

Our reflections identify the key adaptations we made to ensure that review guidance was carefully attuned to the context of the education field and the wider purpose of the review: in our case, to inform an initiative in education. Considerations highlighted by our procedure also included the role of reviewer judgement in quality appraisal and attending to collaborative review team processes. These reflections support the notion that the use of research to inform decisions in education needs to be a dynamic, contextualised, and collaborative process.

Conclusion

Rapid reviews have a crucial part to play in efforts to strengthen evidence-informed practice in the education sector. Our methodological exploration offers insights for those conducting, using, and commissioning rapid reviews to provide systematic and transparent evidence-based guidance for initiatives in education.

Introduction

According to the Organisation for Economic Co-operation and Development (OECD) report Evidence in Education (OECD Citation2007), the education sector has been described as one with low levels of investment in research, and weak links between policy, research, and innovation. More recently, in a comparative review with the health sector, White (Citation2020, 30) reports that the ‘evidence architecture’ within education is still far behind health. However, there are growing expectations internationally that schools and school systems will use research evidence to underpin and inform their improvement efforts (e.g. Australian Government Productivity Commission Citation2016; British Educational Research Association Citation2014; White et al. Citation2021). Indeed, over the last decade there has been ‘a global push to bolster the connections between research and practice’ (Malin et al. Citation2020, 1), with evidence-related developments across varied countries (e.g. Brown and Malin Citation2022).

Against this backdrop, there is increasing interest in the role that research synthesis could play in efforts to enhance the impact of educational research and its influence on education policy and practice (e.g. Hattie, Rogers, and Swaminathan Citation2014). Research synthesis encompasses a range of approaches used to bring together the findings of different research studies. Approaches that have been used in the education sector include systematic reviews, meta-analyses, and best-evidence syntheses, all of which involve an explicit way of identifying, appraising, and synthesising existing research. Andrews and Harlen (Citation2006, 287) argue that research syntheses are essential to ‘any knowledge system that purports to be ‘evidence-based’ or ‘evidence-informed’, including policy, practice, and research’.

There are several organisations worldwide specialising in research syntheses and reviews to inform policy and practice in sectors including education. For example, in the UK, the Evidence for Policy and Practice Information and Co-ordinating Centre (EPPI-Centre) specialises in developing methods for systematic reviews and synthesis, working in research synthesis and use across many fields. Further, the work of the Education Endowment Foundation (EEF) includes the provision of evidence summaries to inform the practice of teachers and senior leaders. Other examples include the What Works Clearinghouse (WWC) in the US, the Best Evidence Synthesis Programme in New Zealand, and the Danish Clearinghouse for Educational Research in Denmark.

One form of research synthesis that is emerging within and beyond education is the rapid review. Rapid reviews are most common in health sciences, and are becoming more common in social sciences, including education (Cooper and Koenka Citation2012). They have been described as reviews of systematic reviews (Newman and Gough Citation2020), with features similar to overviews and umbrella reviews (e.g. Polanin, Maynard, and Dell Citation2017). In the field of health, the Cochrane organisation – ‘a global leader in the production of high quality SRs [systematic reviews] and methodological guidance’ (Garritty et al. Citation2021, 14) – has been active in encouraging engagement with rapid reviews, and in the development of standards for rapid review reporting, through the research activities of the Cochrane rapid reviews methods group (see Garritty et al. Citation2020, Citation2021). Garritty et al. (Citation2020, Citation2021) recommend the following definition of a rapid review, which they cite from Hamel et al. (Citation2021):

… a form of knowledge synthesis that accelerates the process of conducting a traditional systematic review through streamlining or omitting specific methods to produce evidence for stakeholders in a resource-efficient manner. (Hamel et al. (Citation2021) cited in Garritty et al. (Citation2020), 1; Garritty et al. (Citation2021), 15).

As evident from the above, the growth of rapid reviews has led to an increase in methodological discussion and development of guidance around their conduct in health (e.g. Garritty et al. Citation2020, Citation2021) and education (e.g. Polanin, Maynard, and Dell Citation2017). Such discussion and guidance can be seen to highlight the importance of reviewing as a practice and raises important questions about the methodological decisions that are made by reviewers as part of the review process. This paper reflects on our experiences of conducting a rapid review in the field of education. We applied review guidance from the health sector to a topic in education: specifically, we used the Cochrane rapid reviews interim guidance from the Cochrane rapid reviews methods group (Garritty et al. Citation2020) to guide our review procedureFootnote1. We also drew on other methodological literature from the health and education sectors. Our rapid review was focused on the topic of effective Professional Learning (PL) for school-based educators and sought to understand what is known about effective PL to inform the design of a large-scale PL initiative (Cirkony et al. Citation2021). However, our aim here is not to detail the findings of our review: rather, we seek to reflect on the ways in which this review was undertaken, the methodological issues that we encountered, the decisions that were taken, and the ways we applied and adjusted the guidance for the education context and purpose. We see this as building on and contributing to discussions about the practice and experience of conducting different kinds of research synthesis within and beyond education (e.g. Andrews and Harlen Citation2006; Rickinson and May Citation2009).

We begin by outlining the role of rapid reviews in health and their emergence in education. Next, we briefly explain the aims of our rapid review. We then reflect in detail on the conduct of our review (i.e. selection criteria; search strategy; screening, selection and quality appraisal; analysis, synthesis, and reporting; stakeholder engagement). We conclude by discussing the strengths and the limitations of our rapid review process, and consider the implications for the conduct, use, and commissioning of rapid reviews. Overall, we argue that the use of rapid reviews has an important role in synthesising high-quality research to inform decisions and initiatives in education, and their conduct requires considerations related to the complexities within this sector.

Background

The growth of rapid reviews in health and education

Though systematic reviews have a key role in health and medical research, they can often be time-consuming and costly in terms of human and financial resources (Harker and Kleijnen Citation2012; Tricco et al. Citation2015). Rapid reviews have emerged as a more efficient approach to providing timely access to information used for high priority evidence-informed decision-making in these areas, with users ranging from policymakers to clinicians (Garritty et al. Citation2021; Tricco et al. Citation2015; Watt et al. Citation2008). They have been found to be helpful for supporting the direction of an initiative or development of an intervention (Khangura et al. Citation2012). Rapid reviews also have potential for combining generalisable evidence and contextualising findings with the addition of data specific to a local setting (Watt et al. Citation2008). In terms of the quality of findings, the conclusions of rapid reviews have not been found to differ extensively from full systematic reviews (Tricco et al. Citation2015; Watt et al. Citation2008). Rapid reviews can take various forms, including evidence summaries, health technology assessments, and overviews. Without access to such timely evidence, there is the risk that decision-makers may be forced to rely on less robust or comprehensive evidence, such as expert opinion or the findings of a single small study (Tricco et al. Citation2015).

According to Tricco et al. (Citation2015), systematic reviews in the health sector take from six months to two years to conduct, whereas rapid reviews take anywhere between one and twelve months. This is consistent with other studies that have found rapid review timeframes ranging from one week to several months (Watt et al. Citation2008), three weeks to six months (Ganann, Ciliska, and Thomas Citation2010), and between one and six months (Harker and Kleijnen Citation2012). To achieve this efficiency, rapid review methods rely on streamlining processes. This typically involves reducing the inclusion criteria (e.g. publication date and language), limiting the literature search to databases and published literature, and decreasing the number of reviewers (while still having at least two). Additionally, specific methods are used for quality assessment and data extraction, and synthesis of findings such as narrative summaries (Ganann, Ciliska, and Thomas Citation2010; Tricco et al. Citation2015). However, rapid reviews are a relatively new method, with questions remaining around appropriate methodology (Newman and Gough Citation2020). Other limitations have included a lack of explicit descriptions of the methodology, and the scope of the findings (Harker and Kleijnen Citation2012; Watt et al. Citation2008).

Though less common in the field of education, rapid reviews in education show similar potential (and limitations) to those in the health field. Polanin, Maynard, and Dell (Citation2017) explored the use of overviews in education. They found that such reviews have the potential to inform education policy and provide guidance to researchers and practitioners alike. Echoing points made in relation to rapid reviews in the health field, they identified the benefits of such reviews as providing timely and/or up-to-date information and a broad summary of evidence. Further, these reviews enable the comparing and contrasting of findings across multiple reviews, which can help guide practice. As with the issues raised in the health sector, their findings found deficiencies in the methodological reporting, conduct, and synthesis of overviews in education research. Based on these findings, they proposed guidelines for the conduct of overviews in the education sector.

The generation of rapid reviews as part of the educational response to COVID-19 is another indication of their potential significance within education. In the UK, for example, the EEF generated rapid evidence assessments to provide timely information on issues such as remote learning and remote professional development (Education Endowment Foundation Citation2020a, Citation2020b). In their methodology, they assembled specialist teams to review evidence from systematic reviews and meta-analyses, following guidance including the Cochrane rapid reviews interim guidance from the Cochrane rapid reviews methods group (Garritty et al. Citation2020), and on overviews of reviews (Pollock et al. Citation2020) from the Cochrane handbook for systematic reviews of interventions.

Notwithstanding the development and growth in interest in rapid reviews in education, the rapid review process remains an emerging methodology within the education field and there is limited literature on the procedure of using and applying guidance and recommendations from other sectors for rapid reviews in education.

Purpose

This paper seeks to contribute to the methodological literature by reflecting on the conduct of a rapid review that we undertook in the field of education. This process involved using and, where necessary, adapting the interim guidance from the Cochrane rapid reviews methods group (Garritty et al. Citation2020). The guidance presents provisional recommendations, draws from the best practices of high-quality systematic reviews and seeks to address the limitations that have been associated with rapid reviews (see note 2). Our process also involved drawing more widely on methodological literature from the health and education sectors. In the next section, we briefly explain the context of the rapid review itself and outline our methodological approach.

Method

Context and methodology for rapid review

As explained above, the purpose of this paper is not to focus on the content of the rapid review we carried out. Rather, our interest lies in exploring the process: the methodological issues that were encountered as we carried out the review, the decisions that were taken, and the application and adjustment of the guidance that was used. Nonetheless, for intelligibility, it is necessary first to summarise the context in which the rapid review (Cirkony et al. Citation2021) was undertaken. The research was part of a large-scale investigation focusing on practitioner use of research evidence in education (see, for example, Rickinson et al. Citation2020, Citation2021). We set out to clarify what is known about effective PL to inform the development of a PL programme focused on practitioner research use. To establish a broad understanding of the research landscape within a limited timeframe, we undertook a rapid review to understand what is known about effective PL for K-12 (pupils aged 5–18 years) school-based educators and to gain an awareness of the limitations associated with PL design.

Our rapid review addressed the following research question: What are the features of effective professional learning for school-based educators? By ‘professional learning for school-based educators’, we mean any form of formal organised training, professional development or professional learning provided for in-service K-12 schoolteachers and leaders. As noted earlier, this process was informed by the Cochrane rapid reviews interim guidance from the Cochrane rapid reviews methods group (Garritty et al. Citation2020). Our procedure included: drafting a clear protocol (e.g. plan) with eligibility criteria; prioritising high-quality study designs (e.g. systematic reviews) and databases that focus on high-quality systematic reviews; involving an information specialist (e.g. librarian) and two reviewers and applying a valid risk-of-bias tool. Importantly, as discussed in more detail later, we enlisted the input of experts in PL. The suggested timelines for rapid reviews ranged from one week to six months, after formal approval of the protocols. Our rapid review was conducted over a three-month period. In applying the interim guidance from the Cochrane rapid reviews methods group (Garritty et al. Citation2020), and drawing on relevant methodological literature more broadly, we made and documented specific methodological decisions based on considerations for the education sector. It is our reflections on this process that are presented in detail in the following section.

Findings

In the subsections below, we present our reflections on the process of conducting the rapid review. We discuss, in sequence, the decisions we made associated with the main stages of the process: the selection criteria; search strategy; screening, selection, and quality appraisal; analysis, synthesis, and reporting; and stakeholder engagement. We also consider the potential limitations of our rapid review. In our presentation of findings, we elaborate on how, in our view, the guidance we applied relates to the education context, reflecting on the education-related adjustments we made and highlighting considerations for the conduct of rapid reviews in this sector.

Reflecting on the selection criteria

Initial activities in the conduct of a rapid review require establishing the review question(s) and determining the eligibility criteria for the included studies (Gough, Oliver, and Thomas Citation2017). To support this process, the Population, Intervention, Comparison, Outcomes, and Study (PICOS) framework is commonly used for systematic reviewsFootnote2 (Cooke, Smith, and Booth Citation2012). The Cochrane rapid reviews interim guidance (Garritty et al. Citation2020) suggested the use of PICOS as part of setting the research questions. The PICOS framework has its origins in evidence-based medicine and quantitative research and has been found to have more limited application for qualitative and mixed methods research studies common to education reviews (Cooke, Smith, and Booth Citation2012; Zawacki-Richter et al. Citation2020). In terms of our rapid review’s specific focus on PL, it is noteworthy that establishing the impact of PL interventions on teachers’ practice and students’ outcomes, particularly in relation to causality, has been characterised as the ‘twin black boxes of teacher and student learning’ (Timperley et al. Citation2007, viii). Though we did identify relevant papers that sought to establish causality (e.g. Kennedy Citation2016; Timperley et al. Citation2007; Yoon et al. Citation2007), the authors of these papers themselves critique these approaches. For example, Yoon et al. (Citation2007) explained that to establish a causal link requires adequate measures of teaching practices and student achievement, enough data to warrant the use of appropriate statistical analysis, and high-quality implementation of the PL. Further, the impact of such studies has been critiqued as being weak (Kennedy Citation2016; Opfer and Pedder Citation2011). Such designs require changing one variable (e.g. PL), while all others remain constant (e.g. population, context) to establish impact (e.g. change in teacher practice and/or student achievement). An exclusive focus on causal designs would also exclude other approaches that are relevant for the education sector.

The overall purpose of our rapid review was to provide guidance for effective PL programme design. Our research question focused on ‘effective features of PL’, consistent with ‘what works’ approaches, based on experimental studies with control groups (e.g. causal designs), to establish change and impact. For example, reviews generated by the What Works Clearinghouse (Citation2014) include only well-designed quasi-experimental studies, single-case designs, and randomised experiments. However, educational studies typically draw from an extremely broad range of methodologies, from ethnographic studies, through to correlational studies, and randomised field trials (Biesta Citation2007; Desimone Citation2009; Wolgemuth, Hicks, and Agosto Citation2017), reflecting the complexities involved in addressing research questions related to teaching and learning. In our view, the exclusion of some areas within the broad range of research designs that exist in educational studies would risk resulting in gaps in the full knowledge base needed for the practice-based profession of education (Wolgemuth, Hicks, and Agosto Citation2017). Further, an exclusive emphasis on causality would not reflect the values-based practice of education and may ‘promote a misconceived form of education’ (Hammersley Citation2020, 26).

In relation to the PL field of education that we were reviewing, it is particularly interesting to note that a narrow focus on causal designs has been criticised for not adequately capturing the complexities of PL, including understanding how these mechanisms work in different intensities, at different scales, and in different practical contexts (McChesney and Aldridge Citation2019; Opfer and Pedder Citation2011). While there is a growing number of large-scale studies in education (e.g. systematic reviews, meta-analyses), these emphasise aggregative and consistent findings over theoretical explanations of teacher learning (Opfer and Pedder Citation2011). Moreover, they may provide only limited information about the processes occurring within an education intervention (Gough Citation2007), or the level of intellectual demand on teachers, associated with a given activity (Kennedy Citation2016).

In light of these considerations, we found it necessary and helpful in the context of our review activity in education to adapt the PICOS framework to include a broader view of interventions, comparison groups, outcomes, and study designs. This meant that we took a broad view of ‘effective’, including qualitative studies. Further, in terms of comparison groups, whilst we included ‘reviews that include studies with no PL/PL interventions for control groups’ (Cirkony et al. Citation2021, 17), we also included studies that did not have control groups. In terms of study design, we opted to include a range of methodologies in our synthesis (e.g. mixed method reviews), analyse the essential conclusions of each study, and include these in the reporting of our findings through narrative synthesis (see further Cirkony et al. Citation2021). The associated issues of validity are addressed in the following sections. In summary, we adapted the PICOS framework to include the more diverse research designs and evidence sources apparent in the education sector. This resulted in the inclusion of studies that provided contextually relevant insights to inform the PL initiative that was the subject of our wider study (Cirkony et al. Citation2021).

Reflecting on the search strategy

After defining the selection criteria, we developed a search strategy. This requires identifying papers to be included from databases and other sources, which involves knowledge of relevant databases and keywords related to the target studies (Gough, Oliver, and Thomas Citation2017). Broadly in line with, and with some adjustments to the approach suggested in the Cochrane rapid reviews interim guidance (Garritty et al. Citation2020), we consulted with an information specialist and selected specialised databases (Garritty et al. (Citation2020) advise that specialised databases are recommended for certain topics but restricted in terms of number of sources/can be omitted for reasons of time/resource). We also included grey literature and supplemental searches (Garritty et al. (Citation2020) advise limiting grey literature and supplemental searching). Given that the Garritty et al. (Citation2020) guidance is for rapid reviews in the field of health (and therefore gives search guidance specific to the health sector – for example, in terms of databases), we found it was necessary to adapt the advice here to include education sector-specific databases, and also to develop additional search strategies beyond databases to better reflect the education sector.

The implementation of our search strategy involved the identification of key words. The compilation of relevant keywords began with an initial sample set of core PL literature (mostly primary research) that was identified by our PL experts. Using this sample set, we were able to identify a set of keywords, and then generate additional ones through database thesaurus synonyms, with assistance from an information specialist using Ovid databases. The information specialist’s role was integral in providing a peer review of our search strategy, which included support with translating our review question into salient search terms, confirming relevant databases for the education sector, and optimising searches in complex databases (e.g. PsycInfo). This additional guidance is considered important for systematic reviews. In the context of health research, Sampson et al. (Citation2009) highlighted the need for peer reviews in electronic searches, to ensure the identification of relevant research and reduce the risk of errors. For our rapid review, the information specialist assisted us with generating search terms that were relevant to databases, along with search strings, resulting in a shorter list of salient papers from which to screen.

Our selection of databases included those specialising in systematic reviews suggested by Garritty et al. (Citation2020). However, as mentioned above, for the purposes of our rapid review in education, we needed to identify other systematic databases. Although, as stated earlier, Garritty et al. (Citation2020) suggest limitations on the use of additional specialised databases, in the context of our rapid review in education, given the limited number of systematic reviews on the topic of PL, it was evident that these other databases were important for our searches. Thus, our final selection included Cochrane, as suggested in Garritty et al. (Citation2020), with additional ones (i.e. from the Campbell Collaboration and the EPPI-Centre), as well as subject-specific databases with appropriate relevance for education (i.e. PsycInfo, ProQuest Education, ERIC) (see further Cirkony et al. Citation2021).

We also needed to address the limited number of published reviews that were identified in our initial searches. In response, we expanded our search to include grey literature and supplemental searches (e.g. via Google Scholar and informal searches), and widened our date range. The inclusion of grey literature involved undertaking searches outside the formal databases, which raised important issues about bias. Using non-database searches can help to minimise publication and search bias. However, the practice of including non-peer reviewed publications can be criticised for reasons of rigour (Suri Citation2020). Their use, though, was important in our search strategy.

Though our database searches identified five publications for inclusion, the grey literature and supplemental searches identified seven. This outcome is particularly interesting given Greenhalgh and Peacock’s (Citation2005) study auditing the origin of primary sources in health-related systematic reviews. They reported that only 30% of the primary papers were linked to the database and hand-searches defined in the protocol. The rest were identified through other means, such as reference lists, personal knowledge, and personal contacts. They concluded that systematic reviews of complex evidence cannot rely solely on protocol-driven search methods. Given the complexity of evidence in the education sector, this finding underscores the possibility that supplemental searching may need to play a heightened role in reviews of education evidence. Inevitably, such a shift requires careful thought and decision-making in terms of what to include or exclude. In a methodological guidance paper for systematic reviews in education, Alexander (Citation2020) emphasised that decisions around informal searches involve the need for expertise and judgement – a theme that is discussed further in the next section.

We drafted a plan (i.e. protocol) for our review, outlining these initial steps in the selection criteria and search strategy. Our plan delineated the purpose of the review, timelines, review question; and methodology (e.g. inclusion and exclusion criteria, search strategy, selection of studies, and stakeholder engagement). The development and publication of a protocol prior to undertaking the review is recommended in Garritty et al. (Citation2020) and consistent with high-quality reviews (e.g. Cochrane, EPPI, EEF). Although we did not publish our protocol, we found that creating a documented plan was helpful for stakeholder consultations (e.g. PL experts), thus enabling us to refine our approach.

As explained above, we adapted our search strategy to include systematic review databases relevant to education, as well as using more sector-specific databases. We needed to place emphasis on supplemental sources to identify papers, as it was evident that these yielded more relevant results than those identified through the database searches. The development of our protocol was a useful initial step for sharing and refining our approach through discussions with key stakeholders.

Reflecting on the screening, selection, and quality appraisal

Identifying relevant studies from the searches required an initial screening of their titles and/or abstracts, followed by an in-depth review of selected full text documents (Gough, Oliver, and Thomas Citation2017). For systematic reviews, this process is typically undertaken by at least two reviewers, and involves a method to assess the quality and risk of bias in each included study (Polanin, Maynard, and Dell Citation2017). In line with suggestions in Garritty et al. (Citation2020), two reviewers were involved in our process. As described in the paragraphs below, where possible, tasks were alternated and decisions cross-checked. We encountered issues associated with appraising studies and the need for researcher expertise.

Given the relatively small number of identified studies (627), abstracts were screened against the inclusion and exclusion criteria by one reviewer (rather than using a second reviewer to screen all excluded abstracts, as suggested in Garritty et al. (Citation2020)). After removing the duplicates, 39 studies proceeded to full text review by both reviewers. We initially excluded 26 on the basis that they were not systematic literature reviews (n = 10) or had limited findings (n = 2), which was reflective of the limited number of systematic reviews on the topic, and ones with limited findings (e.g. Harker and Kleijnen Citation2012). Others were deemed not relevant (n = 14), for several reasons: they were exploratory studies; they involved pre-service teachers; they were programme- or subject-specific (e.g. mathematics, classroom management); or they took place in higher education settings.

One reviewer then extracted descriptive data from these 13 remaining studies into an electronic spreadsheet, following a similar format to systematic reviews (e.g. review question, setting, conclusions) (Gough, Oliver, and Thomas Citation2017). Following the data extraction guidance in Garritty et al. (Citation2020), the second reviewer checked the data extraction results. No additional changes were made. Papers that met our inclusion criteria were based on syntheses of primary studies (e.g. between 5 and 97 primary studies), using varied research methods (e.g. randomised control trials, quasi-experimental studies, qualitative studies). We noted that it was not possible to identify clearly the number of participants involved in the each of the studies in the included reviews without referring back to the original studies, which went beyond the remit of our rapid review.

For quality assessment, we used the Joanna Briggs Institute (JBI) Critical Appraisal Tool for systematic reviews (Aromataris et al. Citation2015). We chose this tool because it has well-established validity and reliability for evaluating systematic reviews that draw from diverse research methods. The tool also accommodates the appraisal of quantitative and qualitative systematic reviews, as well as meta-analyses. This tool consisted of eleven questions to guide the appraisal of systematic reviews or meta-analyses. According to Aromatasis et al. (2015), these questions are intended to prompt discussions between the two reviewers. The questions focus on issues pertaining to the studies under review, such as the inclusion criteria (Were the inclusion criteria appropriate for the review question?); the search strategy (Were the sources and resources used to search for studies adequate?); the appraisal process (Was critical appraisal conducted by two or more reviewers independently?); and the synthesis methods (Were the methods used to combine studies appropriate?). At this stage in our process, the two reviewers intentionally worked separately on this, rather than entering into discussion, as we were following the procedure of one reviewer carrying out a task and the other checking (see paragraph above). Accordingly, the first reviewer applied the appraisal tool, focusing on the quality of the review designs and the essential outcomes of each review. This process resulted in one additional article being excluded due to reporting limited findings. The second reviewer then evaluated the results of the appraisal tool and determined that no additional changes were needed.

Reflecting on our decision to use the JBI tool, though it proved useful in evaluating diverse research designs, we found that, in our context, it did not provide a way for us to address whether a research study would be ‘fit for purpose’ in terms of its intended use (i.e. the intended use of the rapid review findings: in the context of our review, to identify effective PL approaches to inform an education initiative). This requires that research studies, in turn, need to be synthesised using relevant questions and appropriate methods. Similar issues have been raised in the health sector, particularly around the limitations of evidence hierarchies for determining the effectiveness of social interventions. For example, Petticrew and Roberts (Citation2003) proposed a matrix-style typology of evidence sources, which can help to determine the strengths and weaknesses of studies by focusing on the link between the research design and intended use. Their insights provide a way forward in assessing diverse evidence sources, but still require consideration of the weighting of these sources in making decisions around services and policies.

One strategy for assessing the quality and relevance of diverse studies and considered suitable for rapid reviews is the Weight of Evidence (WoE) approach (see Gough Citation2007). The WoE is a matrix-type heuristic to assess a specific review question according to three dimensions: (a) internal validity; (b) appropriateness of study method; and (c) appropriateness of samples, context, and measures. Each dimension is assessed separately, then combined into an overall WoE score, from high to low. For example, the conduct of a systematic review may be of good quality but may not be fit for purpose, and thus be excluded from the review. According to Gough (Citation2007), the purpose of this approach is to make the judgements more transparent so that they can be considered and debated. This is particularly relevant for issues around researcher bias.

The risk of bias is a long-standing issue in systematic reviews, hence the need for formal domain-based appraisal checklists (Harker and Kleijnen Citation2012). However, as Hammersley (Citation2020) reflects, the use of formal checklists, which may be intended to minimise bias, can imply that the background expertise and judgement of researchers should be minimised. Moreover, the reliance on these checklists may be perceived to be at odds with researcher expertise. Indeed, even in the context of evidence-based medicine, an overreliance on systematic reviews may be ‘seen as part of a larger discourse of distrust of professionals and of expertise, and the increasing proceduralisation of decision-making processes in risk-averse organisations’, with the over-adherence to transparency taken to ‘absurd and counterproductive lengths’ (Torrance Citation2004, 3). As Gough (Citation2007) pointed out, there is a risk of excluding ‘non-ideal’ designs that address the review question, and thereby overlooking the matter that these excluded studies might still contain useful information. These discussions highlight the complex interplay between formal appraisal checklists and researcher expertise. Hammersley (Citation2020, 35) cautioned against strict adherence to formal appraisal methods, explaining that such an approach could downgrade ‘the important role of imagination and creativity, as well as of background knowledge and scientific sensibility’. Hence, expertise and judgement are a necessary part of the synthesis process. To this point, our review team required expertise and judgement to determine the quality of the included research studies. We were transparent in presenting our methods and the limitations of our findings in the published article, consistent with the methodological guidance from Alexander (Citation2020). Importantly, our processes for analysis, synthesis, and reporting supported these inclusive and subjective decisions, and are outlined in the next section.

To sum up, we employed two reviewers and applied the JBI appraisal tool to assess the studies, resulting in a streamlined systematic and transparent approach. We reflected, too, that other tools that have a more explicit focus on the evaluation of ‘fitness for purpose’ would be particularly helpful for assessing research to inform decisions in the area of education policy. This was particularly important in our context, as the aim of our rapid review was to seek out effective PL approaches to inform an education initiative.

Reflecting on analysis, synthesis, and reporting

The analysis and synthesis bring together the findings from the included studies to respond to the review question (Gough, Oliver, and Thomas Citation2019). These activities are driven by the purpose of, and intended audience for, the review – in our case, to inform the development of a PL initiative for educators. As suggested in Garritty et al. (Citation2020), we synthesised our findings narratively. To operationalise this, we generated a narrative synthesis of our findings through thematic analysis. A narrative synthesis is a text-based method, which involves summarising and explaining the findings of multiple studies primarily with words (Gough, Oliver, and Thomas Citation2019; Popay et al. Citation2006). We conducted our thematic analysis around the main conclusions of our included studies, resulting in eight main thematic categories. From these, we were able to identify areas of consensus and disagreement.

We used narrative synthesis as an approach because it is considered an appropriate choice to accommodate the methodological diversity common in systematic reviews of social interventions, including our review (Andrews and Harlen Citation2006; Popay et al. Citation2006). According to Andrews and Harlen (Citation2006, 294), this method offers ‘the best possible account of the research being examined’ by balancing the need to transform the research results systematically while seeking to minimise bias. Popay et al. (Citation2006) suggest that this method is particularly helpful for navigating the complexities of social or behavioural interventions (e.g. populations, outcomes, conduct, context), along with their possible modification during implementation. Narrative synthesis methods also address ‘social heterogeneity’, where historical, cultural, or other attributes in the study population may affect both the delivery and impact of interventions (Popay et al. Citation2006, 14). Finally, this method ‘integrate[s] qualitative and quantitative evidence through narrative juxtaposition – discussing diverse forms of evidence side by side’ (Dixon-Woods et al. Citation2005, 47).

At the synthesis stage, Garritty et al. (Citation2020, 2) advise that one reviewer should ‘grade the certainty’ of evidence, with the second reviewer verifying their decisions. Adapting these activities to our process, the first reviewer conducted the synthesis and analysis, which underwent critical reflection by the second reviewer. To support our rapid review procedure, we also enlisted two PL experts to review our findings. These decisions were consistent with protocols for high-quality systematic reviews, where at least two reviewers need to be involved in the synthesis stage, drawing from both internal and external reviewers (Andrews and Harlen Citation2006). Our engagement with PL experts played an important role in providing guidance around the results of the thematic analysis. They commented that although they were familiar with many of the ‘effective features’ of PL that were identified through our review, our findings offered a more comprehensive view of these features, along with the juxtaposition of diverse findings within themes. Their comments were consistent with the idea proposed by Gough, Oliver, and Thomas (Citation2019, 6), where the ‘mark of a good search strategy can be that it finds studies that surprise the review team’.

It is notable that the PL experts’ feedback highlighted certain critical issues around PL that were not captured in the analysis of the ‘essential conclusions’ across the included papers but were captured in the discussions within the included papers. Thus, we were prompted to elaborate on these issues in the discussion section of our rapid review (Cirkony et al. Citation2021). This allowed us to address the underlying assumptions connected to ‘effective’ approaches, and how these in turn link to ideas around PL and change. Crucially, in applying the findings from our rapid review to inform our education initiative, we were able to integrate the effective features (e.g. collaboration) into the PL design, along with an explicit conceptualisation of PL and change, and how to evaluate complex change processes in practice. In doing so, we believe the results of our rapid review provided high-quality and nuanced guidance for PL design.

Overall, we reflected that the undertaking of narrative synthesis through thematic analysis proved a viable method to bring together the essential findings of included studies with diverse research methodologies, and to enable comparison between them to better inform decisions. Given the nature of these interpretive methods, the need for transparency is critical. The minimum involvement of at least two reviewers is important for quality synthesis: in our case, this also included of the ongoing involvement of PL experts, who provided more feedback on the generated themes, and a nuanced interpretation of the results. The contribution of our experts is elaborated on in the following section.

Reflecting on stakeholder engagement

Stakeholder engagement involves seeking input on the review question(s), eligibility criteria, and outcomes – and, as highlighted earlier, begins with the development of the protocol. This collaboration speaks to the notion that ‘research questions are not owned by one perspective or world view such as government, ‘lay’ people – or academics’ (Gough and Thomas Citation2016, 99). For example, the synthesis of health evidence summaries by Khangura et al. (Citation2012) was developed at the request of stakeholders, who also participated in defining the scope of the problem/question and the purpose of the summary, along with the practical application of the findings. Similarly, in the education field, the What Works Clearinghouse have assembled teams led by methodology and content experts who are involved from ‘protocol to product’, along with staff who have formal reviewer certification (see What Works Clearinghouse Citation2014). The role of key stakeholders (e.g. end-users such as health professionals and policymakers) is also emphasised in Garritty et al. (Citation2020), for example, in ensuring that the review is fit for purpose from the start of the review process.

For our review, we employed the expertise of an external information specialist, and that of the internal team, which consisted of four researchers in the education sector including two who had experience in developing, delivering, and researching school-based PL. As described in preceding sections, our PL experts assisted in the following aspects of our review: (i) identifying core papers to develop keywords for our search strategy; (ii) refining the protocol (i.e. purpose, review question); and (iii) providing feedback on the review synthesis, along with insights into the limitations associated with PL design. Our initial and ongoing consultations with our PL experts provided substantive guidance and highlighted critical issues that were not otherwise captured through the technical aspects of the method. Their contributions were invaluable in ensuring a balance between academic rigour and relevance, supporting the streamlined review process (Gough, Oliver, and Thomas Citation2019). In all, the ongoing engagement of content-specific experts played a vital role in our rapid review process, providing expertise to shape the purpose, the conduct, and the outcomes of the rapid review.

Reflecting on the potential limitations

Consistent with the practices of all research, including research synthesis, there is a need to address the limitations of our rapid review and how these might affect the findings (Gough, Oliver, and Thomas Citation2017). For systematic reviews, limitations are commonly associated with the context and the application of findings (Gough, Oliver, and Thomas Citation2017). For rapid reviews, possible limitations also include the quality of evidence and potential biases at the primary study, review, and overview levels (Polanin, Maynard, and Dell Citation2017). In our rapid review (Cirkony et al. Citation2021), we acknowledged potential limitations associated, for example, with geographical restrictions (we included work published in English with emphasis on Australia, Canada, New Zealand, UK and USA, thereby excluding work published in other languages/countries) and emphasised the need for transparency. We noted that such restrictions can present possible geographical and cultural bias, leading to the contextual limitations of findings. To address this issue, Wolgemuth, Hicks, and Agosto (Citation2017, 138) suggested that reviewers indicate the limitations of exclusionary decisions ‘to inform a more tentative, critical, and contextualised understanding of the terms and conclusions produced in research syntheses’. They thus highlighted the importance of reflecting on the inclusion and exclusion decisions. While we acknowledged these issues within the limitations of our findings, there remained the risk that the findings of our review could be applied without due consideration of context.

Such decisions can have significant implications for policy. According to Suri (Citation2020), there are ethical implications when systematic reviews of research are used to make policy decisions, particularly if findings were assumed to be representative of a larger or different population. Similar issues arise from the lack of contextual details around the original individual primary studies included in each review, especially in terms of the supports needed for effective implementation. For example, there may be an expectation for a programme that is successful in one context to be equally successful in another, though its implementation in different contexts and at scale would likely require commensurate resourcing. Thus, in designing an appropriate sampling and search strategy, Suri (Citation2020) suggested that systematic reviewers should carefully consider the impact of potential publication biases and search biases and be transparent in their methodological designs and in communicating the limitations of the findings.

It is important to note that there remain limitations associated with rapid reviews, even in situations where the process of carrying out the review involved very careful adoption and thoughtful adaptation of valuable methodological recommendations, such as those offered in Garritty et al. (Citation2020) and Garritty et al. (Citation2021). This emphasises, all the more, the need to embed explicit and transparent processes in the rapid review procedure. In conducting our rapid review on effective features of PL (Cirkony et al. Citation2021), we thus outlined our screening and selection process, and provided a detailed account of our synthesis methods. In our review, we also included a table of included studies as an appendix, with details about the review questions (e.g. study aims), the number of studies within each review, and the key findings outlined in each study design. In the limitations section of the review, we outlined the limited application of our review findings, with respect to their applicability and their intended use for informing intervention design.

In short, the need to streamline processes in rapid reviews, due to limitations on time and resources, draws attention to the importance of providing a transparent account of the methods and decisions and identifying limitations with the findings. Our decisions to limit geographical scope may have led to geographical and cultural bias, with implications for contextual relevance, scaling, and the implementation of review findings.

Discussion

This paper reflects upon our experience of undertaking a rapid review of research on effective PL approaches to inform an education initiative (Cirkony et al. Citation2021). Rapid reviews are potentially valuable where time and resources are too limited for a full systematic review (Garritty et al. Citation2021; Gough, Oliver, and Thomas Citation2019). For rapid reviews to be helpful, though, these practical considerations need to be matched by methodological and ethical considerations. Our application of the rapid reviews interim guidance from the Cochrane rapid reviews methods group (Garritty et al. Citation2020) supplied rigour to the conduct of our review, supporting our approach to the key issues of conduct, synthesis, and reporting in education reviews (Polanin, Maynard, and Dell Citation2017). Drawing on the wider methodological literature in the education and health sectors was also helpful in informing our process. More specifically, we believe that our experience and reflections highlight three key considerations.

Firstly, there is a need to adapt guidance and recommendations for conducting reviews in other fields (e.g. health) for the context of the education sector. Based on our experience of carrying out the rapid review in an area of education, this involved the need to modify the application of the PICOS framework to allow for the breadth of educational studies, and to use different databases and supplemental search strategies in order to locate relevant studies. Secondly, we found that recognising the role of reviewer expertise and judgement was key. During our review process, this became particularly important in assessing the quality of the papers and considering the relevance of high-quality studies to inform the specific educational initiative that constituted the wider context of our work. Alongside this, there was a need to identify the limitations of the included studies (e.g. geographical scope, cultural bias), which inevitably have implications for the implementation of review findings. Thirdly, there is a need to attend to collaborative review team processes. This involved the careful selection of the internal review team to ensure rigour in the selection, assessment, and synthesis of studies. This also concerned engaging external expertise (such as an information specialist) to support the technical aspects of the search strategy.

In general, interest in rapid reviews in many areas of research is expected to grow (Garritty et al. Citation2021). Taken together, our reflections on the conduct of the rapid review we carried out offer implications and insights relevant for the practice, use, and commissioning of future rapid reviews in the sphere of education research. For researchers and other professionals conducting rapid reviews, the development of a protocol in collaboration with stakeholders (even if unpublished) is useful for refining the methodology prior to undertaking the review. Alongside this, there is a need to identify and engage relevant stakeholders and/or relevant experts to inform activities such as developing protocols and reviewing findings. Those conducting reviews need to be open to, and to be able to work with, diverse research designs specific to the education sector (e.g. Wolgemuth, Hicks, and Agosto Citation2017). This involves drawing from databases that specialise or prioritise systematic reviews in education, along with education-specific databases. It also necessitates the use of supplemental sources outside databases (e.g. online scholarly search engines, personal contacts, reference searches). As a consequence of expanding the search, there are additional considerations around the screening and appraisal processes.

The choice of appraisal tool must take into consideration both the quality of the studies and their fitness for purpose, requiring researcher expertise and judgement (e.g. Gough Citation2007; Hammersley Citation2020). The analysis and reporting need to involve at least two reviewers, along with maintaining and documenting transparency in the process. For implementation, the findings need to be viewed in relation to issues around potential geographical and cultural bias, contextual relevance, and scaling (e.g. Suri Citation2020). Finally, those conducting rapid reviews need to provide a transparent account of their overall methods and included studies, their decisions around methodology, and the limitations of their findings (e.g. Polanin, Maynard, and Dell Citation2017).

For those using the findings from rapid reviews in education (e.g. practitioners, school leaders, decision-makers), there is a need to consider the purpose, benefits, and limitations of rapid reviews. Compared with rapid reviews that draw on studies that focus on impact (e.g. meta-analysis), education-specific configurative rapid reviews require that practitioners and other users in the education sector be aware of a broader range of evidence sources and methodologies, their potential to support a given research question, and consider whether they are appropriate for their context (e.g. Wolgemuth, Hicks, and Agosto Citation2017; Zawacki-Richter et al. Citation2020). Given that the contexts of the primary and secondary studies may be lost in a rapid review (Polanin, Maynard, and Dell Citation2017), users need to review, carefully, the findings of these syntheses in relation to their own contexts, to better understand how the findings might inform their own decisions and initiatives. This includes awareness of the potential of geographical and cultural bias, along with contextual relevance, particularly if the findings require specific supports for implementation or scaling (e.g. Suri Citation2020).

Many of these considerations also apply to those commissioning rapid reviews to inform initiatives in education (e.g. research funders and policymakers). As stakeholders, commissioners of such reviews need to consider the complexity of the education sector and undertake an active role in the conduct of the review. Following the example set out in the health sector (e.g. Khangura et al. Citation2012), those commissioning rapid reviews need to participate in the development of the protocol, including inputs into selection criteria around language, geographic scope, and context. They also need to review the analysis and findings to ensure fitness for purpose. It is critical that commissioners of reviews understand the need to balance resources, timelines, and rigour (Gough, Oliver, and Thomas Citation2019). In this way, using research to inform decisions can become ‘less of a mechanistic adoption of specific results but more of a dynamic engagement between decision-making and varying sources of research-based information’ (Gough and Thomas Citation2016, 95). This highlights the need for commissioners to seek out teams who have a deep understanding of the diverse approaches to research within education and the complexities involved in the application of research evidence to practice in education.

Conclusion

Our application of health sector rapid review interim guidance (Garritty et al. Citation2020) proved valuable in supporting the conduct of a methodologically rigorous rapid review for the education sector. Drawing, too, on methodological literature from the education and health sectors allowed us to reflect on our process and consider potential adjustments pertinent to the context of education. Overall, our reflections highlight three key considerations relevant to the conduct of rapid reviews in the education sector: the need to adapt recommendations and guidance so that they are maximally relevant to the context of education; the need to recognise the role of reviewer expertise and judgement in quality appraisal; and the need to attend to collaborative review team processes.

These reflections support the notion that the use of educational research to inform decisions in the education sector needs to be a dynamic, contextualised, and collaborative process. Against the backdrop of calls for evidence-based and evidence-informed practice in education, rapid reviews offer a practical and important addition to systematic reviews as part of the growing ‘evidence ecosystem’ from which researchers, practitioners and policymakers can draw on to inform timely decisions (Gough, Oliver, and Thomas Citation2019, 3). Our rapid review experience suggests that adapting and adjusting existing rapid review methodological recommendations available for areas outside education has promise for enabling the timely identification of high-quality research, leading to the provision of evidence-based guidance to inform decisions and initiatives in education.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The authors wish to acknowledge the funding from the Paul Ramsay Foundation to undertake this research.

Notes

1. Garritty et al. (Citation2020) was the interim guidance available to us at the time we conducted our rapid review. Garritty et al. (Citation2020) (published in March 2020) notes that the recommendations in Garritty et al. (Citation2020) are ‘provisional recommendations from the Cochrane Rapid Reviews Methods Group’ (Garritty et al. Citation2020, 2). See further subsequent publication Garritty et al. (Citation2021) (published in October 2020), in particular, ‘Table 1 - Cochrane rapid review methods recommendations’ (17).

2. We use the term systematic review to refer to any effort to synthesise a body of literature using transparent and comprehensive methods, whether that literature includes studies that use quantitative or qualitative methods (Gough, Oliver, and Thomas Citation2017).

References

  • Alexander, P. A. 2020. “Methodological Guidance Paper: The Art and Science of Quality Systematic Reviews.” Review of Educational Research 90 (1): 6–23. doi:10.3102/0034654319854352.
  • Andrews, R., and W. Harlen. 2006. “Issues in Synthesizing Research in Education.” Educational Research 48 (3): 287–299. doi:10.1080/00131880600992330.
  • Aromataris, E., R. Fernandez, C. M. Godfrey, C. Holly, H. Khalil, and P. Tungpunkom. 2015. ”Summarizing Systematic Reviews: Methodological Development, Conduct and Reporting of an Umbrella Review Approach.” JBI Evidence Implementation 133: 132–140. doi;10.1097/XEB.0000000000000055.
  • Australian Government Productivity Commission. 2016. National Education Evidence Base: Productivity Commission Inquiry Report. Canberra: Commonwealth of Australia. https://www.pc.gov.au/inquiries/completed/education-evidence/report/education-evidence.pdf.
  • Biesta, G. 2007. “Why ‘What Works’ Won’t Work: Evidence‐based Practice and the Democratic Deficit in Educational Research.” Educational Theory 57 (1): 1–22. doi:10.1111/j.1741-5446.2006.00241.x.
  • British Educational Research Association. 2014. Research and the Teaching Profession: Building the Capacity for a Self-Improving Education System. Final Report of the BERA-RSA Inquiry into the Role of Research in Teacher Education. London: BERA. https://www.bera.ac.uk/wp-content/uploads/2013/12/BERA-RSA-Research-Teaching-Profession-FULL-REPORT-for-web.pdf.
  • Brown, C., and J. Malin, edited by. 2022. The Emerald Handbook of Evidence-Informed Practice in Education. Bingley: Emerald.
  • Cirkony, C., M. Rickinson, L. Walsh, J. Gleeson, M. Salisbury, B. Cutler, M. Berry, and K. Smith. 2021. ”Beyond Effective Approaches: A Rapid Review Response to Designing Professional Learning.” Professional Development in Education Advance online publication doi:10.1080/19415257.2021.1973075.
  • Cooke, A., D. Smith, and A. Booth. 2012. “Beyond PICO: The SPIDER Tool for Qualitative Evidence Synthesis.” Qualitative Health Research 22 (10): 1435–1443. doi:10.1177/1049732312452938.
  • Cooper, H., and A. C. Koenka. 2012. “The Overview of Reviews: Unique Challenges and Opportunities When Research Syntheses are the Principal Elements of New Integrative Scholarship.” The American Psychologist 67 (6): 446–462. doi:10.1037/a0027119.
  • Desimone, L. M. 2009. “Improving Impact Studies of Teachers’ Professional Development: Toward Better Conceptualizations and Measures.” Educational Researcher 38 (3): 181–199. doi:10.3102/0013189X08331140.
  • Dixon-Woods, M., S. Agarwal, D. Jones, B. Young, and A. Sutton. 2005. “Synthesising Qualitative and Quantitative Evidence: A Review of Possible Methods.” Journal of Health Services Research & Policy 10 (1): 45–53. doi:10.1177/135581960501000110.
  • Education Endowment Foundation. 2020a. Remote Learning, Rapid Evidence Assessment. London: Education Endowment Foundation. https://educationendowmentfoundation.org.uk/public/files/Remote_Learning_Rapid_Evidence_Assessment.pdf.
  • Education Endowment Foundation. 2020b. Remote Professional Development, Rapid Evidence Assessment. London: Education Endowment Foundation. https://d2tic4wvo1iusb.cloudfront.net/documents/guidance/Remote_PD_Evidence_Assessment.pdf?v=1629099920.
  • Ganann, R., D. Ciliska, and H. Thomas. 2010. “Expediting Systematic Reviews: Methods and Implications of Rapid Reviews.” Implementation Science 5 (1): 56. doi:10.1186/1748-5908-5-56.
  • Garritty, C., G. Gartlehner, C. Kamel, V. J. King, B. Nussbaumer-Streit, A. Stevens, C. Hamel, and L. Affengruber. 2020. Cochrane Rapid Reviews: Interim Guidance from the Cochrane Rapid Reviews Methods Group. London: Cochrane. https://methods.cochrane.org/rapidreviews/sites/methods.cochrane.org.rapidreviews/files/public/uploads/cochrane_rr_-_guidance-23mar2020-final.pdf.
  • Garritty, C., G. Gartlehner, B. Nussbaumer-Streit, V. J. King, C. Hamel, C. Kamel, L. Affengruber, and A. Stevens. 2021. “Cochrane Rapid Reviews Methods Group Offers Evidence-Informed Guidance to Conduct Rapid Reviews.” Journal of Clinical Epidemiology, no. 130: 13–22. doi:10.1016/j.jclinepi.2020.10.007.
  • Gough, D. 2007. “Weight of Evidence: A Framework for the Appraisal of the Quality and Relevance of Evidence.” Research Papers in Education 22 (2): 213–228. doi:10.1080/02671520701296189.
  • Gough, D., S. Oliver, and J. Thomas, edited by. 2017. An Introduction to Systematic Reviews. London: SAGE.
  • Gough, D., S. Oliver, and J. Thomas. 2019. “Systematic Reviews.” In SAGE Research Methods Foundations, edited by P. Atkinson, S. Delamont, A. Cernat, J. W. Sakshaug, and R. A. Williams, 1–20. London: SAGE. doi:10.4135/9781526421036857078.
  • Gough, D., and J. Thomas. 2016. “Systematic Reviews of Research in Education: Aims, Myths and Multiple Methods.” Review of Education 4 (1): 84–102. doi:10.1002/rev3.3068.
  • Greenhalgh, T., and R. Peacock. 2005. “Effectiveness and Efficiency of Search Methods in Systematic Reviews of Complex Evidence: Audit of Primary Sources.” The BMJ 331 (331): 1064. doi:10.1136/bmj.38636.593461.68.
  • Hamel, C., A. Michaud, M. Thaku, B. Skidmore, A. Stevens, B. Nussbaumer-Streit, and C. Garritty. 2021. “Defining Rapid Reviews: A Systematic Scoping Review and Thematic Analysis of Definitions and Defining Characteristics of Rapid Reviews.” Journal of Clinical Epidemiology 129: 74–85. doi:10.1016/j.jclinepi.2020.09.041.
  • Hammersley, M. 2020. “Reflections on the Methodological Approach of Systematic Reviews.” In Systematic Reviews in Educational Research: Methodology, Perspectives and Application, edited by O. Zawacki-Richter, M. Kerres, S. Bedenlier, M. Bond, and K. Buntins, 23–39. Wiedbaden: Springer. doi:10.1007/978-3-658-27602-7_1.
  • Harker, J., and J. Kleijnen. 2012. “What is a Rapid Review? A Methodological Exploration of Rapid Reviews in Health Technology Assessments.” International Journal of Evidence-Based Healthcare 10 (4): 397–410. doi:10.1111/j.1744-1609.2012.00290.x.
  • Hattie, J., H. J. Rogers, and H. Swaminathan. 2014. “The Role of Meta-Analysis in Educational Research.” In A Companion to Research in Education, edited by A. D. Reid, E. P. Hart, and M. A. Peters, 197–207. Dordrecht: Springer.
  • Kennedy, M. M. 2016. “How Does Professional Development Improve Teaching?” Review of Educational Research 86 (4): 945–980. doi:10.3102/0034654315626800.
  • Khangura, S., K. Konnyu, R. Cushman, J. Grimshaw, and D. Moher. 2012. “Evidence Summaries: The Evolution of a Rapid Review Approach.” Systematic Reviews, no. 1: 10. doi:10.1186/2046-4053-1-10.
  • Malin, J. R., C. Brown, G. Ion, I. van Ackeren, N. Bremm, R. Luzmore, J. Flood, and G. M. Rind. 2020. “World-Wide Barriers and Enablers to Achieving Evidence-Informed Practice in Education: What Can Be Learnt from Spain, England, the United States, and Germany?” Humanities and Social Sciences Communications 7 (1): 99. doi:10.1057/s41599-020-00587-8.
  • McChesney, K., and J. M. Aldridge. 2019. “What Gets in the Way? A New Conceptual Model for the Trajectory from Teacher Professional Development to Impact.” Professional Development in Education 47 (5): 834–852. doi:10.1080/19415257.2019.1667412.
  • Newman, M., and D. Gough. 2020. “Systematic Reviews in Educational Research: Methodology, Perspectives and Application.” In Systematic Reviews in Educational Research: Methodology, Perspectives and Application, edited by O. Zawacki-Richter, M. Kerres, S. Bedenlier, M. Bond, and K. Buntins, 3–22. Wiedbaden: Springer. doi:10.1007/978-3-658-27602-7_1.
  • Opfer, V. D., and D. Pedder. 2011. “Conceptualizing Teacher Professional Learning.” Review of Educational Research 81 (3): 376–407. doi:10.3102/0034654311413609.
  • Organisation for Economic Co-operation and Development (OECD). 2007. Evidence in Education: Linking Research and Policy. Paris: OECD. doi:10.1787/9789264033672-en.
  • Petticrew, M., and H. Roberts. 2003. ““Evidence, Hierarchies, and Typologies: Horses for Courses.” Journal of Epidemiology and Community Health 57 (7): 527–529. doi:http://dx.doi.org/10.1136/jech.57.7.527.
  • Polanin, J. R., B. R. Maynard, and N. A. Dell. 2017. “Overviews in Education Research: A Systematic Review and Analysis.” Review of Educational Research 87 (1): 172–203. doi:10.3102/0034654316631117.
  • Pollock, M., R. M. Fernandes, L. A. Becker, D. Pieper, and L. Hartling. 2020. ”Chapter V: Overviews of Reviews.” In Cochrane Handbook for Systematic Reviews of Interventions Version 6.0, edited by J. P. T. Higgins, J. Thomas, J. Chandler, M. Cumpston, T. Li, M. J. Page, and V. A. Welch, https://training.cochrane.org/handbook/current/chapter-v
  • Popay, J., H. Roberts, A. Sowden, M. Petticrew, L. Arai, M. Rodgers, and N. Britten. 2006. Guidance on the Conduct of Narrative Synthesis in Systematic Reviews. A Product from the ESRC Methods Programme. Bailrigg: Lancaster University. https://www.lancaster.ac.uk/media/lancasteruniversity/contentassets/documents/fhm/dhr/chir/NSsynthesisguidanceVersion1-April2006.pdf.
  • Rickinson, M., C. Cirkony, L. Walsh, J. Gleeson, and M. Salisbury. 2020. Towards Quality Use of Research Evidence in Education: Discussion Paper. Melbourne: Monash University. doi:10.26180/14071571.
  • Rickinson, M., C. Cirkony, L. Walsh, J. Gleeson, M. Salisbury, and A. Boaz. 2021. “Insights from a Cross-Sector Review on How to Conceptualise the Quality of Use of Research Evidence.” Humanities and Social Sciences Communications 8 (1): 141. doi:10.1057/s41599-021-00821-x.
  • Rickinson, M., and H. May. 2009. A Comparative Study of Methodological Approaches to Reviewing Literature. York: The Higher Education Academy. https://www.heacademy.ac.uk/sites/default/files/resources/Comparativestudy.pdf.
  • Sampson, M., J. McGowan, E. Cogo, J. Grimshaw, D. Moher, and C. Lefebvre. 2009. “An Evidence-Based Practice Guideline for the Peer Review of Electronic Search Strategies.” Journal of Clinical Epidemiology 62 (9): 944–952. doi:10.1016/j.jclinepi.2008.10.012.
  • Suri, H. 2020. “Ethical Considerations of Conducting Systematic Reviews in Educational Research.” In Systematic Reviews in Educational Research: Methodology, Perspectives and Application, edited by O. Zawacki-Richter, M. Kerres, S. Bedenlier, M. Bond, and K. Buntins, 41–54. Wiedbaden: Springer. doi:10.1007/978-3-658-27602-7_1.
  • Timperley, H., A. Wilson, H. Barrar, and I. Fung. 2007. Teacher Professional Learning and Development: Best Evidence Synthesis iteration (BES). Wellington: New Zealand Ministry of Education. https://thehub.swa.govt.nz/assets/documents/42432_TPLandDBESentireWeb_0.pdf.
  • Torrance, H. 2004. “Systematic Reviewing–the ‘Call Centre’ Version of Research Synthesis. Time for a More Flexible Approach.” Paper presented at the Education and Social Research Council/Research Capacity Building Network Seminar on Systematic Reviewing, University of Sheffield, UK. June 24.
  • Tricco, A. C., J. Antony, W. Zarin, L. Strifler, M. Ghassemi, J. Ivory, L. Perrier, B. Hutton, D. Moher, and S. E. Straus. 2015. “A Scoping Review of Rapid Review Methods.” BMC Medicine 13 (1): 224. doi:10.1186/s12916-015-0465-6.
  • Watt, A., A. Cameron, L. Sturm, T. Lathlean, W. Babidge, S. Blamey, K. Facey, D. Hailey, I. Norderhaug, and G. Maddern. 2008. “Rapid versus Full Systematic Reviews: Validity in Clinical Practice?” ANZ Journal of Surgery 78 (11): 1037–1040. doi:10.1111/j.1445-2197.2008.04730.x.
  • What Works Clearinghouse. 2014. Procedures Handbook (Version 4.1). Washington: Institute of Education Sciences. https://ies.ed.gov/ncee/wwc/Docs/referenceresources/WWC-Standards-Handbook-v4-1-508.pdf.
  • White, H. 2020. “The Global Evidence Architecture in Health and Education: A Comparative Scorecard.” In Getting Evidence into Education: Evaluating the Routes to Policy and Practice, edited by S. Gorard, 20–33. Abingdon: Routledge.
  • White, S., B. Down, M. Mills, S. Shore, and A. Woods. 2021. “Strengthening a Research-Rich Teaching Profession: An Australian Study.” Teaching Education 32 (3): 338–352. doi:10.1080/10476210.2020.1737666.
  • Wolgemuth, J. R., T. Hicks, and V. Agosto. 2017. “Unpacking Assumptions in Research Synthesis: A Critical Construct Synthesis Approach.” Educational Researcher 46 (3): 131–139. doi:10.3102/0013189X17703946.
  • Yoon, K. S., T. Duncan, S. W. Lee, B. Scarloss, and K. L. Shapley. 2007. Reviewing the Evidence on How Teacher Professional Development Affects Student Achievement. Washington: U.S. Department of Education. https://ies.ed.gov/ncee/edlabs/regions/southwest/pdf/rel_2007033.pdf.
  • Zawacki-Richter, O., M. Kerres, S. Bedenlier, M. Bond, and K. Buntins. 2020. ”Introduction: Systematic Reviews in Educational Research.” In Systematic Reviews in Educational Research: Methodology, Perspectives and Application, edited by O. Zawacki-Richter, M. Kerres, S. Bedenlier, M. Bond, and K. Buntins. Wiedbaden: Springer. vxiv. doi: .10.1007/978-3-658-27602-7_1.