6,700
Views
24
CrossRef citations to date
0
Altmetric
ARTICLES

Systematic Reviews: A Social Work Perspective

Pages 284-295 | Accepted 16 Jan 2015, Published online: 21 Apr 2015

Abstract

Systematic reviews are gaining prominence and recognition as being an important methodological approach to dealing with ever growing amounts of research data, and recent years have seen the development of guidelines for both the conduct and reporting of systematic reviews. Initially systematic reviews came to prominence as a method for synthesising data emerging from Randomised Control Trials (RCTs) but increasingly the term “systematic review” is being used in regards to reviews of studies of a wide range of research designs. However, among Australian social workers, utilisation and conduct of systematic reviews has been limited. This paper will explore the question of what a systematic review is, introduce some of the key issues in undertaking such a review, and explore the implications of the emergence of systematic reviews from a social work perspective.

方兴未艾的系统评估被视为处理日益增加的研究数据的重要工具,近年来逐渐形成系统研究实施与报告的指导原则。系统研究最初是作为一种整合资料(RCT)的方法而出现的,但这个词正在被用于评估各类方案的研究。不过,在澳大利亚的社会工作者那里,使用和实行系统评估的人还很有限。本文探讨了何为系统评估,引入了实施评估所涉及的一些关键性话题,并从社会工作的角度探讨了系统评估出现的意义。

There is a long established tradition in the social sciences of authors reviewing the relevant literature and developing an argument around what they regard as the salient points (Littell, Citation2008). Indeed it would be rare for a paper to be published in a journal such as Australian Social Work not to include some form of literature review. In its more traditional form, sometimes known as a “narrative review”, authors use literature to discuss or support a proposition but with no or limited requirements for rigour or transparency in respect of their search techniques, identification of relevant studies, quality appraisal of published articles, and methods of synthesis.

However, since the mid-1990s and particularly over the last decade, what are known as systematic reviews have gained prominence and recognition as being an important methodological approach to dealing with ever growing amounts of data generated by research pertinent to health and human services practice (Dahabreh et al., Citation2013). These reviews can provide stakeholders—including policy makers, service providers, and service users—with an overview of often very complex literature around what is known about approaches to practice such as specific interventions (Hansen, Citation2014). Alternatively, findings of a review may be that there has been limited or no research on a subject (Shlonsky, Noonan, Littell, & Montgomery, Citation2011).

Systematic reviews have also found favour within academia and in some institutions it is now an expectation that higher degree research students will produce a systematic review of the existing literature to establish a rationale for their program of research (Pickering & Byrne, Citation2014). Furthermore, the establishment of the journal Research Synthesis Methods in 2010 and the now regular appearance of systematic reviews in social work journals such as Research on Social Work Practice, Journal of the Society for Social Work and Research, and British Journal of Social Work reflects the evolution of systematic reviews as a mainstream research methodology both generally and within social work in recent years. Nevertheless, this evolution has resulted in debates as to what is understood by the term “systematic review” and the relevance of these for a profession such as social work, and ongoing learning has led to changes in how reviews are commissioned, managed, and evaluated (Thomson, Russell, Becker, Klassen, & Hartling, Citation2010).

In 2003, the author was the lead author of the first of the 23 Knowledge Reviews published by the Social Care Institute for Excellence (SCIE) in England between 2003 and 2008. These sought to provide state-of-the-art reviews of current knowledge to guide the development of social work and social care practice in the United Kingdom (UK) (SCIE, Citation2014). Not only did SCIE commission reviews but over time has also developed, and subsequently refined, guidelines that have sought to dispel the mystique in the social work community about systematic reviews by providing practical guidelines on how to undertake reviews. Importantly, these guidelines recognised the research culture within UK social work at the time and how knowledge was understood within the profession (Rutter, Francis, Coren, & Fisher, Citation2010). Unfortunately, there is no organisation similar to SCIE in Australia that has been able to fund and support social work research on such a large scale, although there are several Australian organisations that have funded or conducted systematic reviews on issues of interest to social work practitioners, for example, VicHealth.

Over the last decade since returning to Australia, my experience has been that while there are pockets of interest within the Australian social work community, particularly within the health sector, there is also much misunderstanding as to what a systematic review is and doubts as to whether they have a place in social work research and practice. This may well be changing as Australian Social Work, which arguably acts as a barometer of social work research in Australia, has published a handful of reviews since 2011 described by the authors as systematic (Abdelkerim & Grace, Citation2012; Kiraly & Humphreys, Citation2013; O’Neal, Jackson, & McDermott, Citation2014; Skouteris et al., Citation2011; Trevor & Boddy, Citation2013). However, while the authors of each of these reviews have included mention of search criteria, what is reported and how it has been reported has varied considerably, and it is likely that some of these articles would not have met the guidelines for reporting reviews now being required by some social work journals (Montgomery et al., Citation2013).

This paper thus has two key aims. The first of these is to explore the question of what is a systematic review, introduce some of the key issues in undertaking such a review, and explore the implications of the emergence of systematic reviews from an Australian social work perspective. A second aim is to provide a resource to Australian journals, including Australian Social Work, which may be considering the development of guidelines for the publication of systematic reviews in social work and related areas.

What is a Systematic Review?

For many people, a systematic review is synonymous with the work of the Cochrane Collaboration, an international collaboration of systematic reviews in the field of healthcare interventions founded in 1993. However, in my experience, Australian social workers are often unaware of this resource that may support their practice. Recent Cochrane reviews of interest to social workers include “Efficacy and experiences of telephone counselling for informal carers of people with dementia” (Lins et al., Citation2014) and “Kinship care for the safety, permanency, and well-being of children removed from the home for maltreatment” (Winokur, Holtan, & Batchelder, Citation2014). Similarly, there is limited awareness of the Campbell Collaboration that was founded in 2000 on similar lines to the Cochrane Collaboration but with a focus on systematic reviews on social interventions in fields such as education, criminal justice, and social welfare (Petrosino, Citation2013). Recent Campbell Collaboration reviews of interest to social workers include “Educational and skills-based interventions for preventing relationship and dating violence in adolescents and young adults” (Fellmeth, Heffernan, Nurse, Habibula, & Sethi, Citation2013) and “Interventions for promoting reintegration and reducing harmful behaviour and lifestyles in street-connected children and young people: A systematic review” (Coren et al., Citation2013).

Both Cochrane and Campbell collaborations have detailed and rigorous requirements that review authors must follow to have their review registered with one of these collaborations. These include the development of a detailed protocol, which is submitted for peer review prior to the commencement of the review. Researchers must then undertake rigorous searches for relevant studies, which align with the inclusion and exclusion criteria specified in the protocol. There are also strict guidelines for the preparation of manuscripts for publication and requirements for the peer review of manuscripts prior to publication (Shlonsky et al., Citation2011).

While the Cochrane and Campbell collaborations have produced what are arguably the best-known guidelines for conducting systematic reviews, others have been produced. For example, when SCIE funds systematic reviews it is a requirement that equity and diversity issues are considered. SCIE also strongly encourages that service users and carers be involved in the development of reviews rather than their involvement being merely an option for consideration (Rutter et al., Citation2010). As the degree to which rigour is pursued and ethical requirements may vary, researchers need to be aware of the strengths and limitations of the particular guidelines they adopt when conducting reviews.

Although reviews often include a wide range of research designs and data types (e.g., Lins et al., Citation2014) a common stereotype of systematic reviews is that they are only concerned with randomised control trials (RCTs) of an intervention in which the outcomes are presented as quantified measures (Petticrew, Citation2001) that can be subjected to meta-analysis. This involves combining findings from individual studies using complex statistical techniques with an expectation being that the pooled results of several studies will correct for statistical deviations from the norm that may occur in individual studies (Moher, Liberati, Tetzlaff, Altman, & the PRISMA Group, Citation2009).

It has sometimes been proposed that the undertaking of a systematic review requires a team of specialists, not only from the field of practice but also information specialists, statisticians, and researchers with expertise in conducting systematic reviews (Thomson et al., Citation2010). Such an approach is costly, with one recent estimate suggesting the cost of a systematic review is between US$50,000 and US$150,000 depending on the scope, complexity, and number of studies to be evaluated and included in analyses (Littell & Maynard, Citation2014). Furthermore, such rigour may not be compatible with policy and practice timetables (Thomas, McNaught, & Ananiadou, Citation2011, p. 12).

While some reviews may require such extensive expertise and resources, many universities now expect doctoral, and sometimes even honours, students to undertake a systematic review on some aspect of their research topic with limited financial resources and to adapt the process to the resources available. At least two of the reviews published by Australian Social Work (Abdelkerim & Grace, Citation2012; Trevor & Boddy, Citation2013) are the product of student research, and the publication of these reviews has demonstrated what can be achieved even with limited resources. However, those with an understanding of a systematic review that is closely aligned with those of the Cochrane and Campbell collaborations may not recognise some of the reviews published in this journal as being “systematic”. Abdelkerim and Grace recognised the complexities of systematic reviews, and acknowledged that they used a less robust process. They claimed to have a rigorous approach to identification and appraisal of potential studies for inclusion in their review but provided limited details about their methods. Furthermore, rather than identifying the 50 articles that met the inclusion and exclusion criteria they had set, findings from a subgroup of these articles were highlighted with no clarification as to why these articles were selected. This is a practice more consistent with a narrative review than a systematic review. Similarly, Trevor and Boddy described a thorough search strategy and described their key findings in a narrative review but did not provide a full listing of the data which met their inclusion criteria. Although they indicated much of the data they considered was not included in their review, whether included data were appraised for quality is unclear.

Cost and timeliness concerns may lead to a decision to conduct a scoping review rather than a full systematic review as these can be conducted much more rapidly and at much less expense. For example, many scoping reviews carried out on behalf of the Department of Health in the UK are conducted in 2–3 weeks. The search strategies tend to be less rigorous in that there is less emphasis on data quality (Levac, Colquhoun, & O’Brien, Citation2010), searching may be confined to fewer databases, not seeking grey literature, or using a single coder rather than having multiple coders assess each piece of evidence (Stansfield, Thomas, & Kavanagh, Citation2013). Scoping studies may be conducted in their own right to gain an overview of the breadth of the existing literature including the current gaps in knowledge or to determine whether undertaking a much more extensive and rigorous review would be warranted (Arksey & O’Malley, Citation2005). A systematic method for conducting scoping studies has been developed by SCIE, which the authors note can be adapted according to circumstances (Clapton, Rutter, & Sharif, Citation2009).

Elements in a Systematic Review

Conducting systematic reviews involves a number of elements. These include: defining the research question; determining search methods and retrieving data using these methods; setting inclusion and exclusion criteria; and reporting the results. These will be discussed in turn.

Defining the Research Question

In any research one of the first tasks is to define the research question. Over the past decade there has been a growing expectation that systematic review methods need not be confined to questions about outcomes but can be applied to a wide range of research questions (Petticrew, Citation2001; Rutter et al., Citation2010; Shlonsky & Mildon, Citation2014). For example, Skouteris et al. (Citation2011) reviewed the presence of data about weight or obesity in research about children in out-of-home care and Trevor and Boddy (Citation2013) sought to identify literature about transgenderism in social work. Both of these reviews found very limited data to be available and concluded that further research is needed on these issues.

Search Methods and Data Retrieval

In a satire about the process of undertaking a systematic review the search methods reported were as follows:

We browsed the web a bit, sat around and chatted for an enjoyable weekend, asked a few people who are actually interested in the topic what they think, circulated drafts of this article to a few buddies, and made up the rest. (Oxman, Sackett, Chalmers, & Prescott, Citation2005, p. 564)

While the reader may laugh at such absurdity, this extract is perhaps closer to the truth than some researchers conducting narrative reviews may be willing to admit. In order to ensure against such a process one of the key features of a systematic review is the development of a detailed search protocol which makes explicit the process of data searching. Organisations such as the Cochrane Collaboration (e.g., Higgins & Green, Citation2011) and others that commission systematic reviews typically spell out the requirements for identifying search methods for systematic reviews in a protocol before commencing (Rutter et al., Citation2010). If researchers are unsure as to the most appropriate search terms, pilot testing may be required to determine which terms are included in a search strategy.

Typically a search strategy begins with the identification of relevant bibliographic databases and search terms. While researchers in some disciplines might expect to find any relevant articles indexed in a single database, material of interest to social work researchers can be published in journals associated with a range of disciplines. Although many studies will be indexed in multiple databases, it is not uncommon for relevant studies to be found in only one of a set of health and social sciences databases (McFadden, Taylor, Campbell, & McQuilkin, Citation2012). Using a single search engine such as OVID or EBSCOHost may enable researchers to search multiple databases in a single search but before deciding only to utilise a single research engine, they need to ascertain whether all the relevant databases for their search are included in the search engine.

Furthermore, search strategies need to take into account differing terms used to describe the same phenomena (Pitt et al., Citation2013). For example, where the topic for review cuts across disciplines, different terminologies may be used (O’Mara-Eves et al., Citation2014). Similarly, Littell (Citation2006) reported that search strategies that worked for her UK colleagues needed adaptation in North America. Even between countries in which English is predominant, the language of social work can vary, for example, within social work education what is referred to as “field education” in Australia is known as “practice learning” in the United Kingdom. Within social work, some commonly used words that differ between British and North American spellings include programme/program, behaviour/behavior and counselling/counseling. Moreover, while it might seem obvious, search strategies need be able to take account of linguistic and spelling differences as well as including both abbreviations and fully spelt-out versions of terms (Valentine, Cooper, Patall, Tyson, & Robinson, Citation2010).

An expectation that this is a transparent process may lead to researchers being of the opinion that online bibliographic searching using identified search terms will find any relevant study that could be considered for inclusion in a systematic review. However, unless the actual search terms are included in the title, abstract, or keywords, it is quite possible that relevant materials will not be found in electronic searches even if they have been indexed in a database (Fehrmann & Thomas, Citation2011). By going beyond data searching that solely relied on screening abstracts and titles, O’Mara-Eves et al. (Citation2014) claimed that 31% of 319 studies on community engagement that met their criteria for systematic review would have been excluded. Techniques for finding additional studies include handsearching key journals to which researchers have full-text access (Glanville, Cikalo, Crawford, Dozier, & McIntosh, Citation2012), following up leads in the bibliographies of published work which might be relevant (Fehrmann & Thomas, Citation2011) including conducting author searches to see if they have published further articles within the scope of the review, and contacting authors who it is thought might have relevant published work which was not identified by searches of the bibliographic databases (O’Mara-Eves et al., Citation2014). Crisp, Anderson, Orme, and Green Lister (Citation2003) wrote to the editors of the key journals identified as most likely to be publishing relevant papers related to their review for details of any papers accepted for publication, but with many journals now publishing accepted manuscripts online prior to final publication, contacting editors about forthcoming papers may no longer be relevant. Nevertheless, handsearching of forthcoming papers published online, and papers recently published that have yet to have been indexed may be worthwhile.

Confining searches to bibliographic databases, which predominantly list research published in journals, can result in relevant research published in books being overlooked (Wilson, Citation2009). Bibliographic databases also tend to overlook “grey data”, which refers to information not published by commercial presses. This material might not have been subjected to peer review and is often published for a specific audience that is associated with an organisation or association. For example, although some doctoral theses are indexed, others are not, and honours and masters theses are never indexed. Furthermore, in contexts where rapid or widespread dissemination of research is of higher priority than having it published in peer reviewed journals, grey literature may be an important source of data. This includes government reports and organisational reports of practitioner research, particularly when the research has been undertaken in clinical settings not affiliated with academic institutions (Gingerich & Peterson, Citation2013) where practitioners have little capacity or indeed incentive to publish their research in peer reviewed journals (Mahood, Van Eerd, & Irvin, Citation2014). Grey literature may also include unpublished research reports, given that research which produces a significant or positive result is much more likely to be accepted for publication, known as “publication bias”, than studies where no significant difference has been established or indeed a negative result has been found (Norris et al., Citation2013.)

Systematic searching takes a lot of time (Glanville et al., Citation2012) and the extent to which searching is undertaken may be limited by financial and time constraints (Crisp et al., Citation2003). Moreover, it is recognised that complex search strategies, particularly those that involved iterative searches, can complicate the reporting of both the search method and the results of searches (Fehrmann & Thomas, Citation2011). However, it is important that social workers undertaking systematic reviews develop search strategies which are appropriate for finding the data relevant to their reviews. When available, access to resources such as research librarians who can assist with searches, and bibliographic software to track and store search outputs, can make the systematic review process more manageable. And if there are any constraints that may have prevented them undertaking search strategies which they consider may have yielded further information had there been no constraints on their searching, these can be reported.

Inclusion and Exclusion Criteria

One of the key ways for constraining search strategies is to have explicit inclusion and exclusion criteria. At a minimum, these should usually specify the Participants, Interventions, Comparisons to be undertaken, Outcomes, and Study design, which are often collectively known by the acronym “PICOS”. However, it is acknowledged that some of the PICOS elements may not be relevant in some reviews, for example, not all studies involve comparison groups. Other criteria could include date of publication, publication type and status, databases used, or language (Moher et al., Citation2009).

We know from medical research, that by using different criteria for selection of study populations and intervention outcomes, meta-analysts have come to contradictory conclusions, even when much of the data they have used have overlapped (Ioannidis, Citation2010). A not uncommon inclusion criterion is that the material is written in English (O’Mara-Eves et al., Citation2014). Given the predominance of the English language in international publishing and even more so in the bibliographic databases (Hamel, Citation2007), in practice this may not have a critical impact on the findings reported in a systematic review. However, assuming any findings authors publish in other languages are no more or less favourable to those they publish in English, which may not be so (Egger et al., Citation1997).

Another common criterion for determining inclusion or exclusion is when a study was first published (Valentine et al., Citation2010). For example, changes over time in the social policy context, including legislation, policies, and institutional processes can all impact on the effectiveness of interventions (Higgins et al., Citation2013; Shlonsky & Mildon, Citation2014). Where practices or context differ substantially between countries, there may be an argument for restricting reviews to studies from a single country or countries which are ostensibly alike in respect of study characteristics. Nevertheless, this may result in findings that are less generalisable to other settings. On the other hand, reviews that draw on data from many contexts may produce findings that are too general to be useful in any specific context (Hannes & Harden, Citation2011).

Exclusion criteria are just as important as inclusion criteria and likewise may represent either practical or ideological concerns. For example, masters’ theses were excluded by O’Mara-Eves et al. (Citation2014) although it was unclear why they identified this criterion. However, even if masters’ theses were identified as relevant they may not have been readily obtainable (Littell, Citation2006).

Rigorous exclusion data based on assessments of data quality can potentially result in few, if any, studies included in a review (Abdelkerim & Grace, Citation2012; Gustafsson et al., Citation2009). Others have argued that when data is scarce, it may be necessary not to exclude some studies on methodological grounds (Gray, Joy, Plath, & Webb, Citation2013). In respect of reviews concerning effectiveness of interventions, reviewers have often privileged randomised study designs on the basis of presumed data quality (Norris et al., Citation2013). However, there is an emerging point of view that nonrandomised trials may in fact be a better reflection of how interventions are provided outside research settings (Valentine & Thompson, Citation2013) and that in some situations, the only possible data will neither be randomised nor even have a control group (Paulus et al., Citation2014).

Preferencing of RCT data has resulted in the prioritisation of quantitative data by some reviewers (O’Neal et al., Citation2014). However, in recent years, there has been a growing acceptance and utilisation of qualitative research in systematic reviews although the legitimacy of doing so remains contested (Hannes & Harden, Citation2011; Lorenc, Pearson, Jamal, Cooper, & Garside, Citation2012). Furthermore, it is not uncommon to now find systematic reviews which report on both quantitative and qualitative data in separate analyses (e.g., Lins et al., Citation2014; van Velthoven, Brusamento, Majeed, & Car, Citation2013).

Finally, it is not sufficient to have inclusion and exclusion criteria, but also to have guidelines as to how these will be applied. To enhance reliability, it is generally recommended that for decisions as to whether any study fits the inclusion and exclusion criteria, there should be at least two independent raters who examine studies identified by the search strategy. Each should separately make determinations as to eligibility and extract data relating to the study design, sample, interventions, attrition, outcome, and findings (Littell, Citation2006). However, sometimes this is not possible, for example, due to resource constraints (Abdelkerim & Grace, Citation2012). When there are multiple raters, there also needs to be a strategy for resolving any discrepancies in ratings (Gray et al., Citation2013; Pitt et al., Citation2013).

Reporting

A proliferation of reporting guidelines for systematic reviews have emerged over the last decade, in response to concerns that the reporting of reviews was often inadequate (Mullins, DeLuca, Crepaz, & Lyles, Citation2014; Rutter et al., Citation2010). Nevertheless, there remain concerns that many social work publications have not adopted any of these standards on the conduct and reporting of reviews in respect of methodological transparency and rigour (Littell & Shlonsky, Citation2011). This may reflect either, or both, a lack of awareness of these guidelines, or a perception that guidelines that first emerged from medicine are not applicable in the social sciences (Montgomery et al., Citation2013).

The Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) is increasingly becoming the standard by which authors of reviews document their search methods. The PRISMA guidelines provide a checklist of 27 items that should be used when reporting a systematic review which cluster under the headings of Title, Abstract, Introduction, Methods, Results, Discussion, and Funding, but could equally serve as an invaluable template when designing a review. Within these clusters there may be items which don’t apply to particular reviews, for example, one does not include items specifically relating to meta-analysis if one is undertaking a synthesis of qualitative data. Although the PRISMA guidelines were initially agreed on by both researchers and the editors of several medical journals (Moher et al., Citation2009), there is considerable variability among researchers as to how these should be interpreted (Rader, Mann, Stansfield, Cooper, & Sampson, Citation2014). Editorial requirements of journals can also place significant constraints on reporting of systematic reviews (Littell, Citation2006) although higher impact journals such as Research on Social Work Practice and Journal of the Society for Social Work and Research, are increasingly requiring a robust methodology for any literature review they publish, irrespective of the research question. The PRISMA guidelines recognise space limitations in journals by noting a requirement that a full search strategy for only one database is required (Moher et al., Citation2009) and with many journals now being published online, the potential for publishing supplementary details electronically is emerging (e.g., McFadden, Campbell, & Taylor, Citation2014).

Issues for social work

While it has been argued that social work needs systematic reviews (Mullen & Shuluk, Citation2011; Shlonsky et al., Citation2011), concerns have also been expressed that social science disciplines, such as social work, may have jumped on the systematic review bandwagon without due regard to some of the methodological issues and limitations (Attanasio, Citation2014). Importantly, social work is underpinned by a philosophy that is concerned with the person-in-environment (Clark et al., Citation2014). This includes the social policy context which can differ in history and between locations (Hansen, Citation2014).

In respect of reviews concerning social work interventions, there is growing recognition that systematic reviews need to be able to deal with complexity but it is often difficult to disentangle actual interventions from the services that provide them (Pitt et al., Citation2013) or establish the impact of a particular intervention when, as is common in social work, there may be multiple simultaneous interventions (Thomson et al., Citation2010). Furthermore, in social work, there are interventions where RCTs would not be ethical (Gustafsson et al., Citation2009).

While the potential and difficulties with systematic reviews concerning evaluations of interventions have been widely articulated, there has been much less discussion about the potential for systematic reviews to make a contribution to other areas of social work research. High quality social work knowledge that enhances understanding of social problems and the needs of particular groups or communities is needed for effective program planning, and systematic reviews would seem to have great potential in this area.

If social workers are going to conduct systematic reviews, the reviews need to be framed in such a way that they recognise the nature of social work practice and contribute to the professional knowledge base (Gambrill, Citation2015). This includes feeding into the development and refinement of policy and practice guidelines (Mullen, Citation2014) which “interpret results to make knowledge of intervention effects available to a wide audience, including readers who are unfamiliar with the methodological and technical aspects of both primary research and research synthesis” (Littell, Citation2006, p. 446). However, even though reviewers may state an aim to influence policy or practice, often it is unclear who or what they are attempting to influence (Hannes & Harden, Citation2011). As with all research, dialogue at the nexus between research, policy, and practice will be crucial if systematic reviews are to make a significant impact on social work practice (de Leeuw, McNess, Crisp, & Stagnitti, Citation2008).

References

  • Abdelkerim, A., & Grace, M. (2012). Challenges to employment in newly emerging African communities in Australia: A review of the literature. Australian Social Work, 65(1), 104–119. doi:10.1080/0312407X.2011.616958
  • Arksey, H., & O’Malley, L. (2005). Scoping studies: Towards a methodological framework. The International Journal of Social Research Methodology, 8(1), 19–32. doi:10.1080/1364557032000119616
  • Attanasio, O. P. (2014). Evidence on public policy: Methodological issues, political issues and examples. Scandinavian Journal of Public Health, 42(Suppl. 13), 28–40. doi:10.1177/1403494813516717
  • Clapton, J., Rutter, D., & Sharif, N. (2009). SCIE systematic mapping guidance. London: Social Care Institute for Excellence. Retrieved September 17, 2014, from http://www.scie.org.uk/publications/researchresources/rr03.pdf
  • Clark, T. T., McGovern, P., Mgbeokwere, D., Wooten, N., Owusu, H., & McGraw, K. A. (2014). Systematic review: The nature and extent of social work research on substance use disorders treatment interventions among African Americans. Journal of Social Work, 14, 451–472. doi:10.1177/1468017313479858
  • Coren, E., Hossain, R., Pardo Pardo, J., Veras, M. M. S., Chakraborty, K., Harris, H., & Martin, A. J. (2013). Interventions for promoting reintegration and reducing harmful behaviour and lifestyles in street-connected children and young people: A systematic review. Campbell Systematic Reviews, 2013, 6. doi:10.4073/csr.2013.6
  • Crisp, B. R., Anderson, M. R., Orme, J., & Green Lister, P. (2003). Knowledge review 1: Learning and teaching in social work educationassessment. Bristol: Policy Press.
  • Dahabreh, I. J., Chung, M., Kitsios, G. D., Terasawa, T., Raman, G., Tatsioni, A., … Schmid, C. H. (2013). Survey of the methods and reporting practices in published meta-analyses of test performance: 1987 to 2009. Research Synthesis Methods, 4, 242–255.
  • de Leeuw, E., McNess, A., Crisp, B., & Stagnitti, K. (2008). Theoretical reflections on the nexus between research, policy and practice. Critical Public Health, 18(1), 5–20. doi:10.1080/09581590801949924
  • Egger, M., Zellweger-Zähner, T., Schneider, M., Junker, C., Lengeler, C., & Antes, G. (1997). Language bias in randomised controlled trials published in English and German. The Lancet, 350, 326–329. doi:10.1016/S0140-6736(97)02419-7
  • Fehrmann, P., & Thomas, J. (2011). Comprehensive computer searches and reporting in systematic reviews. Research Synthesis Methods, 2(1), 15–32. doi:10.1002/jrsm.31
  • Fellmeth, G. L. T., Heffernan, C., Nurse, J., Habibula, S., & Sethi, D. (2013). Educational and skills-based interventions for preventing relationship and dating violence in adolescents and young adults: A systematic review. Campbell Systematic Reviews, 2013, 14. doi:10.4073/csr.2013.14
  • Gambrill, E. (2015). Avoidable ignorance and the role of Cochrane and Campbell reviews. Research on Social Work Practice, 25(1), 147–163. doi:10.1177/1049731514533731
  • Gingerich, W. J., & Peterson, L. T. (2013). Effectiveness of solution-focused brief therapy: A systematic review of controlled outcome studies. Research on Social Work Practice, 23, 266–283.
  • Glanville, J., Cikalo, M., Crawford, F., Dozier, M., & McIntosh, H. (2012). Handsearching did not yield additional unique FDG-PET diagnostic test accuracy studies compared with electronic searches: A preliminary investigation. Research Synthesis Methods, 3, 202–213. doi:10.1002/jrsm.1046
  • Gray, M., Joy, E., Plath, D., & Webb, S. A. (2013). Implementing evidence-based practice: A review of the empirical literature. Research on Social Work Practice, 23, 157–166. doi:10.1177/1049731512467072
  • Gustafsson, C., Öjehagen, A., Hansson, L., Sandlund, M., Nyström, M., Glad, J., … Fredrikkson, M. (2009). Effects of psychosocial interventions for people with intellectual disabilities and mental health problems: A survey of systematic reviews. Research on Social Work Practice, 19, 281–290. doi:10.1177/1049731508329403
  • Hamel, R. E. (2007). The dominance of English in the international scientific periodical literature and the future of language use in science. ALIA Review, 20, 53–71. doi:10.1075/aila.20.06ham
  • Hannes, K., & Harden, A. (2011). Multi-context versus context-specific qualitative evidence synthesis: Combining the best of both. Research Methods Synthesis, 2, 271–278. doi:10.1002/jrsm.55
  • Hansen, H. F. (2014). Organisation of evidence-based knowledge production: Evidence hierarchies and evidence typologies. Scandinavian Journal of Public Health, 42(Suppl. 13), 11–17. doi:10.1177/1403494813516715
  • Higgins, J. P. T., & Green, S. (Eds.). (2011). Cochrane handbook for systematic reviews of interventions, Version 5.1. Retrieved June 20, 2014, from http://handbook.cochrane.org
  • Higgins, J. P. T., Ramsay, C., Reeves, B. C., Deeks, J. J., Shea, B., Valentine, J. C., … Wells, G. (2013). Issues relating to study design and risk of bias when including non-randomized studies in systematic reviews on the effects of interventions. Research Synthesis Methods, 4(1), 12–25. doi:10.1002/jrsm.1056
  • Ioannidis, J. P. A. (2010). Meta-research: The art of getting it wrong. Research Synthesis Methods, 1, 169–184. doi:10.1002/jrsm.19
  • Kiraly, M., & Humphreys, C. (2013). Family contact for children in kinship care: A literature review. Australian Social Work, 66, 358–374. doi:10.1080/0312407X.2013.812129
  • Levac, D., Colquhoun, H., & O’Brien, K. K. (2010). Scoping studies: Advancing the methodology. Implementation Science, 5, 69. doi:10.1186.1748-5908-5-69
  • Lins, S., Hayder-Beichel, D., Rücker, G., Motschall, E., Antes, G., Meyer, G., & Langer, G. (2014). Efficacy and experiences of telephone counselling for informal carers of people with dementia. Cochrane Database of Systematic Reviews, 9, CD009126. doi:10.1002/14651858.CD009126.pub2
  • Littell, J. H. (2006). Lessons from a systematic review of effects of multisystemic therapy. Children and Youth Services Review, 27, 445–463. doi:10.1016/j.childyouth.2004.11.009
  • Littell, J. H. (2008). Evidence-based or biased? The quality of published reviews of evidence-based practices. Children and Youth Services Review, 30, 1299–1317. doi:10.1016/j.childyouth.2008.04.001
  • Littell, J. H., & Maynard, B. R. (2014, January 16). Systematic review methods: The science of research synthesis. Paper presented at the Society of Social Work Research Conference, San Antonio, TX. Retrieved June 8, 2014, from http://www.sswr.org/RMW-2.pdf
  • Littell, J. H., & Shlonsky, A. (2011). Making sense of meta-analysis: A critique of “effectiveness of long-term psychodynamic psychotherapy”. Clinical Social Work Journal, 39, 340–346. doi:10.1007/s10615-010-0308-z
  • Lorenc, T., Pearson, M., Jamal, F., Cooper, C., & Garside, R. (2012). The role of systematic review of qualitative evidence in evaluating interventions: A case study. Research Synthesis Methods, 3, 1–10.
  • Mahood, Q., Van Eerd, D., & Irvin, E. (2014). Searching for grey literature for systematic reviews: Challenges and benefits. Research Synthesis Methods, 5, 221–234. doi:10.1002/jrsm.1106
  • McFadden, P., Campbell, A., & Taylor, B. J. (2014). Resilience and burnout in child protection social work: Individual and organisational themes from a systematic literature review. British Journal of Social Work. Advance online publication. doi:10.1093/bjsw/bct210
  • McFadden, P., Taylor, B. J., Campbell, A., & McQuilkin, J. (2012). Systematically identifying relevant research: Case study on child protection social workers’ resilience. Research on Social Work Practice, 22, 626–636. doi:10.1177/1049731512453209
  • Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., & the PRISMA Group. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Annals of Internal Medicine, 151, 264–270. doi:10.7326/0003-4819-151-4-200908180-00135
  • Montgomery, P., Grant, S., Hopewell, S., Macdonald, G., Moher, D., & Mayo-Wilson, E. (2013). Developing a reporting guideline for social and psychological intervention trials. British Journal of Social Work, 43, 1024–1038. doi:10.1093/bjsw/bct129
  • Mullen, E. J. (2014). Evidence-based knowledge in the context of social practice. Scandinavian Journal of Public Health, 42(Suppl. 13), 59–73. doi:10.1177/1403494813516714
  • Mullen, E. J., & Shuluk, J. (2011). Outcomes of social work intervention in the context of evidence-based practice. Journal of Social Work, 11, 49–63. doi:10.1177/1468017310381309
  • Mullins, M. M., DeLuca, J. B., Crepaz, N., & Lyles, C. M. (2014). Reporting quality of search methods in systematic reviews of HIV behavioural interventions (2000–2010): Are the searches clearly explained, systematic and reproducible? Research Synthesis Methods, 5, 116–130. doi:10.1002/jrsm.1098
  • Norris, S. L., Moher, D., Reeves, B. C., Shea, B., Loke, Y., Garner, S., … Wells, G. (2013). Issues relating to selective reporting when including non-randomized studies in systematic reviews on the effects of healthcare intervention. Research Synthesis Methods, 4(1), 36–47. doi:10.1002/jrsm.1062
  • O’Mara-Eves, A., Brunton, G., McDaid, D., Kavanagh, J., Oliver, S., & Thomas, J. (2014). Techniques for identifying cross-disciplinary and “hard to detect” evidence for systematic review. Research Synthesis Methods, 5, 50–59.
  • O’Neal, P., Jackson, A., & McDermott, F. (2014). A review of the efficacy and effectiveness of cognitive-behaviour therapy and short-term psychodynamic therapy in the treatment of major depression: Implications for mental health social work practice. Australian Social Work, 67, 197–213.
  • Oxman, A. D., Sackett, D. L., Chalmers, I., & Prescott, T. E. (2005). A surrealistic mega-analysis of redisorganization theories. Journal of the Royal Society of Medicine, 98, 563–568. doi:10.1258/jrsm.98.12.563
  • Paulus, J. K., Dahabreh, I. J., Balk, E. M., Avendano, E. E., Lau, J., & Ip, S. (2014). Opportunities and challenges in using studies without a control group comparative effectiveness reviews. Research Synthesis Methods, 5(2), 152–161. doi:10.1002/jrsm.1101
  • Petrosino, A. (2013). Reflections on the genesis of the Campbell Collaboration. The Experimental Criminologist, 8(2), 9–12.
  • Petticrew, M. (2001). Systematic reviews from astronomy to zoology: Myths and misconceptions. British Medical Journal, 322(7278), 98–101. doi:10.1136/bmj.322.7278.98
  • Pickering, C., & Byrne, J. (2014). The benefits of publishing systematic quantitative literature reviews for PhD candidates and other early-career researchers. Higher Education Research and Development, 33, 534–548. doi:10.1080/07294360.2013.841651
  • Pitt, V. J., Lowe, D., Prictor, M., Hetrick, S., Ryan, R., Berends, L., & Hill, S. (2013). A systematic review of consumer-providers’ effects on client outcomes in statutory mental health services: The evidence and the path beyond. Journal for the Society of Social Work Research, 4, 333–356. doi:10.5243/jsswr.2013.21
  • Rader, T., Mann, M., Stansfield, C., Cooper, C., & Sampson, M. (2014). Methods for documenting systematic review searches: A discussion of common issues. Research Synthesis Methods, 5(2), 98–115. doi:10.1002/jrsm.1097
  • Rutter, D., Francis, J., Coren, E., & Fisher, M. (2010). SCIE systematic reviews: Guidelines ( 2nd ed.). London: Social Care Institute for Excellence. Retrieved June 20, 2014, from http://www.scie.org.uk/publications/researchresources/rr01.pdf
  • SCIE. (2014). SCIE knowledge reviews. Retrieved September 14, 2014, from http://www.scie.org.uk/publications/knowledgereviews/index.asp
  • Shlonsky, A., & Mildon, R. (2014). Methodological pluralism in the age of evidence-informed practice and policy. Scandinavian Journal of Public Health, 42(Suppl. 13), 18–27. doi:10.1177/1403494813516716
  • Shlonsky, A., Noonan, E., Littell, J. H., & Montgomery, P. (2011). The role of systematic reviews and the Campbell Collaboration in the realization of evidence-informed practice. Clinical Social Work Journal, 39, 362–368. doi:10.1007/s10615-010-0307-0
  • Skouteris, H., McCabe, M., Fuller-Tyszkiewicz, M., Henwood, A., Limbrick, S., & Miller, R. (2011). Obesity in children in out-of-home care: A review of the literature. Australian Social Work, 64, 475–486. doi:10.1080/0312407X.2011.574145
  • Stansfield, C., Thomas, J., & Kavanagh, J. (2013). Clustering documents automatically to support scoping reviews of research: A case study. Research Methods Synthesis, 4, 230–241.
  • Trevor, M., & Boddy, J. (2013). Transgenderism and Australian social work: A literature review. Australian Social Work, 66, 555–570. doi:10.1080/0312407X.2013.829112
  • Thomas, J., McNaught, J., & Ananiadou, S. (2011). Applications of text mining within systematic reviews. Research Synthesis Methods, 2(1), 1–14. doi:10.1002/jrsm.27
  • Thomson, D., Russell, K., Becker, L., Klassen, T., & Hartling, L. (2010). The evolution of a new publication type: Steps and challenges of producing overviews of reviews. Research Synthesis Methods, 1, 198–211. doi:10.1002/jrsm.30
  • Valentine, J. C., Cooper, H., Patall, E. A., Tyson, D., & Robinson, J. C. (2010). A method for evaluating research syntheses: The quality, conclusions, and consensus of 12 syntheses of the effects of after-school programs. Research Synthesis Methods, 1(1), 20–38. doi:10.1002/jrsm.3
  • Valentine, J. C., & Thompson, S. G. (2013). Issues relating to confounding and meta-analysis when including non-randomized studies in systematic reviews on the effects of interventions. Research Synthesis Methods, 4(1), 26–35. doi:10.1002/jrsm.1064
  • van Velthoven, M. H. M. M. T., Brusamento, S., Majeed, A., & Car, J. (2013). Scope and effectiveness of mobile phone messaging for HIV/AIDS care: A systematic review. Psychology, Health & Medicine, 18, 182–202. doi:10.1080/13548506.2012.701310
  • Wilson, D. B. (2009). Missing a critical piece of the pie: Simple document search strategies inadequate for systematic reviews. Journal of Experimental Criminology, 5, 429–440. doi:10.1007/s11292-009-9085-5
  • Winokur, M., Holtan, A., & Batchelder, K. E. (2014). Kinship care for the safety, permanency, and well-being of children removed from the home for maltreatment. Cochrane Database of Systematic Reviews, 1, CD006546. doi:10.1002/14651858.CD006546.pub3

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.