11,936
Views
75
CrossRef citations to date
0
Altmetric
Review

How to prepare a systematic review of economic evaluations for clinical practice guidelines: database selection and search strategy development (part 2/3)

, , , , , & show all
Pages 705-721 | Received 26 Apr 2016, Accepted 07 Oct 2016, Published online: 02 Nov 2016

ABSTRACT

Introduction: This article is part of the series “How to prepare a systematic review of economic evaluations (EES) for informing evidence-based healthcare decisions”, in which a five-step approach is proposed.

Areas covered: This paper focuses on the selection of relevant databases and developing a search strategy for detecting EEs, as well as on how to perform the search and how to extract relevant data from retrieved records.

Expert commentary: Thus far, little has been published on how to conduct systematic review EEs. Moreover, reliable sources of information, such as the Health Economic Evaluation Database, have ceased to publish updates. Researchers are thus left without authoritative guidance on how to conduct SR-EEs. Together with van Mastrigt et al. we seek to fill this gap.

1. Introduction

To support their decisions in health care, policy and decision makers need reliable information on the cost-effectiveness of health care interventions [Citation1]. Systematic reviews of economic evaluations (SR-EEs) are a source of this information [Citation2]. However, although these reviews have become increasingly important, little has been published on how to perform SR-EEs [Citation3]. Without such guidance, those who wish to perform SR-EEs are left with practice guidance and recommendations that focus solely on medical efficacy research, which is usually concerned only superficially – if at all – with economic outcomes.

The vast amount of publications and their widely differing quality, together with subjective components that may guide a searcher’s decision, call for standardized methods [Citation4]. Therefore, a carefully planned strategy is essential when a thoroughly conducted SR is the goal [Citation5]. Moreover, SRs should be reproducible, verifiable, efficient, and accountable [Citation2,Citation6,Citation7].

With a five-step approach on how to perform SR-EEs of health-care interventions, van Mastrigt and colleagues [Citation8] make a first attempt to fill the gap that has occurred in the absence of both guidance and reliable and comprehensive economic databases. Their goal is to pave the way in establishing future guidance for SR-EEs. In the meantime, their approach can be used as a preliminary manual for performing SR-EEs in a sound scientific way. Their guidance aids users in employing efficient and transparent methods, which are central to any SR [Citation2]. Just as for part 1/3 of this paper series, this article’s main target audience is developers of clinical practice guidelines (CPGs) who need a point of reference on how to perform SR-EEs. Similarly, it can be a helpful tool for researchers in health technology assessment, systematic reviewers, and for students who seek to prepare an SR-EE. To illustrate the case, we will discuss our theoretical considerations alongside a recent example of an SR-EE that was part of developing a CPG for the treatment of epilepsy in The Netherlands [Citation9].

2. Background

Typically, evidence for a CPG is gathered by systematically reviewing publications that are concerned with the effectiveness of different treatment options [Citation10]. In addition, it has become increasingly acknowledged that CPGs should also entail economic evidence [Citation11Citation13]. This can be done in two, not necessarily independent, ways: (1) an SR and critical summary of the economic evidence already published is undertaken or (2) a decision analytic model is built to model economic effects [Citation2]. This article will focus solely on the former approach.

In general, most steps of an SR-EE involve the same stages that are needed to conduct an SR of evidence for clinical effectiveness [Citation2]. More specifically, any SR-EE will be based on the same two-stage process that has become the established standard for SRs of effects [Citation2], namely: (1) developing a search strategy and (2) applying the search strategy to a set of specified databases [Citation14]. However, some methods of SR-EEs diverge significantly as economic outcomes replace effectiveness or safety outcomes that would be detected in SRs [Citation2]. As a result, database selection as well as the identification of search terms and filters differs. However, guidance on how to extend a search strategy and what databases to use when seeking to incorporate EEs is scarce, fragmented, or not applicable to all cases. In this article, we will present solutions for overcoming these issues, based on published guidance in the field and our experience.

3. The five-step approach for preparing an SR-EE

Following van Mastrigt’s approach for conducting SR-EEs, the first step is to compose a multidisciplinary project team, frame the study, prioritize the topics, and write and publish the protocol. With regard to the subsequent steps, it should be noted that adding a medical information specialist or librarian to the search team adds great value to the quality of the searches [Citation15].

In the second step, EEs need to be identified; this includes (1) selecting relevant databases, (2) developing an adequate search strategy, (3) performing the searches, and (4) selecting the relevant studies.

This article will provide a more detailed description of these four parts of the second step, while step 3 is described by Wijnen et al. [Citation16] in more detail. An overview of all other steps and a detailed description of steps 1, 4, and 5 can be found in van Mastrigt et al. [Citation8]. For an overview of the five-step approach, see .

Figure 1. An overview of the 5-step approach for preparing a systematic review of economic evaluations to inform evidence-based decisions. *Described in detail by van Mastrigt et al. [Citation8], **Described in detail by Wijnen et al. [Citation16]

Figure 1. An overview of the 5-step approach for preparing a systematic review of economic evaluations to inform evidence-based decisions. *Described in detail by van Mastrigt et al. [Citation8], **Described in detail by Wijnen et al. [Citation16]

4. Step 2.1 of the overall framework: selection of relevant data sources

Until recently, a large part of EEs in health care could be detected by searching databases that specifically focus on these evaluations, such as the U.K. National Health Service Economic Evaluation Database (NHS EED) and the Health Economic Evaluation Database (HEED). However, HEED ceased publication at the end of 2014 and is no longer accessible for searches [Citation17]. And, although still accessible through the Cochrane Library and the Centre for Reviews and Dissemination (CRD) website, the NHS EED has not been updated since March 2015 [Citation18].

Many databases can be accessed via different search providers and platforms, and these pose varying requirements for a search strategy. Most end users will access well-known standard biomedical databases such as MEDLINE or Embase [Citation1]. Apart from the question of whether all EEs are indexed in these databases, records can be indexed inconsistently, and there is no uniform interpretation of the definition of EEs [Citation3]. In addition to electronic bibliographic databases, other resources such as gray literature, research registries, or web pages may contain useful information. Also, registries of unpublished studies can be searched, and researchers can be contacted for additional data.

No database is comprehensive enough to cover all relevant published research [Citation19]. Therefore, the general consensus for effectiveness is that at least several databases need to be searched for a comprehensive result [Citation20Citation24]. Guidelines for SRs recommend searching at least two bibliographic databases [Citation25,Citation26], although there is no agreed-on standard for how many should be searched [Citation27]. As the number of searched databases increases, database bias (referred to as the probability that the index of a record in a specific database is dependent on its results) and potential language bias can be reduced [Citation28]. Which databases should be selected for a review depends heavily on the study objectives [Citation27], and there is no consensus about this either [Citation29]. Being aware of how each interface for searching databases works is essential, since search results might well vary if the same database is searched through different interfaces (e.g. searching MEDLINE via PubMed or via OVID) [Citation29].

4.1. Electronic databases for searching EEs

Backed by an extensive amount of evidence [Citation30Citation39], Mathes et al. [Citation40] recommend searching at least MEDLINE and Embase for SR-EEs. In addition, they suggest searching one health economic database, such as HEED or NHS EED. Also, the Cochrane Handbook [Citation39] and the manual for developing the National Institute for Health and Care Excellence (NICE) guidelines [Citation10], together with the Campbell and Cochrane Economics Methods Group (CCEMG) [Citation41], emphasize the use of the NHS EED on their website when searching for economic evidence for SRs. However, as HEED is no longer available and the NHS EED is no longer updated, this advice is obsolete.

4.2. Gray literature

Gray literature (i.e. technical reports, studies, or essays that are unpublished, have restricted distribution, and are therefore rarely included in bibliographic retrieval systems)[Citation42] has the potential to add valuable information to an SR-EE, especially when little is known about the topic under study. Although finding and including gray literature is particularly time-consuming and difficult, it is regarded as necessary for minimizing bias in reviews [Citation43]. When possibly including gray trials, Hopewell et al. [Citation43] recommend contacting the authors of these trials for more information. Examples of missing information could, for instance, be values for the standard deviation or variance when only the mean or median is reported.

The CRD health technology assessment database [Citation44] identifies gray literature.

4.3. Citation searching

In citation searching, the reviewers search for articles that have cited a set of relevant articles which have already been detected [Citation27]. For example, this can be done on the Science Citation Index Expanded™ (Thomson Reuters, United States) [Citation45], via the Web of Science™. Citation searching can also include reference checking. Here, the reviewers can scan the reference lists of useful records previously identified to see if they refer to as yet unknown articles.

4.4. Classification of databases

We classified several databases and websites into three categories, based on their ability to detect EEs in health care; these three categories are (1) basic, (2) specific, and (3) optional. For a complete but non-exhaustive list, see . The choice of databases is independent of whether the purpose is to conduct a multipurpose review or to develop a new CPG.

  1. Basic databases: We refer to ‘basic databases’ as those that are recommended for use in any case when performing SR-EEs. Using a well-constructed search strategy, most relevant EEs will be detected.

  2. Specific databases: For an SR on a topic for which a specific database is available, we recommend using it. Specific databases are those that provide information primarily in a particular research field. An SR on a mental health topic for instance would benefit from searches performed on PsycINFO (American Psychological Association, United States) [Citation46,Citation47].

  3. Optional databases and websites: Under the category of ‘optional databases,’ we grouped databases and web pages that may hold additional information relevant for a more comprehensive SR. For example, optional databases will identify Health Technology Assessment (HTA) reports (Canadian Agency for Drugs and Technologies in Health [CADTH] HTA database) and conference proceedings (International Society For Pharmacoeconomics and Outcomes Research (ISPOR) website or the Cochrane Colloquium). Furthermore, trial registries may provide an outlook on what studies are currently being performed and may provide further evidence in the near future.

Until a new EE database becomes available, we recommend searching at least the basic databases MEDLINE [Citation48], Embase [Citation49], NHS EED [Citation44], EconLit (EBSCO) [Citation50], and Web of Science [Citation51], bearing in mind that the NHS EED stopped updating in March 2015. If applicable, a search on a more disease-specific database can be necessary. As many optional databases should be added as is feasible.

5. Step 2.2 of the overall framework: development of a search strategy

Developing an entirely new comprehensive search strategy (i.e. a string of search terms) is a time-consuming effort which highly depends on the reviewer’s experience. The time needed for developing and testing such a strategy is reported to be around 20 h [Citation52] for experienced reviewers. It needs to be noted that these estimates also entail the testing of such a strategy against a so-called ‘gold standard’ (i.e. a known set that entails all relevant publications) [Citation4]. However, it is not necessary to develop and test a search strategy from scratch for every new SR-EE. When designing a comprehensive search strategy, it is advised to ask the help of a biomedical information specialist, available at many universities [Citation6,Citation15,Citation53]. Considerable work has been done to support researchers in detecting relevant articles for SRs concerning the effectiveness of treatment and diagnostics. However, little has been published on empirically validated search strategies for EEs [Citation1]. In general, a successful search strategy is regarded as one that delivers a manageable amount of references with a searcher-specified balance of sensitivity and precision [Citation22]. The definition of what is regarded as being manageable obviously depends on the size and expertise of the review team. When making use of predefined methods for screening, researchers other than information specialists screened a median of 296 articles per hour [Citation54].

5.1. Important elements in a comprehensive search strategy

In searching literature databases, a search strategy typically makes use of different search terms that are related to elements in the research question. With a so-called ‘conceptual approach’ (also known as a ‘conventional approach’ [Citation55]), different information sources are used to identify relevant terms and their synonyms [Citation56]. Several databases offer the possibility to employ medical subject headings (referred to as MeSH® terms in e.g. PubMed®), or Emtrees® (Embase®). Both MeSH and Emtrees groups controlled vocabulary and hence serve as thesauri used to index biomedical literature in the respective databases. For a comparison of MeSH® and Emtree®, see [Citation57].

Search filters are defined as a collection of search terms based on research and validated against a so-called ‘gold standard’ (i.e. a known set of relevant records)[Citation4], used to identify certain types of records, often for very broad topics [Citation4,Citation58]. They are regarded as a time-saving ‘ready-made solution’, leaving searchers ‘free to concentrate on the other aspects of the search’ [Citation19]. Hence, they improve both the efficiency and effectiveness of searches [Citation4].

Although there seems to be no consensus on how to set up a good search filter, filters can be tested for their quality in terms of (1) sensitivity, (2) specificity, (3) precision, and (4) accuracy [Citation4] (see ). Sensitivity is defined as the proportion of relevant citations that were retrieved; specificity is the proportion of low-quality (or off-topic) records not detected; precision is the proportion of articles that are of high quality; accuracy is the proportion of all articles that are correctly classified [Citation59]. While it should be the general aim to maximize sensitivity [Citation14], a high level of precision is needed to meet the requirements of guideline developers and HTA researches and to prepare scoping or rapid reviews [Citation60]. It should be noted that achieving a high degree of sensitivity is often associated with a lowering of precision and vice versa [Citation3,Citation14,Citation60Citation62].

Table 1. Calculation of sensitivity, precision, and specificity for the evaluation of search filters.

For identification of full EEs, we recommend choosing a sensitive rather than a precise filter.

Once all synonyms, MeSH/Emtree terms, and search filters are detected, they can be connected through the Boolean or proximity operators per Patient, Intervention, Comparator, Outcome (PICO) aspect. All PICO aspects are then combined with AND. Finally, the complete search strategy can be pasted into the database search interface. It needs to be noted that each interface follows specific syntax rules [Citation63].

5.2. Boolean operators

Search terms within a concept (synonyms) should be combined with the Boolean operator OR. Aspects and filters can be combined into a search strategy with the use of the Boolean operator ‘AND.’ In addition, some search interfaces allow the use of proximity operators such as ‘NEAR’ or ‘ADJ.’ By searching for two (groups of) words on a certain internal distance, the search achieves more specificity in comparison with combining terms with ‘AND’ and more sensitivity in comparison with searching for specific phrases. The proximity between the words can often be set by the user. This can be of particular value if one search term can be described in several ways. The Cochrane Handbook for Systematic Reviews of Interventions (hereafter: Cochrane Handbook) [Citation7] recommends using the ‘NEAR’ operator due to its higher degree of sensitivity and precision as opposed to ‘NEXT’ and ‘AND,’ respectively. It should be noted, however, that the proximity should be used only to combine words within one aspect (such as the disease or intervention aspect). Accordingly, it cannot replace the ‘AND’ between aspects. Theoretically, the Boolean operator ‘NOT’ can be used to exclude specific aspects. It should, however, be avoided in searches for SRs or used with great caution due to the possibility that it could unintentionally remove relevant records [Citation14].

5.3. Truncation

Most databases offer the use of truncation, which is a way to search for multiple words with the same word stem. Usually truncation is indicated with an asterisk (*) at the end of a word stem. Truncating effectiv* would for instance search for effective, effectiveness, effectivity, etc. Likewise, some databases offer a wildcard operator (such as ‘?’ in the Cochrane Library or ‘#’ in Ovid), which is meant to replace one single character Searching for wom?n will in this case search for women and woman [Citation14]. Truncation should be done carefully. Truncation of the word cost* for anything related to costs will for instance also search for costimulants which is not directly related to costs. In this example, truncation took place at a word stem that was too short.

5.4. Restrictions

Most databases allow different methods for restricting their search results. It is recommended that language restrictions not be included in the search strategy [Citation14], although this is not always feasible. Likewise, restrictions on dates should not be applied except for specific reasons, such as when updating earlier reviews or when a certain technique being evaluated was not present before a certain date. Formats such as letters can add relevant additional information that relates to trial reports; they can update them or may be intended to correct mistakes. Therefore, they should not be excluded per se [Citation14].

5.5. Selection of search terms and filters

Following the first steps of Mastrigt et al. [Citation8], the eligibility criteria for studies to be included in the SR are already defined. These criteria will inform the four basic components of the PICO scheme: population (or participant, or population), intervention, control or comparator, and outcomes [Citation64]; this is a helpful step in the conceptualization of the research question [Citation40]. Other search tools such as PICOS (where the S refers to study design) seem to be less sensitive in comparison with PICO [Citation65]. Usually, not all PICO aspects are well covered by the title or abstracts or indexed key words of an article, and not all aspects are equally important [Citation14]. Therefore, the final search strategy for SR-EEs will often consist of the following three main key concepts of interest: (1) health/disease, 2b) intervention, and (3) economics. Search terms for each concept can be derived from the conceptual approach or by using already existing search filters. For each concept, it is advised to include a wide range of free-text terms separated by the Boolean operator OR, to make as much use of truncation and wildcards as possible (see below) [Citation14], and to use proximity operators if they are available in the interfaces used. Specifics of the three concepts will be discussed in the following subsection. Since February 2016, Embase provides a PICO search interface that can be useful for conceptualizing a first search strategy [Citation66].

Several databases offer the possibility of employing thesauri (also known as MeSH terms in MEDLINE or Emtree in Embase). These thesauri provide additional alternative terms that can be used as synonyms in the creation of the search strategy.

For English, it is recommended using both British and American spellings for the free-term search [Citation67].

5.6. Health/disease and intervention concept

As both health/disease and intervention concepts share many features and are closely related to each other, they are discussed together. For both concepts, making use of an already existing search strategy or filters is recommended. These may be found in the appendices of Cochrane SRs, publications of the NICE [Citation68], or other high-quality SRs. If the planned SR-EE is part of a CPG development process, information on the health- or disease-specific string can be taken from the search used to detect studies that evaluate the clinical effectiveness of the intervention of interest.

As mentioned earlier, some search filters for specific topics already exist and sometimes are even partially integrated by database providers (e.g. clinical queries in PubMed). The InterTASC Information Specialists’ Sub-Group (ISSG) provides a list, updated monthly, for search filters grouped by study design and focus [Citation69].

5.7. Economic concept

Search terms for the economic concept are dependent on the research question and on the type of EEs that are sought to be incorporated. If, for instance, economic modeling studies are considered for the SR, it is not enough to incorporate only economics-related search terms.

Most often, search filters and full search strategies are reported together with their respective sensitivity, specificity, precision, and accuracy. In 2009, Glanville and colleagues [Citation70] found that EEs cannot be identified efficiently using indexing terms provided by most databases. Therefore, they tested the performance of available search filters for their ability to detect EEs in MEDLINE and Embase. They concluded that, while some filters are able to achieve high levels of sensitivity, precision is usually low [Citation70].

Since a newly created search filter needs to be validated, its development is a challenging, time-intensive, and resource-consuming task. Some search filters for detecting EEs have been published in the literature. Although these filters have been translated to fit more than one database, the translation is not always optimal, so they are not easily transferrable between databases. The selection of an appropriate search filter depends on the scope set out for the SR, as well as on which databases are to be searched. Therefore, we refer to the regularly updated ISSG website [Citation71] which holds a list of published filters for finding EEs in the databases CINAHL, Embase, MEDLINE, and PsycINFO. If feasible, we advise choosing a sensitive rather than a precise search filter for SRs. This is because the former will most likely detect more records than the latter.

In 2016, the CADTH issued an update to the Peer Review of Electronic Search Strategies (PRESS) guideline [Citation72] that aims to evaluate electronic search strategies. Originally, the PRESS guideline focused on librarians and other information specialists as primary users, but it can also be of great use for researchers undertaking SRs.

Recommendations for a complete search strategy – in a nutshell

When developing the search strategy, it is important to breakdown the research question into its main conceptual elements. The PICO scheme can help with this, although not all PICO elements might be useful.

A search strategy should encompass a wide range of free-text terms, make use of proximity operators when possible, and employ thesauri. Truncation should be used with caution, and for English, British and American spelling should be used.

Restrictions of search results (e.g. language and time frame) should be used as little as possible when setting up a search strategy.

Already existing and validated search filters should be selected for being highly sensitive or highly precise or a combination of both. A soundly conducted SR will profit from a sensitive rather than from a precise search filter. Filters to find EEs can be found on the ISSG website.

Figure 2. Schematic overview on search strategy of Wijnen et al. [Citation16] Per PICO item, all synonyms and MeSH terms were combined with the Boolean operator OR. Truncation (in the form of an *) was used whenever possible. All search terms were restricted to be detected in title and abstracts only (see [TIAB] or [Title/Abstract]). Within one PICO item, different words can be combined with AND. For the intervention aspect, “ketogenic” was combined with “diet”. At this place a proximity operator could have been used. The same approach could also have been used for the search term “diet therapy”. To detect economic evaluations, a published search filter [Citation73] was copied. Finally, all elements of the PICO scheme were combined with the Boolean operator AND to produce a single search strategy that could then be pasted into a MEDLINE search interface (in this case PubMed).

Figure 2. Schematic overview on search strategy of Wijnen et al. [Citation16] Per PICO item, all synonyms and MeSH terms were combined with the Boolean operator OR. Truncation (in the form of an *) was used whenever possible. All search terms were restricted to be detected in title and abstracts only (see [TIAB] or [Title/Abstract]). Within one PICO item, different words can be combined with AND. For the intervention aspect, “ketogenic” was combined with “diet”. At this place a proximity operator could have been used. The same approach could also have been used for the search term “diet therapy”. To detect economic evaluations, a published search filter [Citation73] was copied. Finally, all elements of the PICO scheme were combined with the Boolean operator AND to produce a single search strategy that could then be pasted into a MEDLINE search interface (in this case PubMed).

6. Step 2.3 of the overall framework: perform searches

Once the search strategies for the selected databases have been created, the search can be performed. Relevant studies that are already known should be included in the newly retrieved set of articles. If not, it needs to be determined why the search strategy could not detect them. Accordingly, the search strategy might have to be adapted. This triangulation method can serve as a sort of quality check.

A clear documentation of all searches (i.e. electronic database searches and hand and reference searches) is essential for the reproducibility and future updates of the study findings [Citation14,Citation25Citation27,Citation74]. This means that the details of all searches performed (e.g. database selected, time frame covered, key words and restrictions used [i.e. the entire search strategy], number of records retrieved, etc.) should be collected systematically and added to appendices of the report (see for an example). Reference managing software (e.g. EndNote, Refworks, etc.) can be used to manage bibliographic details and deduplicate results and prepare references for publications. This will ensure efficient handling of all references retrieved from different databases [Citation14]. The user should, however, be aware of how the reference manager used handles deduplication [Citation75] and the preparation of references for publication [Citation76]. Reference information for gray literature and reports can be found on WorldCat® [Citation77].

After references from all databases have been downloaded into a reference software program, they can be deduplicated. Most reference management software programs have built-in deduplication options, but several methods have been published as well [Citation78Citation80]. Deduplication is often considered time-consuming, even when using bibliographic software, because users feel the need to check the correctness of the selected duplicates. A safe and fast method has been developed in EndNote, where fields can be set upon which the duplicates are compared [Citation78,Citation81].

7. Step 2.4 of the overall framework: selection of studies

Screening of potential relevant studies should be conducted in two stages [Citation25,Citation27]. First, after removing the duplicates, all remaining records are screened, preferably by two independent reviewers [Citation82], on title and abstract. Studies should be selected based on the eligibility criteria stated in the published protocol (Steps 1.3 and 1.4). Second, the full-text records are screened for compliance with eligibility criteria [Citation82]. Often it is recommended that, ideally, all steps critical for study selection (2.3 and 2.4) [Citation82] and for data extraction (3.1 and 3.2) [Citation26,Citation27,Citation82] should be done by two reviewers independently. However, as this is not always achievable, one reviewer can select and extract the data, with a second one checking this for completeness and accuracy [Citation27]. Pilot testing of these processes should be performed using a representative sample of studies [Citation25,Citation27,Citation82]. Accordingly, the inclusion criteria should be applied to a sample of records [Citation25]. Any discrepancies between the two reviewers should be resolved by consensus [Citation25,Citation27,Citation82]. In addition, a third reviewer may be consulted if any issues need further discussion [Citation27,Citation82]. The review process can be done in different ways. As a formal measure of agreement, Cohen’s Kappa can be calculated [Citation27,Citation82], although not all guidelines regard this as necessary [Citation25]. The review process can be managed through EndNote [Citation54], but many other programs are available as well. A compendium of different tools that also calculate Cohen’s Kappa automatically can be found elsewhere [Citation83].

All information on the abovementioned processes can be reported in the study protocol and in the methods section of the publication [Citation25,Citation27,Citation82]. If there are multiple records of the same study, these records should be linked together [Citation14,Citation25,Citation32]. This can be done by making a systematic numerical order for the studies and reporting this in the results section. This could be done as follows: for the oldest report, the number ‘1A’ (used further in SR-EE when reporting or discussing this study), ‘1B’ for the second report of that specific study (mentioned only once in the results section when discussing the number of included studies), ‘1C’ for the third publication, etc. A list of studies that were excluded from the SR at the full paper stage should be provided in the appendices [Citation27,Citation82], to keep the study transparent and reproducible. This list needs to contain bibliographic details of the excluded studies and the reason for exclusion [Citation25,Citation27,Citation82].

A flowchart of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement on study inclusion should be used to show all details of the selection process in a systematic way [Citation25,Citation27,Citation84].

8. Expert commentary & five-year view

As much as the development of the NHS EED and HEED databases was heralded as an improvement in providing access to EEs [Citation60], the discontinuation of updating these databases has had a tremendous impact on how to conduct SR-EEs. The cessation of these databases created a gap, with no new database currently capable of replacing them. The scientific community seems to be reacting with procrastination. Renowned practice guidance such as the Cochrane Handbook [Citation7], the NICE manual for developing NICE guidelines [Citation10], and other reliable sources of information (e.g. the CCEMG website [Citation41]) need to be revised and updated so that using these databases is no longer recommended. Without comprehensive economic databases, researchers need to rely on other information sources which are not specialized in EEs and must use more complex search strategies with specialized search filters to detect EE literature in available databases. Setting up a new health economic database might seem like a good solution. However, with regard to the tremendous amount of resources needed to build and maintain such information repositories, it is questionable if this will add value.

Based on several key guidelines for preparing SRs in effectiveness research and on major publications exploring methods for detecting economic publications, we issue our advice on how to identify EEs for SRs in data sources not specializing solely in health economic literature. All recommendations are compiled into a step by step plan that can be used as a checklist (see ).

Table 2. Step-by-step plan on how to identify economic literature for a systematic review.

As yet there is no consensus on how many and which specific databases need to be searched to identify all relevant EEs. Also, there is no unanimous agreement by which methodology a solid search strategy should be developed (see for instance [Citation55,Citation85]). Our contribution can thus be seen as merely temporary guidance until more methodological research on this topic has been published or new databases for EEs have been set up. With an increasing amount of validated, reliable, and user-friendly search filters to detect health economic literature, the creation of a new database specialized on health EEs might become redundant.

Updating new and existing SRs is a key objective for future research in this area [Citation86], particularly because many reviews are currently outdated or no longer accessible [Citation87]. On the one hand, surveillance systems could assess the need for updating SRs [Citation88]. On the other hand, Elliott et al. [Citation89] suggest initiating living SRs which should be high quality, up-to-date online summaries of health research that are continuously updated with newly available research.

In the years to come, researchers will have the possibility to (1) implement process parallelization, (2) use novel techniques and applications to automate the process, and (3) methodologically modify certain SR processes, in order to address the issue of timeliness in the compilation of SRs [Citation90]. Automation processes seem to be the most promising innovation in this regard [Citation91], as they would make handcrafted SRs (at least in part) obsolete [Citation92]. The SR toolbox website [Citation83] holds a regularly updated compendium of available software tools to support the process of compiling SRs. With upcoming automation processes and the increasing availability of validated search filters, it is conceivable that the cessation of health economic-specific databases will no longer be a misfortune for the scientific community. For the last decade, it seems that most research concerned with developing search strategies for detecting EEs focuses on the two major players, MEDLINE and Embase anyway [Citation1,Citation3,Citation60,Citation70,Citation93,Citation94]. In the near future, a search of those two databases could possibly be sufficient to detect most EEs. However, an important step for this to become reality is that EEs must be correctly indexed. Concepts related to health economics are often broadly defined, and the mere definition of what constitutes important components of EEs differs among scholars and changes over time (see definitions of costs components in [Citation95] and [Citation96]). Establishing new guidelines to stimulate a uniform use of terms could help overcome this issue.

Key issues

● Currently there are no up-to-date economic evaluation databases available.

● For detecting economic evaluations, we recommend searching at least Embase, MEDLINE, Web of Science, and NHS EED (basic databases), bearing in mind that the latter can serve as an archive only until March 2015.

● A biomedical information specialist should be part of any team that seeks to conduct SRs.

● For a search strategy, select a wide range of search terms, including thesauri, and use proximity operators as well as truncation options.

● Validated search filters for the economic aspect can be retrieved from the ISSG website.

● Restrictions should be used as little as possible in the search strategy.

● References can best be handled with a reference managing software.

● Preferably, retrieved references should be screened independently by two reviewers.

Declaration of interest

The authors have no relevant affiliations or financial involvement with any organization or entity with a financial interest in or financial conflict with the subject matter or materials discussed in the manuscript. This includes employment, consultancies, honoraria, stock ownership or options, expert testimony, grants or patents received or pending, or royalties.

Acknowledgments

The authors would like to thank the other members of the project: Ken Redekop (Erasmus University Rotterdam, Rotterdam), Ben Wijnen (Maastricht University, Maastricht), Mickael Hiligsmann (Maastricht University, Maastricht), Chris Arts (Maastricht University), Reina de Kinderen (Maastricht University, Maastricht), Pieter Broos (Knowledge Institute of Medical specialist, Utrecht), Toon Lamberts (Knowledge Institute of Medical specialist, Utrecht), Bernard Elsman (Deventer Ziekenhuis, Deventer), for their valuable feedback on the draft of this paper. Furthermore, the authors would like to thank Barbara Greenberg for her English editing services.

Additional information

Funding

This project received financial support by the Dutch Society of Medical Specialists, (Stichting Kwaliteitsgelden Medisch Specialisten): project number: 329666498, Utrecht, and from ZonMw (the Netherlands Organisation for Health Research and Development, project number: 836022001, The Hague).

References

  • McKinlay RJ, Wilczynski NL, Haynes RB. Optimal search strategies for detecting cost and economic studies in EMBASE. BMC Health Serv Res. 2006;6(1):67.
  • Shemilt I, Mugford M, Vale L, et al. Evidence synthesis, economics and public policy. Res Synth Methods. 2010;1(2):126–135.
  • Sassi F, Archard L, McDaid D. Searching literature databases for health care economic evaluations: how systematic can we afford to be? Med Care. 2002;40(5):387–394.
  • Jenkins M. Evaluation of methodological search filters – a review. Health Inf Libraries J. 2004;21(3):148–163.
  • Alton V, Eckerlund I, Norlund A. Health economic evaluations: how to find them. Int J Technol Assess Health Care. 2006;22(04):512–517.
  • McGowan J, Sampson M. Systematic reviews need systematic searchers. J Med Lib Assoc. 2005;93(1):74.
  • Higgins J, Green S. Cochrane handbook for systematic reviews of interventions version 5.1.0 [updated march 2011. The Cochrane Collaboration; 2011. Available from: www.handbook.cochrane.org.
  • van Mastrigt GAPG, Hiligsmann M, Arts JJC, et al. How to prepare a systematic review of economic evaluations for clinical practice guidelines (part 1 of 3). Expert Review of Pharmacoeconomics & Outcomes Research. doi: 10.1080/14737167.2016.1246960. [Epub ahead of print]
  • Wijnen B, Van Mastrigt GAPG, Evers SMAA, et al. Review of economic evaluations of treatments for patients with epilepsy. Submitted.
  • National Institute for Health and Care Excellence. 5 Identifying the evidence: literature searching and evidence submission. In: Developing NICE guidelines: the manual. National Institute for Health and Care Excellence; 2014. Available from: https://www.nice.org.uk/process/pmg20/chapter/introduction-and-overview
  • Brouwers MC, Kho ME, Browman GP, et al. AGREE II: advancing guideline development, reporting and evaluation in health care. Can Med Assoc J. 2010;182(18):E839–E842.
  • Qaseem A, Forland F, Macbeth F, et al. Guidelines International Network: toward international standards for clinical practice guidelines. Ann Intern Med. 2012;156(7):525–531.
  • National Institute for Health and Care Excellence. 7 Incorporating economic evidence. In: Developing NICE guidelines: the manual. National Institute for Health and Care Excellence; 2014.
  • Lefebyre C, Manheimer E, Glanville J. Chapter 6: Searching for studies. In: Higgins J, Green S, Editors. Cochrane handbook for systematic reviews of interventions version 510 [updated March 2011]. The Cochrane Collaboration; 2011.
  • Rethlefsen ML, Farrell AM, Osterhaus Trzasko LC, et al. Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews. J Clin Epidemiol. 2015;68(6):617–626.
  • Wijnen B, Van Mastrigt GAPG, Redekop WK, et al. How to prepare a systematic review of economic evaluations for informing evidence-based healthcare decisions: data extraction, risk of bias, and transferability (part 3 of 3). Expert Review of Pharmacoeconomics & Outcomes Research. doi: 10.1080/14737167.2016.1246961. [Epub ahead of print].
  • HEED: Health Economic Evaluations Database Available from: http://onlinelibrary.wiley.com/book/10.1002/9780470510933
  • News. Available from: http://www.crd.york.ac.uk/CRDWeb/
  • White VJ, Glanville JM, Lefebvre C, et al. A statistical approach to designing search filters to find systematic reviews: objectivity enhances accuracy. J Inf Sci. 2001;27(6):357–370.
  • Lemeshow AR, Blum RE, Berlin JA, et al. Searching one or two databases was insufficient for meta-analysis of observational studies. J Clin Epidemiol. 2005;58(9):867–873.
  • Dickersin K, Scherer R, Lefebvre C. Systematic reviews: identifying relevant studies for systematic reviews. Bmj. 1994;309(6964):1286–1291.
  • Brettle AJ, Long AF, Grant MJ, et al. Searching for information on outcomes: do you need to be comprehensive? Quality in Health Care. 1998;7(3):163–167.
  • Morton S, Levit L, Berg A, et al. Finding what works in health care: standards for systematic reviews. Washington (DC): National Academies Press; 2011.
  • Bramer WM, Giustini D, Kramer BMR. Comparing the coverage, recall, and precision of searches for 120 systematic reviews in Embase, MEDLINE, and Google Scholar: a prospective study. Syst Rev. 2016;5(1):1–7.
  • Methods guide for effectiveness and comparative effectiveness reviews. AHRQ Publication No. 10(14)-EHC063-EF. Available from: www.effectivehealthcare.ahrq.gov
  • Shea B, Grimshaw J, Wells G, et al. Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews. BMC Med Res Methodol. 2007;7(1):10.
  • Centre for Reviews Dissemination. Systematic reviews: CRD’s guidance for undertaking reviews in health care. York(UK): CRD, University of York; 2009.
  • Halladay CW, Trikalinos TA, Schmid IT, et al. Using data sources beyond PubMed has a modest impact on the results of systematic reviews of therapeutic interventions. J Clin Epidemiol. 2015;68(9):1076–1084.
  • Service providers and search interfaces. Available from: http://www.htai.org/vortal/?q=node/927
  • Fröschl B, Bornschein B, Brunner-Ziegler S, et al. Methodenhandbuch für health technology assessment version 1.2012. Wien: Gesundheit Österreich GmbH (GÖG); 2012.
  • Glanville J, Paisley S. Identifying economic evaluations for health technology assessment. Int J Technol Assess Health Care. 2010;26(04):436–440. Available from: http://hta.lbg.ac.at/uploads/tableTool/UllCmsPage/gallery/Methodenhandbuch.pdf
  • Akers J, Aguiar-Ibáñez R, Baba-Akbari Sari A. CRD’s guidance for undertaking reviews in health care. York (UK): Centre for Reviews and Dissemination (CRD); 2009.
  • Healthcare Improvement Scotland. Standard operating procedure for production of evidence notes (version 2.0). Edinburgh: Healthcare Improvement Scotland; 2012.
  • Healthcare Improvement Scotland. Standard operating procedure for production of technologies scoping reports (version 1.0). Edinburgh (Scotland): Healthcare Improvement Scotland; 2012.
  • National Health and Medical Research Council. How to compare the costs and benefits: evaluation of the economic evidence. Canberra (Australia): National Health and Medical Research Council (NHMRC); 2001.
  • NHS Quality Improvement Scotland. SIGN 50 – a guideline developer’s handbook. Edinburgh(Scotland): NHS Quality Improvement Scotland (NHS QIS); 2011.
  • Kristensen F, Sigmund H. Health technology assessment handbook. Copenhagen (Denmark): Danish Centre for Health Technology Assessment (DACEHTA), National Board of Health (NBoH); 2007.
  • Cleemput I, Van Den Bruel A, Kohn L, et al. Search for evidence & critical appraisal: health technology assessment (HTA). Brussels (Belgium): Belgian Federal Health Care Knowledge Centre (KCE); 2007.
  • Shemilt I, Mugford M, Byford S, et al. Incorporating economics evidence. In: Higgins JPT, Green S (editors). Cochrane handbook for systematic reviews of interventions. 2008;449–480. Available from: www.handbook.cochrane.org.
  • Mathes T, Walgenbach M, Antoine S-L, et al. Methods for systematic reviews of health economic evaluations a systematic review, comparison, and synthesis of method literature. Med Decis Making. 2014;34(7):826–840.
  • Resources. Available from: http://www.c-cemg.org/home/resources
  • Porta MS. International epidemiological A: a dictionary of epidemiology. 5th ed. /edn ed. Oxford: Oxford University Press; 2008.
  • Hopewell S, McDonald S, Clarke M, et al. Grey literature in meta-analyses of randomized trials of health care interventions. Cochrane Database Syst Rev. 2007;2. Art. No.: MR000010. doi:10.1002/14651858.MR000010.pub3. Available from: http://onlinelibrary.wiley.com/doi/10.1002/14651858.MR000010.pub3/epdf
  • CRD Database. Available from: http://www.crd.york.ac.uk/CRDWeb/
  • Science citation index expanded. Available from; http://thomsonreuters.com/en/products-services/scholarly-scientific-research/scholarly-search-and-discovery/science-citation-index-expanded.html
  • Eady AM, Wilczynski NL, Haynes RB. PsycINFO search strategies identified methodologically sound therapy studies and review articles for use by clinicians and researchers. J Clin Epidemiol. 2008;61(1):34–40.
  • Löhönen J, Isohanni M, Nieminen P, et al. Coverage of the bibliographic databases in mental health research. Nord J Psychiatry. 2010;64(3):181–188.
  • Ovid MEDLINE. Available from: gateway.ovid.com/autologin.html
  • Embase. Available from: www.embase.com
  • EconLit. Available from: https://www.aea-web.org/econlit/index.php
  • Web of science. Available from: https://apps.webofknowledge.com/UA_GeneralSearch_input.do?product=UA&search_mode=GeneralSearch&SID=3AbDfUY7pV3cFFrEDlg&preferencesSaved=
  • Bramer WM, Rethlefsen ML, Mast F, Kleijnen J, Non-inferiority of the Erasmus MC exhaustive search method: An inter-organizational comparison of the speed and quality of librarian-mediated literature searches for systematic reviews. Submitted
  • McGowan J. For expert literature searching, call a librarian. CMAJ: Can Med Assoc J. 2001;165(10):1301–1302.
  • Bramer WM, Milic J, Mast F. Reviewing retrieved references for inclusion in systematic reviews using endnote. J Med Lib Assoc. accepted for publication. 2017;105(1).
  • Boeker M, Motschall E, Vach W. Literature search methodology for systematic reviews: conventional and natural language processing enabled methods are complementary (letter commenting on: J Clin Epidemiol. 2015;68:191-9). J Clin Epidemiol. 2016;69:253–255.
  • Hausner E, Guddat C, Hermanns T, et al. Development of search strategies for systematic reviews: validation showed the noninferiority of the objective approach. J Clin Epidemiol. 2015;68(2):191–199.
  • Elsevier. A comparison of MeSH® and Emtree®. Elsevier; 2015. Available from: http://supportcontent.elsevier.com/Support%20Hub/Embase/Files%20&%20Attachements/4685-Embase_White%20Paper_Comparison%20of%20Emtree%20and%20MeSH_July%202015.pdf
  • Beale S, Duffy S, Glanville J, et al. Choosing and using methodological search filters: searchers’ views. Health Info Libr J. 2014;31(2):133–147.
  • Haynes RB, Wilczynski N, McKibbon KA, et al. Developing optimal search strategies for detecting clinically sound studies in MEDLINE. J Am Med Inform Assoc. 1994;1(6):447–458.
  • Glanville J. Development and testing of search filters to identify economic evaluations in MEDLINE and EMBASE: Canadian Agency for Drugs and Technologies in Health. Canada (Ottawa): Agence canadienne des médicaments et des technologies de la santé; 2009.
  • Boynton J, Glanville J, McDaid D, et al. Identifying systematic reviews in MEDLINE: developing an objective approach to search strategy design. J Inf Sci. 1998;24(3):137–154.
  • Camosso-Stefinovic J. Developing objectively derived search strategies to identify randomized controlled trials in MEDLINE. MSci thesis Loughborough: Loughborough University, Department of Information and Library Studies; 2002.
  • Improving efficiency and confidence in systematic literature searching. Available from: http://de.slideshare.net/wichor
  • Russell R, Chung M, Balk EM, et al. Issues and challenges in conducting systematic reviews to support development of nutrient reference values: workshop summary: nutrition research series. Vol. 2. Rockville (MD): Agency for Healthcare Research and Quality (US); 2009
  • Methley AM, Campbell S, Chew-Graham C, et al. PICO, PICOS and SPIDER: a comparison study of specificity and sensitivity in three search tools for qualitative systematic reviews. BMC Health Serv Res. 2014;14(1):1–10.
  • PICO search. Available from: https://www.embase.com/#picoSearch/default
  • Sampson M, McGowan J, Cogo E, et al. An evidence-based practice guideline for the peer review of electronic search strategies. J Clin Epidemiol. 2009;62(9):944–952.
  • Home. Available from: https://www.nice.org.uk/
  • The InterTASC Information Specialists’ Sub-Group search filter resource. Available from: https://sites.google.com/a/york.ac.uk/issg-search-filters-resource/home
  • Glanville J, Kaunelis D, Mensinkai S. How well do search filters perform in identifying economic evaluations in MEDLINE and EMBASE. Int J Technol Assess Health Care. 2009;25(04):522–529.
  • Filters to find economic evaluations. Available from: https://sites.google.com/a/york.ac.uk/issg-search-filters-resource/filters-to-find-i
  • McGowan J, Sampson M, Salzwedel DM, et al. PRESS peer review electronic search strategies: 2015 guideline explanation and elaboration (PRESS E&E). Ottawa: CADTH; 2016.
  • Wilczynski NL, Haynes RB, Lavis JN, et al. Optimal search strategies for detecting health services research studies in MEDLINE. Can Med Assoc J. 2004;171(10):1179–1185.
  • Institute of Medicine of the National Academies. Finding what works in health care standards for systematic reviews. Washington(D.C.): National Academies Press; 2011.
  • Sarawagi S, Bhamidipaty A. Interactive deduplication using active learning. In: Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining. Edmonton, Alberta, Canada: ACM. 2002: 269–278.
  • Miller MC. Reference management software: a review of endnote plus, reference manager, and pro-cite. MD Comput. 1994;11(3):161–168.
  • WorldCat. Available from: http://www.worldcat.org/
  • Bramer W, Holland L, Mollema J, et.al. Removing duplicates in retrieval sets from electronic databases: comparing the efficiency and accuracy of the Bramer-method with other methods and software packages. 2014. Available from: http://www.bmi-online.nl/wp-content/uploads/2015/01/bramer-presentation-Removing-duplicates-in-retrieval-sets-from-electronic-databases-zonder-demo.pdf
  • Jiang Y, Lin C, Meng W, et al. Rule-based deduplication of article records from bibliographic databases. Database. 2014;2014:bat086.
  • Qi X, Yang M, Ren W, et al. Find duplicates among the PubMed, EMBASE, and Cochrane library databases in systematic review. PLoS One. 2013;8(8):e71838.
  • Bramer WM, Giustini D, de Jonge GB, Holland L, Bekhuis T. De-duplication of database search results for systematic reviews in EndNote. Journal of the Medical Library Association: JMLA, 2016;104(3):240.
  • Higgins J, Deeks JJ. Chapter 7: selecting studies and collecting data. Cochrane handbook for systematic reviews of interventions version 510 [updated March 2011]. Higgins J, Deeks JJ, Editors. The Cochrane Collaboration; 2011.
  • Marshall C, Brereton P. Systematic review toolbox: a catalogue of tools to support systematic reviews. In: Proceedings of the 19th International Conference on Evaluation and Assessment in Software Engineering: 2015: ACM; 2015: 23.
  • Moher D, Liberati A, Tetzlaff J, et al. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med. 2009;151(4):264-9, W64.
  • Hausner E, Waffenschmidt S. Response to letter by Boeker et al. Development of search strategies for systematic reviews: further issues regarding the objective and conceptual approaches. J Clin Epidemiol. 2016;69:255–257.
  • Moher D, Tsertsvadze A. Systematic reviews: when is an update? Lancet. 2006;367(9514):881–883.
  • Shojania KG, Sampson M, Ansari MT, et al. How quickly do systematic reviews go out of date? A survival analysis. Ann Intern Med. 2007;147(4):224–233.
  • Ahmadzai N, Newberry SJ, Maglione MA, et al. A surveillance system to assess the need for updating systematic reviews. Syst Rev. 2013;2(1):1.
  • Elliott JH, Turner T, Clavisi O, et al. Living systematic reviews: an emerging opportunity to narrow the evidence-practice gap. PLoS Med. 2014;11(2):e1001603.
  • Tsertsvadze A, Chen Y-F, Moher D, et al. How to conduct systematic reviews more expeditiously? Syst Rev. 2015;4(1):1–6.
  • Elliott J, Sim I, Thomas J, et al. #CochraneTech: technology and the future of systematic reviews. Cochrane Database Syst Rev. 2014;9:Ed000091.
  • Tsafnat G, Dunn A, Glasziou P, et al. The automation of systematic reviews. BMJ. 2013;346:f139
  • Topfer L-A, Parada A, Menon D, et al. Comparison of literature searches on quality and costs for health technology assessment using the MEDLINE and EMBASE databases. Int J Technol Assess Health Care. 1999;15(02):297–303.
  • Sampson M, De Bruijn B, Urquhart C, et al. Complementary approaches to searching MEDLINE may be sufficient for updating existing systematic reviews. J Clin Epidemiol. 2016. Available from: http://www.sciencedirect.com/science/article/pii/S0895435616300166
  • Drummond M, Sculpher M, Torrance G, et al. Methods for the economic evaluation of health care programmes. Oxford (London): Oxford University Press; 2005.
  • Drummond MF, Sculpher MJ, Claxton K, et al. Methods for the economic evaluation of health care programmes. Oxford (London): Oxford university press; 2015.

Appendices

Appendix 1. Databases for detecting economic evaluations.

Appendix 2. Example of reporting on databases and search strategies.