
- Study and research support
- Literature searching

Literature searching explained
Develop a search strategy.
A search strategy is an organised structure of key terms used to search a database. The search strategy combines the key concepts of your search question in order to retrieve accurate results.
Your search strategy will account for all:
- possible search terms
- keywords and phrases
- truncated and wildcard variations of search terms
- subject headings (where applicable)
Each database works differently so you need to adapt your search strategy for each database. You may wish to develop a number of separate search strategies if your research covers several different areas.
It is a good idea to test your strategies and refine them after you have reviewed the search results.
How a search strategy looks in practice
Take a look at this example literature search in PsycINFO (PDF) about self-esteem.
The example shows the subject heading and keyword searches that have been carried out for each concept within our research question and how they have been combined using Boolean operators. It also shows where keyword techniques like truncation, wildcards and adjacency searching have been used.
Search strategy techniques
The next sections show some techniques you can use to develop your search strategy.
Skip straight to:
- Choosing search terms
- Searching with keywords
- Searching for exact phrases
- Using truncated and wildcard searches
Searching with subject headings
- Using Boolean logic
Citation searching
Choose search terms.
Concepts can be expressed in different ways eg “self-esteem” might be referred to as “self-worth”. Your aim is to consider each of your concepts and come up with a list of the different ways they could be expressed.
To find alternative keywords or phrases for your concepts try the following:
- Use a thesaurus to identify synonyms.
- Search for your concepts on a search engine like Google Scholar, scanning the results for alternative words and phrases.
- Examine relevant abstracts or articles for alternative words, phrases and subject headings (if the database uses subject headings).
When you've done this, you should have lists of words and phrases for each concept as in this completed PICO model (PDF) or this example concept map (PDF).
As you search and scan articles and abstracts, you may discover different key terms to enhance your search strategy.
Using truncation and wildcards can save you time and effort by finding alternative keywords.
Search with keywords
Keywords are free text words and phrases. Database search strategies use a combination of free text and subject headings (where applicable).
A keyword search usually looks for your search terms in the title and abstract of a reference. You may wish to search in title fields only if you want a small number of specific results.
Some databases will find the exact word or phrase, so make sure your spelling is accurate or you will miss references.
Search for the exact phrase
If you want words to appear next to each other in an exact phrase, use quotation marks, eg “self-esteem”.
Phrase searching decreases the number of results you get and makes your results more relevant. Most databases allow you to search for phrases, but check the database guide if you are unsure.
Truncation and wildcard searches
You can use truncated and wildcard searches to find variations of your search term. Truncation is useful for finding singular and plural forms of words and variant endings.
Many databases use an asterisk (*) as their truncation symbol. Check the database help section if you are not sure which symbol to use. For example, “therap*” will find therapy, therapies, therapist or therapists. A wildcard finds variant spellings of words. Use it to search for a single character, or no character.
Check the database help section to see which symbol to use as a wildcard.
Wildcards are useful for finding British and American spellings, for example: “behavio?r” in Medline will find both behaviour and behavior.
There are sometimes different symbols to find a variable single character. For example, in the Medline database, “wom#n” will find woman and also women.
Use adjacency searching for more accurate results
You can specify how close two words appear together in your search strategy. This can make your results more relevant; generally the closer two words appear to each other, the closer the relationship is between them.
Commands for adjacency searching differ among databases, so make sure you consult database guides.
In OvidSP databases (like Medline), searching for “physician ADJ3 relationship” will find both physician and relationship within two major words of each other, in any order. This finds more papers than "physician relationship".
Using this adjacency retrieves papers with phrases like "physician patient relationship", "patient physician relationship", "relationship of the physician to the patient" and so on.
Database subject headings are controlled vocabulary terms that a database uses to describe what an article is about.
Watch our 3-minute introduction to subject headings video . You can also View the video using Microsoft Stream (link opens in a new window, available for University members only).
Using appropriate subject headings enhances your search and will help you to find more results on your topic. This is because subject headings find articles according to their subject, even if the article does not use your chosen key words.
You should combine both subject headings and keywords in your search strategy for each of the concepts you identify. This is particularly important if you are undertaking a systematic review or an in-depth piece of work
Subject headings may vary between databases, so you need to investigate each database separately to find the subject headings they use. For example, for Medline you can use MeSH (Medical Subject Headings) and for Embase you can use the EMTREE thesaurus.
SEARCH TIP: In Ovid databases, search for a known key paper by title, select the "complete reference" button to see which subject headings the database indexers have given that article, and consider adding relevant ones to your own search strategy.
Use Boolean logic to combine search terms
Boolean operators (AND, OR and NOT) allow you to try different combinations of search terms or subject headings.
Databases often show Boolean operators as buttons or drop-down menus that you can click to combine your search terms or results.
The main Boolean operators are:
OR is used to find articles that mention either of the topics you search for.
AND is used to find articles that mention both of the searched topics.
NOT excludes a search term or concept. It should be used with caution as you may inadvertently exclude relevant references.
For example, searching for “self-esteem NOT eating disorders” finds articles that mention self-esteem but removes any articles that mention eating disorders.
Citation searching is a method to find articles that have been cited by other publications.
Use citation searching (or cited reference searching) to:
- find out whether articles have been cited by other authors
- find more recent papers on the same or similar subject
- discover how a known idea or innovation has been confirmed, applied, improved, extended, or corrected
- help make your literature review more comprehensive.
You can use cited reference searching in:
- OvidSP databases
- Google Scholar
- Web of Science
Cited reference searching can complement your literature search. However be careful not to just look at papers that have been cited in isolation. A robust literature search is also needed to limit publication bias.

Literature Review: Developing a search strategy
- Traditional or narrative literature reviews
- Scoping Reviews
- Systematic literature reviews
- Annotated bibliography
- Keeping up to date with literature
- Finding a thesis
- Evaluating sources and critical appraisal of literature
- Managing and analysing your literature
- Further reading and resources
From research question to search strategy
Keeping a record of your search activity
Good search practice could involve keeping a search diary or document detailing your search activities (Phelps et. al. 2007, pp. 128-149), so that you can keep track of effective search terms, or to help others to reproduce your steps and get the same results.
This record could be a document, table or spreadsheet with:
- The names of the sources you search and which provider you accessed them through - eg Medline (Ovid), Web of Science (Thomson Reuters). You should also include any other literature sources you used.
- how you searched (keyword and/or subject headings)
- which search terms you used (which words and phrases)
- any search techniques you employed (truncation, adjacency, etc)
- how you combined your search terms (AND/OR). Check out the Database Help guide for more tips on Boolean Searching.
- The number of search results from each source and each strategy used. This can be the evidence you need to prove a gap in the literature, and confirms the importance of your research question.
A search planner may help you to organise you thoughts prior to conducting your search. If you have any problems with organising your thoughts prior, during and after searching please contact your Library Faculty Team for individual help.
- Literature search - a librarian's handout to introduce tools, terms and techniques Created by Elsevier librarian, Katy Kavanagh Web, this document outlines tools, terms and techniques to think about when conducting a literature search
- Search planner
Literature search cycle

Have a search framework
Search frameworks are mnemonics which can help you focus your research question. They are also useful in helping you to identify the concepts and terms you will use in your literature search.
PICO is a search framework commonly used in the health sciences to focus clinical questions. As an example, you work in an aged care facility and are interested in whether cranberry juice might help reduce the common occurrence of urinary tract infections. The PICO framework would look like this:
Now that the issue has been broken up to its elements, it is easier to turn it into an answerable research question: “Does cranberry juice help reduce urinary tract infections in people living in aged care facilities?”
Other frameworks may be helpful, depending on your question and your field of interest. PICO can be adapted to PICOT (which adds T ime) or PICOS (which adds S tudy design), or PICOC (adding C ontext).
For qualitative questions you could use
- SPIDER : S ample, P henomenon of I nterest, D esign, E valuation, R esearch type
For questions about causes or risk,
- PEO : P opulation, E xposure, O utcomes
For evaluations of interventions or policies,
- SPICE: S etting, P opulation or P erspective, I ntervention, C omparison, E valuation or
- ECLIPSE: E xpectation, C lient group, L ocation, I mpact, P rofessionals, SE rvice
See the University of Notre Dame Australia’s examples of some of these frameworks.
You can also try some PICO examples in the National Library of Medicine's PubMed training site: Using PICO to frame clinical questions.
Different search strategies
Contact Your Faculty Team Librarian
Faculty librarians are here to provide assistance to students, researchers and academic staff by providing expert searching advice, research and curriculum support.
- Faculty of Arts & Education team
- Faculty of Business, Justice & Behavioural Science team
- Faculty of Science team
Further reading
- << Previous: Annotated bibliography
- Next: Keeping up to date with literature >>
- Last Updated: Jan 24, 2023 11:45 AM
- URL: https://libguides.csu.edu.au/review

Charles Sturt University is an Australian University, TEQSA Provider Identification: PRV12018. CRICOS Provider: 00005F.
- Interlibrary Loan
Ask an Expert
Ask an expert about access to resources, publishing, grants, and more.
MD Anderson faculty and staff can also request a one-on-one consultation with a librarian or scientific editor.
- Library Calendar
Log in to the Library's remote access system using your MyID account.

- Library Home
- Research Guides
Literature Search Basics
Develop a search strategy.
- Define your search
- Decide where to search
What is a search strategy
Advanced search tips.
- Track and save your search
- Class Recording: Writing an Effective Narrative Review
- A search strategy includes a combination of keywords, subject headings, and limiters (language, date, publication type, etc.)
- A search strategy should be planned out and practiced before executing the final search in a database.
- A search strategy and search results should be documented throughout the searching process.
What is a search strategy?
A search strategy is an organized combination of keywords, phrases, subject headings, and limiters used to search a database.
Your search strategy will include:
- keywords
- boolean operators
- variations of search terms (synonyms, suffixes)
- subject headings
Your search strategy may include:
- truncation (where applicable)
- phrases (where applicable)
- limiters (date, language, age, publication type, etc.)
A search strategy usually requires several iterations. You will need to test the strategy along the way to ensure that you are finding relevant articles. It's also a good idea to review your search strategy with your co-authors. They may have ideas about terms or concepts you may have missed.
Additionally, each database you search is developed differently. You will need to adjust your strategy for each database your search. For instance, Embase is a European database, many of the medical terms are slightly different than those used in MEDLINE and PubMed.
Choose search terms
Start by writing down as many terms as you can think of that relate to your question. You might try cited reference searching to find a few good articles that you can review for relevant terms.
Remember than most terms or concepts can be expressed in different ways. A few things to consider:
- synonyms: "cancer" may be referred to as "neoplasms", "tumors", or "malignancy"
- abbreviations: spell out the word instead of abbreviating
- generic vs. trade names of drugs
Search for the exact phrase
If you want words to appear next to each other in an exact phrase, use quotation marks, eg “self-esteem”.
Phrase searching decreases the number of results you get. Most databases allow you to search for phrases, but check the database guide if you are unsure.
Truncation and wildcards
Many databases use an asterisk (*) as their truncation symbol to find various word endings like singulars and plurals. Check the database help section if you are not sure which symbol to use.
"Therap*"
retrieves: therapy, therapies, therapist or therapists.
Use a wildcard (?) to find different spellings like British and American spellings.
"Behavio?r" retrieves behaviour and behavior.
Searching with subject headings
Database subject headings are controlled vocabulary terms that a database uses to describe what an article is about.
Using appropriate subject headings enhances your search and will help you to find more results on your topic. This is because subject headings find articles according to their subject, even if the article does not use your chosen key words.
You should combine both subject headings and keywords in your search strategy for each of the concepts you identify. This is particularly important if you are undertaking a systematic review or an in-depth piece of work
Subject headings may vary between databases, so you need to investigate each database separately to find the subject headings they use. For example, for MEDLINE you can use MeSH (Medical Subject Headings) and for Embase you can use the EMTREE thesaurus.
SEARCH TIP: In Ovid databases, search for a known key paper by title, select the "complete reference" button to see which subject headings the database indexers have given that article, and consider adding relevant ones to your own search strategy.
Use Boolean logic to combine search terms
Boolean operators (AND, OR and NOT) allow you to try different combinations of search terms or subject headings.
Databases often show Boolean operators as buttons or drop-down menus that you can click to combine your search terms or results.
The main Boolean operators are:
OR is used to find articles that mention either of the topics you search for.
AND is used to find articles that mention both of the searched topics.
NOT excludes a search term or concept. It should be used with caution as you may inadvertently exclude relevant references.
For example, searching for “self-esteem NOT eating disorders” finds articles that mention self-esteem but removes any articles that mention eating disorders.
Adjacency searching
Use adjacency operators to search by phrase or with two or more words in relation to one another. A djacency searching commands differ among databases. Check the database help section if you are not sure which searching commands to use.
In Ovid Medline
"breast ADJ3 cancer" finds the word breast within three words of cancer, in any order.
This includes breast cancer or cancer of the breast.
Cited Reference Searching
Cited reference searching is a method to find articles that have been cited by other publications.
Use cited reference searching to:
- find keywords or terms you may need to include in your search strategy
- find pivotal papers the same or similar subject area
- find pivotal authors in the same or similar subject area
- track how a topic has developed over time
Cited reference searching is available through these tools:
- Web of Science
- GoogleScholar
- << Previous: Decide where to search
- Next: Track and save your search >>
- Last Updated: Nov 29, 2022 3:34 PM
- URL: https://mdanderson.libguides.com/literaturesearchbasics

Literature Reviews & Search Strategies
- Defining the Literature Review
- Types of Literature Reviews
- Choosing Databases
Overview of Search Strategies
Search strategies, subject searching, example: iteratively developing + using keywords, demonstration: developing keywords from a question, demonstration: an advanced search.
- Organizing Your Literature
- Books: Research Design & Scholarly Writing
- Recommended Tutorials
There are many ways to find literature for your review, and we recommend that you use a combination of strategies - keeping in mind that you're going to be searching multiple times in a variety of ways, using different databases and resources. Searching the literature is not a straightforward, linear process - it's iterative (translation: you'll search multiple times, modifying your strategies as you go, and sometimes it'll be frustrating).
- Known Item Searching
- Citation Jumping
Some form of a keyword search is the way most of us get at scholarly articles in database - it's a great approach! Make sure you're familiar with these librarian strategies to get the most out of your searches.
Figuring out the best keywords for your research topic/question is a process - you'll start with one or a few words and then shift, adapt, and expand them as you start finding source that describe the topic using other words. Your search terms are the bridge between known topics and the unknowns of your research question - so sometimes one specific word will be enough, sometimes you'll need several different words to describe a concept AND you'll need to connect that concept to a second (and/or third) concept.
The number and specificity of your search terms depend on your topic and the scope of your literature review.
Connect Keywords Using Boolean
Make the database work more.
...uses the asterisk (*) to end a word at its core, allowing you to retrieve many more documents containing variations of the search term. Example: educat* will find educate, educates, education, educators, educating and more.
Phrase Searching
...is when you put quotations marks around two or more words, so that the database looks for those words in that exact order. Examples: "higher education," "public health" and "pharmaceutical industry."
Controlled Vocabulary
... is when you use the terms the database uses to describe what each article is about as search terms. Searching using controlled vocabularies is a great way to get at everything on a topic in a database.
Databases and search engines are probably going to bring back a lot of results - more than a human can realistically go through. Instead of trying to manually read and sort them all, use the filters in each database to remove the stuff you wouldn't use anyway (ie it's outside the scope of your project).
To make sure you're consistent between searches and databases, write down the filters you're using.
A Few Filters to Try
Once you know you have a good article , there are a lot of useful parts to it - far beyond the content.
Not sure where to start? Try course readings and other required materials.
Useful Parts of a Good Article
Ways to use citations.
- Interactive Tutorial: Searching Cited and Citing Practice starting your search at an article and using the references to gather additional sources.

Your search results don't have to be frozen in the moment you search! There are a few things you can set up to keep your search going automatically.
Searching using subject headings is a comprehensive search strategy that requires some planning and topic knowledge. Work through this PubMed tutorial for an introduction to this important approach to searching.

Through these videos and the accompanying PDF, you'll see an example of starting with a potential research question and developing search terms through brainstorming and keyword searching.
- Slidedeck: Keywords and Advanced Search PowerPoint slides to accompany the two demonstration videos on developing keywords from a question, and doing an advanced search.
- << Previous: Choosing Databases
- Next: Organizing Your Literature >>
- Last Updated: Sep 21, 2022 3:47 PM
- URL: https://mcphs.libguides.com/litreviews
- Open Access
- Published: 14 August 2018
Defining the process to literature searching in systematic reviews: a literature review of guidance and supporting studies
- Chris Cooper ORCID: orcid.org/0000-0003-0864-5607 1 ,
- Andrew Booth 2 ,
- Jo Varley-Campbell 1 ,
- Nicky Britten 3 &
- Ruth Garside 4
BMC Medical Research Methodology volume 18 , Article number: 85 ( 2018 ) Cite this article
163k Accesses
135 Citations
125 Altmetric
Metrics details
Systematic literature searching is recognised as a critical component of the systematic review process. It involves a systematic search for studies and aims for a transparent report of study identification, leaving readers clear about what was done to identify studies, and how the findings of the review are situated in the relevant evidence.
Information specialists and review teams appear to work from a shared and tacit model of the literature search process. How this tacit model has developed and evolved is unclear, and it has not been explicitly examined before.
The purpose of this review is to determine if a shared model of the literature searching process can be detected across systematic review guidance documents and, if so, how this process is reported in the guidance and supported by published studies.
A literature review.
Two types of literature were reviewed: guidance and published studies. Nine guidance documents were identified, including: The Cochrane and Campbell Handbooks. Published studies were identified through ‘pearl growing’, citation chasing, a search of PubMed using the systematic review methods filter, and the authors’ topic knowledge.
The relevant sections within each guidance document were then read and re-read, with the aim of determining key methodological stages. Methodological stages were identified and defined. This data was reviewed to identify agreements and areas of unique guidance between guidance documents. Consensus across multiple guidance documents was used to inform selection of ‘key stages’ in the process of literature searching.
Eight key stages were determined relating specifically to literature searching in systematic reviews. They were: who should literature search, aims and purpose of literature searching, preparation, the search strategy, searching databases, supplementary searching, managing references and reporting the search process.
Conclusions
Eight key stages to the process of literature searching in systematic reviews were identified. These key stages are consistently reported in the nine guidance documents, suggesting consensus on the key stages of literature searching, and therefore the process of literature searching as a whole, in systematic reviews. Further research to determine the suitability of using the same process of literature searching for all types of systematic review is indicated.
Peer Review reports
Systematic literature searching is recognised as a critical component of the systematic review process. It involves a systematic search for studies and aims for a transparent report of study identification, leaving review stakeholders clear about what was done to identify studies, and how the findings of the review are situated in the relevant evidence.
Information specialists and review teams appear to work from a shared and tacit model of the literature search process. How this tacit model has developed and evolved is unclear, and it has not been explicitly examined before. This is in contrast to the information science literature, which has developed information processing models as an explicit basis for dialogue and empirical testing. Without an explicit model, research in the process of systematic literature searching will remain immature and potentially uneven, and the development of shared information models will be assumed but never articulated.
One way of developing such a conceptual model is by formally examining the implicit “programme theory” as embodied in key methodological texts. The aim of this review is therefore to determine if a shared model of the literature searching process in systematic reviews can be detected across guidance documents and, if so, how this process is reported and supported.
Identifying guidance
Key texts (henceforth referred to as “guidance”) were identified based upon their accessibility to, and prominence within, United Kingdom systematic reviewing practice. The United Kingdom occupies a prominent position in the science of health information retrieval, as quantified by such objective measures as the authorship of papers, the number of Cochrane groups based in the UK, membership and leadership of groups such as the Cochrane Information Retrieval Methods Group, the HTA-I Information Specialists’ Group and historic association with such centres as the UK Cochrane Centre, the NHS Centre for Reviews and Dissemination, the Centre for Evidence Based Medicine and the National Institute for Clinical Excellence (NICE). Coupled with the linguistic dominance of English within medical and health science and the science of systematic reviews more generally, this offers a justification for a purposive sample that favours UK, European and Australian guidance documents.
Nine guidance documents were identified. These documents provide guidance for different types of reviews, namely: reviews of interventions, reviews of health technologies, reviews of qualitative research studies, reviews of social science topics, and reviews to inform guidance.
Whilst these guidance documents occasionally offer additional guidance on other types of systematic reviews, we have focused on the core and stated aims of these documents as they relate to literature searching. Table 1 sets out: the guidance document, the version audited, their core stated focus, and a bibliographical pointer to the main guidance relating to literature searching.
Once a list of key guidance documents was determined, it was checked by six senior information professionals based in the UK for relevance to current literature searching in systematic reviews.
Identifying supporting studies
In addition to identifying guidance, the authors sought to populate an evidence base of supporting studies (henceforth referred to as “studies”) that contribute to existing search practice. Studies were first identified by the authors from their knowledge on this topic area and, subsequently, through systematic citation chasing key studies (‘pearls’ [ 1 ]) located within each key stage of the search process. These studies are identified in Additional file 1 : Appendix Table 1. Citation chasing was conducted by analysing the bibliography of references for each study (backwards citation chasing) and through Google Scholar (forward citation chasing). A search of PubMed using the systematic review methods filter was undertaken in August 2017 (see Additional file 1 ). The search terms used were: (literature search*[Title/Abstract]) AND sysrev_methods[sb] and 586 results were returned. These results were sifted for relevance to the key stages in Fig. 1 by CC.

The key stages of literature search guidance as identified from nine key texts
Extracting the data
To reveal the implicit process of literature searching within each guidance document, the relevant sections (chapters) on literature searching were read and re-read, with the aim of determining key methodological stages. We defined a key methodological stage as a distinct step in the overall process for which specific guidance is reported, and action is taken, that collectively would result in a completed literature search.
The chapter or section sub-heading for each methodological stage was extracted into a table using the exact language as reported in each guidance document. The lead author (CC) then read and re-read these data, and the paragraphs of the document to which the headings referred, summarising section details. This table was then reviewed, using comparison and contrast to identify agreements and areas of unique guidance. Consensus across multiple guidelines was used to inform selection of ‘key stages’ in the process of literature searching.
Having determined the key stages to literature searching, we then read and re-read the sections relating to literature searching again, extracting specific detail relating to the methodological process of literature searching within each key stage. Again, the guidance was then read and re-read, first on a document-by-document-basis and, secondly, across all the documents above, to identify both commonalities and areas of unique guidance.
Results and discussion
Our findings.
We were able to identify consensus across the guidance on literature searching for systematic reviews suggesting a shared implicit model within the information retrieval community. Whilst the structure of the guidance varies between documents, the same key stages are reported, even where the core focus of each document is different. We were able to identify specific areas of unique guidance, where a document reported guidance not summarised in other documents, together with areas of consensus across guidance.
Unique guidance
Only one document provided guidance on the topic of when to stop searching [ 2 ]. This guidance from 2005 anticipates a topic of increasing importance with the current interest in time-limited (i.e. “rapid”) reviews. Quality assurance (or peer review) of literature searches was only covered in two guidance documents [ 3 , 4 ]. This topic has emerged as increasingly important as indicated by the development of the PRESS instrument [ 5 ]. Text mining was discussed in four guidance documents [ 4 , 6 , 7 , 8 ] where the automation of some manual review work may offer efficiencies in literature searching [ 8 ].
Agreement between guidance: Defining the key stages of literature searching
Where there was agreement on the process, we determined that this constituted a key stage in the process of literature searching to inform systematic reviews.
From the guidance, we determined eight key stages that relate specifically to literature searching in systematic reviews. These are summarised at Fig. 1 . The data extraction table to inform Fig. 1 is reported in Table 2 . Table 2 reports the areas of common agreement and it demonstrates that the language used to describe key stages and processes varies significantly between guidance documents.
For each key stage, we set out the specific guidance, followed by discussion on how this guidance is situated within the wider literature.
Key stage one: Deciding who should undertake the literature search
The guidance.
Eight documents provided guidance on who should undertake literature searching in systematic reviews [ 2 , 4 , 6 , 7 , 8 , 9 , 10 , 11 ]. The guidance affirms that people with relevant expertise of literature searching should ‘ideally’ be included within the review team [ 6 ]. Information specialists (or information scientists), librarians or trial search co-ordinators (TSCs) are indicated as appropriate researchers in six guidance documents [ 2 , 7 , 8 , 9 , 10 , 11 ].
How the guidance corresponds to the published studies
The guidance is consistent with studies that call for the involvement of information specialists and librarians in systematic reviews [ 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 , 20 , 21 , 22 , 23 , 24 , 25 , 26 ] and which demonstrate how their training as ‘expert searchers’ and ‘analysers and organisers of data’ can be put to good use [ 13 ] in a variety of roles [ 12 , 16 , 20 , 21 , 24 , 25 , 26 ]. These arguments make sense in the context of the aims and purposes of literature searching in systematic reviews, explored below. The need for ‘thorough’ and ‘replicable’ literature searches was fundamental to the guidance and recurs in key stage two. Studies have found poor reporting, and a lack of replicable literature searches, to be a weakness in systematic reviews [ 17 , 18 , 27 , 28 ] and they argue that involvement of information specialists/ librarians would be associated with better reporting and better quality literature searching. Indeed, Meert et al. [ 29 ] demonstrated that involving a librarian as a co-author to a systematic review correlated with a higher score in the literature searching component of a systematic review [ 29 ]. As ‘new styles’ of rapid and scoping reviews emerge, where decisions on how to search are more iterative and creative, a clear role is made here too [ 30 ].
Knowing where to search for studies was noted as important in the guidance, with no agreement as to the appropriate number of databases to be searched [ 2 , 6 ]. Database (and resource selection more broadly) is acknowledged as a relevant key skill of information specialists and librarians [ 12 , 15 , 16 , 31 ].
Whilst arguments for including information specialists and librarians in the process of systematic review might be considered self-evident, Koffel and Rethlefsen [ 31 ] have questioned if the necessary involvement is actually happening [ 31 ].
Key stage two: Determining the aim and purpose of a literature search
The aim: Five of the nine guidance documents use adjectives such as ‘thorough’, ‘comprehensive’, ‘transparent’ and ‘reproducible’ to define the aim of literature searching [ 6 , 7 , 8 , 9 , 10 ]. Analogous phrases were present in a further three guidance documents, namely: ‘to identify the best available evidence’ [ 4 ] or ‘the aim of the literature search is not to retrieve everything. It is to retrieve everything of relevance’ [ 2 ] or ‘A systematic literature search aims to identify all publications relevant to the particular research question’ [ 3 ]. The Joanna Briggs Institute reviewers’ manual was the only guidance document where a clear statement on the aim of literature searching could not be identified. The purpose of literature searching was defined in three guidance documents, namely to minimise bias in the resultant review [ 6 , 8 , 10 ]. Accordingly, eight of nine documents clearly asserted that thorough and comprehensive literature searches are required as a potential mechanism for minimising bias.
The need for thorough and comprehensive literature searches appears as uniform within the eight guidance documents that describe approaches to literature searching in systematic reviews of effectiveness. Reviews of effectiveness (of intervention or cost), accuracy and prognosis, require thorough and comprehensive literature searches to transparently produce a reliable estimate of intervention effect. The belief that all relevant studies have been ‘comprehensively’ identified, and that this process has been ‘transparently’ reported, increases confidence in the estimate of effect and the conclusions that can be drawn [ 32 ]. The supporting literature exploring the need for comprehensive literature searches focuses almost exclusively on reviews of intervention effectiveness and meta-analysis. Different ‘styles’ of review may have different standards however; the alternative, offered by purposive sampling, has been suggested in the specific context of qualitative evidence syntheses [ 33 ].
What is a comprehensive literature search?
Whilst the guidance calls for thorough and comprehensive literature searches, it lacks clarity on what constitutes a thorough and comprehensive literature search, beyond the implication that all of the literature search methods in Table 2 should be used to identify studies. Egger et al. [ 34 ], in an empirical study evaluating the importance of comprehensive literature searches for trials in systematic reviews, defined a comprehensive search for trials as:
a search not restricted to English language;
where Cochrane CENTRAL or at least two other electronic databases had been searched (such as MEDLINE or EMBASE); and
at least one of the following search methods has been used to identify unpublished trials: searches for (I) conference abstracts, (ii) theses, (iii) trials registers; and (iv) contacts with experts in the field [ 34 ].
Tricco et al. (2008) used a similar threshold of bibliographic database searching AND a supplementary search method in a review when examining the risk of bias in systematic reviews. Their criteria were: one database (limited using the Cochrane Highly Sensitive Search Strategy (HSSS)) and handsearching [ 35 ].
Together with the guidance, this would suggest that comprehensive literature searching requires the use of BOTH bibliographic database searching AND supplementary search methods.
Comprehensiveness in literature searching, in the sense of how much searching should be undertaken, remains unclear. Egger et al. recommend that ‘investigators should consider the type of literature search and degree of comprehension that is appropriate for the review in question, taking into account budget and time constraints’ [ 34 ]. This view tallies with the Cochrane Handbook, which stipulates clearly, that study identification should be undertaken ‘within resource limits’ [ 9 ]. This would suggest that the limitations to comprehension are recognised but it raises questions on how this is decided and reported [ 36 ].
What is the point of comprehensive literature searching?
The purpose of thorough and comprehensive literature searches is to avoid missing key studies and to minimize bias [ 6 , 8 , 10 , 34 , 37 , 38 , 39 ] since a systematic review based only on published (or easily accessible) studies may have an exaggerated effect size [ 35 ]. Felson (1992) sets out potential biases that could affect the estimate of effect in a meta-analysis [ 40 ] and Tricco et al. summarize the evidence concerning bias and confounding in systematic reviews [ 35 ]. Egger et al. point to non-publication of studies, publication bias, language bias and MEDLINE bias, as key biases [ 34 , 35 , 40 , 41 , 42 , 43 , 44 , 45 , 46 ]. Comprehensive searches are not the sole factor to mitigate these biases but their contribution is thought to be significant [ 2 , 32 , 34 ]. Fehrmann (2011) suggests that ‘the search process being described in detail’ and that, where standard comprehensive search techniques have been applied, increases confidence in the search results [ 32 ].
Does comprehensive literature searching work?
Egger et al., and other study authors, have demonstrated a change in the estimate of intervention effectiveness where relevant studies were excluded from meta-analysis [ 34 , 47 ]. This would suggest that missing studies in literature searching alters the reliability of effectiveness estimates. This is an argument for comprehensive literature searching. Conversely, Egger et al. found that ‘comprehensive’ searches still missed studies and that comprehensive searches could, in fact, introduce bias into a review rather than preventing it, through the identification of low quality studies then being included in the meta-analysis [ 34 ]. Studies query if identifying and including low quality or grey literature studies changes the estimate of effect [ 43 , 48 ] and question if time is better invested updating systematic reviews rather than searching for unpublished studies [ 49 ], or mapping studies for review as opposed to aiming for high sensitivity in literature searching [ 50 ].
Aim and purpose beyond reviews of effectiveness
The need for comprehensive literature searches is less certain in reviews of qualitative studies, and for reviews where a comprehensive identification of studies is difficult to achieve (for example, in Public health) [ 33 , 51 , 52 , 53 , 54 , 55 ]. Literature searching for qualitative studies, and in public health topics, typically generates a greater number of studies to sift than in reviews of effectiveness [ 39 ] and demonstrating the ‘value’ of studies identified or missed is harder [ 56 ], since the study data do not typically support meta-analysis. Nussbaumer-Streit et al. (2016) have registered a review protocol to assess whether abbreviated literature searches (as opposed to comprehensive literature searches) has an impact on conclusions across multiple bodies of evidence, not only on effect estimates [ 57 ] which may develop this understanding. It may be that decision makers and users of systematic reviews are willing to trade the certainty from a comprehensive literature search and systematic review in exchange for different approaches to evidence synthesis [ 58 ], and that comprehensive literature searches are not necessarily a marker of literature search quality, as previously thought [ 36 ]. Different approaches to literature searching [ 37 , 38 , 59 , 60 , 61 , 62 ] and developing the concept of when to stop searching are important areas for further study [ 36 , 59 ].
The study by Nussbaumer-Streit et al. has been published since the submission of this literature review [ 63 ]. Nussbaumer-Streit et al. (2018) conclude that abbreviated literature searches are viable options for rapid evidence syntheses, if decision-makers are willing to trade the certainty from a comprehensive literature search and systematic review, but that decision-making which demands detailed scrutiny should still be based on comprehensive literature searches [ 63 ].
Key stage three: Preparing for the literature search
Six documents provided guidance on preparing for a literature search [ 2 , 3 , 6 , 7 , 9 , 10 ]. The Cochrane Handbook clearly stated that Cochrane authors (i.e. researchers) should seek advice from a trial search co-ordinator (i.e. a person with specific skills in literature searching) ‘before’ starting a literature search [ 9 ].
Two key tasks were perceptible in preparing for a literature searching [ 2 , 6 , 7 , 10 , 11 ]. First, to determine if there are any existing or on-going reviews, or if a new review is justified [ 6 , 11 ]; and, secondly, to develop an initial literature search strategy to estimate the volume of relevant literature (and quality of a small sample of relevant studies [ 10 ]) and indicate the resources required for literature searching and the review of the studies that follows [ 7 , 10 ].
Three documents summarised guidance on where to search to determine if a new review was justified [ 2 , 6 , 11 ]. These focused on searching databases of systematic reviews (The Cochrane Database of Systematic Reviews (CDSR) and the Database of Abstracts of Reviews of Effects (DARE)), institutional registries (including PROSPERO), and MEDLINE [ 6 , 11 ]. It is worth noting, however, that as of 2015, DARE (and NHS EEDs) are no longer being updated and so the relevance of this (these) resource(s) will diminish over-time [ 64 ]. One guidance document, ‘Systematic reviews in the Social Sciences’, noted, however, that databases are not the only source of information and unpublished reports, conference proceeding and grey literature may also be required, depending on the nature of the review question [ 2 ].
Two documents reported clearly that this preparation (or ‘scoping’) exercise should be undertaken before the actual search strategy is developed [ 7 , 10 ]).
The guidance offers the best available source on preparing the literature search with the published studies not typically reporting how their scoping informed the development of their search strategies nor how their search approaches were developed. Text mining has been proposed as a technique to develop search strategies in the scoping stages of a review although this work is still exploratory [ 65 ]. ‘Clustering documents’ and word frequency analysis have also been tested to identify search terms and studies for review [ 66 , 67 ]. Preparing for literature searches and scoping constitutes an area for future research.
Key stage four: Designing the search strategy
The Population, Intervention, Comparator, Outcome (PICO) structure was the commonly reported structure promoted to design a literature search strategy. Five documents suggested that the eligibility criteria or review question will determine which concepts of PICO will be populated to develop the search strategy [ 1 , 4 , 7 , 8 , 9 ]. The NICE handbook promoted multiple structures, namely PICO, SPICE (Setting, Perspective, Intervention, Comparison, Evaluation) and multi-stranded approaches [ 4 ].
With the exclusion of The Joanna Briggs Institute reviewers’ manual, the guidance offered detail on selecting key search terms, synonyms, Boolean language, selecting database indexing terms and combining search terms. The CEE handbook suggested that ‘search terms may be compiled with the help of the commissioning organisation and stakeholders’ [ 10 ].
The use of limits, such as language or date limits, were discussed in all documents [ 2 , 3 , 4 , 6 , 7 , 8 , 9 , 10 , 11 ].
Search strategy structure
The guidance typically relates to reviews of intervention effectiveness so PICO – with its focus on intervention and comparator - is the dominant model used to structure literature search strategies [ 68 ]. PICOs – where the S denotes study design - is also commonly used in effectiveness reviews [ 6 , 68 ]. As the NICE handbook notes, alternative models to structure literature search strategies have been developed and tested. Booth provides an overview on formulating questions for evidence based practice [ 69 ] and has developed a number of alternatives to the PICO structure, namely: BeHEMoTh (Behaviour of interest; Health context; Exclusions; Models or Theories) for use when systematically identifying theory [ 55 ]; SPICE (Setting, Perspective, Intervention, Comparison, Evaluation) for identification of social science and evaluation studies [ 69 ] and, working with Cooke and colleagues, SPIDER (Sample, Phenomenon of Interest, Design, Evaluation, Research type) [ 70 ]. SPIDER has been compared to PICO and PICOs in a study by Methley et al. [ 68 ].
The NICE handbook also suggests the use of multi-stranded approaches to developing literature search strategies [ 4 ]. Glanville developed this idea in a study by Whitting et al. [ 71 ] and a worked example of this approach is included in the development of a search filter by Cooper et al. [ 72 ].
Writing search strategies: Conceptual and objective approaches
Hausner et al. [ 73 ] provide guidance on writing literature search strategies, delineating between conceptually and objectively derived approaches. The conceptual approach, advocated by and explained in the guidance documents, relies on the expertise of the literature searcher to identify key search terms and then develop key terms to include synonyms and controlled syntax. Hausner and colleagues set out the objective approach [ 73 ] and describe what may be done to validate it [ 74 ].
The use of limits
The guidance documents offer direction on the use of limits within a literature search. Limits can be used to focus literature searching to specific study designs or by other markers (such as by date) which limits the number of studies returned by a literature search. The use of limits should be described and the implications explored [ 34 ] since limiting literature searching can introduce bias (explored above). Craven et al. have suggested the use of a supporting narrative to explain decisions made in the process of developing literature searches and this advice would usefully capture decisions on the use of search limits [ 75 ].
Key stage five: Determining the process of literature searching and deciding where to search (bibliographic database searching)
Table 2 summarises the process of literature searching as reported in each guidance document. Searching bibliographic databases was consistently reported as the ‘first step’ to literature searching in all nine guidance documents.
Three documents reported specific guidance on where to search, in each case specific to the type of review their guidance informed, and as a minimum requirement [ 4 , 9 , 11 ]. Seven of the key guidance documents suggest that the selection of bibliographic databases depends on the topic of review [ 2 , 3 , 4 , 6 , 7 , 8 , 10 ], with two documents noting the absence of an agreed standard on what constitutes an acceptable number of databases searched [ 2 , 6 ].
The guidance documents summarise ‘how to’ search bibliographic databases in detail and this guidance is further contextualised above in terms of developing the search strategy. The documents provide guidance of selecting bibliographic databases, in some cases stating acceptable minima (i.e. The Cochrane Handbook states Cochrane CENTRAL, MEDLINE and EMBASE), and in other cases simply listing bibliographic database available to search. Studies have explored the value in searching specific bibliographic databases, with Wright et al. (2015) noting the contribution of CINAHL in identifying qualitative studies [ 76 ], Beckles et al. (2013) questioning the contribution of CINAHL to identifying clinical studies for guideline development [ 77 ], and Cooper et al. (2015) exploring the role of UK-focused bibliographic databases to identify UK-relevant studies [ 78 ]. The host of the database (e.g. OVID or ProQuest) has been shown to alter the search returns offered. Younger and Boddy [ 79 ] report differing search returns from the same database (AMED) but where the ‘host’ was different [ 79 ].
The average number of bibliographic database searched in systematic reviews has risen in the period 1994–2014 (from 1 to 4) [ 80 ] but there remains (as attested to by the guidance) no consensus on what constitutes an acceptable number of databases searched [ 48 ]. This is perhaps because thinking about the number of databases searched is the wrong question, researchers should be focused on which databases were searched and why, and which databases were not searched and why. The discussion should re-orientate to the differential value of sources but researchers need to think about how to report this in studies to allow findings to be generalised. Bethel (2017) has proposed ‘search summaries’, completed by the literature searcher, to record where included studies were identified, whether from database (and which databases specifically) or supplementary search methods [ 81 ]. Search summaries document both yield and accuracy of searches, which could prospectively inform resource use and decisions to search or not to search specific databases in topic areas. The prospective use of such data presupposes, however, that past searches are a potential predictor of future search performance (i.e. that each topic is to be considered representative and not unique). In offering a body of practice, this data would be of greater practicable use than current studies which are considered as little more than individual case studies [ 82 , 83 , 84 , 85 , 86 , 87 , 88 , 89 , 90 ].
When to database search is another question posed in the literature. Beyer et al. [ 91 ] report that databases can be prioritised for literature searching which, whilst not addressing the question of which databases to search, may at least bring clarity as to which databases to search first [ 91 ]. Paradoxically, this links to studies that suggest PubMed should be searched in addition to MEDLINE (OVID interface) since this improves the currency of systematic reviews [ 92 , 93 ]. Cooper et al. (2017) have tested the idea of database searching not as a primary search method (as suggested in the guidance) but as a supplementary search method in order to manage the volume of studies identified for an environmental effectiveness systematic review. Their case study compared the effectiveness of database searching versus a protocol using supplementary search methods and found that the latter identified more relevant studies for review than searching bibliographic databases [ 94 ].
Key stage six: Determining the process of literature searching and deciding where to search (supplementary search methods)
Table 2 also summaries the process of literature searching which follows bibliographic database searching. As Table 2 sets out, guidance that supplementary literature search methods should be used in systematic reviews recurs across documents, but the order in which these methods are used, and the extent to which they are used, varies. We noted inconsistency in the labelling of supplementary search methods between guidance documents.
Rather than focus on the guidance on how to use the methods (which has been summarised in a recent review [ 95 ]), we focus on the aim or purpose of supplementary search methods.
The Cochrane Handbook reported that ‘efforts’ to identify unpublished studies should be made [ 9 ]. Four guidance documents [ 2 , 3 , 6 , 9 ] acknowledged that searching beyond bibliographic databases was necessary since ‘databases are not the only source of literature’ [ 2 ]. Only one document reported any guidance on determining when to use supplementary methods. The IQWiG handbook reported that the use of handsearching (in their example) could be determined on a ‘case-by-case basis’ which implies that the use of these methods is optional rather than mandatory. This is in contrast to the guidance (above) on bibliographic database searching.
The issue for supplementary search methods is similar in many ways to the issue of searching bibliographic databases: demonstrating value. The purpose and contribution of supplementary search methods in systematic reviews is increasingly acknowledged [ 37 , 61 , 62 , 96 , 97 , 98 , 99 , 100 , 101 ] but understanding the value of the search methods to identify studies and data is unclear. In a recently published review, Cooper et al. (2017) reviewed the literature on supplementary search methods looking to determine the advantages, disadvantages and resource implications of using supplementary search methods [ 95 ]. This review also summarises the key guidance and empirical studies and seeks to address the question on when to use these search methods and when not to [ 95 ]. The guidance is limited in this regard and, as Table 2 demonstrates, offers conflicting advice on the order of searching, and the extent to which these search methods should be used in systematic reviews.
Key stage seven: Managing the references
Five of the documents provided guidance on managing references, for example downloading, de-duplicating and managing the output of literature searches [ 2 , 4 , 6 , 8 , 10 ]. This guidance typically itemised available bibliographic management tools rather than offering guidance on how to use them specifically [ 2 , 4 , 6 , 8 ]. The CEE handbook provided guidance on importing data where no direct export option is available (e.g. web-searching) [ 10 ].
The literature on using bibliographic management tools is not large relative to the number of ‘how to’ videos on platforms such as YouTube (see for example [ 102 ]). These YouTube videos confirm the overall lack of ‘how to’ guidance identified in this study and offer useful instruction on managing references. Bramer et al. set out methods for de-duplicating data and reviewing references in Endnote [ 103 , 104 ] and Gall tests the direct search function within Endnote to access databases such as PubMed, finding a number of limitations [ 105 ]. Coar et al. and Ahmed et al. consider the role of the free-source tool, Zotero [ 106 , 107 ]. Managing references is a key administrative function in the process of review particularly for documenting searches in PRISMA guidance.
Key stage eight: Documenting the search
The Cochrane Handbook was the only guidance document to recommend a specific reporting guideline: Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) [ 9 ]. Six documents provided guidance on reporting the process of literature searching with specific criteria to report [ 3 , 4 , 6 , 8 , 9 , 10 ]. There was consensus on reporting: the databases searched (and the host searched by), the search strategies used, and any use of limits (e.g. date, language, search filters (The CRD handbook called for these limits to be justified [ 6 ])). Three guidance documents reported that the number of studies identified should be recorded [ 3 , 6 , 10 ]. The number of duplicates identified [ 10 ], the screening decisions [ 3 ], a comprehensive list of grey literature sources searched (and full detail for other supplementary search methods) [ 8 ], and an annotation of search terms tested but not used [ 4 ] were identified as unique items in four documents.
The Cochrane Handbook was the only guidance document to note that the full search strategies for each database should be included in the Additional file 1 of the review [ 9 ].
All guidance documents should ultimately deliver completed systematic reviews that fulfil the requirements of the PRISMA reporting guidelines [ 108 ]. The guidance broadly requires the reporting of data that corresponds with the requirements of the PRISMA statement although documents typically ask for diverse and additional items [ 108 ]. In 2008, Sampson et al. observed a lack of consensus on reporting search methods in systematic reviews [ 109 ] and this remains the case as of 2017, as evidenced in the guidance documents, and in spite of the publication of the PRISMA guidelines in 2009 [ 110 ]. It is unclear why the collective guidance does not more explicitly endorse adherence to the PRISMA guidance.
Reporting of literature searching is a key area in systematic reviews since it sets out clearly what was done and how the conclusions of the review can be believed [ 52 , 109 ]. Despite strong endorsement in the guidance documents, specifically supported in PRISMA guidance, and other related reporting standards too (such as ENTREQ for qualitative evidence synthesis, STROBE for reviews of observational studies), authors still highlight the prevalence of poor standards of literature search reporting [ 31 , 110 , 111 , 112 , 113 , 114 , 115 , 116 , 117 , 118 , 119 ]. To explore issues experienced by authors in reporting literature searches, and look at uptake of PRISMA, Radar et al. [ 120 ] surveyed over 260 review authors to determine common problems and their work summaries the practical aspects of reporting literature searching [ 120 ]. Atkinson et al. [ 121 ] have also analysed reporting standards for literature searching, summarising recommendations and gaps for reporting search strategies [ 121 ].
One area that is less well covered by the guidance, but nevertheless appears in this literature, is the quality appraisal or peer review of literature search strategies. The PRESS checklist is the most prominent and it aims to develop evidence-based guidelines to peer review of electronic search strategies [ 5 , 122 , 123 ]. A corresponding guideline for documentation of supplementary search methods does not yet exist although this idea is currently being explored.
How the reporting of the literature searching process corresponds to critical appraisal tools is an area for further research. In the survey undertaken by Radar et al. (2014), 86% of survey respondents (153/178) identified a need for further guidance on what aspects of the literature search process to report [ 120 ]. The PRISMA statement offers a brief summary of what to report but little practical guidance on how to report it [ 108 ]. Critical appraisal tools for systematic reviews, such as AMSTAR 2 (Shea et al. [ 124 ]) and ROBIS (Whiting et al. [ 125 ]), can usefully be read alongside PRISMA guidance, since they offer greater detail on how the reporting of the literature search will be appraised and, therefore, they offer a proxy on what to report [ 124 , 125 ]. Further research in the form of a study which undertakes a comparison between PRISMA and quality appraisal checklists for systematic reviews would seem to begin addressing the call, identified by Radar et al., for further guidance on what to report [ 120 ].

Limitations
Other handbooks exist.
A potential limitation of this literature review is the focus on guidance produced in Europe (the UK specifically) and Australia. We justify the decision for our selection of the nine guidance documents reviewed in this literature review in section “ Identifying guidance ”. In brief, these nine guidance documents were selected as the most relevant health care guidance that inform UK systematic reviewing practice, given that the UK occupies a prominent position in the science of health information retrieval. We acknowledge the existence of other guidance documents, such as those from North America (e.g. the Agency for Healthcare Research and Quality (AHRQ) [ 126 ], The Institute of Medicine [ 127 ] and the guidance and resources produced by the Canadian Agency for Drugs and Technologies in Health (CADTH) [ 128 ]). We comment further on this directly below.
The handbooks are potentially linked to one another
What is not clear is the extent to which the guidance documents inter-relate or provide guidance uniquely. The Cochrane Handbook, first published in 1994, is notably a key source of reference in guidance and systematic reviews beyond Cochrane reviews. It is not clear to what extent broadening the sample of guidance handbooks to include North American handbooks, and guidance handbooks from other relevant countries too, would alter the findings of this literature review or develop further support for the process model. Since we cannot be clear, we raise this as a potential limitation of this literature review. On our initial review of a sample of North American, and other, guidance documents (before selecting the guidance documents considered in this review), however, we do not consider that the inclusion of these further handbooks would alter significantly the findings of this literature review.
This is a literature review
A further limitation of this review was that the review of published studies is not a systematic review of the evidence for each key stage. It is possible that other relevant studies could help contribute to the exploration and development of the key stages identified in this review.
This literature review would appear to demonstrate the existence of a shared model of the literature searching process in systematic reviews. We call this model ‘the conventional approach’, since it appears to be common convention in nine different guidance documents.
The findings reported above reveal eight key stages in the process of literature searching for systematic reviews. These key stages are consistently reported in the nine guidance documents which suggests consensus on the key stages of literature searching, and therefore the process of literature searching as a whole, in systematic reviews.
In Table 2 , we demonstrate consensus regarding the application of literature search methods. All guidance documents distinguish between primary and supplementary search methods. Bibliographic database searching is consistently the first method of literature searching referenced in each guidance document. Whilst the guidance uniformly supports the use of supplementary search methods, there is little evidence for a consistent process with diverse guidance across documents. This may reflect differences in the core focus across each document, linked to differences in identifying effectiveness studies or qualitative studies, for instance.
Eight of the nine guidance documents reported on the aims of literature searching. The shared understanding was that literature searching should be thorough and comprehensive in its aim and that this process should be reported transparently so that that it could be reproduced. Whilst only three documents explicitly link this understanding to minimising bias, it is clear that comprehensive literature searching is implicitly linked to ‘not missing relevant studies’ which is approximately the same point.
Defining the key stages in this review helps categorise the scholarship available, and it prioritises areas for development or further study. The supporting studies on preparing for literature searching (key stage three, ‘preparation’) were, for example, comparatively few, and yet this key stage represents a decisive moment in literature searching for systematic reviews. It is where search strategy structure is determined, search terms are chosen or discarded, and the resources to be searched are selected. Information specialists, librarians and researchers, are well placed to develop these and other areas within the key stages we identify.
This review calls for further research to determine the suitability of using the conventional approach. The publication dates of the guidance documents which underpin the conventional approach may raise questions as to whether the process which they each report remains valid for current systematic literature searching. In addition, it may be useful to test whether it is desirable to use the same process model of literature searching for qualitative evidence synthesis as that for reviews of intervention effectiveness, which this literature review demonstrates is presently recommended best practice.
Abbreviations
Behaviour of interest; Health context; Exclusions; Models or Theories
Cochrane Database of Systematic Reviews
The Cochrane Central Register of Controlled Trials
Database of Abstracts of Reviews of Effects
Enhancing transparency in reporting the synthesis of qualitative research
Institute for Quality and Efficiency in Healthcare
National Institute for Clinical Excellence
Population, Intervention, Comparator, Outcome
Preferred Reporting Items for Systematic Reviews and Meta-Analyses
Setting, Perspective, Intervention, Comparison, Evaluation
Sample, Phenomenon of Interest, Design, Evaluation, Research type
STrengthening the Reporting of OBservational studies in Epidemiology
Trial Search Co-ordinators
Booth A. Unpacking your literature search toolbox: on search styles and tactics. Health Information & Libraries Journal. 2008;25(4):313–7.
Article Google Scholar
Petticrew M, Roberts H. Systematic reviews in the social sciences: a practical guide. Oxford: Blackwell Publishing Ltd; 2006.
Book Google Scholar
Institute for Quality and Efficiency in Health Care (IQWiG). IQWiG Methods Resources. 7 Information retrieval 2014 [Available from: https://www.ncbi.nlm.nih.gov/books/NBK385787/ .
NICE: National Institute for Health and Care Excellence. Developing NICE guidelines: the manual 2014. Available from: https://www.nice.org.uk/media/default/about/what-we-do/our-programmes/developing-nice-guidelines-the-manual.pdf .
Sampson M. MJ, Lefebvre C, Moher D, Grimshaw J. Peer Review of Electronic Search Strategies: PRESS; 2008.
Google Scholar
Centre for Reviews & Dissemination. Systematic reviews – CRD’s guidance for undertaking reviews in healthcare. York: Centre for Reviews and Dissemination, University of York; 2009.
eunetha: European Network for Health Technology Assesment Process of information retrieval for systematic reviews and health technology assessments on clinical effectiveness 2016. Available from: http://www.eunethta.eu/sites/default/files/Guideline_Information_Retrieval_V1-1.pdf .
Kugley SWA, Thomas J, Mahood Q, Jørgensen AMK, Hammerstrøm K, Sathe N. Searching for studies: a guide to information retrieval for Campbell systematic reviews. Oslo: Campbell Collaboration. 2017; Available from: https://www.campbellcollaboration.org/library/searching-for-studies-information-retrieval-guide-campbell-reviews.html
Lefebvre C, Manheimer E, Glanville J. Chapter 6: searching for studies. In: JPT H, Green S, editors. Cochrane Handbook for Systematic Reviews of Interventions; 2011.
Collaboration for Environmental Evidence. Guidelines for Systematic Review and Evidence Synthesis in Environmental Management.: Environmental Evidence:; 2013. Available from: http://www.environmentalevidence.org/wp-content/uploads/2017/01/Review-guidelines-version-4.2-final-update.pdf .
The Joanna Briggs Institute. Joanna Briggs institute reviewers’ manual. 2014th ed: the Joanna Briggs institute; 2014. Available from: https://joannabriggs.org/assets/docs/sumari/ReviewersManual-2014.pdf
Beverley CA, Booth A, Bath PA. The role of the information specialist in the systematic review process: a health information case study. Health Inf Libr J. 2003;20(2):65–74.
Article CAS Google Scholar
Harris MR. The librarian's roles in the systematic review process: a case study. Journal of the Medical Library Association. 2005;93(1):81–7.
PubMed PubMed Central Google Scholar
Egger JB. Use of recommended search strategies in systematic reviews and the impact of librarian involvement: a cross-sectional survey of recent authors. PLoS One. 2015;10(5):e0125931.
Li L, Tian J, Tian H, Moher D, Liang F, Jiang T, et al. Network meta-analyses could be improved by searching more sources and by involving a librarian. J Clin Epidemiol. 2014;67(9):1001–7.
Article PubMed Google Scholar
McGowan J, Sampson M. Systematic reviews need systematic searchers. J Med Libr Assoc. 2005;93(1):74–80.
Rethlefsen ML, Farrell AM, Osterhaus Trzasko LC, Brigham TJ. Librarian co-authors correlated with higher quality reported search strategies in general internal medicine systematic reviews. J Clin Epidemiol. 2015;68(6):617–26.
Weller AC. Mounting evidence that librarians are essential for comprehensive literature searches for meta-analyses and Cochrane reports. J Med Libr Assoc. 2004;92(2):163–4.
Swinkels A, Briddon J, Hall J. Two physiotherapists, one librarian and a systematic literature review: collaboration in action. Health Info Libr J. 2006;23(4):248–56.
Foster M. An overview of the role of librarians in systematic reviews: from expert search to project manager. EAHIL. 2015;11(3):3–7.
Lawson L. OPERATING OUTSIDE LIBRARY WALLS 2004.
Vassar M, Yerokhin V, Sinnett PM, Weiher M, Muckelrath H, Carr B, et al. Database selection in systematic reviews: an insight through clinical neurology. Health Inf Libr J. 2017;34(2):156–64.
Townsend WA, Anderson PF, Ginier EC, MacEachern MP, Saylor KM, Shipman BL, et al. A competency framework for librarians involved in systematic reviews. Journal of the Medical Library Association : JMLA. 2017;105(3):268–75.
Cooper ID, Crum JA. New activities and changing roles of health sciences librarians: a systematic review, 1990-2012. Journal of the Medical Library Association : JMLA. 2013;101(4):268–77.
Crum JA, Cooper ID. Emerging roles for biomedical librarians: a survey of current practice, challenges, and changes. Journal of the Medical Library Association : JMLA. 2013;101(4):278–86.
Dudden RF, Protzko SL. The systematic review team: contributions of the health sciences librarian. Med Ref Serv Q. 2011;30(3):301–15.
Golder S, Loke Y, McIntosh HM. Poor reporting and inadequate searches were apparent in systematic reviews of adverse effects. J Clin Epidemiol. 2008;61(5):440–8.
Maggio LA, Tannery NH, Kanter SL. Reproducibility of literature search reporting in medical education reviews. Academic medicine : journal of the Association of American Medical Colleges. 2011;86(8):1049–54.
Meert D, Torabi N, Costella J. Impact of librarians on reporting of the literature searching component of pediatric systematic reviews. Journal of the Medical Library Association : JMLA. 2016;104(4):267–77.
Morris M, Boruff JT, Gore GC. Scoping reviews: establishing the role of the librarian. Journal of the Medical Library Association : JMLA. 2016;104(4):346–54.
Koffel JB, Rethlefsen ML. Reproducibility of search strategies is poor in systematic reviews published in high-impact pediatrics, cardiology and surgery journals: a cross-sectional study. PLoS One. 2016;11(9):e0163309.
Article PubMed PubMed Central CAS Google Scholar
Fehrmann P, Thomas J. Comprehensive computer searches and reporting in systematic reviews. Research Synthesis Methods. 2011;2(1):15–32.
Booth A. Searching for qualitative research for inclusion in systematic reviews: a structured methodological review. Systematic Reviews. 2016;5(1):74.
Article PubMed PubMed Central Google Scholar
Egger M, Juni P, Bartlett C, Holenstein F, Sterne J. How important are comprehensive literature searches and the assessment of trial quality in systematic reviews? Empirical study. Health technology assessment (Winchester, England). 2003;7(1):1–76.
Tricco AC, Tetzlaff J, Sampson M, Fergusson D, Cogo E, Horsley T, et al. Few systematic reviews exist documenting the extent of bias: a systematic review. J Clin Epidemiol. 2008;61(5):422–34.
Booth A. How much searching is enough? Comprehensive versus optimal retrieval for technology assessments. Int J Technol Assess Health Care. 2010;26(4):431–5.
Papaioannou D, Sutton A, Carroll C, Booth A, Wong R. Literature searching for social science systematic reviews: consideration of a range of search techniques. Health Inf Libr J. 2010;27(2):114–22.
Petticrew M. Time to rethink the systematic review catechism? Moving from ‘what works’ to ‘what happens’. Systematic Reviews. 2015;4(1):36.
Betrán AP, Say L, Gülmezoglu AM, Allen T, Hampson L. Effectiveness of different databases in identifying studies for systematic reviews: experience from the WHO systematic review of maternal morbidity and mortality. BMC Med Res Methodol. 2005;5
Felson DT. Bias in meta-analytic research. J Clin Epidemiol. 1992;45(8):885–92.
Article PubMed CAS Google Scholar
Franco A, Malhotra N, Simonovits G. Publication bias in the social sciences: unlocking the file drawer. Science. 2014;345(6203):1502–5.
Hartling L, Featherstone R, Nuspl M, Shave K, Dryden DM, Vandermeer B. Grey literature in systematic reviews: a cross-sectional study of the contribution of non-English reports, unpublished studies and dissertations to the results of meta-analyses in child-relevant reviews. BMC Med Res Methodol. 2017;17(1):64.
Schmucker CM, Blümle A, Schell LK, Schwarzer G, Oeller P, Cabrera L, et al. Systematic review finds that study data not published in full text articles have unclear impact on meta-analyses results in medical research. PLoS One. 2017;12(4):e0176210.
Egger M, Zellweger-Zahner T, Schneider M, Junker C, Lengeler C, Antes G. Language bias in randomised controlled trials published in English and German. Lancet (London, England). 1997;350(9074):326–9.
Moher D, Pham B, Lawson ML, Klassen TP. The inclusion of reports of randomised trials published in languages other than English in systematic reviews. Health technology assessment (Winchester, England). 2003;7(41):1–90.
Pham B, Klassen TP, Lawson ML, Moher D. Language of publication restrictions in systematic reviews gave different results depending on whether the intervention was conventional or complementary. J Clin Epidemiol. 2005;58(8):769–76.
Mills EJ, Kanters S, Thorlund K, Chaimani A, Veroniki A-A, Ioannidis JPA. The effects of excluding treatments from network meta-analyses: survey. BMJ : British Medical Journal. 2013;347
Hartling L, Featherstone R, Nuspl M, Shave K, Dryden DM, Vandermeer B. The contribution of databases to the results of systematic reviews: a cross-sectional study. BMC Med Res Methodol. 2016;16(1):127.
van Driel ML, De Sutter A, De Maeseneer J, Christiaens T. Searching for unpublished trials in Cochrane reviews may not be worth the effort. J Clin Epidemiol. 2009;62(8):838–44.e3.
Buchberger B, Krabbe L, Lux B, Mattivi JT. Evidence mapping for decision making: feasibility versus accuracy - when to abandon high sensitivity in electronic searches. German medical science : GMS e-journal. 2016;14:Doc09.
Lorenc T, Pearson M, Jamal F, Cooper C, Garside R. The role of systematic reviews of qualitative evidence in evaluating interventions: a case study. Research Synthesis Methods. 2012;3(1):1–10.
Gough D. Weight of evidence: a framework for the appraisal of the quality and relevance of evidence. Res Pap Educ. 2007;22(2):213–28.
Barroso J, Gollop CJ, Sandelowski M, Meynell J, Pearce PF, Collins LJ. The challenges of searching for and retrieving qualitative studies. West J Nurs Res. 2003;25(2):153–78.
Britten N, Garside R, Pope C, Frost J, Cooper C. Asking more of qualitative synthesis: a response to Sally Thorne. Qual Health Res. 2017;27(9):1370–6.
Booth A, Carroll C. Systematic searching for theory to inform systematic reviews: is it feasible? Is it desirable? Health Info Libr J. 2015;32(3):220–35.
Kwon Y, Powelson SE, Wong H, Ghali WA, Conly JM. An assessment of the efficacy of searching in biomedical databases beyond MEDLINE in identifying studies for a systematic review on ward closures as an infection control intervention to control outbreaks. Syst Rev. 2014;3:135.
Nussbaumer-Streit B, Klerings I, Wagner G, Titscher V, Gartlehner G. Assessing the validity of abbreviated literature searches for rapid reviews: protocol of a non-inferiority and meta-epidemiologic study. Systematic Reviews. 2016;5:197.
Wagner G, Nussbaumer-Streit B, Greimel J, Ciapponi A, Gartlehner G. Trading certainty for speed - how much uncertainty are decisionmakers and guideline developers willing to accept when using rapid reviews: an international survey. BMC Med Res Methodol. 2017;17(1):121.
Ogilvie D, Hamilton V, Egan M, Petticrew M. Systematic reviews of health effects of social interventions: 1. Finding the evidence: how far should you go? J Epidemiol Community Health. 2005;59(9):804–8.
Royle P, Milne R. Literature searching for randomized controlled trials used in Cochrane reviews: rapid versus exhaustive searches. Int J Technol Assess Health Care. 2003;19(4):591–603.
Pearson M, Moxham T, Ashton K. Effectiveness of search strategies for qualitative research about barriers and facilitators of program delivery. Eval Health Prof. 2011;34(3):297–308.
Levay P, Raynor M, Tuvey D. The Contributions of MEDLINE, Other Bibliographic Databases and Various Search Techniques to NICE Public Health Guidance. 2015. 2015;10(1):19.
Nussbaumer-Streit B, Klerings I, Wagner G, Heise TL, Dobrescu AI, Armijo-Olivo S, et al. Abbreviated literature searches were viable alternatives to comprehensive searches: a meta-epidemiological study. J Clin Epidemiol. 2018;102:1–11.
Briscoe S, Cooper C, Glanville J, Lefebvre C. The loss of the NHS EED and DARE databases and the effect on evidence synthesis and evaluation. Res Synth Methods. 2017;8(3):256–7.
Stansfield C, O'Mara-Eves A, Thomas J. Text mining for search term development in systematic reviewing: A discussion of some methods and challenges. Research Synthesis Methods.n/a-n/a.
Petrova M, Sutcliffe P, Fulford KW, Dale J. Search terms and a validated brief search filter to retrieve publications on health-related values in Medline: a word frequency analysis study. Journal of the American Medical Informatics Association : JAMIA. 2012;19(3):479–88.
Stansfield C, Thomas J, Kavanagh J. 'Clustering' documents automatically to support scoping reviews of research: a case study. Res Synth Methods. 2013;4(3):230–41.
PubMed Google Scholar
Methley AM, Campbell S, Chew-Graham C, McNally R, Cheraghi-Sohi S. PICO, PICOS and SPIDER: a comparison study of specificity and sensitivity in three search tools for qualitative systematic reviews. BMC Health Serv Res. 2014;14:579.
Andrew B. Clear and present questions: formulating questions for evidence based practice. Library Hi Tech. 2006;24(3):355–68.
Cooke A, Smith D, Booth A. Beyond PICO: the SPIDER tool for qualitative evidence synthesis. Qual Health Res. 2012;22(10):1435–43.
Whiting P, Westwood M, Bojke L, Palmer S, Richardson G, Cooper J, et al. Clinical effectiveness and cost-effectiveness of tests for the diagnosis and investigation of urinary tract infection in children: a systematic review and economic model. Health technology assessment (Winchester, England). 2006;10(36):iii-iv, xi-xiii, 1–154.
Cooper C, Levay P, Lorenc T, Craig GM. A population search filter for hard-to-reach populations increased search efficiency for a systematic review. J Clin Epidemiol. 2014;67(5):554–9.
Hausner E, Waffenschmidt S, Kaiser T, Simon M. Routine development of objectively derived search strategies. Systematic Reviews. 2012;1(1):19.
Hausner E, Guddat C, Hermanns T, Lampert U, Waffenschmidt S. Prospective comparison of search strategies for systematic reviews: an objective approach yielded higher sensitivity than a conceptual one. J Clin Epidemiol. 2016;77:118–24.
Craven J, Levay P. Recording database searches for systematic reviews - what is the value of adding a narrative to peer-review checklists? A case study of nice interventional procedures guidance. Evid Based Libr Inf Pract. 2011;6(4):72–87.
Wright K, Golder S, Lewis-Light K. What value is the CINAHL database when searching for systematic reviews of qualitative studies? Syst Rev. 2015;4:104.
Beckles Z, Glover S, Ashe J, Stockton S, Boynton J, Lai R, et al. Searching CINAHL did not add value to clinical questions posed in NICE guidelines. J Clin Epidemiol. 2013;66(9):1051–7.
Cooper C, Rogers M, Bethel A, Briscoe S, Lowe J. A mapping review of the literature on UK-focused health and social care databases. Health Inf Libr J. 2015;32(1):5–22.
Younger P, Boddy K. When is a search not a search? A comparison of searching the AMED complementary health database via EBSCOhost, OVID and DIALOG. Health Inf Libr J. 2009;26(2):126–35.
Lam MT, McDiarmid M. Increasing number of databases searched in systematic reviews and meta-analyses between 1994 and 2014. Journal of the Medical Library Association : JMLA. 2016;104(4):284–9.
Bethel A, editor Search summary tables for systematic reviews: results and findings. HLC Conference 2017a.
Aagaard T, Lund H, Juhl C. Optimizing literature search in systematic reviews - are MEDLINE, EMBASE and CENTRAL enough for identifying effect studies within the area of musculoskeletal disorders? BMC Med Res Methodol. 2016;16(1):161.
Adams CE, Frederick K. An investigation of the adequacy of MEDLINE searches for randomized controlled trials (RCTs) of the effects of mental health care. Psychol Med. 1994;24(3):741–8.
Kelly L, St Pierre-Hansen N. So many databases, such little clarity: searching the literature for the topic aboriginal. Canadian family physician Medecin de famille canadien. 2008;54(11):1572–3.
Lawrence DW. What is lost when searching only one literature database for articles relevant to injury prevention and safety promotion? Injury Prevention. 2008;14(6):401–4.
Lemeshow AR, Blum RE, Berlin JA, Stoto MA, Colditz GA. Searching one or two databases was insufficient for meta-analysis of observational studies. J Clin Epidemiol. 2005;58(9):867–73.
Sampson M, Barrowman NJ, Moher D, Klassen TP, Pham B, Platt R, et al. Should meta-analysts search Embase in addition to Medline? J Clin Epidemiol. 2003;56(10):943–55.
Stevinson C, Lawlor DA. Searching multiple databases for systematic reviews: added value or diminishing returns? Complementary Therapies in Medicine. 2004;12(4):228–32.
Suarez-Almazor ME, Belseck E, Homik J, Dorgan M, Ramos-Remus C. Identifying clinical trials in the medical literature with electronic databases: MEDLINE alone is not enough. Control Clin Trials. 2000;21(5):476–87.
Taylor B, Wylie E, Dempster M, Donnelly M. Systematically retrieving research: a case study evaluating seven databases. Res Soc Work Pract. 2007;17(6):697–706.
Beyer FR, Wright K. Can we prioritise which databases to search? A case study using a systematic review of frozen shoulder management. Health Info Libr J. 2013;30(1):49–58.
Duffy S, de Kock S, Misso K, Noake C, Ross J, Stirk L. Supplementary searches of PubMed to improve currency of MEDLINE and MEDLINE in-process searches via Ovid. Journal of the Medical Library Association : JMLA. 2016;104(4):309–12.
Katchamart W, Faulkner A, Feldman B, Tomlinson G, Bombardier C. PubMed had a higher sensitivity than Ovid-MEDLINE in the search for systematic reviews. J Clin Epidemiol. 2011;64(7):805–7.
Cooper C, Lovell R, Husk K, Booth A, Garside R. Supplementary search methods were more effective and offered better value than bibliographic database searching: a case study from public health and environmental enhancement (in Press). Research Synthesis Methods. 2017;
Cooper C, Booth, A., Britten, N., Garside, R. A comparison of results of empirical studies of supplementary search techniques and recommendations in review methodology handbooks: A methodological review. (In Press). BMC Systematic Reviews. 2017.
Greenhalgh T, Peacock R. Effectiveness and efficiency of search methods in systematic reviews of complex evidence: audit of primary sources. BMJ (Clinical research ed). 2005;331(7524):1064–5.
Article PubMed Central Google Scholar
Hinde S, Spackman E. Bidirectional citation searching to completion: an exploration of literature searching methods. PharmacoEconomics. 2015;33(1):5–11.
Levay P, Ainsworth N, Kettle R, Morgan A. Identifying evidence for public health guidance: a comparison of citation searching with web of science and Google scholar. Res Synth Methods. 2016;7(1):34–45.
McManus RJ, Wilson S, Delaney BC, Fitzmaurice DA, Hyde CJ, Tobias RS, et al. Review of the usefulness of contacting other experts when conducting a literature search for systematic reviews. BMJ (Clinical research ed). 1998;317(7172):1562–3.
Westphal A, Kriston L, Holzel LP, Harter M, von Wolff A. Efficiency and contribution of strategies for finding randomized controlled trials: a case study from a systematic review on therapeutic interventions of chronic depression. Journal of public health research. 2014;3(2):177.
Matthews EJ, Edwards AG, Barker J, Bloor M, Covey J, Hood K, et al. Efficient literature searching in diffuse topics: lessons from a systematic review of research on communicating risk to patients in primary care. Health Libr Rev. 1999;16(2):112–20.
Bethel A. Endnote Training (YouTube Videos) 2017b [Available from: http://medicine.exeter.ac.uk/esmi/workstreams/informationscience/is_resources,_guidance_&_advice/ .
Bramer WM, Giustini D, de Jonge GB, Holland L, Bekhuis T. De-duplication of database search results for systematic reviews in EndNote. Journal of the Medical Library Association : JMLA. 2016;104(3):240–3.
Bramer WM, Milic J, Mast F. Reviewing retrieved references for inclusion in systematic reviews using EndNote. Journal of the Medical Library Association : JMLA. 2017;105(1):84–7.
Gall C, Brahmi FA. Retrieval comparison of EndNote to search MEDLINE (Ovid and PubMed) versus searching them directly. Medical reference services quarterly. 2004;23(3):25–32.
Ahmed KK, Al Dhubaib BE. Zotero: a bibliographic assistant to researcher. J Pharmacol Pharmacother. 2011;2(4):303–5.
Coar JT, Sewell JP. Zotero: harnessing the power of a personal bibliographic manager. Nurse Educ. 2010;35(5):205–7.
Moher D, Liberati A, Tetzlaff J, Altman DG, The PG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6(7):e1000097.
Sampson M, McGowan J, Tetzlaff J, Cogo E, Moher D. No consensus exists on search reporting methods for systematic reviews. J Clin Epidemiol. 2008;61(8):748–54.
Toews LC. Compliance of systematic reviews in veterinary journals with preferred reporting items for systematic reviews and meta-analysis (PRISMA) literature search reporting guidelines. Journal of the Medical Library Association : JMLA. 2017;105(3):233–9.
Booth A. "brimful of STARLITE": toward standards for reporting literature searches. Journal of the Medical Library Association : JMLA. 2006;94(4):421–9. e205
Faggion CM Jr, Wu YC, Tu YK, Wasiak J. Quality of search strategies reported in systematic reviews published in stereotactic radiosurgery. Br J Radiol. 2016;89(1062):20150878.
Mullins MM, DeLuca JB, Crepaz N, Lyles CM. Reporting quality of search methods in systematic reviews of HIV behavioral interventions (2000–2010): are the searches clearly explained, systematic and reproducible? Research Synthesis Methods. 2014;5(2):116–30.
Yoshii A, Plaut DA, McGraw KA, Anderson MJ, Wellik KE. Analysis of the reporting of search strategies in Cochrane systematic reviews. Journal of the Medical Library Association : JMLA. 2009;97(1):21–9.
Bigna JJ, Um LN, Nansseu JR. A comparison of quality of abstracts of systematic reviews including meta-analysis of randomized controlled trials in high-impact general medicine journals before and after the publication of PRISMA extension for abstracts: a systematic review and meta-analysis. Syst Rev. 2016;5(1):174.
Akhigbe T, Zolnourian A, Bulters D. Compliance of systematic reviews articles in brain arteriovenous malformation with PRISMA statement guidelines: review of literature. Journal of clinical neuroscience : official journal of the Neurosurgical Society of Australasia. 2017;39:45–8.
Tao KM, Li XQ, Zhou QH, Moher D, Ling CQ, Yu WF. From QUOROM to PRISMA: a survey of high-impact medical journals' instructions to authors and a review of systematic reviews in anesthesia literature. PLoS One. 2011;6(11):e27611.
Wasiak J, Tyack Z, Ware R. Goodwin N. Jr. Poor methodological quality and reporting standards of systematic reviews in burn care management. International wound journal: Faggion CM; 2016.
Tam WW, Lo KK, Khalechelvam P. Endorsement of PRISMA statement and quality of systematic reviews and meta-analyses published in nursing journals: a cross-sectional study. BMJ Open. 2017;7(2):e013905.
Rader T, Mann M, Stansfield C, Cooper C, Sampson M. Methods for documenting systematic review searches: a discussion of common issues. Res Synth Methods. 2014;5(2):98–115.
Atkinson KM, Koenka AC, Sanchez CE, Moshontz H, Cooper H. Reporting standards for literature searches and report inclusion criteria: making research syntheses more transparent and easy to replicate. Res Synth Methods. 2015;6(1):87–95.
McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS peer review of electronic search strategies: 2015 guideline statement. J Clin Epidemiol. 2016;75:40–6.
Sampson M, McGowan J, Cogo E, Grimshaw J, Moher D, Lefebvre C. An evidence-based practice guideline for the peer review of electronic search strategies. J Clin Epidemiol. 2009;62(9):944–52.
Shea BJ, Reeves BC, Wells G, Thuku M, Hamel C, Moran J, et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. BMJ (Clinical research ed). 2017;358.
Whiting P, Savović J, Higgins JPT, Caldwell DM, Reeves BC, Shea B, et al. ROBIS: a new tool to assess risk of bias in systematic reviews was developed. J Clin Epidemiol. 2016;69:225–34.
Relevo R, Balshem H. Finding evidence for comparing medical interventions: AHRQ and the effective health care program. J Clin Epidemiol. 2011;64(11):1168–77.
Medicine Io. Standards for Systematic Reviews 2011 [Available from: http://www.nationalacademies.org/hmd/Reports/2011/Finding-What-Works-in-Health-Care-Standards-for-Systematic-Reviews/Standards.aspx .
CADTH: Resources 2018.
Download references
Acknowledgements
CC acknowledges the supervision offered by Professor Chris Hyde.
This publication forms a part of CC’s PhD. CC’s PhD was funded through the National Institute for Health Research (NIHR) Health Technology Assessment (HTA) Programme (Project Number 16/54/11). The open access fee for this publication was paid for by Exeter Medical School.
RG and NB were partially supported by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care South West Peninsula.
The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health.
Author information
Authors and affiliations.
Institute of Health Research, University of Exeter Medical School, Exeter, UK
Chris Cooper & Jo Varley-Campbell
HEDS, School of Health and Related Research (ScHARR), University of Sheffield, Sheffield, UK
Andrew Booth
Nicky Britten
European Centre for Environment and Human Health, University of Exeter Medical School, Truro, UK
Ruth Garside
You can also search for this author in PubMed Google Scholar
Contributions
CC conceived the idea for this study and wrote the first draft of the manuscript. CC discussed this publication in PhD supervision with AB and separately with JVC. CC revised the publication with input and comments from AB, JVC, RG and NB. All authors revised the manuscript prior to submission. All authors read and approved the final manuscript.
Corresponding author
Correspondence to Chris Cooper .
Ethics declarations
Ethics approval and consent to participate, consent for publication, competing interests.
The authors declare that they have no competing interests.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Additional file
Additional file 1:.
Appendix tables and PubMed search strategy. Key studies used for pearl growing per key stage, working data extraction tables and the PubMed search strategy. (DOCX 30 kb)
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.
Reprints and Permissions
About this article
Cite this article.
Cooper, C., Booth, A., Varley-Campbell, J. et al. Defining the process to literature searching in systematic reviews: a literature review of guidance and supporting studies. BMC Med Res Methodol 18 , 85 (2018). https://doi.org/10.1186/s12874-018-0545-3
Download citation
Received : 20 September 2017
Accepted : 06 August 2018
Published : 14 August 2018
DOI : https://doi.org/10.1186/s12874-018-0545-3
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- Literature Search Process
- Citation Chasing
- Tacit Models
- Unique Guidance
- Information Specialists
BMC Medical Research Methodology
ISSN: 1471-2288
- Submission enquiries: [email protected]
- General enquiries: [email protected]

An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
- Publications
- Account settings
- Advanced Search
- Journal List
- J Med Libr Assoc
- v.106(4); 2018 Oct
A systematic approach to searching: an efficient and complete method to develop literature searches
Associated data.
Creating search strategies for systematic reviews, finding the best balance between sensitivity and specificity, and translating search strategies between databases is challenging. Several methods describe standards for systematic search strategies, but a consistent approach for creating an exhaustive search strategy has not yet been fully described in enough detail to be fully replicable. The authors have established a method that describes step by step the process of developing a systematic search strategy as needed in the systematic review. This method describes how single-line search strategies can be prepared in a text document by typing search syntax (such as field codes, parentheses, and Boolean operators) before copying and pasting search terms (keywords and free-text synonyms) that are found in the thesaurus. To help ensure term completeness, we developed a novel optimization technique that is mainly based on comparing the results retrieved by thesaurus terms with those retrieved by the free-text search words to identify potentially relevant candidate search terms. Macros in Microsoft Word have been developed to convert syntaxes between databases and interfaces almost automatically. This method helps information specialists in developing librarian-mediated searches for systematic reviews as well as medical and health care practitioners who are searching for evidence to answer clinical questions. The described method can be used to create complex and comprehensive search strategies for different databases and interfaces, such as those that are needed when searching for relevant references for systematic reviews, and will assist both information specialists and practitioners when they are searching the biomedical literature.
INTRODUCTION
Librarians and information specialists are often involved in the process of preparing and completing systematic reviews (SRs), where one of their main tasks is to identify relevant references to include in the review [ 1 ]. Although several recommendations for the process of searching have been published [ 2 – 6 ], none describe the development of a systematic search strategy from start to finish.
Traditional methods of SR search strategy development and execution are highly time consuming, reportedly requiring up to 100 hours or more [ 7 , 8 ]. The authors wanted to develop systematic and exhaustive search strategies more efficiently, while preserving the high sensitivity that SR search strategies necessitate. In this article, we describe the method developed at Erasmus University Medical Center (MC) and demonstrate its use through an example search. The efficiency of the search method and outcome of 73 searches that have resulted in published reviews are described in a separate article [ 9 ].
As we aimed to describe the creation of systematic searches in full detail, the method starts at a basic level with the analysis of the research question and the creation of search terms. Readers who are new to SR searching are advised to follow all steps described. More experienced searchers can consider the basic steps to be existing knowledge that will already be part of their normal workflow, although step 4 probably differs from general practice. Experienced searchers will gain the most from reading about the novelties in the method as described in steps 10–13 and comparing the examples given in the supplementary appendix to their own practice.
CREATING A SYSTEMATIC SEARCH STRATEGY
Our methodology for planning and creating a multi-database search strategy consists of the following steps:
- Determine a clear and focused question
- Describe the articles that can answer the question
- Decide which key concepts address the different elements of the question
- Decide which elements should be used for the best results
- Choose an appropriate database and interface to start with
- Document the search process in a text document
- Identify appropriate index terms in the thesaurus of the first database
- Identify synonyms in the thesaurus
- Add variations in search terms
- Use database-appropriate syntax, with parentheses, Boolean operators, and field codes
- Optimize the search
- Evaluate the initial results
- Check for errors
- Translate to other databases
- Test and reiterate
Each step in the process is reflected by an example search described in the supplementary appendix .
1. Determine a clear and focused question
A systematic search can best be applied to a well-defined and precise research or clinical question. Questions that are too broad or too vague cannot be answered easily in a systematic way and will generally result in an overwhelming number of search results. On the other hand, a question that is too specific will result into too few or even zero search results. Various papers describe this process in more detail [ 10 – 12 ].
2. Describe the articles that can answer the question
Although not all clinical or research questions can be answered in the literature, the next step is to presume that the answer can indeed be found in published studies. A good starting point for a search is hypothesizing what the research that can answer the question would look like. These hypothetical (when possible, combined with known) articles can be used as guidance for constructing the search strategy.
3. Decide which key concepts address the different elements of the question
Key concepts are the topics or components that the desired articles should address, such as diseases or conditions, actions, substances, settings, domains (e.g., therapy, diagnosis, etiology), or study types. Key concepts from the research question can be grouped to create elements in the search strategy.
Elements in a search strategy do not necessarily follow the patient, intervention, comparison, outcome (PICO) structure or any other related structure. Using the PICO or another similar framework as guidance can be helpful to consider, especially in the inclusion and exclusion review stage of the SR, but this is not necessary for good search strategy development [ 13 – 15 ]. Sometimes concepts from different parts of the PICO structure can be grouped together into one search element, such as when the desired outcome is frequently described in a certain study type.
4. Decide which elements should be used for the best results
Not all elements of a research question should necessarily be used in the search strategy. Some elements are less important than others or may unnecessarily complicate or restrict a search strategy. Adding an element to a search strategy increases the chance of missing relevant references. Therefore, the number of elements in a search strategy should remain as low as possible to optimize recall.
Using the schema in Figure 1 , elements can be ordered by their specificity and importance to determine the best search approach. Whether an element is more specific or more general can be measured objectively by the number of hits retrieved in a database when searching for a key term representing that element. Depending on the research question, certain elements are more important than others. If articles (hypothetically or known) exist that can answer the question but lack a certain element in their titles, abstracts, or keywords, that element is unimportant to the question. An element can also be unimportant because of expected bias or an overlap with another element.

Schema for determining the optimal order of elements
Bias in elements
The choice of elements in a search strategy can introduce bias through use of overly specific terminology or terms often associated with positive outcomes. For the question “does prolonged breastfeeding improve intelligence outcomes in children?,” searching specifically for the element of duration will introduce bias, as articles that find a positive effect of prolonged breastfeeding will be much more likely to mention time factors in their titles or abstracts.
Overlapping elements
Elements in a question sometimes overlap in their meaning. Sometimes certain therapies are interventions for one specific disease. The Lichtenstein technique, for example, is a repair method for inguinal hernias. There is no need to include an element of “inguinal hernias” to a search for the effectiveness of the Lichtenstein therapy. Likewise, sometimes certain diseases are only found in certain populations. Adding such an overlapping element could lead to missing relevant references.
The elements to use in a search strategy can be found in the plot of elements in Figure 1 , by following the top row from left to right. For this method, we recommend starting with the most important and specific elements. Then, continue with more general and important elements until the number of results is acceptable for screening. Determining how many results are acceptable for screening is often a matter of negotiation with the SR team.
5. Choose an appropriate database and interface to start with
Important factors for choosing databases to use are the coverage and the presence of a thesaurus. For medically oriented searches, the coverage and recall of Embase, which includes the MEDLINE database, are superior to those of MEDLINE [ 16 ]. Each of these two databases has its own thesaurus with its own unique definitions and structure. Because of the complexity of the Embase thesaurus, Emtree, which contains much more specific thesaurus terms than the MEDLINE Medical Subject Headings (MeSH) thesaurus, translation from Emtree to MeSH is easier than the other way around. Therefore, we recommend starting in Embase.
MEDLINE and Embase are available through many different vendors and interfaces. The choice of an interface and primary database is often determined by the searcher’s accessibility. For our method, an interface that allows searching with proximity operators is desirable, and full functionality of the thesaurus, including explosion of narrower terms, is crucial. We recommend developing a personal workflow that always starts with one specific database and interface.
6. Document the search process in a text document
We advise designing and creating the complete search strategies in a log document, instead of directly in the database itself, to register the steps taken and to make searches accountable and reproducible. The developed search strategies can be copied and pasted into the desired databases from the log document. This way, the searcher is in control of the whole process. Any change to the search strategy should be done in the log document, assuring that the search strategy in the log is always the most recent.
7. Identify appropriate index terms in the thesaurus of the first database
Searches should start by identifying appropriate thesaurus terms for the desired elements. The thesaurus of the database is searched for matching index terms for each key concept. We advise restricting the initial terms to the most important and most relevant terms. Later in the process, more general terms can be added in the optimization process, in which the effect on the number of hits, and thus the desirability of adding these terms, can be evaluated more easily.
Several factors can complicate the identification of thesaurus terms. Sometimes, one thesaurus term is found that exactly describes a specific element. In contrast, especially in more general elements, multiple thesaurus terms can be found to describe one element. If no relevant thesaurus terms have been found for an element, free-text terms can be used, and possible thesaurus terms found in the resulting references can be added later (step 11).
Sometimes, no distinct thesaurus term is available for a specific key concept that describes the concept in enough detail. In Emtree, one thesaurus term often combines two or more elements. The easiest solution for combining these terms for a sensitive search is to use such a thesaurus term in all elements where it is relevant. Examples are given in the supplementary appendix .
8. Identify synonyms in the thesaurus
Most thesauri offer a list of synonyms on their term details page (named Synonyms in Emtree and Entry Terms in MeSH). To create a sensitive search strategy for SRs, these terms need to be searched as free-text keywords in the title and abstract fields, in addition to searching their associated thesaurus terms.
The Emtree thesaurus contains more synonyms (300,000) than MeSH does (220,000) [ 17 ]. The difference in number of terms is even higher considering that many synonyms in MeSH are permuted terms (i.e., inversions of phrases using commas).
Thesaurus terms are ordered in a tree structure. When searching for a more general thesaurus term, the more specific (narrower) terms in the branches below that term will also be searched (this is frequently referred to as “exploding” a thesaurus term). However, to perform a sensitive search, all relevant variations of the narrower terms must be searched as free-text keywords in the title or abstract, in addition to relying on the exploded thesaurus term. Thus, all articles that describe a certain narrower topic in their titles and abstracts will already be retrieved before MeSH terms are added.
9. Add variations in search terms (e.g., truncation, spelling differences, abbreviations, opposites)
Truncation allows a searcher to search for words beginning with the same word stem. A search for therap* will, thus, retrieve therapy, therapies, therapeutic, and all other words starting with “therap.” Do not truncate a word stem that is too short. Also, limitations of interfaces should be taken into account, especially in PubMed, where the number of search term variations that can be found by truncation is limited to 600.
Databases contain references to articles using both standard British and American English spellings. Both need to be searched as free-text terms in the title and abstract. Alternatively, many interfaces offer a certain code to replace zero or one characters, allowing a search for “pediatric” or “paediatric” as “p?ediatric.” Table 1 provides a detailed description of the syntax for different interfaces.
Field codes in five most used interfaces for biomedical literature searching
Searching for abbreviations can identify extra, relevant references and retrieve more irrelevant ones. The search can be more focused by combining the abbreviation with an important word that is relevant to its meaning or by using the Boolean “NOT” to exclude frequently observed, clearly irrelevant results. We advise that searchers do not exclude all possible irrelevant meanings, as it is very time consuming to identify all the variations, it will result in unnecessarily complicated search strategies, and it may lead to erroneously narrowing the search and, thereby, reduce recall.
Searching partial abbreviations can be useful for retrieving relevant references. For example, it is very likely that an article would mention osteoarthritis (OA) early in the abstract, replacing all further occurrences of osteoarthritis with OA . Therefore, it may not contain the phrase “hip osteoarthritis” but only “hip oa.”
It is also important to search for the opposites of search terms to avoid bias. When searching for “disease recurrence,” articles about “disease free” may be relevant as well. When the desired outcome is survival , articles about mortality may be relevant.
10. Use database-appropriate syntax, with parentheses, Boolean operators, and field codes
Different interfaces require different syntaxes, the special set of rules and symbols unique to each database that define how a correctly constructed search operates. Common syntax components include the use of parentheses and Boolean operators such as “AND,” “OR,” and “NOT,” which are available in all major interfaces. An overview of different syntaxes for four major interfaces for bibliographic medical databases (PubMed, Ovid, EBSCOhost, Embase.com, and ProQuest) is shown in Table 1 .
Creating the appropriate syntax for each database, in combination with the selected terms as described in steps 7–9, can be challenging. Following the method outlined below simplifies the process:
- Create single-line queries in a text document (not combining multiple record sets), which allows immediate checking of the relevance of retrieved references and efficient optimization.
- Type the syntax (Boolean operators, parentheses, and field codes) before adding terms, which reduces the chance that errors are made in the syntax, especially in the number of parentheses.
- Use predefined proximity structures including parentheses, such as (() ADJ3 ()) in Ovid, that can be reused in the query when necessary.
- Use thesaurus terms separately from free-text terms of each element. Start an element with all thesaurus terms (using “OR”) and follow with the free-text terms. This allows the unique optimization methods as described in step 11.
- When adding terms to an existing search strategy, pay close attention to the position of the cursor. Make sure to place it appropriately either in the thesaurus terms section, in the title/abstract section, or as an addition (broadening) to an existing proximity search.
The supplementary appendix explains the method of building a query in more detail, step by step for different interfaces: PubMed, Ovid, EBSCOhost, Embase.com, and ProQuest. This method results in a basic search strategy designed to retrieve some relevant references upon which a more thorough search strategy can be built with optimization such as described in step 11.
11. Optimize the search
The most important question when performing a systematic search is whether all (or most) potentially relevant articles have been retrieved by the search strategy. This is also the most difficult question to answer, since it is unknown which and how many articles are relevant. It is, therefore, wise first to broaden the initial search strategy, making the search more sensitive, and then check if new relevant articles are found by comparing the set results (i.e., search for Strategy #2 NOT Strategy #1 to see the unique results).
A search strategy should be tested for completeness. Therefore, it is necessary to identify extra, possibly relevant search terms and add them to the test search in an OR relationship with the already used search terms. A good place to start, and a well-known strategy, is scanning the top retrieved articles when sorted by relevance, looking for additional relevant synonyms that could be added to the search strategy.
We have developed a unique optimization method that has not been described before in the literature. This method often adds valuable extra terms to our search strategy and, therefore, extra, relevant references to our search results. Extra synonyms can be found in articles that have been assigned a certain set of thesaurus terms but that lack synonyms in the title and/or abstract that are already present in the current search strategy. Searching for thesaurus terms NOT free-text terms will help identify missed free-text terms in the title or abstract. Searching for free-text terms NOT thesaurus terms will help identify missed thesaurus terms. If this is done repeatedly for each element, leaving the rest of the query unchanged, this method will help add numerous relevant terms to the query. These steps are explained in detail for five different search platforms in the supplementary appendix .
12. Evaluate the initial results
The results should now contain relevant references. If the interface allows relevance ranking, use that in the evaluation. If you know some relevant references that should be included in the research, search for those references specifically; for example, combine a specific (first) author name with a page number and the publication year. Check whether those references are retrieved by the search. If the known relevant references are not retrieved by the search, adapt the search so that they are. If it is unclear which element should be adapted to retrieve a certain article, combine that article with each element separately.
Different outcomes are desired for different types of research questions. For instance, in the case of clinical question answering, the researcher will not be satisfied with many references that contain a lot of irrelevant references. A clinical search should be rather specific and is allowed to miss a relevant reference. In the case of an SR, the researchers do not want to miss any relevant reference and are willing to handle many irrelevant references to do so. The search for references to include in an SR should be very sensitive: no included reference should be missed. A search that is too specific or too sensitive for the intended goal can be adapted to become more sensitive or specific. Steps to increase sensitivity or specificity of a search strategy can be found in the supplementary appendix .
13. Check for errors
Errors might not be easily detected. Sometimes clues can be found in the number of results, either when the number of results is much higher or lower than expected or when many retrieved references are not relevant. However, the number expected is often unknown, and very sensitive search strategies will always retrieve many irrelevant articles. Each query should, therefore, be checked for errors.
One of the most frequently occurring errors is missing the Boolean operator “OR.” When no “OR” is added between two search terms, many interfaces automatically add an “AND,” which unintentionally reduces the number of results and likely misses relevant references. One good strategy to identify missing “OR”s is to go to the web page containing the full search strategy, as translated by the database, and using Ctrl-F search for “AND.” Check whether the occurrences of the “AND” operator are deliberate.
Ideally, search strategies should be checked by other information specialists [ 18 ]. The Peer Review of Electronic Search Strategies (PRESS) checklist offers good guidance for this process [ 4 ]. Apart from the syntax (especially Boolean operators and field codes) of the search strategy, it is wise to have the search terms checked by the clinician or researcher familiar with the topic. At Erasmus MC, researchers and clinicians are involved during the complete process of structuring and optimizing the search strategy. Each word is added after the combined decision of the searcher and the researcher, with the possibility of directly comparing results with and without the new term.
14. Translate to other databases
To retrieve as many relevant references as possible, one has to search multiple databases. Translation of complex and exhaustive queries between different databases can be very time consuming and cumbersome. The single-line search strategy approach detailed above allows quick translations using the find and replace method in Microsoft Word (<Ctrl-H>).
At Erasmus MC, macros based on the find-and-replace method in Microsoft Word have been developed for easy and fast translation between the most used databases for biomedical and health sciences questions. The schema that is followed for the translation between databases is shown in Figure 2 . Most databases simply follow the structure set by the Embase.com search strategy. The translation from Emtree terms to MeSH terms for MEDLINE in Ovid often identifies new terms that need to be added to the Embase.com search strategy before the translation to other databases.

Schematic representation of translation between databases used at Erasmus University Medical Center
Dotted lines represent databases that are used in less than 80% of the searches.
Using five different macros, a thoroughly optimized query in Embase.com can be relatively quickly translated into eight major databases. Basic search strategies will be created to use in many, mostly smaller, databases, because such niche databases often do not have extensive thesauri or advanced syntax options. Also, there is not much need to use extensive syntax because the number of hits and, therefore, the amount of noise in these databases is generally low. In MEDLINE (Ovid), PsycINFO (Ovid), and CINAHL (EBSCOhost), the thesaurus terms must be adapted manually, as each database has its own custom thesaurus. These macros and instructions for their installation, use, and adaptation are available at bit.ly/databasemacros.
15. Test and reiterate
Ideally, exhaustive search strategies should retrieve all references that are covered in a specific database. For SR search strategies, checking searches for their recall is advised. This can be done after included references have been determined by the authors of the systematic review. If additional papers have been identified through other non-database methods (i.e., checking references in included studies), results that were not identified by the database searches should be examined. If these results were available in the databases but not located by the search strategy, the search strategy should be adapted to try to retrieve these results, as they may contain terms that were omitted in the original search strategies. This may enable the identification of additional relevant results.
A methodology for creating exhaustive search strategies has been created that describes all steps of the search process, starting with a question and resulting in thorough search strategies in multiple databases. Many of the steps described are not new, but together, they form a strong method creating high-quality, robust searches in a relatively short time frame.
Our methodology is intended to create thoroughness for literature searches. The optimization method, as described in step 11, will identify missed synonyms or thesaurus terms, unlike any other method that largely depends on predetermined keywords and synonyms. Using this method results in a much quicker search process, compared to traditional methods, especially because of the easier translation between databases and interfaces (step 13). The method is not a guarantee for speed, since speed depends on many factors, including experience. However, by following the steps and using the tools as described above, searchers can gain confidence first and increase speed through practice.
What is new?
This method encourages searchers to start their search development process using empty syntax first and later adding the thesaurus terms and free-text synonyms. We feel this helps the searcher to focus on the search terms, instead of on the structure of the search query. The optimization method in which new terms are found in the already retrieved articles is used in some other institutes as well but has to our knowledge not been described in the literature. The macros to translate search strategies between interfaces are unique in this method.
What is different compared to common practice?
Traditionally, librarians and information specialists have focused on creating complex, multi-line (also called line-by-line) search strategies, consisting of multiple record sets, and this method is frequently advised in the literature and handbooks [ 2 , 19 – 21 ]. Our method, instead, uses single-line searches, which is critical to its success. Single-line search strategies can be easily adapted by adding or dropping a term without having to recode numbers of record sets, which would be necessary in multi-line searches. They can easily be saved in a text document and repeated by copying and pasting for search updates. Single-line search strategies also allow easy translation to other syntaxes using find-and-replace technology to update field codes and other syntax elements or using macros (step 13).
When constructing a search strategy, the searcher might experience that certain parentheses in the syntax are unnecessary, such as parentheses around all search terms in the title/abstract portion, if there is only one such term, there are double parentheses in the proximity statement, or one of the word groups exists for only one word. One might be tempted to omit those parentheses for ease of reading and management. However, during the optimization process, the searcher is likely to find extra synonyms that might consist of one word. To add those terms to the first query (with reduced parentheses) requires adding extra parentheses (meticulously placing and counting them), whereas, in the latter search, it only requires proper placement of those terms.
Many search methods highly depend on the PICO framework. Research states that often PICO or PICOS is not suitable for every question [ 22 , 23 ]. There are other acronyms than PICO—such as sample, phenomenon of interest, design, evaluation, research type (SPIDER) [ 24 ]—but each is just a variant. In our method, the most important and specific elements of a question are being analyzed for building the best search strategy.
Though it is generally recommended that searchers search both MEDLINE and Embase, most use MEDLINE as the starting point. It is considered the gold standard for biomedical searching, partially due to historical reasons, since it was the first of its kind, and more so now that it is freely available via the PubMed interface. Our method can be used with any database as a starting point, but we use Embase instead of MEDLINE or another database for a number of reasons. First, Embase provides both unique content and the complete content of MEDLINE. Therefore, searching Embase will be, by definition, more complete than searching MEDLINE only. Second, the number of terms in Emtree (the Embase thesaurus) is three times as high as that of MeSH (the MEDLINE thesaurus). It is easier to find MeSH terms after all relevant Emtree terms have been identified than to start with MeSH and translate to Emtree.
At Erasmus MC, the researchers sit next to the information specialist during most of the search strategy design process. This way, the researchers can deliver immediate feedback on the relevance of proposed search terms and retrieved references. The search team then combines knowledge about databases with knowledge about the research topic, which is an important condition to create the highest quality searches.
Limitations of the method
One disadvantage of single-line searches compared to multi-line search strategies is that errors are harder to recognize. However, with the methods for optimization as described (step 11), errors are recognized easily because missed synonyms and spelling errors will be identified during the process. Also problematic is that more parentheses are needed, making it more difficult for the searcher and others to assess the logic of the search strategy. However, as parentheses and field codes are typed before the search terms are added (step 10), errors in parentheses can be prevented.
Our methodology works best if used in an interface that allows proximity searching. It is recommended that searchers with access to an interface with proximity searching capabilities select one of those as the initial database to develop and optimize the search strategy. Because the PubMed interface does not allow proximity searches, phrases or Boolean “AND” combinations are required. Phrase searching complicates the process and is more specific, with the higher risk of missing relevant articles, and using Boolean “AND” combinations increases sensitivity but at an often high loss of specificity. Due to some searchers’ lack of access to expensive databases or interfaces, the freely available PubMed interface may be necessary to use, though it should never be the sole database used for an SR [ 2 , 16 , 25 ]. A limitation of our method is that it works best with subscription-based and licensed resources.
Another limitation is the customization of the macros to a specific institution’s resources. The macros for the translation between different database interfaces only work between the interfaces as described. To mitigate this, we recommend using the find-and-replace functionality of text editors like Microsoft Word to ease the translation of syntaxes between other databases. Depending on one’s institutional resources, custom macros can be developed using similar methods.
Results of the method
Whether this method results in exhaustive searches where no important article is missed is difficult to determine, because the number of relevant articles is unknown for any topic. A comparison of several parameters of 73 published reviews that were based on a search developed with this method to 258 reviews that acknowledged information specialists from other Dutch academic hospitals shows that the performance of the searches following our method is comparable to those performed in other institutes but that the time needed to develop the search strategies was much shorter than the time reported for the other reviews [ 9 ].
CONCLUSIONS
With the described method, searchers can gain confidence in their search strategies by finding many relevant words and creating exhaustive search strategies quickly. The approach can be used when performing SR searches or for other purposes such as answering clinical questions, with different expectations of the search’s precision and recall. This method, with practice, provides a stepwise approach that facilitates the search strategy development process from question clarification to final iteration and beyond.
SUPPLEMENTAL FILE
Acknowledgments.
We highly appreciate the work that was done by our former colleague Louis Volkers, who in his twenty years as an information specialist in Erasmus MC laid the basis for our method. We thank Professor Oscar Franco for reviewing earlier drafts of this article.
Covidence website will be inaccessible as we upgrading our platform on Monday 23rd August at 10am AEST, / 2am CEST/1am BST (Sunday, 15th August 8pm EDT/5pm PDT)
How to write a search strategy for your systematic review
Home | Blog | How To | How to write a search strategy for your systematic review
Practical tips to write a search strategy for your systematic review
With a great review question and a clear set of eligibility criteria already mapped out, it’s now time to plan the search strategy. The medical literature is vast. Your team plans a thorough and methodical search, but you also know that resources and interest in the project are finite. At this stage it might feel like you have a mountain to climb.
The bottom line? You will have to sift through some irrelevant search results to find the studies that you need for your review. Capturing a proportion of irrelevant records in your search is necessary to ensure that it identifies as many relevant records as possible. This is the trade-off of precision versus sensitivity and, because systematic reviews aim to be as comprehensive as possible, it is best to favour sensitivity – more is more.
By now, the size of this task might be sounding alarm bells. The good news is that a range of techniques and web-based tools can help to make searching more efficient and save you time. We’ll look at some of them as we walk through the four main steps of searching for studies:
- Decide where to search
- Write and refine the search
- Run and record the search
- Manage the search results
Searching is a specialist discipline and the information given here is not intended to replace the advice of a skilled professional. Before we look at each of the steps in turn, the most important systematic reviewer pro-tip for searching is:
Pro Tip – Talk to your librarian and do it early!
1. decide where to search .
It’s important to come up with a comprehensive list of sources to search so that you don’t miss anything potentially relevant. In clinical medicine, your first stop will likely be the databases MEDLINE , Embase , and CENTRAL . Depending on the subject of the review, it might also be appropriate to run the search in databases that cover specific geographical regions or specialist areas, such as traditional Chinese medicine.
In addition to these databases, you’ll also search for grey literature (essentially, research that was not published in journals). That’s because your search of bibliographic databases will not find relevant information if it is part of, for example:
- a trials register
- a study that is ongoing
- a thesis or dissertation
- a conference abstract.
Over-reliance on published data introduces bias in favour of positive results. Studies with positive results are more likely to be submitted to journals, published in journals, and therefore indexed in databases. This is publication bias and systematic reviews seek to minimise its effects by searching for grey literature.
2. Write and refine the search
Search terms are derived from key concepts in the review question and from the inclusion and exclusion criteria that are specified in the protocol or research plan.
Keywords will be searched for in the title or abstract of the records in the database. They are often truncated (for example, a search for therap* to find therapy, therapies, therapist). They might also use wildcards to allow for spelling variants and plurals (for example, wom#n to find woman and women). The symbols used to perform truncation and wildcard searches vary by database.
Index terms
Using index terms such as MeSH and Emtree in a search can improve its performance. Indexers with subject area expertise work through databases and tag each record with subject terms from a prespecified controlled vocabulary.
This indexing can save review teams a lot of time that would otherwise be spent sifting through irrelevant records. Using index terms in your search, for example, can help you find the records that are actually about the topic of interest (tagged with the index term) but ignore those that contain only a brief mention of it (not tagged with the index term).
Indexers assign terms based on a careful read of each study, rather than whether or not the study contains certain words. So the index terms enable the retrieval of relevant records that cannot be captured by a simple search for the keyword or phrase.
Use a combination
Relying solely on index terms is not advisable. Doing so could miss a relevant record that for some reason (indexer’s judgment, time lag between a record being listed in a database and being indexed) has not been tagged with an index term that would enable you to retrieve it. Good search strategies include both index terms and keywords.

Let’s see how this works in a real review! Figure 2 shows the search strategy for the review ‘Wheat flour fortification with iron and other micronutrients for reducing anaemia and improving iron status in populations’. This strategy combines index terms and keywords using the Boolean operators AND, OR, and NOT. OR is used first to reach as many records as possible before AND and NOT are used to narrow them down.
- Lines 1 and 2: contain MeSH terms (denoted by the initial capitals and the slash at the end).
- Line 3: contains truncated keywords (‘tw’ in this context is an instruction to search the title and abstract fields of the record).
- Line 4: combines the three previous lines using Boolean OR to broaden the search.
- Line 11: combines previous lines using Boolean AND to narrow the search.
- Lines 12 and 13: further narrow the search using Boolean NOT to exclude records of studies with no human subjects.

Writing a search strategy is an iterative process. A good plan is to try out a new strategy and check that it has picked up the key studies that you would expect it to find based on your existing knowledge of the topic area. If it hasn’t, you can explore the reasons for this, revise the strategy, check it for errors, and try it again!
3. Run and record the search
Because of the different ways that individual databases are structured and indexed, a separate search strategy is needed for each database. This adds complexity to the search process, and it is important to keep a careful record of each search strategy as you run it. Search strategies can often be saved in the databases themselves, but it is a good idea to keep an offline copy as a back-up; Covidence allows you to store your search strategies online in your review settings.
The reporting of the search will be included in the methods section of your review and should follow the PRISMA guidelines. You can download a flow diagram from PRISMA’s website to help you log the number of records retrieved from the search and the subsequent decisions about the inclusion or exclusion of studies. The PRISMA-S extension provides guidance on reporting literature searches.

It is very important that search strategies are reproduced in their entirety (preferably using copy and paste to avoid typos) as part of the published review so that they can be studied and replicated by other researchers. Search strategies are often made available as an appendix because they are long and might otherwise interrupt the flow of the text in the methods section.
4. Manage the search results
Once the search is done and you have recorded the process in enough detail to write up a thorough description in the methods section, you will move on to screening the results. This is an exciting stage in any review because it’s the first glimpse of what the search strategies have found. A large volume of results may be daunting but your search is very likely to have captured some irrelevant studies because of its high sensitivity, as we have already seen. Fortunately, it will be possible to exclude many of these irrelevant studies at the screening stage on the basis of the title and abstract alone 😅.
Search results from multiple databases can be collated in a single spreadsheet for screening. To benefit from process efficiencies, time-saving and easy collaboration with your team, you can import search results into a specialist tool such as Covidence. A key benefit of Covidence is that you can track decisions made about the inclusion or exclusion of studies in a simple workflow and resolve conflicting decisions quickly and transparently. Covidence currently supports three formats for file imports of search results:
- EndNote XML
- PubMed text format
- RIS text format
If you’d like to try this feature of Covidence but don’t have any data yet, you can download some ready-made sample data .
And you’re done!
There is a lot to think about when planning a search strategy. With practice, expert help, and the right tools your team can complete the search process with confidence.
This blog post is part of the Covidence series on how to write a systematic review.
Sign up for a free trial of Covidence today!
[1] Witt KG, Hetrick SE, Rajaram G, Hazell P, Taylor Salisbury TL, Townsend E, Hawton K. Pharmacological interventions for self‐harm in adults . Cochrane Database of Systematic Reviews 2020, Issue 12. Art. No.: CD013669. DOI: 10.1002/14651858.CD013669.pub2. Accessed 02 February 2021

Laura Mellor. Portsmouth, UK
Perhaps you'd also like....

Top 5 Tips for High-Quality Systematic Review Data Extraction
Data extraction can be a complex step in the systematic review process. Here are 5 top tips from our experts to help prepare and achieve high quality data extraction.

How to get through study quality assessment Systematic Review
Find out 5 tops tips to conducting quality assessment and why it’s an important step in the systematic review process.

How to extract study data for your systematic review
Learn the basic process and some tips to build data extraction forms for your systematic review with Covidence.
Better systematic review management
Head office, while you’re here, why not try covidence for yourself, it’s free to sign up and start a review..

By using our site you consent to our use of cookies to measure and improve our site’s performance. Please see our Privacy Policy for more information.
How to Write a Literature Search Strategy
So, you have worked your tail off to dig deep into the literature to find what you hope will fill over 40 pages of your monstrous Chapter 2. Now what? Well, ideally, you will begin crafting a clear and concise synthesis of the literature. However, oftentimes dissertation candidates struggle with putting it all together. A good place to start (and an easy box to check off for your chapter) is with writing the Literature Search Strategy.

Discover How We Assist to Edit Your Dissertation Chapters
Aligning theoretical framework, gathering articles, synthesizing gaps, articulating a clear methodology and data plan, and writing about the theoretical and practical implications of your research are part of our comprehensive dissertation editing services.
- Bring dissertation editing expertise to chapters 1-5 in timely manner.
- Track all changes, then work with you to bring about scholarly writing.
- Ongoing support to address committee feedback, reducing revisions.
This particular section in your Chapter 2 is really a plug-and-play piece. There are two main things that you focus on in this section. The first one is the databases that you used to locate the articles that you found, and the other is the search terms that you utilized. A secondary, minor, element to this section is the date range of articles searched. Most schools have a five year limitation for non-seminal pieces. Once you have these elements, you can craft something like this:
The search for current, 2013-2018, peer-reviewed articles was conducted via the online library. These databases included Academic OneFile, Academic Search Complete, ERIC, Gale, JSTOR, Sage Journals, and PsycNet . Google Scholar was also utilized to locate open access articles. The following search terms were used to locate articles specific to this study: drug abuse, drug treatment, and so on. . Variations of these terms were used to ensure exhaustive search results.
Using the template above, you should be able to quickly and easily knock out this section of your Chapter 2!

Literature search strategy
Sometimes you are required to explain your literature search strategy used in your research. Even when you are not officially required to do so, including the explanation of literature search strategy in the literature review chapter is going to boost your marks considerably.
Keeping a literature search diary to write your search activities is a good way of keeping track of your literature review progress. The diary can be in the paper format, a Microsoft Word file or an Excel spreadsheet and include the following
- Names of sources
- Search terms used
- The numbers of search results generated from each source.
Generally, you can conduct your literature search strategy in the following stages:
1. Identification of search terms . For example, for a study entitled “An investigation into the impacts of management practices on the levels of employee motivation at Coca-Cola USA” search terms can be specified as management, management style, motivation, employee morale, leadership, satisfaction, work-life balance , and others.
Your search strategy for the relevant literature should also consider synonyms of key words. For example above, the search term of employee motivation might be referred to elsewhere as employee morale or employee willingness.
2. Finding an initial pool of online and offline resources according to the search term . Equipped with search terms, a vast pool of relevant literature can be generated from a wide range of sources. The most effective secondary data sources include the following:
- Bibliographic databases such as Emerald and Google Scholar
- Online libraries such as Questia
- Conference proceedings
- Key industry journals and magazines
3. Filtering the literature according to credentials of authors . Due to the word limits imposed for the literature review chapter, as well as, other chapters of the dissertation, it is not possible, nor desirable to discuss all of the sources you have found in this chapter. Only the works of the most noteworthy scholars and authors need to be included in the literature review.
Scholars with the highest credentials do usually publish their articles on peer-reviewed journals and respectable magazines, rather than newspapers and online blogs. You should take this into account when devising and applying your literature search strategy.
4. Further filtering the remaining literature according to contribution of the text to the development of the research area . Regardless of the type of the selected research area, the literature review will identify many works that have been completed by respected authorities in the area. Due to word limitation requirement only the most important contributions of the research area need to be mentioned in the literature review.
For example, within the research area of organizational culture such contributions can be mentioned as Harrison’s Model of Culture (1972), Competing Values Framework by Cameron and Quinn (1999), Geert Hofstede’s Cultural Dimensions, Trompenaar’s Cultural Dimensions and others.
5. Filtering the remaining literature according to date of publication . Your search strategy for the literature needs to give more preference to recent publications. Apart from the inclusion of major models and theoretical frameworks, you have to focus on the latest developments in the research area. Therefore, it is important to be critically analyzing up-to-date sources in the literature, and the majority of the literature discussed in this chapter need to be the ones that were published during the last five years.
Literature review chapter of your dissertation needs to include literature that remains after applying all five literature search stages discussed above.

John Dudovskiy
- University of Michigan Library
- Research Guides
Systematic Reviews
- Search Strategy
- Work with a Search Expert
- Types of Reviews
- Evidence in a Systematic Review
- Information Sources
Developing an Answerable Question
Creating a search strategy, identifying synonyms & related terms, keywords vs. index terms, combining search terms using boolean operators, a sr search strategy, search limits.
- Managing Records
- Selection Process
- Data Collection Process
- Study Risk of Bias Assessment
- Reporting Results
- For Search Professionals
Validated Search Filters
Depending on your topic, you may be able to save time in constructing your search by using specific search filters (also called "hedges") developed & validated by researchers in the Health Information Research Unit (HiRU) of McMaster University, under contract from the National Library of Medicine. These filters can be found on
- PubMed’s Clinical Queries & Health Services Research Queries pages
- Ovid Medline’s Clinical Queries filters or here
- Embase & PsycINFO
- EBSCOhost’s main search page for CINAHL (Clinical Queries category)
- HiRU’s Nephrology Filters page
- American U of Beirut, esp. for " humans" filters .
- Countway Library of Medicine methodology filters
- InterTASC Information Specialists' Sub-Group Search Filter Resource
- SIGN (Scottish Intercollegiate Guidelines Network) filters page
Why Create a Sensitive Search?
In many literature reviews, you try to balance the sensitivity of the search (how many potentially relevant articles you find) & specificit y (how many definitely relevant articles you find ), realizing that you will miss some. In a systematic review, you want a very sensitive search: you are trying to find any potentially relevant article. A systematic review search will:
- contain many synonyms & variants of search terms
- use care in adding search filters
- search multiple resources, databases & grey literature, such as reports & clinical trials
PICO is a good framework to help clarify your systematic review question.
P - Patient, Population or Problem: What are the important characteristics of the patients &/or problem?
I - Intervention: What you plan to do for the patient or problem?
C - Comparison: What, if anything, is the alternative to the intervention?
O - Outcome: What is the outcome that you would like to measure?
Beyond PICO: the SPIDER tool for qualitative evidence synthesis.
5-SPICE: the application of an original framework for community health worker program design, quality improvement and research agenda setting.
A well constructed search strategy is the core of your systematic review and will be reported on in the methods section of your paper. The search strategy retrieves the majority of the studies you will assess for eligibility & inclusion. The quality of the search strategy also affects what items may have been missed. Informationists can be partners in this process.
For a systematic review, it is important to broaden your search to maximize the retrieval of relevant results.
Use keywords: How other people might describe a topic?
Identify the appropriate index terms (subject headings) for your topic.
- Index terms differ by database (MeSH, or Medical Subject Headings , Emtree terms , Subject headings) are assigned by experts based on the article's content.
- Check the indexing of sentinel articles (3-6 articles that are fundamental to your topic). Sentinel articles can also be used to test your search results.
Include spelling variations (e.g., behavior, behaviour ).
Both types of search terms are useful & both should be used in your search.
Keywords help to broaden your results. They will be searched for at least in journal titles, author names, article titles, & article abstracts. They can also be tagged to search all text.
Index/subject terms help to focus your search appropriately, looking for items that have had a specific term applied by an indexer.
Boolean operators let you combine search terms in specific ways to broaden or narrow your results.
An example of a search string for one concept in a systematic review.

In this example from a PubMed search, [mh] = MeSH & [tiab] = Title/Abstract, a more focused version of a keyword search.
A typical database search limit allows you to narrow results so that you retrieve articles that are most relevant to your research question. Limit types vary by database & include:
- Article/publication type
- Publication dates
In a systematic review search, you should use care when applying limits, as you may lose articles inadvertently. For more information, see, particularly regarding language & format limits. Cochrane 2008 6.4.9

IMAGES
VIDEO
COMMENTS
Are you or one of your children beginning college soon and are in search of scholarships? Winning scholarships is an excellent way of reducing student debt. With the broad range of scholarships available, there’s something for everyone. The...
Strategy is important to make decisions, conduct operations, attract customers, compete successfully and attain organization’s goals. Creating a strategic plan is crucial regardless of the size of the organization. A good strategy helps yo ...
Students can search online for past teachers at websites such as SchoolRack.com and TeacherWeb.com, or at the individual school’s official website. Those interested in finding former college professors can also search online at RateMyProfes...
A search strategy is an organised structure of key terms used to search a database. The search strategy combines the key concepts of your search question in
Keeping a record of your search activity · how you searched (keyword and/or subject headings) · which search terms you used (which words and
A search strategy is an organized combination of keywords, phrases, subject headings, and limiters used to search a database. Your search
Search Strategies · Connect Keywords Using Boolean · AND · OR · NOT · Make the Database Work More · A Few Filters to Try · Useful Parts of a Good
It is to retrieve everything of relevance' [2] or 'A systematic literature search aims to identify all publications relevant to the particular
Literature Review Methods · Inclusion and Exclusion Criteria · Search Strategy · Database Searches · Abstract Review · Evaluating Published Articles for Completeness
A search strategy should be tested for completeness. Therefore, it is necessary to identify extra, possibly relevant search terms and add them
Writing a search strategy is an iterative process. A good plan is to try out a new strategy and check that it has picked up the key studies that you would
There are two main things that you focus on in this section. The first one is the databases that you used to locate the articles that you found, and the other
Literature search strategy · 1. Identification of search terms. · 2. Finding an initial pool of online and offline resources according to the search term. · 3.
A well constructed search strategy is the core of your systematic review and will be reported on in the methods section of your paper. The