Skip to Main Content

Computer Search Report Checklist (CSRC): Other Assessment Tools

First look - how the CSRC relates to AMSTAR, PRISMA, PRESS

The chart below (see CSRC Comparison link) is taken from a poster/paper presentation on August 8, 2015 at the 2015 Annual Convention of the American Psychological Association in Toronto, Canada.  Comparisons to other tools or approaches for assessing computer searches may be coming.  Comparisons in other studies have looked at PRISMA and AMSTAR, as well as AMSTAR and AMSTAR-R. The poster/paper is here.

Also, this chart is one recent attempt at comparison (Aug, 2015)*. The focus has been on evaluating the comprehensiveness and reproducibility of computer searches. Ongoing work should be and is being pursued.

Clicking on the link below should open the screenshot of the chart in a separate browser tab or window.

* Update on chart (9/2/2015): depending on perspective, it seems the number of items for comprehensiveness in PRESS should be from 4 to 20 out of 30.  The number on the chart indicates 4/30.

Additional discussion of these three has been developed and used in a paper to be submitted that looks at the reproducibility of electronic searches (computer searches) that are used for systematic reviews (12/21/16).

Other tools

1. PRISMA - Search (PRISMA-S) Extension to PRISMA

Proposal: A project and tool that seeks to "create an extension to the PRISMA Statement, using a consensus- based approach common to other reporting guideline development processes.(20) The Delphi process and a formal consensus conference will produce a checklist that will form the basis of the PRISMA extension. The checklist and its accompanying elaboration and explanation document will enable information specialists and systematic reviewers to have concrete items to report, with detailed examples of preferred methods of reporting each item."  See the proposal

UPDATE: In the spring (March 4, 2019) development materials were placed on the OSF: PRISMA-S PRISMA Search Reporting Extension. https://doi.org/10.17605/OSF.IO/YGN9W

More comments on this new tool are below. 

2. An instrument for evaluating searches for systematic reviews: The SRS-checklist

This is from the poster: The SRS-checklist for evaluating search strategies in systematic reviews was compiled from several other lists (e.g. Cochrane Handbook, PRISMA, PRESS). Several elements were grouped and sometimes merged together and some were rephrased or removed completely. The checklist consists of 23 questions, equivalent to 23 binary variables, with nine pertaining to reproducibility and fourteen pertaining to the quality of the search strategy

 

3. Documentation and Appraisal ReviewTool (DART)

A tool that seeks to "assess and document the quality of systematic reviews".  See this article 

 

Comments on PRISMA-S

Comments:  As a librarian, I am very much looking forward to using this resource to support students and faculty and their research. This resource has had considerable input.

I also had/have these thoughts (November 1) - see a. and b.

a. The discussion/explanation in the PRISMA-S documents emphasized targeting what is "most essential for reporting the literature search component...so that it is reproducible".

I have been fixated on "reproducibility". I have also been focused on *methods for searching. More specifically, I have been pushing to understand what a reader needs to see to "exactly reproduce" what I have been viewing as a “computer search". And I have recently found the definition of Goodman et al useful: "Methods reproducibility refers to the provision of enough detail about study procedures and data so the same procedures could, in theory or in actuality, be exactly repeated." Goodman, Steven N., Daniele Fanelli, and John P. A. Ioannidis. (2016). What Does Research Reproducibility Mean? Science Translational Medicine 8(341): 341ps12-341ps12. 10.1126/scitranslmed.aaf5027
  
So... "coming from" a framework and set of activities I have pursued on "computer searches” since 2008, as far as reproducibility goes I think that the CSRC focus on reproducibility has been/is a narrower focus than the PRISMA-S.  PRISMA-S seems to have a broader focus on reproducibility of the "literature search". This broader focus on the literature search can be seen in the drafts of PRISMA-S checklists that have been available since Mary of 2019 as noted above.

b. Also, there are items that seem to be necessary for "exact replication" of the electronic searches (e.g., name of resource - item 1A, full search strategy used for each database - items 6, etc). Also, it seems, there are items that are very desirable for *confidence in reproducibility. It seems that there is a difference between exact reproducibility vs. *confidence that readers can have in the *possibility of reproducing electronic searches. Some of the PRISMA-S items that seem to have content desirable for *confidence in reproducibility (but not necessary for "exact reproducible searches" based on reporting) are Items 9, 10, 11. There might be other items and/or content that has to do with confidence in vs. "exact" reproducing of the searches (electronic searches). So, it seems that a report of the electronic/computer search might provide what is needed for "exact reproduction" of the search without providing the additional information that would support (greater ?) confidence in the reproducibility of the search.