Toward a Framework for the Design, Implementation, and Reporting of Methodology Scoping Reviews

Abstract

Background and Objective

In view of the growth of published articles, there is an increasing need for studies that summarize scientific research. An increasingly common review is a “methodology scoping review,” which provides a summary of existing analytical methods, techniques and software that have been proposed or applied in research articles to address an analytical problem or further an analytical approach. However, guidelines for their design, implementation, and reporting are limited.

Methods

Drawing on the experiences of the authors, which were consolidated through a series of face-to-face workshops, we summarize the challenges inherent in conducting a methodology scoping review and offer suggestions of best practice to promote future guideline development.

Results

We identified three challenges of conducting a methodology scoping review. First, identification of search terms; one cannot usually define the search terms a priori, and the language used for a particular method can vary across the literature. Second, the scope of the review requires careful consideration because new methodology is often not described (in full) within abstracts. Third, many new methods are motivated by a specific clinical question, where the methodology may only be documented in supplementary materials. We formulated several recommendations that build upon existing review guidelines. These recommendations ranged from an iterative approach to defining search terms through to screening and data extraction processes.

Conclusion

Although methodology scoping reviews are an important aspect of research, there is currently a lack of guidelines to standardize their design, implementation, and reporting. We recommend a wider discussion on this topic.

Publication
Journal of Clinical Epidemiology
Dr. Michael Barrowman, PhD
Dr. Michael Barrowman, PhD
Data Scientist

I am a Data Scientist, and Python and R Developer.

Related