Timely knowledge synthesis during infodemics: two-week systematic reviews (2weekSR)
THE INFODEMIC
Evidence about the effectiveness of therapies, procedures, tests and public health interventions is being generated at an accelerating pace. In the late 1970s, 14 trials were being published per day; by 2010, 75 trials and 11 systematic reviews were being published per day.1 The growth in trials and systematic reviews has continued, with the COVID pandemic most recently having unleashed ‘a deluge of publications’.2(para.1)
Some commentators, including the WHO Director-General Tedros Adhanom Ghebreyesus, have observed that we are now in an era of the infodemic.3 The term ‘infodemic’ was coined by Gunther Eysenbach to indicate a situation where there is an excess of information (both accurate and inaccurate), which causes confusion and makes identifying solutions to a problem more difficult.4 Whilst this poses a considerable challenge to decision-makers, Eysenbach has subsequently proposed that infodemics can be managed. The four pillars of infodemic management are: 1) information monitoring; 2) building health and science literacy; 3) encouraging knowledge refinement and quality improvement processes (e.g., fact checking, peer review); and 4) accurate and timely knowledge translation.4,5
THE NEED FOR ACCURATE AND TIMELY KNOWLEDGE SYNTHESES (SYSTEMATIC REVIEWS)
A systematic review combines the results from multiple clinical trials (or other types of studies) and involves searching for and identifying the relevant trials, assessing their quality (or validity) and meta-analysing (i.e., statistically combining) their results.6 Because systematic reviews use systematic, reproducible and transparent methods, they are considered the highest-quality evidence by clinicians, policymakers and decision-makers.7 Systematic reviews are used to inform a variety of decisions, ranging from which therapy to prescribe in a one-on-one clinician-patient encounter, to multimillion dollar decisions by policymakers about subsidising new treatments, diagnostic and screening tests, surgical procedures and other therapies utilised by healthcare systems.
However, systematic reviews are very time consuming to produce. The time that is required varies, depending on the size of the team, type and complexity of the question that needs to be answered and the volume of the existing evidence (e.g., the number of trials). On average, a systematic review will require 67 weeks (approximately one year and four months) to complete, but may take up to 186 weeks (3½ years).8
This is a barrier for decision-makers, who often require evidence in much shorter timeframes. This has been especially true during COVID, when decisions (about pharmaceutical treatments, public health interventions, etc.) often need to be made urgently.
2weekSRs (TWO-WEEK SYSTEMATIC REVIEWS)
The 2weekSR team, based at the Institute of Evidence-Based Healthcare at Bond University, Australia, has designed an innovative methodology to address the fourth of Eysenbach’s four pillars of infodemic management: accurate and timely knowledge translation. The methodology is called 2weekSR and it shortens the time required to conduct a systematic review from the average of 67 weeks identified by Borah and colleagues,8 to approximately two weeks (2weekSR), depending on the complexity of the question.
The first 2weekSR was conducted in 2019. It reviewed the evidence from randomised controlled trials on the impact of increased fluid intake (e.g. drinking extra water or juice) to prevent urinary tract infections (i.e. it asked an intervention question). The review involved screening approximately 1400 references, included eight studies, and was completed in nine work days (i.e., two calendar weeks, hence the name).
Four key elements allowed us to successfully complete the first 2weekSR in two weeks: 1) protected time for the project (i.e. we worked mostly on just this project), 2) the reviewer team had complementary skills (clinicians, methodologist, search specialist), 3) we used agile methodology, and 4) we used systematic review automation tools (many of which have been developed at our Institute, and are freely available for anyone to use on the Systematic Review Accelerator (SRA) website). The methodology was published in more detail in 2020.9
SUBSEQUENT 2weekSRs – INCLUDING COVID
When the COVID-19 pandemic began, we, like many other researchers, shifted some of our work focus to systematically reviewing the evidence on COVID-related topics. In 2020, we conducted four 2weekSRs on COVID-related topics:
Downsides of face masks and possible mitigation strategies: a systematic review and meta-analysis: 6 people, screened ~5500 studies, included 37 studies, required 12 workdays.
Comparison of seroprevalence of SARS-CoV-2 infections with cumulative and imputed COVID-19 cases: systematic review: 7 people, screened ~2200 studies, included 17 studies, required 13 workdays.
Estimating the extent of asymptomatic COVID-19 and its potential for community transmission: a systematic review and meta-analysis: 6 people, screened ~2500 studies, included 13 studies, required 13 workdays.
Impact of COVID-19 pandemic on utilisation of healthcare services: a systematic review: 14 people, screened ~3100 studies, included 81 studies, required 18 workdays.
LESSONS LEARNED AND NEXT STEPS
The conditions under which the first 2weekSR was conducted were very unusual: the team was very small (four people) and very experienced; we worked in close physical proximity, in real-time and face-to-face; and we were all already familiar with the automation tools. This does not reflect many types of systematic reviews that are conducted, nor does it reflect the conditions under which they are conducted.
Therefore, in subsequent 2weekSRs (including the ones focused on COVID, mentioned previously), we applied the methodology to different types of questions (e.g., questions about prevalence or about harms and adverse events) and conducted them under more complex conditions (with less experienced team members, larger teams, a mix of working remotely and face-to-face, a greater variety of study types included in the review, etc.). We have found that the 2weekSRs, in practice, range from a 1weekSR (which included six studies) to a 4weekSR (which included 81 studies). In other words, the time to complete the review increases for larger and more complex systematic reviews, but remains considerably shorter than the 67 week average.
Encouraged by this success, we aim to test the feasibility of extending the 2weekSR methodology to updating and maintaining completed reviews, ‘living systematic reviews’ (which are systematic reviews that are continually updated as new trials become available)10 and scoping reviews (which ask broader questions than systematic reviews) to continue to contribute to Eysenbach’s fourth pillar for managing the infodemic.
References
1. Bastian H, Glasziou P, Chalmers I. Seventy-five trials and eleven systematic reviews a day: how will we ever keep up? PLoS Med. 2010 Sep 21;7(9):e1000326.
2. Gianola S, Jesus TS, Bargeri S, Castellini G. Characteristics of academic publications, preprints, and registered clinical trials on the COVID-19 pandemic. PLoS One. 2020;15(10):e0240123.
3. Tangcharoensathien V, Calleja N, Nguyen T, Purnat T, D’Agostino M, Garcia-Saiso S, et al. Framework for managing the COVID-19 infodemic: methods and results of an online, crowdsourced who technical consultation. J Med Internet Res. 2020;22(6):e19659.
4. Eysenbach G. Infodemiology and infoveillance: framework for an emerging set of public health informatics methods to analyze search, communication and publication behavior on the Internet. J Med Internet Res. 2009;11(1):e11.
5. Eysenbach G. How to fight an infodemic: the four pillars of infodemic management. J Med Internet Res. 2020;22(6):e21820.
6. Higgins JPT, Thomas J, Chandler J, Cumpston M, Li T, Page MJ, Welch VA, editors. Cochrane Handbook for Systematic Reviews of Interventions version 6.2 (updated February 2021). Cochrane; 2021 [cited YYYY MMM DD]. Available from: www.training.cochrane.org/handbook.
7. Bero LA, Jadad AR. How consumers and policymakers can use systematic reviews for decision making. Ann Intern Med. 1997;127(1):37-42.
8. Borah R, Brown AW, Capers PL, Kaiser KA. Analysis of the time and workers needed to conduct systematic reviews of medical interventions using data from the PROSPERO registry. BMJ Open. 2017;7(2):e012545.
9. Clark J, Glasziou P, Del Mar C, Bannach-Brown A, Stehlik P, Scott AM. A full systematic review was completed in 2 weeks using automation tools: a case study. J Clin Epidemiol. 2020;121:81-90.
10. Millard T, Synnot A, Elliott J, Green S, McDonald S, Turner T. Feasibility and acceptability of living systematic reviews: results from a mixed-methods evaluation. Syst Rev. 2019;8(1):325.
Authors
Anna Mae Scott1
Paul Glasziou1
Institute for Evidence-Based Healthcare, Bond University, Gold Coast, QLD, Australia
Disclaimer
The views expressed in this this World EBHC Day Blog, as well as any errors or omissions, are the sole responsibility of the author and do not represent the views of the World EBHC Day Steering Committee, Official Partners or Sponsors; nor does it imply endorsement by the aforementioned parties.