Variation observed in consensus judgments between pairs of reviewers when assessing the risk of bias due to missing evidence in a sample of published meta-analyses of nutrition research

Raju Kanukula, Joanne E. McKenzie, Aidan G. Cashin, Elizabeth Korevaar, Sally McDonald, Arthur T. Mello, Phi Yen Nguyen, Ian J. Saldanha, Michael A. Wewege, Matthew J. Page (Leading Author)

Research output: Contribution to journalArticleResearchpeer-review

1 Citation (Scopus)

Abstract

Objectives: To evaluate the risk of bias due to missing evidence in a sample of published meta-analyses of nutrition research using the Risk Of Bias due to Missing Evidence (ROB-ME) tool and determine inter-rater agreement in assessments. Study Design and Setting: We assembled a random sample of 42 meta-analyses of nutrition research. Eight assessors were randomly assigned to one of four pairs. Each pair assessed 21 randomly assigned meta-analyses, and each meta-analysis was assessed by two pairs. We calculated raw percentage agreement and chance corrected agreement using Gwet's Agreement Coefficient (AC) in consensus judgments between pairs. Results: Across the eight signaling questions in the ROB-ME tool, raw percentage agreement ranged from 52% to 100%, and Gwet's AC ranged from 0.39 to 0.76. For the risk-of-bias judgment, the raw percentage agreement was 76% (95% confidence interval 60% to 92%) and Gwet's AC was 0.47 (95% confidence interval 0.14 to 0.80). In seven (17%) meta-analyses, either one or both pairs judged the risk of bias due to missing evidence as “low risk”. Conclusion: Our findings indicated substantial variation in assessments in consensus judgments between pairs for the signaling questions and overall risk-of-bias judgments. More tutorials and training are needed to help researchers apply the ROB-ME tool more consistently.

Original languageEnglish
Article number111244
Number of pages10
JournalJournal of Clinical Epidemiology
Volume166
DOIs
Publication statusPublished - Feb 2024

Keywords

  • Bias
  • Meta-analysis
  • Nutritional sciences
  • Reliability
  • Reporting bias
  • Systematic review

Cite this