Inter-rater agreement on assessment of outcome within a trauma registry

C. L. Ekegren, M. J. Hart, A. Brown, B. J. Gabbe

Research output: Contribution to journalArticleResearchpeer-review

14 Citations (Scopus)

Abstract

Introduction: To better evaluate the degree of ongoing disability in trauma patients, it has been recommended that trauma registries introduce routine long-term outcome measurement. One of the measures recommended for use is the Extended Glasgow Outcome Scale (GOS-E). However, few registries have adopted this measure and further research is required to determine its reliability with trauma populations. This study aimed to evaluate the inter-rater agreement of GOS-E scoring between an expert rater and trauma registry follow-up staff with a sample of detailed trauma case scenarios. 

Methods: Sixteen trauma registry telephone interviewers participated in the study. They were provided with a written summary of 15 theoretical adult trauma cases covering a spectrum of disability and asked to rate each case using the structured GOS-E interview. Their ratings were compared with those of an expert rater in order to calculate the inter-rater agreement for each individual rater-expert rater pair. Agreement was reported as the percentage of agreement, the kappa statistic, and weighted kappa. A multi-rater kappa value was also calculated for agreement between the 16 raters. 

Results: Across the 15 cases, the percentage of agreement between individual raters and the expert ranged from 63% to 100%. Across the 16 raters, the percentage of agreement with the expert rater ranged from 73-100% (mean = 90%). Kappa values ranged from 0.65 to 1.00 across raters (mean = 0.86) and weighted kappa values ranged from 0.73 to 1.00 (mean = 0.89) The multi-rater kappa value was 0.78 (95% CI: 0.66, 0.89). 

Conclusions: Sixteen follow-up staff achieved 'substantial' to 'almost perfect' agreement with an expert rater using the GOS-E outcome measure to score 15 sample trauma cases. The results of this study lend support to the use of the GOS-E within trauma populations and highlight the importance of ongoing training where multiple raters are involved to ensure reliable outcome reporting. It is also recommended that the structured GOS-E interview guide be used to achieve better agreement between raters. Ensuring the reliability of trauma outcome scores will enable more accurate evaluation of patient outcomes, and ultimately, more targeted trauma care.

Original languageEnglish
Pages (from-to)130-134
Number of pages5
JournalInjury
Volume47
Issue number1
DOIs
Publication statusPublished - 1 Jan 2016

Keywords

  • Extended Glasgow Outcome Scale
  • Inter-rater agreement
  • Outcome measures
  • Registry
  • Reliability
  • Trauma
  • Validity

Cite this