Frameworks and quality measures used for debriefing in team-based simulation

A systematic review

Ruth Endacott, Thomas Gale, Anita O'Connor, Samantha Dix

Research output: Contribution to journalReview ArticleResearchpeer-review

1 Citation (Scopus)

Abstract

Objectives The skill of the debriefer is known to be the strongest independent predictor of the quality of simulation encounters yet educators feel underprepared for this role. The aim of this review was to identify frameworks used for debriefing team-based simulations and measures used to assess debriefing quality. Methods We systematically searched PubMed, CINAHL, MedLine and Embase databases for simulation studies that evaluated a debriefing framework. Two reviewers evaluated study quality and retrieved information regarding study methods, debriefing framework, outcome measures and debriefing quality. Results A total of 676 papers published between January 2003 and December 2017 were identified using the search protocol. Following screening of abstracts, 37 full-text articles were assessed for eligibility, 26 studies met inclusion criteria for quality appraisal and 18 achieved a sufficiently high-quality score for inclusion in the evidence synthesis. A debriefing framework was used in all studies, mostly tailored to the study. Impact of the debrief was measured using satisfaction surveys (n=11) and/or participant performance (n=18). Three themes emerged from the data synthesis: selection and training of facilitators, debrief model and debrief assessment. There was little commonality across studies in terms of participants, experience of faculty and measures used. Conclusions A range of debriefing frameworks were used in these studies. Some key aspects of debrief for team-based simulation, such as facilitator training, the inclusion of a reaction phase and the impact of learner characteristics on debrief outcomes, have no or limited evidence and provide opportunities for future research particularly with interprofessional groups.

Original languageEnglish
Pages (from-to)61-72
Number of pages12
JournalBMJ Simulation and Technology Enhanced Learning
Volume5
Issue number2
DOIs
Publication statusPublished - Apr 2019

Keywords

  • debriefing
  • frameworks
  • interprofessional
  • simulation
  • validity

Cite this

@article{e765bcd48735438fb2c1944f12b70497,
title = "Frameworks and quality measures used for debriefing in team-based simulation: A systematic review",
abstract = "Objectives The skill of the debriefer is known to be the strongest independent predictor of the quality of simulation encounters yet educators feel underprepared for this role. The aim of this review was to identify frameworks used for debriefing team-based simulations and measures used to assess debriefing quality. Methods We systematically searched PubMed, CINAHL, MedLine and Embase databases for simulation studies that evaluated a debriefing framework. Two reviewers evaluated study quality and retrieved information regarding study methods, debriefing framework, outcome measures and debriefing quality. Results A total of 676 papers published between January 2003 and December 2017 were identified using the search protocol. Following screening of abstracts, 37 full-text articles were assessed for eligibility, 26 studies met inclusion criteria for quality appraisal and 18 achieved a sufficiently high-quality score for inclusion in the evidence synthesis. A debriefing framework was used in all studies, mostly tailored to the study. Impact of the debrief was measured using satisfaction surveys (n=11) and/or participant performance (n=18). Three themes emerged from the data synthesis: selection and training of facilitators, debrief model and debrief assessment. There was little commonality across studies in terms of participants, experience of faculty and measures used. Conclusions A range of debriefing frameworks were used in these studies. Some key aspects of debrief for team-based simulation, such as facilitator training, the inclusion of a reaction phase and the impact of learner characteristics on debrief outcomes, have no or limited evidence and provide opportunities for future research particularly with interprofessional groups.",
keywords = "debriefing, frameworks, interprofessional, simulation, validity",
author = "Ruth Endacott and Thomas Gale and Anita O'Connor and Samantha Dix",
year = "2019",
month = "4",
doi = "10.1136/bmjstel-2017-000297",
language = "English",
volume = "5",
pages = "61--72",
journal = "BMJ Simulation and Technology Enhanced Learning",
issn = "2056-6697",
number = "2",

}

Frameworks and quality measures used for debriefing in team-based simulation : A systematic review. / Endacott, Ruth; Gale, Thomas; O'Connor, Anita; Dix, Samantha.

In: BMJ Simulation and Technology Enhanced Learning, Vol. 5, No. 2, 04.2019, p. 61-72.

Research output: Contribution to journalReview ArticleResearchpeer-review

TY - JOUR

T1 - Frameworks and quality measures used for debriefing in team-based simulation

T2 - A systematic review

AU - Endacott, Ruth

AU - Gale, Thomas

AU - O'Connor, Anita

AU - Dix, Samantha

PY - 2019/4

Y1 - 2019/4

N2 - Objectives The skill of the debriefer is known to be the strongest independent predictor of the quality of simulation encounters yet educators feel underprepared for this role. The aim of this review was to identify frameworks used for debriefing team-based simulations and measures used to assess debriefing quality. Methods We systematically searched PubMed, CINAHL, MedLine and Embase databases for simulation studies that evaluated a debriefing framework. Two reviewers evaluated study quality and retrieved information regarding study methods, debriefing framework, outcome measures and debriefing quality. Results A total of 676 papers published between January 2003 and December 2017 were identified using the search protocol. Following screening of abstracts, 37 full-text articles were assessed for eligibility, 26 studies met inclusion criteria for quality appraisal and 18 achieved a sufficiently high-quality score for inclusion in the evidence synthesis. A debriefing framework was used in all studies, mostly tailored to the study. Impact of the debrief was measured using satisfaction surveys (n=11) and/or participant performance (n=18). Three themes emerged from the data synthesis: selection and training of facilitators, debrief model and debrief assessment. There was little commonality across studies in terms of participants, experience of faculty and measures used. Conclusions A range of debriefing frameworks were used in these studies. Some key aspects of debrief for team-based simulation, such as facilitator training, the inclusion of a reaction phase and the impact of learner characteristics on debrief outcomes, have no or limited evidence and provide opportunities for future research particularly with interprofessional groups.

AB - Objectives The skill of the debriefer is known to be the strongest independent predictor of the quality of simulation encounters yet educators feel underprepared for this role. The aim of this review was to identify frameworks used for debriefing team-based simulations and measures used to assess debriefing quality. Methods We systematically searched PubMed, CINAHL, MedLine and Embase databases for simulation studies that evaluated a debriefing framework. Two reviewers evaluated study quality and retrieved information regarding study methods, debriefing framework, outcome measures and debriefing quality. Results A total of 676 papers published between January 2003 and December 2017 were identified using the search protocol. Following screening of abstracts, 37 full-text articles were assessed for eligibility, 26 studies met inclusion criteria for quality appraisal and 18 achieved a sufficiently high-quality score for inclusion in the evidence synthesis. A debriefing framework was used in all studies, mostly tailored to the study. Impact of the debrief was measured using satisfaction surveys (n=11) and/or participant performance (n=18). Three themes emerged from the data synthesis: selection and training of facilitators, debrief model and debrief assessment. There was little commonality across studies in terms of participants, experience of faculty and measures used. Conclusions A range of debriefing frameworks were used in these studies. Some key aspects of debrief for team-based simulation, such as facilitator training, the inclusion of a reaction phase and the impact of learner characteristics on debrief outcomes, have no or limited evidence and provide opportunities for future research particularly with interprofessional groups.

KW - debriefing

KW - frameworks

KW - interprofessional

KW - simulation

KW - validity

UR - http://www.scopus.com/inward/record.url?scp=85063938695&partnerID=8YFLogxK

U2 - 10.1136/bmjstel-2017-000297

DO - 10.1136/bmjstel-2017-000297

M3 - Review Article

VL - 5

SP - 61

EP - 72

JO - BMJ Simulation and Technology Enhanced Learning

JF - BMJ Simulation and Technology Enhanced Learning

SN - 2056-6697

IS - 2

ER -