A national training program for simulation educators and technicians: evaluation strategy and outcomes

Debra Nestel, Margaret Bearman, Peter Brooks, Dylan Campher, Kirsty Freeman, Jennene Greenhill, Brian C Jolly, Leanne Rogers, Cobie Rudd, Cyle Sprick, Beverley Sutton, Jennifer Meiliana Harlim, Marcus Watson

Research output: Contribution to journalArticleResearchpeer-review

15 Citations (Scopus)

Abstract

Simulation-based education (SBE) has seen a dramatic uptake in health professions education over the last decade. SBE offers learning opportunities that are difficult to access by other methods. Competent faculty is seen as key to high quality SBE. In 2011, in response to a significant national healthcare issue-the need to enhance the quality and scale of SBE-A group of Australian universities was commissioned to develop a national training program-Australian Simulation Educator and Technician Training (AusSETT) Program. This paper reports the evaluation of this large-scale initiative. Methods: The AusSETT Program adopted a train-the-trainer model, which offered up to three days of workshops and between four and eight hours of e-learning. The Program was offered across all professions in all states and territories. Three hundred and three participants attended workshops with 230 also completing e-learning modules. Topics included: Foundational learning theory; orientation to diverse simulation modalities; briefing; and debriefing. A layered objectives-oriented evaluation strategy was adopted with multiple stakeholders (participants, external experts), methods of data collection (end of module evaluations, workshop observer reports and individual interviews) and at multiple data points (immediate and two months later). Descriptive statistics were used to analyse numerical data while textual data (written comments and transcripts of interviews) underwent content or thematic analysis. Results: For each module, between 45 and 254 participants completed evaluations. The content and educational methods were rated highly with items exceeding the pre-established standard. In written evaluations, participants identified strengths (e.g. High quality facilitation, breadth and depth of content) and areas for development (e.g. Electronic portfolio, learning management system) of the Program. Interviews with participants suggested the Program had positively impacted their educational practices. Observers reported a high quality educational experience for participants with alignment of content and methods with perceived participant needs. Conclusions: The AusSETT Program is a significant and enduring learning resource. The development of a national training program to support a competent simulation workforce is feasible. The Program objectives were largely met. Although there are limitations with the study design (e.g. Self-report), there are strengths such as exploring the impact two months later. The evaluation of the Program informs the next phase of the national strategy for simulation educators and technicians with respect to content and processes, strengths and areas for development.
Original languageEnglish
Article number25
Number of pages13
JournalBMC Medical Education
Volume16
Issue number1
DOIs
Publication statusPublished - 22 Jan 2016

Keywords

  • FAculty development
  • Health workforce
  • Program evaluation
  • Simulation

Cite this

Nestel, D., Bearman, M., Brooks, P., Campher, D., Freeman, K., Greenhill, J., Jolly, B. C., Rogers, L., Rudd, C., Sprick, C., Sutton, B., Harlim, J. M., & Watson, M. (2016). A national training program for simulation educators and technicians: evaluation strategy and outcomes. BMC Medical Education, 16(1), [25]. https://doi.org/10.1186/s12909-016-0548-x