Reliability of an existing tool of professionalism assessment for pre-clinical medical students.

Research output: Contribution to conferenceAbstractpeer-review

Abstract

Background: Medical professionalism is a core competency for medical graduates, and significantly correlates with other key competencies associated with medical practice (Kirk, 2007; Larkin, 2003). Professionalism development begins in the pre-clinical training years, allowing the student time to start to form their professional identity in line with course expectations. Behaviours used as proxies for professionalism assessment at the pre-clinical level include attendance, punctuality, communication, respect, accountability and engagement. Objective measurement of these behaviours is challenging, and a reliable and valid assessment of professionalism is needed.Aim: Evaluation of internal consistency of professionalism assessment data from pre-clinical medical students.Methods: Retrospective assessment of data collected from 105 pre-clinical graduate-entry medical students. Data was collected by 18 evaluators during problem-based learning sessions (PBLs) and clinical placement (CP). Exploratory factor analysis (EFA) (Extraction Method: Principal Component extraction and Oblimin rotation with Kaiser Normalization) and sensitivity analyses were conducted on this data.Results: EFA identifies PBL and CP, as distinct components. Overall Cronbach alpha (α) of the assessment tool was 0.584. Individually, PBL recorded an α of 0.759 and CP recorded an α of 0.584. Sensitivity analysis revealed that removal of ‘accountability’ from the CP domain resulted in the greatest increase in overall alpha (0.720). Removal of CP domain items ‘punctuality’ and ‘communication’ also significantly improved α, suggesting that the CP domain; ‘accountability’; ‘punctuality’; and ‘communication’ require further development.Conclusion: PBL and CP are distinct domains of professionalism assessment in pre-clinical medicine. Modification of ‘accountability’ ‘punctuality’, and ‘communication’ are likely to increase assessment reliability. One caveat is that nearly all students received a very high score, limiting differentiation. Further research is recommended, with a larger data set. Discriminatory validity is also required to confirm psychometric properties of the tool.

Conference

ConferenceAustralian & New Zealand Association for Health Professional Educators Conference 2021
Abbreviated titleANZAHPE 2021
Cityvirtual conference
Period6/07/2117/07/21
OtherANZAHPE Festival 2021
Theme: Moving forward in ambiguity
Internet address

Keywords

  • professionalism assessment, professionalism , Reliability

Cite this