TY - JOUR
T1 - Curtailing marking variation and enhancing feedback in large scale undergraduate chemistry courses through reducing academic judgement
T2 - a case study
AU - George-Williams, Stephen
AU - Carroll, Mary-Rose
AU - Ziebell, Angela
AU - Thompson, Christopher
AU - Overton, Tina
PY - 2019/8/18
Y1 - 2019/8/18
N2 - Variation in marks awarded, alongside quality of feedback, is an issue whenever large-scale assessment is undertaken. In particular, variation between sessional teaching staff has been studied for decades resulting in many recorded efforts to overcome this issue. Attempts to curtail variation range from moderation meetings, extended training programmes, electronic tools, automated feedback or even audio/video feedback. Decreased marking variation was observed whenever automated marking was used, potentially due to less academic judgment being used by the markers. This article will focus on a case study of three interventions undertaken at Monash University that were designed to address concerns around the variability of marking and the feedback between sessional teaching staff employed in the chemistry teaching laboratories. The interventions included the use of detailed marking criteria, Excel marking spreadsheets and automated marked Moodle reports. Results indicated that more detailed marking criteria had no effect whilst automated processes caused a consistent decrease. This was attributed to a decrease in the academic judgment markers were expected to use. Only the Excel spreadsheet ensured the provision of consistent feedback to students. Sessional teaching staff commented that their marking loads were reduced and the new methods were easy to use.
AB - Variation in marks awarded, alongside quality of feedback, is an issue whenever large-scale assessment is undertaken. In particular, variation between sessional teaching staff has been studied for decades resulting in many recorded efforts to overcome this issue. Attempts to curtail variation range from moderation meetings, extended training programmes, electronic tools, automated feedback or even audio/video feedback. Decreased marking variation was observed whenever automated marking was used, potentially due to less academic judgment being used by the markers. This article will focus on a case study of three interventions undertaken at Monash University that were designed to address concerns around the variability of marking and the feedback between sessional teaching staff employed in the chemistry teaching laboratories. The interventions included the use of detailed marking criteria, Excel marking spreadsheets and automated marked Moodle reports. Results indicated that more detailed marking criteria had no effect whilst automated processes caused a consistent decrease. This was attributed to a decrease in the academic judgment markers were expected to use. Only the Excel spreadsheet ensured the provision of consistent feedback to students. Sessional teaching staff commented that their marking loads were reduced and the new methods were easy to use.
KW - Electronic marking
KW - large cohorts
KW - marking criteria
KW - sessional teaching staff
UR - http://www.scopus.com/inward/record.url?scp=85059320740&partnerID=8YFLogxK
U2 - 10.1080/02602938.2018.1545897
DO - 10.1080/02602938.2018.1545897
M3 - Article
AN - SCOPUS:85059320740
VL - 44
SP - 881
EP - 893
JO - Assessment & Evaluation in Higher Education
JF - Assessment & Evaluation in Higher Education
SN - 0260-2938
IS - 6
ER -