TY - JOUR
T1 - ‘I will go to my grave fighting for grammar’
T2 - Exploring the ability of language-trained raters to implement a professionally-relevant rating scale for writing
AU - Knoch, Ute
AU - Zhang, Barbara Ying
AU - Elder, Catherine
AU - Flynn, Eleanor
AU - Huisman, Annemiek
AU - Woodward-Kron, Robyn
AU - Manias, Elizabeth
AU - McNamara, Tim
N1 - Funding Information:
This work was supported by the Australian Research Council [Linkage grant number LP130100171 ]. We would also like to acknowledge the support of the Cambridge Boxhill Language Assessment , the owner of the Occupational English Test (OET), and all the OET raters who participated in the study.
Funding Information:
This work was supported by the Australian Research Council [Linkage grant number LP130100171]. We would also like to acknowledge the support of the Cambridge Boxhill Language Assessment, the owner of the Occupational English Test (OET), and all the OET raters who participated in the study.
Publisher Copyright:
© 2020 Elsevier Inc.
PY - 2020/10
Y1 - 2020/10
N2 - Researchers have recommended involving domain experts in the design of scoring rubrics of language for specific purpose tests by eliciting profession-relevant, indigenous criteria and applying these to test performances (see, e.g., Douglas, 2001; Jacoby, 1998; Pill, 2016). However, these indigenous criteria, derived as they are from people outside the assessment field, may be difficult to apply by the non-domain expert raters typically employed to rate performances on language tests. This paper addresses this question with reference to the writing component of the Occupational English Test (OET), a test designed to assess the English communication skills of overseas-trained health professionals. The paper describes the development of a set of professionally-relevant writing descriptors and then explores how well language-trained raters (N = 15) were able to apply these to a set of OET writing samples. All raters were interviewed and the rating data were analysed statistically. The findings show that while the statistical properties of the score data were generally satisfactory, some of the raters felt that they were not able to apply the scale confidently due to their perceived lack of medical knowledge. The study has implications for scale design, rater training and the use of professionally-relevant rating scales for LSP testing purposes.
AB - Researchers have recommended involving domain experts in the design of scoring rubrics of language for specific purpose tests by eliciting profession-relevant, indigenous criteria and applying these to test performances (see, e.g., Douglas, 2001; Jacoby, 1998; Pill, 2016). However, these indigenous criteria, derived as they are from people outside the assessment field, may be difficult to apply by the non-domain expert raters typically employed to rate performances on language tests. This paper addresses this question with reference to the writing component of the Occupational English Test (OET), a test designed to assess the English communication skills of overseas-trained health professionals. The paper describes the development of a set of professionally-relevant writing descriptors and then explores how well language-trained raters (N = 15) were able to apply these to a set of OET writing samples. All raters were interviewed and the rating data were analysed statistically. The findings show that while the statistical properties of the score data were generally satisfactory, some of the raters felt that they were not able to apply the scale confidently due to their perceived lack of medical knowledge. The study has implications for scale design, rater training and the use of professionally-relevant rating scales for LSP testing purposes.
KW - Assessing health professionals
KW - ESP assessment
KW - Indigenous criteria
KW - Rating scale development
KW - Rubric design
UR - http://www.scopus.com/inward/record.url?scp=85093653657&partnerID=8YFLogxK
U2 - 10.1016/j.asw.2020.100488
DO - 10.1016/j.asw.2020.100488
M3 - Article
AN - SCOPUS:85093653657
SN - 1075-2935
VL - 46
JO - Assessing Writing
JF - Assessing Writing
M1 - 100488
ER -