Do-it-yourself e-exams

Mathew Hillier, Scott Grant

Research output: Contribution to conferencePaperpeer-review

1 Citation (Scopus)


This paper focuses on a small case study in which we developed and tested a set of spreadsheets as a 'do-it-yourself' e-examination delivery and marking environment. A trial was conducted in a first-year university level class during 2017 at Monash University, Australia. The approach enabled automatic marking for selected response questions and semi-automatic marking for short text responses. The system did not require a network or servers to operate therefore minimising the reliance on complex infrastructure. We paid particular attention to the integrity of the assessment process by ensuring separation of the answer key from the response composition environment. Students undertook a practice session followed by an invigilated exam. Student's perceptions of the process were collected using pre-post surveys (n = 16) comprising qualitative comments and Likert items. The data revealed that students were satisfied with the process (4 or above on 5-point scales). Comments revealed that their experience was in part influenced by their level of computer literacy with respect to enabling skills in the subject domain. Overall the approach was found to be successful with all students successfully completing the e-exam and administrative efficiencies realised in terms of marking time saved.

Original languageEnglish
Number of pages10
Publication statusPublished - 2018
EventAnnual Conference of the Australasian Society for Computers in Learning in Tertiary Education 2018 - Deakin University, Geelong, Australia
Duration: 25 Nov 201828 Nov 2018
Conference number: 35th (Proceedings)


ConferenceAnnual Conference of the Australasian Society for Computers in Learning in Tertiary Education 2018
Abbreviated titleASCILITE 2018
Internet address


  • Computerised assessment system
  • E-exam
  • Spreadsheets

Cite this