Abstract
Existing online forum software support limited assessment features. This paper presents an analysis of an assessment model which has been implemented in online discussion forum software. The assessment model is aimed to automate the assessment of students participation in online discussion forums. The model is formulated based on four different participation indicators and educators feedback. The model was tested by a group of students who used the online forum to complete a project. Pearson product-moment correlations were calculated using the scores (performance indicator scores) generated by the model and the actual scores given by five educators. The performance indicator scores generated using the assessment formula was highly correlated with the actual grades assigned by the educators. The results suggest that the assessment model is reliable and can be used to evaluate students participation in online discussion forums.
Original language | English |
---|---|
Pages (from-to) | 121-140 |
Number of pages | 20 |
Journal | Computer Science and Information Systems |
Volume | 8 |
Issue number | 1 |
DOIs | |
Publication status | Published - Jan 2011 |
Externally published | Yes |
Keywords
- Information systems
- Online discussion forums
- Online participation
- Performance indicator
- Students assessment