Abstract
CONTEXT
Assessment tasks are critical to student learning. It is reasonable to assume that the assessment schedule and workload are associated with a student’s performance and engagement. Many students report that their “inability to manage workload during the semester” undermines their ability to engage with their subjects. It would, therefore, be useful to quantify the relationship between assessment types and submission timings with their associated workload to enable optimal assessment spreading to reduce student pressure within, and across multiple, units within a teaching period (in our previous paper).
PURPOSE OR GOAL
In a previous research paper, we introduced a tool that can quantify how the assessment loadings are typically dispersed across the semester. This quantitative tool can be introduced as a method that can predict weekly student workload deliveries from each assessment in advance. The workload level is computed directly from the unit guides for any given semester and then aggregated by week, either for a single unit or across all the units within a semester. However, the validity of the designed tool based on the unit schedule needs to be confirmed to be accurate using another method to assess student workload distribution. In this paper, we will introduce the validity of the tool's workload predictions on student online activities through the university’s LMS.
APPROACH OR METHODOLOGY/METHODS
In this approach, the data analysis method incorporates based on data are collected from student views or posts through the university’s LMS across the semester weekly. The student views are defined as the number of times a student has viewed resources on the university’s LMS, while the student posts are distinct as the number of times student has uploaded information (questions, assignments, or any discussion that counts as an uploaded file) through the university’s LMS.
ACTUAL OR ANTICIPATED OUTCOMES
The preliminary research has been carried out for two sets of four units in one semester which was actively running the course through the online LMS. The data obtained using the new method based on student views and posts are compared with the data obtained from the quantitative tool based on the unit guides in order to confirm that the quantitative tool is accurate. The results imply that the method can be developed through analysis of the students' interactions with the university’s LMS.
CONCLUSIONS/RECOMMENDATIONS/SUMMARY
In conclusion, the results are promising and show the workload trendlines are relatively similar. However, the contribution to student workload of the final exams needs to be factored into the tool’s calculations.
Assessment tasks are critical to student learning. It is reasonable to assume that the assessment schedule and workload are associated with a student’s performance and engagement. Many students report that their “inability to manage workload during the semester” undermines their ability to engage with their subjects. It would, therefore, be useful to quantify the relationship between assessment types and submission timings with their associated workload to enable optimal assessment spreading to reduce student pressure within, and across multiple, units within a teaching period (in our previous paper).
PURPOSE OR GOAL
In a previous research paper, we introduced a tool that can quantify how the assessment loadings are typically dispersed across the semester. This quantitative tool can be introduced as a method that can predict weekly student workload deliveries from each assessment in advance. The workload level is computed directly from the unit guides for any given semester and then aggregated by week, either for a single unit or across all the units within a semester. However, the validity of the designed tool based on the unit schedule needs to be confirmed to be accurate using another method to assess student workload distribution. In this paper, we will introduce the validity of the tool's workload predictions on student online activities through the university’s LMS.
APPROACH OR METHODOLOGY/METHODS
In this approach, the data analysis method incorporates based on data are collected from student views or posts through the university’s LMS across the semester weekly. The student views are defined as the number of times a student has viewed resources on the university’s LMS, while the student posts are distinct as the number of times student has uploaded information (questions, assignments, or any discussion that counts as an uploaded file) through the university’s LMS.
ACTUAL OR ANTICIPATED OUTCOMES
The preliminary research has been carried out for two sets of four units in one semester which was actively running the course through the online LMS. The data obtained using the new method based on student views and posts are compared with the data obtained from the quantitative tool based on the unit guides in order to confirm that the quantitative tool is accurate. The results imply that the method can be developed through analysis of the students' interactions with the university’s LMS.
CONCLUSIONS/RECOMMENDATIONS/SUMMARY
In conclusion, the results are promising and show the workload trendlines are relatively similar. However, the contribution to student workload of the final exams needs to be factored into the tool’s calculations.
Original language | English |
---|---|
Title of host publication | 31st Annual Conference of the Australasian Association for Engineering Education (AAEE 2020 |
Subtitle of host publication | Disrupting Business as Usual in Engineering Education |
Place of Publication | Barton ACT Australia |
Publisher | Engineers Australia |
Pages | 241-246 |
Number of pages | 6 |
ISBN (Print) | 9781925627541 |
Publication status | Published - 6 Dec 2020 |
Event | AAEE - Annual Conference of Australasian Association for Engineering Education 2020 - University of Technology, Sydney and the University of Sydney, Sydney, Australia Duration: 6 Dec 2020 → 9 Dec 2020 Conference number: 31st https://search.informit.org/doi/book/10.3316/informit.9781925627541 (published proceedings) https://aaee.net.au/conferences/ |
Conference
Conference | AAEE - Annual Conference of Australasian Association for Engineering Education 2020 |
---|---|
Abbreviated title | AAEE2020 |
Country/Territory | Australia |
City | Sydney |
Period | 6/12/20 → 9/12/20 |
Internet address |
Keywords
- Student workload, Learning System Management