Multimodal learning analytics to inform learning design: lessons learned from computing education

Katerina Mangaroska, Kshitij Sharma, Dragan Gašević, Michalis Giannakos

Research output: Contribution to journalArticleResearchpeer-review

3 Citations (Scopus)

Abstract

Programming is a complex learning activity that involves coordination of cognitive processes and affective states. These aspects are often considered individually in computing education research, demonstrating limited understanding of how and when students learn best. This issue confines researchers to contextualize evidence-driven outcomes when learning behaviour deviates from pedagogical intentions. Multimodal learning analytics (MMLA) captures data essential for measuring constructs (e.g., cognitive load, confusion) that are posited in the learning sciences as important for learning, and cannot effectively be measured solely with the use of programming process data (IDE-log data). Thus, we augmented IDE-log data with physiological data (e.g., gaze data) and participants’ facial expressions, collected during a debugging learning activity. The findings emphasize the need for learning analytics that are consequential for learning, rather than easy and convenient to collect. In that regard, our paper aims to provoke productive reflections and conversations about the potential of MMLA to expand and advance the synergy of learning analytics and learning design among the community of educators from a post-evaluation design-aware process to a permanent monitoring process of adaptation.

Original languageEnglish
Pages (from-to)79-97
Number of pages19
JournalJournal of Learning Analytics
Volume7
Issue number3
DOIs
Publication statusPublished - 17 Dec 2020

Keywords

  • Debugging
  • Learning design
  • Multimodal learning analytics
  • Physiological measures
  • Predictive modelling

Cite this