TY - JOUR
T1 - HiTSKT
T2 - a hierarchical transformer model for session-aware knowledge tracing
AU - Ke, Fucai
AU - Wang, Weiqing
AU - Tan, Weicong
AU - Du, Lan
AU - Jin, Yuan
AU - Huang, Yujin
AU - Yin, Hongzhi
N1 - Publisher Copyright:
© 2023 The Author(s)
PY - 2024/1/25
Y1 - 2024/1/25
N2 - Knowledge tracing (KT) aims to leverage students’ learning histories to estimate their mastery levels on a set of pre-defined skills, based on which the corresponding future performance can be accurately predicted. In practice, a student's learning history comprises answers to sets of massed questions, each known as a session, rather than merely being a sequence of independent answers. Theoretically, within and across these sessions, students’ learning dynamics can be very different. Therefore, how to effectively model the dynamics of students’ knowledge states within and across the sessions is crucial for handling the KT problem. Most existing KT models treat student's learning records as a single continuing sequence, without capturing the sessional shift of students’ knowledge state. To address the above issue, we propose a novel hierarchical transformer model, named HiTSKT, comprises an interaction(-level) encoder to capture the knowledge a student acquires within a session, and a session(-level) encoder to summarize acquired knowledge across the past sessions. To predict an interaction in the current session, a knowledge retriever integrates the summarized past-session knowledge with the previous interactions’ information into proper knowledge representations. These representations are then used to compute the student's current knowledge state. Additionally, to model the student's long-term forgetting behaviour across the sessions, a power-law-decay attention mechanism is designed and deployed in the session encoder, allowing it to emphasize more on the recent sessions. Extensive experiments on four public datasets demonstrate that HiTSKT achieves new state-of-the-art performance on all the datasets compared with seven state-of-the-art KT models.
AB - Knowledge tracing (KT) aims to leverage students’ learning histories to estimate their mastery levels on a set of pre-defined skills, based on which the corresponding future performance can be accurately predicted. In practice, a student's learning history comprises answers to sets of massed questions, each known as a session, rather than merely being a sequence of independent answers. Theoretically, within and across these sessions, students’ learning dynamics can be very different. Therefore, how to effectively model the dynamics of students’ knowledge states within and across the sessions is crucial for handling the KT problem. Most existing KT models treat student's learning records as a single continuing sequence, without capturing the sessional shift of students’ knowledge state. To address the above issue, we propose a novel hierarchical transformer model, named HiTSKT, comprises an interaction(-level) encoder to capture the knowledge a student acquires within a session, and a session(-level) encoder to summarize acquired knowledge across the past sessions. To predict an interaction in the current session, a knowledge retriever integrates the summarized past-session knowledge with the previous interactions’ information into proper knowledge representations. These representations are then used to compute the student's current knowledge state. Additionally, to model the student's long-term forgetting behaviour across the sessions, a power-law-decay attention mechanism is designed and deployed in the session encoder, allowing it to emphasize more on the recent sessions. Extensive experiments on four public datasets demonstrate that HiTSKT achieves new state-of-the-art performance on all the datasets compared with seven state-of-the-art KT models.
KW - Educational data mining
KW - Hierarchical transformer
KW - Knowledge tracing
KW - Learner modelling
KW - User behaviour modelling
UR - http://www.scopus.com/inward/record.url?scp=85180536062&partnerID=8YFLogxK
U2 - 10.1016/j.knosys.2023.111300
DO - 10.1016/j.knosys.2023.111300
M3 - Article
AN - SCOPUS:85180536062
SN - 0950-7051
VL - 284
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
M1 - 111300
ER -