联营
计算机科学
追踪
编码器
编码
序列(生物学)
卷积(计算机科学)
块(置换群论)
编码(内存)
期限(时间)
人工智能
理论计算机科学
人工神经网络
数学
生物
基因
操作系统
几何学
物理
量子力学
生物化学
遗传学
化学
作者
Lingmei Ai,Xiaoying Zhang,Ximing Hu
标识
DOI:10.1145/3565387.3565420
摘要
Over the past two years, COVID-19 has led to a widespread rise in online education, and knowledge tracing has been used on various educational platforms. However, most existing knowledge tracing models still suffer from long-term dependence. To address this problem, we propose a Multi-head ProbSparse Self-Attention for Knowledge Tracing(MPSKT). Firstly, the temporal convolutional network is used to encode the position information of the input sequence. Then, the Multi-head ProbSparse Self-Attention in the encoder and decoder blocks is used to capture the relationship between the input sequences, and the convolution and pooling layers in the encoder block are used to shorten the length of the input sequence, which greatly reduces the time complexity of the model and better solves the problem of long-term dependence of the model. Finally, experimental results on three public online education datasets demonstrate the effectiveness of our proposed model.
科研通智能强力驱动
Strongly Powered by AbleSci AI