计算机科学
追踪
蒸馏
人工智能
人机交互
色谱法
程序设计语言
化学
作者
Yan Yu,Zheng Guan,Xue Wang,Yanyou Wei,Zhijun Yang
标识
DOI:10.1109/eebda60612.2024.10486040
摘要
Knowledge tracing models capture student knowledge status through student learning records and predict student performance in the future. Currently popular deep knowledge tracing models are labeled by the rightness or wrongness of a student's answer, ignoring the impact of information such as scores, question difficulty, and individual ability, and using 0 and 1 as labels is too absolute. To this problem, we propose a knowledge tracing model (KDKT) that utilizes knowledge distillation to provide soft labels. The model uses the IRT model as a teacher model to provide interpretable parameters, and then the Rasch measurements to calculate the student's ability value for the corresponding question as soft labels. Secondly, the question information is labeled as difficulty to do self-attention embedded pre-training, and then combined with the answer information is input into the long and short-term memory (LSTM) network for prediction. Enriching the embedding of question information while ensuring that sequential information is not compromised. We conducted numerous experiments on three public benchmark datasets and the results show that our model outperforms other classical knowledge tracing models.
科研通智能强力驱动
Strongly Powered by AbleSci AI