EBERT: A lightweight expression-enhanced large-scale pre-trained language model for mathematics education

表达式(计算机科学) 比例(比率) 数学教育 计算机科学 心理学 程序设计语言 地理 地图学
作者
Zhiyi Duan,Hengnian Gu,Ke Yuan,Dongdai Zhou
出处
期刊:Knowledge Based Systems [Elsevier]
卷期号:300: 112118-112118
标识
DOI:10.1016/j.knosys.2024.112118
摘要

Within the realm of mathematics education, there exist several challenging supervised tasks that educators and researchers encounter, such as question difficulty prediction and mathematical expression understanding. To address these challenges, researchers have introduced unsupervised pre-trained models specifically tailored for mathematics education, yielding promising outcomes. However, the existing literature fails to consider the domain-specific characteristics of mathematics, particularly the structural features in pre-trained corpora and extensive expressions, which makes them costly expensive and time-consuming. To tackle this problem, we propose a lightweight expression-enhanced large-scale pre-trained language model, called EBERT, for mathematics education. Specifically, we select a large number of expression-enriched exercises to further pre-train the original BERT. To depict the inherent structural features existed in expressions, the initial step involves the creation of an Operator Tree for each expression. Subsequently, each exercise is transformed into a corresponding Question&Answer tree (QAT) to serve as the model input. Notably, to ensure the preservation of semantic integrity within the QAT, a specialized Expression Enhanced Matrix is devised to confine the visibility of individual tokens. Additionally, a new pre-training task, referred to as Question&Answer Matching, is introduced to capture exercise-related structural information at the semantic level. Through three downstream tasks in mathematical education, we prove that EBERT outperforms several state-of-the-art baselines (such as MathBERT and GPT-3) in terms of ACC and F1-score.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
杨xy发布了新的文献求助10
刚刚
和谐板栗完成签到 ,获得积分10
刚刚
Emma应助LHD采纳,获得10
1秒前
RIXI完成签到,获得积分10
1秒前
斯文谷秋发布了新的文献求助10
1秒前
把的蛮耐得烦完成签到,获得积分10
1秒前
2秒前
Emma应助姚琳采纳,获得10
2秒前
6543210完成签到,获得积分10
3秒前
4秒前
都会完成签到 ,获得积分10
8秒前
陈庆学发布了新的文献求助10
9秒前
柯梦完成签到,获得积分10
9秒前
动听山芙发布了新的文献求助10
9秒前
科研通AI2S应助www采纳,获得10
10秒前
PLT完成签到,获得积分10
10秒前
123456完成签到,获得积分10
11秒前
sidegate应助人文采纳,获得10
11秒前
11秒前
JamesPei应助科研通管家采纳,获得10
11秒前
小二郎应助科研通管家采纳,获得10
11秒前
赘婿应助科研通管家采纳,获得10
11秒前
闪闪的YOSH完成签到,获得积分10
12秒前
香蕉觅云应助伈X采纳,获得10
15秒前
动听山芙完成签到,获得积分10
16秒前
HYHY发布了新的文献求助10
16秒前
SCUTnwj完成签到,获得积分10
17秒前
18秒前
寻绿完成签到,获得积分10
19秒前
小满完成签到,获得积分10
20秒前
Zyyyy完成签到,获得积分10
20秒前
qq完成签到 ,获得积分10
22秒前
23秒前
iNk应助monster采纳,获得10
23秒前
潘润朗发布了新的文献求助10
24秒前
27秒前
陈庆学完成签到 ,获得积分10
29秒前
29秒前
Zyq发布了新的文献求助10
30秒前
斯文败类应助吴锦珑采纳,获得10
32秒前
高分求助中
Lire en communiste 1000
Ore genesis in the Zambian Copperbelt with particular reference to the northern sector of the Chambishi basin 800
Becoming: An Introduction to Jung's Concept of Individuation 600
中国氢能技术发展路线图研究 500
Communist propaganda: a fact book, 1957-1958 500
Briefe aus Shanghai 1946‒1952 (Dokumente eines Kulturschocks) 500
A new species of Coccus (Homoptera: Coccoidea) from Malawi 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3168440
求助须知:如何正确求助?哪些是违规求助? 2819772
关于积分的说明 7927799
捐赠科研通 2479687
什么是DOI,文献DOI怎么找? 1321073
科研通“疑难数据库(出版商)”最低求助积分说明 632958
版权声明 602463