EBERT: A lightweight expression-enhanced large-scale pre-trained language model for mathematics education

表达式(计算机科学) 比例(比率) 数学教育 计算机科学 心理学 程序设计语言 地理 地图学
作者
Zhiyi Duan,Hengnian Gu,Ke Yuan,Dongdai Zhou
出处
期刊:Knowledge Based Systems [Elsevier]
卷期号:300: 112118-112118 被引量:1
标识
DOI:10.1016/j.knosys.2024.112118
摘要

Within the realm of mathematics education, there exist several challenging supervised tasks that educators and researchers encounter, such as question difficulty prediction and mathematical expression understanding. To address these challenges, researchers have introduced unsupervised pre-trained models specifically tailored for mathematics education, yielding promising outcomes. However, the existing literature fails to consider the domain-specific characteristics of mathematics, particularly the structural features in pre-trained corpora and extensive expressions, which makes them costly expensive and time-consuming. To tackle this problem, we propose a lightweight expression-enhanced large-scale pre-trained language model, called EBERT, for mathematics education. Specifically, we select a large number of expression-enriched exercises to further pre-train the original BERT. To depict the inherent structural features existed in expressions, the initial step involves the creation of an Operator Tree for each expression. Subsequently, each exercise is transformed into a corresponding Question&Answer tree (QAT) to serve as the model input. Notably, to ensure the preservation of semantic integrity within the QAT, a specialized Expression Enhanced Matrix is devised to confine the visibility of individual tokens. Additionally, a new pre-training task, referred to as Question&Answer Matching, is introduced to capture exercise-related structural information at the semantic level. Through three downstream tasks in mathematical education, we prove that EBERT outperforms several state-of-the-art baselines (such as MathBERT and GPT-3) in terms of ACC and F1-score.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
量子星尘发布了新的文献求助10
1秒前
1秒前
1秒前
烂漫的飞松完成签到,获得积分10
1秒前
1秒前
xiaoyuanbao1988完成签到,获得积分10
2秒前
2秒前
2秒前
3秒前
张杰发布了新的文献求助10
3秒前
科研通AI2S应助Sky采纳,获得10
3秒前
科研通AI6应助我叫XXXXXXX采纳,获得10
4秒前
斗转星移发布了新的文献求助10
4秒前
早睡完成签到 ,获得积分10
5秒前
敏哇哇哇发布了新的文献求助10
5秒前
vv发布了新的文献求助10
5秒前
小苏发布了新的文献求助10
6秒前
小王发布了新的文献求助10
6秒前
Derun发布了新的文献求助10
6秒前
Hello应助看文献了采纳,获得10
7秒前
liherong完成签到,获得积分10
7秒前
8秒前
高高断秋发布了新的文献求助10
9秒前
怕孤单应助科研通管家采纳,获得10
9秒前
科研通AI2S应助科研通管家采纳,获得10
9秒前
烟花应助科研通管家采纳,获得10
9秒前
科研通AI2S应助科研通管家采纳,获得10
9秒前
浮游应助科研通管家采纳,获得10
9秒前
Twonej应助科研通管家采纳,获得30
9秒前
大模型应助科研通管家采纳,获得10
9秒前
9秒前
慕青应助科研通管家采纳,获得10
9秒前
Twonej应助科研通管家采纳,获得30
9秒前
didi完成签到,获得积分10
9秒前
浮游应助科研通管家采纳,获得10
10秒前
今后应助科研通管家采纳,获得10
10秒前
Twonej应助科研通管家采纳,获得30
10秒前
科研小白完成签到 ,获得积分10
10秒前
asdfzxcv应助科研通管家采纳,获得10
10秒前
10秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Binary Alloy Phase Diagrams, 2nd Edition 8000
Encyclopedia of Reproduction Third Edition 3000
Comprehensive Methanol Science Production, Applications, and Emerging Technologies 2000
From Victimization to Aggression 1000
Exosomes Pipeline Insight, 2025 500
Red Book: 2024–2027 Report of the Committee on Infectious Diseases 500
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5649011
求助须知:如何正确求助?哪些是违规求助? 4777097
关于积分的说明 15046363
捐赠科研通 4807843
什么是DOI,文献DOI怎么找? 2571160
邀请新用户注册赠送积分活动 1527756
关于科研通互助平台的介绍 1486683