亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

EBERT: A lightweight expression-enhanced large-scale pre-trained language model for mathematics education

表达式(计算机科学) 比例(比率) 数学教育 计算机科学 心理学 程序设计语言 地理 地图学
作者
Zhiyi Duan,Hengnian Gu,Ke Yuan,Dongdai Zhou
出处
期刊:Knowledge Based Systems [Elsevier BV]
卷期号:300: 112118-112118 被引量:1
标识
DOI:10.1016/j.knosys.2024.112118
摘要

Within the realm of mathematics education, there exist several challenging supervised tasks that educators and researchers encounter, such as question difficulty prediction and mathematical expression understanding. To address these challenges, researchers have introduced unsupervised pre-trained models specifically tailored for mathematics education, yielding promising outcomes. However, the existing literature fails to consider the domain-specific characteristics of mathematics, particularly the structural features in pre-trained corpora and extensive expressions, which makes them costly expensive and time-consuming. To tackle this problem, we propose a lightweight expression-enhanced large-scale pre-trained language model, called EBERT, for mathematics education. Specifically, we select a large number of expression-enriched exercises to further pre-train the original BERT. To depict the inherent structural features existed in expressions, the initial step involves the creation of an Operator Tree for each expression. Subsequently, each exercise is transformed into a corresponding Question&Answer tree (QAT) to serve as the model input. Notably, to ensure the preservation of semantic integrity within the QAT, a specialized Expression Enhanced Matrix is devised to confine the visibility of individual tokens. Additionally, a new pre-training task, referred to as Question&Answer Matching, is introduced to capture exercise-related structural information at the semantic level. Through three downstream tasks in mathematical education, we prove that EBERT outperforms several state-of-the-art baselines (such as MathBERT and GPT-3) in terms of ACC and F1-score.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
慕青应助科研通管家采纳,获得10
9秒前
打打应助科研通管家采纳,获得10
9秒前
研友_VZG7GZ应助科研通管家采纳,获得10
9秒前
暴躁的忆丹完成签到,获得积分10
19秒前
共享精神应助美味肉蟹煲采纳,获得10
1分钟前
1分钟前
吃紫薯的鱼完成签到,获得积分10
1分钟前
乐观慕山发布了新的文献求助10
1分钟前
1分钟前
FX1688完成签到 ,获得积分10
2分钟前
mortal发布了新的文献求助10
2分钟前
2分钟前
小马甲应助科研通管家采纳,获得10
2分钟前
英俊的铭应助科研通管家采纳,获得10
2分钟前
2分钟前
2分钟前
2分钟前
yunny发布了新的文献求助150
2分钟前
mortal完成签到,获得积分10
2分钟前
2分钟前
脑残骑士老张完成签到,获得积分10
2分钟前
2分钟前
xmsyq完成签到 ,获得积分10
3分钟前
jimmy_bytheway完成签到,获得积分0
3分钟前
3分钟前
islazheng发布了新的文献求助20
3分钟前
十八完成签到 ,获得积分10
3分钟前
科研17完成签到,获得积分10
3分钟前
guangshuang完成签到 ,获得积分10
3分钟前
科研通AI5应助sun采纳,获得10
3分钟前
caca完成签到,获得积分0
3分钟前
3分钟前
量子星尘发布了新的文献求助10
3分钟前
sun发布了新的文献求助10
4分钟前
星辰大海应助sun采纳,获得10
5分钟前
5分钟前
5分钟前
cqbrain123完成签到,获得积分10
5分钟前
sun发布了新的文献求助10
5分钟前
噜啦啦完成签到 ,获得积分10
5分钟前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Einführung in die Rechtsphilosophie und Rechtstheorie der Gegenwart 1500
NMR in Plants and Soils: New Developments in Time-domain NMR and Imaging 600
Electrochemistry: Volume 17 600
Physical Chemistry: How Chemistry Works 500
SOLUTIONS Adhesive restoration techniques restorative and integrated surgical procedures 500
Energy-Size Reduction Relationships In Comminution 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 内科学 生物化学 物理 计算机科学 纳米技术 遗传学 基因 复合材料 化学工程 物理化学 病理 催化作用 免疫学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 4952327
求助须知:如何正确求助?哪些是违规求助? 4215067
关于积分的说明 13110992
捐赠科研通 3996934
什么是DOI,文献DOI怎么找? 2187720
邀请新用户注册赠送积分活动 1202971
关于科研通互助平台的介绍 1115712