Multi-layer features ablation of BERT model and its application in stock trend prediction

计算机科学 滑动窗口协议 编码器 变压器 人工智能 库存(枪支) 语言模型 机器学习 窗口(计算) 万维网 量子力学 机械工程 操作系统 物理 工程类 电压
作者
Feng Zhao,Xinning Li,Yating Gao,Ying Li,Zhiquan Feng,Caiming Zhang
出处
期刊:Expert Systems With Applications [Elsevier BV]
卷期号:207: 117958-117958
标识
DOI:10.1016/j.eswa.2022.117958
摘要

Stock comments published by experts are important references for accurate stock trends prediction. How to comprehensively and accurately capture the topic of expert stock comments is an important issue which belongs to text classification . The Bidirectional Encoder Representations from Transformers (BERT) pretrained language model is widely used for text classification , due to its high identification accuracy. However, BERT has some limitations. First , it only utilizes fixed length text, leading to suboptimal performance in long text information exploration. Second , it only relies on the features extracted from the last layer, resulting in incomprehensive classification features. To tackle these issues, we propose a multi-layer features ablation study of BERT model for accurate identification of stock comments’ themes. Specifically, we firstly divide the original text to meet the length requirement of the BERT model based on sliding window technology. In this way, we can enlarge the sample size which is beneficial for reducing the over-fitting problem. At the same time, by dividing the long text into multiple short texts, all the information of the long text can be comprehensively captured through the synthesis of the subject information of multiple short texts. In addition , we extract the output features of each layer in the BERT model and apply the ablation strategy to extract more effective information in these features. Experimental results demonstrate that compared with non-intercepted comments, the topic recognition accuracy is improved by intercepting stock comments based on sliding window technology. It proves that intercepting text can improve the performance of text classification. Compared with the BERT, the multi-layer features ablation study we present in the paper further improves the performance in the topic recognition of stock comments, and can provide reference for the majority of investors. Our study has better performance and practicability on stock trend prediction by stock comments topic recognition.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
panbl451245发布了新的文献求助10
1秒前
慕容雅柏完成签到 ,获得积分10
1秒前
风轻云淡完成签到,获得积分20
1秒前
QQ完成签到,获得积分10
3秒前
libob发布了新的文献求助10
3秒前
3秒前
开放世界完成签到,获得积分10
4秒前
xiaolei完成签到 ,获得积分10
5秒前
小豆芽发布了新的文献求助20
7秒前
9秒前
nina发布了新的文献求助10
9秒前
9秒前
小蘑菇应助Youatpome采纳,获得10
12秒前
天天快乐应助岁岁平安采纳,获得10
14秒前
牛马发布了新的文献求助10
14秒前
wenji完成签到,获得积分10
15秒前
ssyl34发布了新的文献求助10
15秒前
17秒前
orixero应助小野菌采纳,获得10
18秒前
悦悦发布了新的文献求助10
19秒前
21秒前
21秒前
ZiPen完成签到,获得积分20
21秒前
Owen应助渊思采纳,获得10
22秒前
清爽乐菱应助lunjianchi采纳,获得30
23秒前
该用户已存在完成签到,获得积分20
23秒前
24秒前
wangrswjx完成签到,获得积分10
24秒前
guo完成签到,获得积分0
24秒前
售后延长完成签到 ,获得积分10
25秒前
无聊的凉面完成签到,获得积分10
26秒前
26秒前
26秒前
bernie1023发布了新的文献求助10
27秒前
lyric应助饼藏采纳,获得10
27秒前
搜集达人应助panbl451245采纳,获得10
28秒前
lili发布了新的文献求助10
29秒前
30秒前
向阳完成签到,获得积分10
31秒前
32秒前
高分求助中
A new approach to the extrapolation of accelerated life test data 1000
Cognitive Neuroscience: The Biology of the Mind 1000
Technical Brochure TB 814: LPIT applications in HV gas insulated switchgear 1000
Immigrant Incorporation in East Asian Democracies 600
Nucleophilic substitution in azasydnone-modified dinitroanisoles 500
不知道标题是什么 500
A Preliminary Study on Correlation Between Independent Components of Facial Thermal Images and Subjective Assessment of Chronic Stress 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3967722
求助须知:如何正确求助?哪些是违规求助? 3512889
关于积分的说明 11165380
捐赠科研通 3247919
什么是DOI,文献DOI怎么找? 1794067
邀请新用户注册赠送积分活动 874836
科研通“疑难数据库(出版商)”最低求助积分说明 804578