LOGIC: LLM-originated guidance for internal cognitive improvement of small language models in stance detection

计算机科学 推论 任务(项目管理) 语言模型 人工智能 认知 过程(计算) 机器学习 自然语言处理 心理学 工程类 程序设计语言 系统工程 神经科学
作者
Woojin Lee,J.-J. Lee,Harksoo Kim
出处
期刊:PeerJ [PeerJ, Inc.]
卷期号:10: e2585-e2585
标识
DOI:10.7717/peerj-cs.2585
摘要

Stance detection is a critical task in natural language processing that determines an author’s viewpoint toward a specific target, playing a pivotal role in social science research and various applications. Traditional approaches incorporating Wikipedia-sourced data into small language models (SLMs) to compensate for limited target knowledge often suffer from inconsistencies in article quality and length due to the diverse pool of Wikipedia contributors. To address these limitations, we utilize large language models (LLMs) pretrained on expansive datasets to generate accurate and contextually relevant target knowledge. By providing concise, real-world insights tailored to the stance detection task, this approach surpasses the limitations of Wikipedia-based information. Despite their superior reasoning capabilities, LLMs are computationally intensive and challenging to deploy on smaller devices. To mitigate these drawbacks, we introduce a reasoning distillation methodology that transfers the reasoning capabilities of LLMs to more compact SLMs, enhancing their efficiency while maintaining robust performance. Our stance detection model, LOGIC (LLM-Originated Guidance for Internal Cognitive improvement of small language models in stance detection), is built on Bidirectional and Auto-Regressive Transformer (BART) and fine-tuned with auxiliary learning tasks, including reasoning distillation. By incorporating LLM-generated target knowledge into the inference process, LOGIC achieves state-of-the-art performance on the VAried Stance Topics (VAST) dataset, outperforming advanced models like GPT-3.5 Turbo and GPT-4 Turbo in stance detection tasks.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
傅予菲完成签到,获得积分10
3秒前
月林旭发布了新的文献求助10
4秒前
7秒前
CodeCraft应助大气糖豆采纳,获得100
10秒前
wanghuan发布了新的文献求助10
12秒前
隐形曼青应助kiltorh采纳,获得10
12秒前
逝去的永小恒完成签到,获得积分10
13秒前
13秒前
16秒前
19秒前
zhentg发布了新的文献求助30
20秒前
21秒前
奔跑的黄油小熊完成签到 ,获得积分10
23秒前
Irene完成签到,获得积分10
23秒前
23秒前
Miyo发布了新的文献求助10
23秒前
bkagyin应助budingman采纳,获得20
23秒前
houjibofa完成签到 ,获得积分10
24秒前
wanci应助上好佳采纳,获得10
24秒前
凉风送信完成签到,获得积分10
25秒前
yar应助WEI采纳,获得10
25秒前
25秒前
王路飞发布了新的文献求助10
28秒前
29秒前
30秒前
可爱的函函应助谢博敦采纳,获得10
30秒前
大气糖豆发布了新的文献求助100
32秒前
32秒前
小确幸完成签到,获得积分10
32秒前
大林完成签到,获得积分20
33秒前
hhx发布了新的文献求助10
33秒前
王路飞完成签到,获得积分10
34秒前
yunwen发布了新的文献求助10
34秒前
含蓄文博完成签到 ,获得积分10
34秒前
35秒前
35秒前
clyhg发布了新的文献求助10
36秒前
36秒前
lh完成签到,获得积分10
36秒前
大林发布了新的文献求助20
37秒前
高分求助中
A new approach to the extrapolation of accelerated life test data 1000
Cognitive Neuroscience: The Biology of the Mind 1000
Technical Brochure TB 814: LPIT applications in HV gas insulated switchgear 1000
Immigrant Incorporation in East Asian Democracies 600
Nucleophilic substitution in azasydnone-modified dinitroanisoles 500
不知道标题是什么 500
A Preliminary Study on Correlation Between Independent Components of Facial Thermal Images and Subjective Assessment of Chronic Stress 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3966882
求助须知:如何正确求助?哪些是违规求助? 3512358
关于积分的说明 11162837
捐赠科研通 3247220
什么是DOI,文献DOI怎么找? 1793752
邀请新用户注册赠送积分活动 874602
科研通“疑难数据库(出版商)”最低求助积分说明 804432