Evaluating the Zero-shot Robustness of Instruction-tuned Language Models

计算机科学 稳健性(进化) 差异(会计) 嵌入 要价 人工智能 自然语言处理 生物化学 基因 会计 经济 业务 经济 化学
作者
Jiuding Sun,Chantal Shaib,Byron Wallace
出处
期刊:Cornell University - arXiv 被引量:2
标识
DOI:10.48550/arxiv.2306.11270
摘要

Instruction fine-tuning has recently emerged as a promising approach for improving the zero-shot capabilities of Large Language Models (LLMs) on new tasks. This technique has shown particular strength in improving the performance of modestly sized LLMs, sometimes inducing performance competitive with much larger model variants. In this paper we ask two questions: (1) How sensitive are instruction-tuned models to the particular phrasings of instructions, and, (2) How can we make them more robust to such natural language variation? To answer the former, we collect a set of 319 instructions manually written by NLP practitioners for over 80 unique tasks included in widely used benchmarks, and we evaluate the variance and average performance of these instructions as compared to instruction phrasings observed during instruction fine-tuning. We find that using novel (unobserved) but appropriate instruction phrasings consistently degrades model performance, sometimes substantially so. Further, such natural instructions yield a wide variance in downstream performance, despite their semantic equivalence. Put another way, instruction-tuned models are not especially robust to instruction re-phrasings. We propose a simple method to mitigate this issue by introducing ``soft prompt'' embedding parameters and optimizing these to maximize the similarity between representations of semantically equivalent instructions. We show that this method consistently improves the robustness of instruction-tuned models.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
酷波er应助姜老师采纳,获得10
1秒前
CT完成签到,获得积分10
1秒前
Hello应助罗鸯鸯采纳,获得10
1秒前
1秒前
1秒前
1秒前
2秒前
2秒前
杨纯宇完成签到,获得积分10
2秒前
洋山芋发布了新的文献求助10
2秒前
布布发布了新的文献求助10
3秒前
just_cook完成签到,获得积分10
3秒前
stars发布了新的文献求助10
3秒前
3秒前
哒哒哒完成签到,获得积分10
3秒前
打打应助轨迹采纳,获得10
4秒前
Wsh完成签到,获得积分10
4秒前
科研通AI6.1应助zaizai234采纳,获得10
4秒前
langzhiquan发布了新的文献求助10
4秒前
5秒前
Samming完成签到 ,获得积分10
5秒前
ming830发布了新的文献求助30
5秒前
5秒前
核潜艇很优秀应助Nitr0ce1L采纳,获得10
5秒前
XXPP完成签到,获得积分10
6秒前
Hey完成签到 ,获得积分10
6秒前
佳佳佳完成签到,获得积分10
7秒前
ren完成签到,获得积分10
7秒前
林三发布了新的文献求助10
7秒前
22233发布了新的文献求助10
7秒前
peiwenjing发布了新的文献求助10
7秒前
呵呵呵呵完成签到,获得积分10
8秒前
ZWQ发布了新的文献求助10
8秒前
Pangolin完成签到,获得积分10
8秒前
thth完成签到,获得积分10
9秒前
布比卡因完成签到,获得积分10
10秒前
snowman应助hhhhh采纳,获得10
10秒前
10秒前
langzhiquan完成签到,获得积分10
11秒前
高分求助中
The Wiley Blackwell Companion to Diachronic and Historical Linguistics 3000
HANDBOOK OF CHEMISTRY AND PHYSICS 106th edition 1000
ASPEN Adult Nutrition Support Core Curriculum, Fourth Edition 1000
Decentring Leadership 800
Signals, Systems, and Signal Processing 610
脑电大模型与情感脑机接口研究--郑伟龙 500
Genera Orchidacearum Volume 4: Epidendroideae, Part 1 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6288091
求助须知:如何正确求助?哪些是违规求助? 8106771
关于积分的说明 16957879
捐赠科研通 5353051
什么是DOI,文献DOI怎么找? 2844680
邀请新用户注册赠送积分活动 1821869
关于科研通互助平台的介绍 1678089