Agent Attention: On the Integration of Softmax and Linear Attention

Softmax函数 计算机科学 变压器 骨料(复合) 人工智能 背景(考古学) 人工神经网络 量子力学 生物 物理 古生物学 复合材料 电压 材料科学
作者
Dongchen Han,Tianzhu Ye,Yizeng Han,Zhuofan Xia,Shiji Song,Gao Huang
出处
期刊:Cornell University - arXiv 被引量:4
标识
DOI:10.48550/arxiv.2312.08874
摘要

The attention module is the key component in Transformers. While the global attention mechanism offers high expressiveness, its excessive computational cost restricts its applicability in various scenarios. In this paper, we propose a novel attention paradigm, Agent Attention, to strike a favorable balance between computational efficiency and representation power. Specifically, the Agent Attention, denoted as a quadruple $(Q, A, K, V)$, introduces an additional set of agent tokens $A$ into the conventional attention module. The agent tokens first act as the agent for the query tokens $Q$ to aggregate information from $K$ and $V$, and then broadcast the information back to $Q$. Given the number of agent tokens can be designed to be much smaller than the number of query tokens, the agent attention is significantly more efficient than the widely adopted Softmax attention, while preserving global context modelling capability. Interestingly, we show that the proposed agent attention is equivalent to a generalized form of linear attention. Therefore, agent attention seamlessly integrates the powerful Softmax attention and the highly efficient linear attention. Extensive experiments demonstrate the effectiveness of agent attention with various vision Transformers and across diverse vision tasks, including image classification, object detection, semantic segmentation and image generation. Notably, agent attention has shown remarkable performance in high-resolution scenarios, owning to its linear attention nature. For instance, when applied to Stable Diffusion, our agent attention accelerates generation and substantially enhances image generation quality without any additional training. Code is available at https://github.com/LeapLabTHU/Agent-Attention.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
Brave_1完成签到 ,获得积分10
刚刚
8R60d8应助学术小黄采纳,获得10
1秒前
南宫萍完成签到,获得积分10
1秒前
1秒前
1秒前
小苔藓发布了新的文献求助10
1秒前
1秒前
2秒前
2秒前
快乐银耳汤应助FFF采纳,获得10
2秒前
shelly0621完成签到,获得积分10
2秒前
科研通AI5应助FFF采纳,获得10
2秒前
yyang完成签到,获得积分10
2秒前
穆思柔完成签到,获得积分10
3秒前
3秒前
4秒前
4秒前
脑洞疼应助Xu采纳,获得10
4秒前
4秒前
4秒前
4秒前
4秒前
Dddd发布了新的文献求助10
5秒前
xx完成签到,获得积分20
5秒前
BEIBEI完成签到,获得积分10
5秒前
liyi发布了新的文献求助10
5秒前
苗条的山晴完成签到,获得积分10
5秒前
6秒前
mm完成签到,获得积分10
7秒前
JUll发布了新的文献求助10
7秒前
无奈抽屉完成签到 ,获得积分10
7秒前
7秒前
8秒前
风中的夏兰完成签到,获得积分10
8秒前
czt完成签到,获得积分10
8秒前
研友_nPPERn发布了新的文献求助10
8秒前
9秒前
温柔若发布了新的文献求助10
9秒前
ry发布了新的文献求助10
9秒前
gms发布了新的文献求助10
9秒前
高分求助中
Continuum Thermodynamics and Material Modelling 3000
Production Logging: Theoretical and Interpretive Elements 2700
Social media impact on athlete mental health: #RealityCheck 1020
Ensartinib (Ensacove) for Non-Small Cell Lung Cancer 1000
Unseen Mendieta: The Unpublished Works of Ana Mendieta 1000
Bacterial collagenases and their clinical applications 800
El viaje de una vida: Memorias de María Lecea 800
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 基因 遗传学 物理化学 催化作用 量子力学 光电子学 冶金
热门帖子
关注 科研通微信公众号,转发送积分 3527469
求助须知:如何正确求助?哪些是违规求助? 3107497
关于积分的说明 9285892
捐赠科研通 2805298
什么是DOI,文献DOI怎么找? 1539865
邀请新用户注册赠送积分活动 716714
科研通“疑难数据库(出版商)”最低求助积分说明 709678