清晨好,您是今天最早来到科研通的研友!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您科研之路漫漫前行!

CIMFormer: A Systolic CIM-Array-Based Transformer Accelerator With Token-Pruning-Aware Attention Reformulating and Principal Possibility Gathering

安全性令牌 变压器 校长(计算机安全) 计算机科学 收缩阵列 嵌入式系统 工程类 计算机安全 电气工程 电压 超大规模集成
作者
Ruiqi Guo,X.L. Chen,Lei Wang,Yang Wang,Hao Sun,Jingchuan Wei,Huiming Han,Leibo Liu,Shaojun Wei,Yang Hu,Shouyi Yin
出处
期刊:IEEE Journal of Solid-state Circuits [Institute of Electrical and Electronics Engineers]
卷期号:: 1-13
标识
DOI:10.1109/jssc.2024.3402174
摘要

Transformer models have achieved impressive performance in various artificial intelligence (AI) applications. However, the high cost of computation and memory footprint make its inference inefficient. Although digital compute-in-memory (CIM) is a promising hardware architecture with high accuracy, Transformer's attention mechanism raises three challenges in the access and computation of CIM: 1) the attention computation involving Query and Key results in massive data movement and under-utilization in CIM macros; 2) the attention computation involving Possibility and Value exhibits plenty of dynamic bit-level sparsity, resulting in redundant bit-serial CIM operations; and 3) the restricted data reload bandwidth in CIM macros results in a significant decrease in performance for large Transformer models. To address these challenges, we design a CIM accelerator called CIM Transformer (CIMFormer) with three corresponding features. First, the token-pruning-aware attention reformulation (TPAR) is a technique that adjusts attention computations according to the token-pruning ratio. This reformulation reduces the real-time access to and under-utilization of CIM macros. Second, the principal possibility gather-scatter scheduler (PPGSS) gathers the possibilities with greater effective bit-width as concurrent inputs to CIM macros, enhancing the efficiency of bit-serial CIM operations. Third, the systolic X $\mid$ W-CIM macro array efficiently handles the execution of large Transformer models that exceed the storage capacity of the on-chip CIM macros. Fabricated in a 28-nm technology, CIMFormer achieves a peak energy efficiency of 15.71 TOPS/W, with an over 1.46 $\times$ improvement compared with the state-of-the-art Transformer accelerator at an equivalent situation.

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
小马甲应助枯藤老柳树采纳,获得10
9秒前
zj完成签到 ,获得积分10
18秒前
shhoing应助科研通管家采纳,获得10
34秒前
34秒前
科研通AI2S应助科研通管家采纳,获得10
34秒前
cadcae完成签到,获得积分10
37秒前
49秒前
秋夜临完成签到,获得积分0
55秒前
1分钟前
Liumingyu发布了新的文献求助10
1分钟前
千帆破浪完成签到 ,获得积分10
1分钟前
shhoing应助羞涩的妙菱采纳,获得10
1分钟前
1分钟前
WGX完成签到 ,获得积分10
1分钟前
master-f完成签到 ,获得积分10
1分钟前
羞涩的妙菱完成签到,获得积分10
1分钟前
1分钟前
端庄半凡完成签到 ,获得积分10
1分钟前
小丁发布了新的文献求助50
1分钟前
yang923完成签到 ,获得积分10
1分钟前
shhoing应助羞涩的妙菱采纳,获得10
2分钟前
Anlocia完成签到 ,获得积分10
2分钟前
2分钟前
ldy完成签到 ,获得积分10
2分钟前
woxinyouyou完成签到,获得积分0
2分钟前
2分钟前
2分钟前
科研通AI2S应助科研通管家采纳,获得10
2分钟前
发个15分的完成签到 ,获得积分10
2分钟前
cxy完成签到 ,获得积分10
3分钟前
JamesPei应助Liumingyu采纳,获得10
3分钟前
3分钟前
3分钟前
3分钟前
3分钟前
Liumingyu发布了新的文献求助10
3分钟前
老石完成签到 ,获得积分10
3分钟前
NexusExplorer应助枯藤老柳树采纳,获得10
4分钟前
田様应助神秘猎牛人采纳,获得10
4分钟前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
List of 1,091 Public Pension Profiles by Region 1581
以液相層析串聯質譜法分析糖漿產品中活性雙羰基化合物 / 吳瑋元[撰] = Analysis of reactive dicarbonyl species in syrup products by LC-MS/MS / Wei-Yuan Wu 1000
Biology of the Reptilia. Volume 21. Morphology I. The Skull and Appendicular Locomotor Apparatus of Lepidosauria 600
The Scope of Slavic Aspect 600
Foregrounding Marking Shift in Sundanese Written Narrative Segments 600
Rousseau, le chemin de ronde 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 物理化学 基因 遗传学 催化作用 冶金 量子力学 光电子学
热门帖子
关注 科研通微信公众号,转发送积分 5539063
求助须知:如何正确求助?哪些是违规求助? 4625935
关于积分的说明 14597077
捐赠科研通 4566709
什么是DOI,文献DOI怎么找? 2503520
邀请新用户注册赠送积分活动 1481524
关于科研通互助平台的介绍 1452982