SecFormer: Towards Fast and Accurate Privacy-Preserving Inference for Large Language Models

计算机科学 推论 Softmax函数 变压器 语言模型 一套 云计算 建筑 数据挖掘 分布式计算 机器学习 人工智能 深度学习 物理 考古 电压 视觉艺术 艺术 操作系统 历史 量子力学
作者
Jinglong Luo,Yehong Zhang,Jiaqi Zhang,Xin Mu,Hui Wang,Yue Yu,Zenglin Xu
出处
期刊:Cornell University - arXiv
标识
DOI:10.48550/arxiv.2401.00793
摘要

With the growing use of large language models hosted on cloud platforms to offer inference services, privacy concerns are escalating, especially concerning sensitive data like investment plans and bank account details. Secure Multi-Party Computing (SMPC) emerges as a promising solution to protect the privacy of inference data and model parameters. However, the application of SMPC in Privacy-Preserving Inference (PPI) for large language models, particularly those based on the Transformer architecture, often leads to considerable slowdowns or declines in performance. This is largely due to the multitude of nonlinear operations in the Transformer architecture, which are not well-suited to SMPC and difficult to circumvent or optimize effectively. To address this concern, we introduce an advanced optimization framework called SecFormer, to achieve fast and accurate PPI for Transformer models. By implementing model design optimization, we successfully eliminate the high-cost exponential and maximum operations in PPI without sacrificing model performance. Additionally, we have developed a suite of efficient SMPC protocols that utilize segmented polynomials, Fourier series and Goldschmidt's method to handle other complex nonlinear functions within PPI, such as GeLU, LayerNorm, and Softmax. Our extensive experiments reveal that SecFormer outperforms MPCFormer in performance, showing improvements of $5.6\%$ and $24.2\%$ for BERT$_{\text{BASE}}$ and BERT$_{\text{LARGE}}$, respectively. In terms of efficiency, SecFormer is 3.56 and 3.58 times faster than Puma for BERT$_{\text{BASE}}$ and BERT$_{\text{LARGE}}$, demonstrating its effectiveness and speed.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
生技BT完成签到 ,获得积分10
刚刚
2秒前
小鳄鱼一只完成签到,获得积分10
2秒前
金色天际线完成签到,获得积分10
2秒前
山山完成签到 ,获得积分10
3秒前
3秒前
123456完成签到,获得积分10
3秒前
Joker完成签到,获得积分0
4秒前
宁静致远QY完成签到,获得积分10
5秒前
chaiachaic完成签到,获得积分10
6秒前
QQLL完成签到,获得积分10
6秒前
冶金人完成签到,获得积分10
7秒前
离岸完成签到,获得积分10
7秒前
MISSIW完成签到,获得积分10
9秒前
Godric147完成签到 ,获得积分10
9秒前
蜻蜓完成签到,获得积分10
10秒前
昱昱完成签到 ,获得积分10
12秒前
Mu完成签到,获得积分10
12秒前
vivian完成签到 ,获得积分0
12秒前
穆一手完成签到 ,获得积分10
12秒前
wongcheng完成签到,获得积分10
15秒前
安详绿草应助tg2024采纳,获得10
17秒前
落叶完成签到 ,获得积分10
17秒前
天将明完成签到 ,获得积分10
18秒前
Krim完成签到 ,获得积分10
18秒前
nqterysc完成签到,获得积分10
18秒前
ChaiN完成签到,获得积分10
18秒前
森巴小妹完成签到 ,获得积分10
23秒前
端庄代荷完成签到 ,获得积分10
25秒前
曲蔚然完成签到 ,获得积分10
29秒前
TTDY完成签到 ,获得积分0
29秒前
宇文沛岚发布了新的文献求助20
30秒前
冷傲菠萝完成签到 ,获得积分10
30秒前
不安的朋友完成签到,获得积分10
30秒前
ylyao完成签到,获得积分10
32秒前
曾建完成签到 ,获得积分10
32秒前
Jenny应助科研通管家采纳,获得10
32秒前
36456657应助科研通管家采纳,获得10
32秒前
清脆愫完成签到 ,获得积分10
32秒前
Owen应助天天天王采纳,获得10
33秒前
高分求助中
Continuum Thermodynamics and Material Modelling 3000
Production Logging: Theoretical and Interpretive Elements 2700
Mechanistic Modeling of Gas-Liquid Two-Phase Flow in Pipes 2500
Structural Load Modelling and Combination for Performance and Safety Evaluation 800
Conference Record, IAS Annual Meeting 1977 610
Interest Rate Modeling. Volume 3: Products and Risk Management 600
Interest Rate Modeling. Volume 2: Term Structure Models 600
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 基因 遗传学 物理化学 催化作用 量子力学 光电子学 冶金
热门帖子
关注 科研通微信公众号,转发送积分 3555910
求助须知:如何正确求助?哪些是违规求助? 3131507
关于积分的说明 9391334
捐赠科研通 2831220
什么是DOI,文献DOI怎么找? 1556405
邀请新用户注册赠送积分活动 726554
科研通“疑难数据库(出版商)”最低求助积分说明 715890