FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal Heterogeneous Federated Learning

基础(证据) 计算机科学 情态动词 人工智能 分布式计算 计算机体系结构 材料科学 复合材料 政治学 法学
作者
Haokun Chen,Yao Zhang,Denis Krompaß,Jindong Gu,Volker Tresp
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence [Association for the Advancement of Artificial Intelligence (AAAI)]
卷期号:38 (10): 11285-11293 被引量:3
标识
DOI:10.1609/aaai.v38i10.29007
摘要

Recently, foundation models have exhibited remarkable advancements in multi-modal learning. These models, equipped with millions (or billions) of parameters, typically require a substantial amount of data for finetuning. However, collecting and centralizing training data from diverse sectors becomes challenging due to distinct privacy regulations. Federated Learning (FL) emerges as a promising solution, enabling multiple clients to collaboratively train neural networks without centralizing their local data. To alleviate client computation burdens and communication overheads, previous works have adapted Parameter-efficient Finetuning (PEFT) methods for FL. Hereby, only a small fraction of the model parameters are optimized and communicated during federated communications. Nevertheless, most previous works have focused on a single modality and neglected one common phenomenon, i.e., the presence of data heterogeneity across the clients. Therefore, in this work, we propose a finetuning framework tailored to heterogeneous multi-modal FL, called Federated Dual-Aadapter Teacher (FedDAT). Specifically, our approach leverages a Dual-Adapter Teacher (DAT) to address data heterogeneity by regularizing the client local updates and applying Mutual Knowledge Distillation (MKD) for an efficient knowledge transfer. FedDAT is the first approach that enables an efficient distributed finetuning of foundation models for a variety of heterogeneous Vision-Language tasks. To demonstrate its effectiveness, we conduct extensive experiments on four multi-modality FL benchmarks with different types of data heterogeneity, where FedDAT substantially outperforms the existing centralized PEFT methods adapted for FL.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
王6完成签到,获得积分10
刚刚
1秒前
西瓜完成签到,获得积分10
1秒前
anthony发布了新的文献求助10
1秒前
张雨桐发布了新的文献求助10
2秒前
2秒前
soob发布了新的文献求助10
2秒前
NexusExplorer应助ali采纳,获得10
4秒前
22233完成签到,获得积分20
4秒前
5秒前
5秒前
方法完成签到,获得积分10
5秒前
吕邓宏完成签到 ,获得积分10
6秒前
科研通AI6.3应助郭大侠采纳,获得30
7秒前
lamb发布了新的文献求助10
7秒前
onmyway完成签到,获得积分10
7秒前
脑洞疼应助笑鱼采纳,获得10
9秒前
李健的小迷弟应助冰河采纳,获得10
10秒前
风趣问蕊发布了新的文献求助30
10秒前
楼一笑发布了新的文献求助10
10秒前
peipei发布了新的文献求助10
10秒前
ltttttt完成签到,获得积分20
11秒前
11秒前
快乐水发布了新的文献求助10
11秒前
12秒前
科目三应助科研通管家采纳,获得10
12秒前
12秒前
英姑应助科研通管家采纳,获得10
12秒前
香蕉觅云应助科研通管家采纳,获得10
12秒前
华仔应助科研通管家采纳,获得10
12秒前
CipherSage应助科研通管家采纳,获得10
12秒前
orixero应助科研通管家采纳,获得10
12秒前
小二郎应助科研通管家采纳,获得10
13秒前
共享精神应助科研通管家采纳,获得10
13秒前
顾矜应助科研通管家采纳,获得10
13秒前
桐桐应助科研通管家采纳,获得10
13秒前
13秒前
失眠的霸完成签到,获得积分10
13秒前
13秒前
科目三应助科研通管家采纳,获得10
13秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
The Wiley Blackwell Companion to Diachronic and Historical Linguistics 3000
The impact of workplace variables on juvenile probation officers’ job satisfaction 1000
When the badge of honor holds no meaning anymore 1000
HANDBOOK OF CHEMISTRY AND PHYSICS 106th edition 1000
ASPEN Adult Nutrition Support Core Curriculum, Fourth Edition 1000
AnnualResearch andConsultation Report of Panorama survey and Investment strategy onChinaIndustry 1000
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6280761
求助须知:如何正确求助?哪些是违规求助? 8099823
关于积分的说明 16934380
捐赠科研通 5348226
什么是DOI,文献DOI怎么找? 2842928
邀请新用户注册赠送积分活动 1820293
关于科研通互助平台的介绍 1677197