FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal Heterogeneous Federated Learning

基础(证据) 计算机科学 情态动词 人工智能 分布式计算 计算机体系结构 材料科学 复合材料 政治学 法学
作者
Haokun Chen,Yao Zhang,Denis Krompaß,Jindong Gu,Volker Tresp
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence [Association for the Advancement of Artificial Intelligence (AAAI)]
卷期号:38 (10): 11285-11293 被引量:3
标识
DOI:10.1609/aaai.v38i10.29007
摘要

Recently, foundation models have exhibited remarkable advancements in multi-modal learning. These models, equipped with millions (or billions) of parameters, typically require a substantial amount of data for finetuning. However, collecting and centralizing training data from diverse sectors becomes challenging due to distinct privacy regulations. Federated Learning (FL) emerges as a promising solution, enabling multiple clients to collaboratively train neural networks without centralizing their local data. To alleviate client computation burdens and communication overheads, previous works have adapted Parameter-efficient Finetuning (PEFT) methods for FL. Hereby, only a small fraction of the model parameters are optimized and communicated during federated communications. Nevertheless, most previous works have focused on a single modality and neglected one common phenomenon, i.e., the presence of data heterogeneity across the clients. Therefore, in this work, we propose a finetuning framework tailored to heterogeneous multi-modal FL, called Federated Dual-Aadapter Teacher (FedDAT). Specifically, our approach leverages a Dual-Adapter Teacher (DAT) to address data heterogeneity by regularizing the client local updates and applying Mutual Knowledge Distillation (MKD) for an efficient knowledge transfer. FedDAT is the first approach that enables an efficient distributed finetuning of foundation models for a variety of heterogeneous Vision-Language tasks. To demonstrate its effectiveness, we conduct extensive experiments on four multi-modality FL benchmarks with different types of data heterogeneity, where FedDAT substantially outperforms the existing centralized PEFT methods adapted for FL.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
深情安青应助香菜炒香菜采纳,获得30
1秒前
1秒前
1秒前
2秒前
Zn中毒完成签到,获得积分10
4秒前
科研通AI6.4应助小陈采纳,获得10
5秒前
vampire发布了新的文献求助10
5秒前
刘淘淘完成签到 ,获得积分10
6秒前
Aiman完成签到,获得积分20
6秒前
zzz完成签到,获得积分10
6秒前
辛勤的乌完成签到,获得积分10
8秒前
无私的妍完成签到 ,获得积分10
8秒前
bkagyin应助干净的文涛采纳,获得10
9秒前
Pony完成签到,获得积分10
9秒前
9秒前
YQ57发布了新的文献求助10
10秒前
懵懂的小甜瓜完成签到 ,获得积分10
11秒前
爱听歌的机器猫完成签到,获得积分10
11秒前
落寞的寒云完成签到 ,获得积分10
12秒前
vampire完成签到,获得积分10
13秒前
happy发布了新的文献求助10
13秒前
14秒前
14秒前
qiuxu完成签到,获得积分10
15秒前
顺利的夜梦完成签到,获得积分10
15秒前
wanglixiang完成签到 ,获得积分10
17秒前
小蘑菇应助风趣的依秋采纳,获得10
18秒前
爱听歌的机器猫关注了科研通微信公众号
19秒前
Jasper应助纯情的忆霜采纳,获得10
22秒前
是阿刁完成签到,获得积分10
23秒前
YQ57完成签到,获得积分10
23秒前
24秒前
娜比完成签到,获得积分10
25秒前
小马甲应助着急的聪展采纳,获得10
26秒前
陌珩灏完成签到 ,获得积分10
26秒前
27秒前
AJZ应助文光采纳,获得10
27秒前
爆米花应助孙晓文采纳,获得10
27秒前
丘比特应助张莹采纳,获得10
27秒前
29秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Salmon nasal cartilage-derived proteoglycan complexes influence the gut microbiota and bacterial metabolites in mice 2000
The Composition and Relative Chronology of Dynasties 16 and 17 in Egypt 1500
Cowries - A Guide to the Gastropod Family Cypraeidae 1200
ON THE THEORY OF BIRATIONAL BLOWING-UP 666
Signals, Systems, and Signal Processing 610
LASER: A Phase 2 Trial of 177 Lu-PSMA-617 as Systemic Therapy for RCC 520
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6382027
求助须知:如何正确求助?哪些是违规求助? 8194208
关于积分的说明 17322068
捐赠科研通 5435733
什么是DOI,文献DOI怎么找? 2875039
邀请新用户注册赠送积分活动 1851652
关于科研通互助平台的介绍 1696352