FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal Heterogeneous Federated Learning

基础(证据) 计算机科学 情态动词 人工智能 分布式计算 计算机体系结构 材料科学 复合材料 政治学 法学
作者
Haokun Chen,Yao Zhang,Denis Krompaß,Jindong Gu,Volker Tresp
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence [Association for the Advancement of Artificial Intelligence (AAAI)]
卷期号:38 (10): 11285-11293 被引量:3
标识
DOI:10.1609/aaai.v38i10.29007
摘要

Recently, foundation models have exhibited remarkable advancements in multi-modal learning. These models, equipped with millions (or billions) of parameters, typically require a substantial amount of data for finetuning. However, collecting and centralizing training data from diverse sectors becomes challenging due to distinct privacy regulations. Federated Learning (FL) emerges as a promising solution, enabling multiple clients to collaboratively train neural networks without centralizing their local data. To alleviate client computation burdens and communication overheads, previous works have adapted Parameter-efficient Finetuning (PEFT) methods for FL. Hereby, only a small fraction of the model parameters are optimized and communicated during federated communications. Nevertheless, most previous works have focused on a single modality and neglected one common phenomenon, i.e., the presence of data heterogeneity across the clients. Therefore, in this work, we propose a finetuning framework tailored to heterogeneous multi-modal FL, called Federated Dual-Aadapter Teacher (FedDAT). Specifically, our approach leverages a Dual-Adapter Teacher (DAT) to address data heterogeneity by regularizing the client local updates and applying Mutual Knowledge Distillation (MKD) for an efficient knowledge transfer. FedDAT is the first approach that enables an efficient distributed finetuning of foundation models for a variety of heterogeneous Vision-Language tasks. To demonstrate its effectiveness, we conduct extensive experiments on four multi-modality FL benchmarks with different types of data heterogeneity, where FedDAT substantially outperforms the existing centralized PEFT methods adapted for FL.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
肥猫完成签到 ,获得积分10
1秒前
兮兮兮兮兮兮完成签到,获得积分10
1秒前
5433发布了新的文献求助10
1秒前
Marcus完成签到,获得积分10
1秒前
4秒前
圆圆完成签到,获得积分10
4秒前
万能图书馆应助星落枝头采纳,获得10
5秒前
小蘑菇应助春不倦采纳,获得10
6秒前
梨花发布了新的文献求助10
6秒前
哦呵发布了新的文献求助20
7秒前
默默zzz完成签到 ,获得积分10
7秒前
7秒前
8秒前
情怀应助乔木采纳,获得10
8秒前
乐乐应助请看备注再上传采纳,获得10
8秒前
龙金兴完成签到,获得积分20
9秒前
英俊的铭应助lida采纳,获得10
9秒前
wxy发布了新的文献求助10
9秒前
缓慢的凤妖应助Christine采纳,获得10
9秒前
9秒前
10秒前
慕青应助圈儿采纳,获得10
11秒前
Ava应助吴小利采纳,获得20
11秒前
12秒前
戚梦之完成签到,获得积分10
12秒前
0℃发布了新的文献求助10
12秒前
李健应助珍珠爸爸采纳,获得10
12秒前
上官若男应助激流勇进采纳,获得10
13秒前
圆圆发布了新的文献求助10
13秒前
深情安青应助yu采纳,获得10
13秒前
鸭梨发布了新的文献求助10
14秒前
123发布了新的文献求助10
15秒前
15秒前
xrose完成签到 ,获得积分10
15秒前
星落枝头发布了新的文献求助10
16秒前
ramicccx完成签到,获得积分10
16秒前
17秒前
18秒前
18秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
PowerCascade: A Synthetic Dataset for Cascading Failure Analysis in Power Systems 2000
Various Faces of Animal Metaphor in English and Polish 800
Signals, Systems, and Signal Processing 610
An Introduction to Medicinal Chemistry 第六版习题答案 600
On the Dragon Seas, a sailor's adventures in the far east 500
Yangtze Reminiscences. Some Notes And Recollections Of Service With The China Navigation Company Ltd., 1925-1939 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6347216
求助须知:如何正确求助?哪些是违规求助? 8161970
关于积分的说明 17168326
捐赠科研通 5403408
什么是DOI,文献DOI怎么找? 2861421
邀请新用户注册赠送积分活动 1839238
关于科研通互助平台的介绍 1688559