FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal Heterogeneous Federated Learning

基础(证据) 计算机科学 情态动词 人工智能 分布式计算 计算机体系结构 材料科学 复合材料 政治学 法学
作者
Haokun Chen,Yao Zhang,Denis Krompaß,Jindong Gu,Volker Tresp
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence [Association for the Advancement of Artificial Intelligence (AAAI)]
卷期号:38 (10): 11285-11293 被引量:3
标识
DOI:10.1609/aaai.v38i10.29007
摘要

Recently, foundation models have exhibited remarkable advancements in multi-modal learning. These models, equipped with millions (or billions) of parameters, typically require a substantial amount of data for finetuning. However, collecting and centralizing training data from diverse sectors becomes challenging due to distinct privacy regulations. Federated Learning (FL) emerges as a promising solution, enabling multiple clients to collaboratively train neural networks without centralizing their local data. To alleviate client computation burdens and communication overheads, previous works have adapted Parameter-efficient Finetuning (PEFT) methods for FL. Hereby, only a small fraction of the model parameters are optimized and communicated during federated communications. Nevertheless, most previous works have focused on a single modality and neglected one common phenomenon, i.e., the presence of data heterogeneity across the clients. Therefore, in this work, we propose a finetuning framework tailored to heterogeneous multi-modal FL, called Federated Dual-Aadapter Teacher (FedDAT). Specifically, our approach leverages a Dual-Adapter Teacher (DAT) to address data heterogeneity by regularizing the client local updates and applying Mutual Knowledge Distillation (MKD) for an efficient knowledge transfer. FedDAT is the first approach that enables an efficient distributed finetuning of foundation models for a variety of heterogeneous Vision-Language tasks. To demonstrate its effectiveness, we conduct extensive experiments on four multi-modality FL benchmarks with different types of data heterogeneity, where FedDAT substantially outperforms the existing centralized PEFT methods adapted for FL.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
不安遥完成签到 ,获得积分10
刚刚
1秒前
忍蛙发布了新的文献求助10
1秒前
eceg发布了新的文献求助10
2秒前
欣喜的忆秋完成签到,获得积分10
2秒前
2秒前
ephore应助LCH采纳,获得20
2秒前
3秒前
宽攻为妙发布了新的文献求助10
4秒前
斯文败类应助魔音甜菜采纳,获得10
4秒前
4秒前
5秒前
5秒前
喜气杨杨完成签到 ,获得积分10
5秒前
duolafu发布了新的文献求助10
5秒前
wuwa完成签到,获得积分10
6秒前
桐桐应助科研通管家采纳,获得10
6秒前
orixero应助科研通管家采纳,获得10
6秒前
健壮雨发布了新的文献求助10
6秒前
Lucas应助科研通管家采纳,获得10
6秒前
大龙哥886应助科研通管家采纳,获得10
6秒前
Tameiki完成签到 ,获得积分10
6秒前
隐形曼青应助科研通管家采纳,获得10
6秒前
SciGPT应助科研通管家采纳,获得10
6秒前
思源应助科研通管家采纳,获得10
6秒前
7秒前
光储一体化完成签到,获得积分10
7秒前
7秒前
研友_VZG7GZ应助科研通管家采纳,获得30
7秒前
7秒前
7秒前
大龙哥886应助科研通管家采纳,获得10
7秒前
7秒前
李健应助大方的向日葵采纳,获得10
8秒前
精明觅双完成签到,获得积分20
8秒前
8秒前
cchh发布了新的文献求助10
9秒前
嘎嘎完成签到,获得积分10
10秒前
henritsie发布了新的文献求助10
11秒前
11秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Cronologia da história de Macau 1600
Decentring Leadership 1000
Lloyd's Register of Shipping's Approach to the Control of Incidents of Brittle Fracture in Ship Structures 1000
BRITTLE FRACTURE IN WELDED SHIPS 1000
Intentional optical interference with precision weapons (in Russian) Преднамеренные оптические помехи высокоточному оружию 1000
Atlas of Anatomy 5th original digital 2025的PDF高清电子版(非压缩版,大小约400-600兆,能更大就更好了) 1000
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 纳米技术 计算机科学 化学工程 生物化学 物理 复合材料 内科学 催化作用 物理化学 光电子学 细胞生物学 基因 电极 遗传学
热门帖子
关注 科研通微信公众号,转发送积分 6184127
求助须知:如何正确求助?哪些是违规求助? 8011419
关于积分的说明 16663390
捐赠科研通 5283551
什么是DOI,文献DOI怎么找? 2816555
邀请新用户注册赠送积分活动 1796367
关于科研通互助平台的介绍 1660883