FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal Heterogeneous Federated Learning

基础(证据) 计算机科学 情态动词 人工智能 分布式计算 计算机体系结构 材料科学 复合材料 政治学 法学
作者
Haokun Chen,Yao Zhang,Denis Krompaß,Jindong Gu,Volker Tresp
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence [Association for the Advancement of Artificial Intelligence (AAAI)]
卷期号:38 (10): 11285-11293 被引量:3
标识
DOI:10.1609/aaai.v38i10.29007
摘要

Recently, foundation models have exhibited remarkable advancements in multi-modal learning. These models, equipped with millions (or billions) of parameters, typically require a substantial amount of data for finetuning. However, collecting and centralizing training data from diverse sectors becomes challenging due to distinct privacy regulations. Federated Learning (FL) emerges as a promising solution, enabling multiple clients to collaboratively train neural networks without centralizing their local data. To alleviate client computation burdens and communication overheads, previous works have adapted Parameter-efficient Finetuning (PEFT) methods for FL. Hereby, only a small fraction of the model parameters are optimized and communicated during federated communications. Nevertheless, most previous works have focused on a single modality and neglected one common phenomenon, i.e., the presence of data heterogeneity across the clients. Therefore, in this work, we propose a finetuning framework tailored to heterogeneous multi-modal FL, called Federated Dual-Aadapter Teacher (FedDAT). Specifically, our approach leverages a Dual-Adapter Teacher (DAT) to address data heterogeneity by regularizing the client local updates and applying Mutual Knowledge Distillation (MKD) for an efficient knowledge transfer. FedDAT is the first approach that enables an efficient distributed finetuning of foundation models for a variety of heterogeneous Vision-Language tasks. To demonstrate its effectiveness, we conduct extensive experiments on four multi-modality FL benchmarks with different types of data heterogeneity, where FedDAT substantially outperforms the existing centralized PEFT methods adapted for FL.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
刚刚
量子星尘发布了新的文献求助10
刚刚
流露发布了新的文献求助10
刚刚
xingstar完成签到,获得积分20
1秒前
1秒前
曾经若南完成签到,获得积分10
1秒前
乐乐应助suodeheng采纳,获得40
1秒前
2秒前
Culloo完成签到,获得积分10
2秒前
小丁发布了新的文献求助10
2秒前
甜美香之完成签到 ,获得积分10
2秒前
FashionBoy应助平凡的书雁采纳,获得10
2秒前
3秒前
崔晴晴发布了新的文献求助10
3秒前
zsc发布了新的文献求助10
3秒前
雪白的烨霖完成签到 ,获得积分10
3秒前
汉堡包应助有魅力的雨雪采纳,获得10
4秒前
hmoo发布了新的文献求助10
4秒前
土豆淀粉完成签到,获得积分10
4秒前
ggy发布了新的文献求助10
5秒前
17完成签到,获得积分10
5秒前
大模型应助yangyangyang采纳,获得10
5秒前
研友_VZG7GZ应助Yuanyuan采纳,获得10
5秒前
标致香完成签到,获得积分10
6秒前
6秒前
闪闪寒云完成签到,获得积分10
6秒前
东asdfghjkl发布了新的文献求助30
6秒前
6秒前
LIN96T完成签到 ,获得积分10
7秒前
wanci应助受伤的安波采纳,获得10
7秒前
7秒前
pearlqi发布了新的文献求助30
7秒前
最落幕完成签到 ,获得积分10
7秒前
任性的沅完成签到,获得积分10
7秒前
传奇3应助喜悦语堂采纳,获得10
8秒前
ju00完成签到,获得积分10
8秒前
JamesPei应助gm采纳,获得10
8秒前
wanci应助wym采纳,获得10
8秒前
DD完成签到,获得积分10
9秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Aerospace Standards Index - 2026 ASIN2026 3000
Polymorphism and polytypism in crystals 1000
Signals, Systems, and Signal Processing 610
Discrete-Time Signals and Systems 610
Research Methods for Business: A Skill Building Approach, 9th Edition 500
Social Work and Social Welfare: An Invitation(7th Edition) 410
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 纳米技术 有机化学 物理 生物化学 化学工程 计算机科学 复合材料 内科学 催化作用 光电子学 物理化学 电极 冶金 遗传学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 6052583
求助须知:如何正确求助?哪些是违规求助? 7867865
关于积分的说明 16275318
捐赠科研通 5198100
什么是DOI,文献DOI怎么找? 2781296
邀请新用户注册赠送积分活动 1764196
关于科研通互助平台的介绍 1645986