亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整的填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

Ensemble Attention Distillation for Privacy-Preserving Federated Learning

计算机科学 机器学习 节点(物理) 人工智能 蒸馏 数据挖掘 结构工程 工程类 有机化学 化学
作者
Xuan Gong,Abhishek Sharma,Srikrishna Karanam,Ziyan Wu,Terrence Chen,David Doermann,Arun Innanje
标识
DOI:10.1109/iccv48922.2021.01480
摘要

We consider the problem of Federated Learning (FL) where numerous decentralized computational nodes collaborate with each other to train a centralized machine learning model without explicitly sharing their local data samples. Such decentralized training naturally leads to issues of imbalanced or differing data distributions among the local models and challenges in fusing them into a central model. Existing FL methods deal with these issues by either sharing local parameters or fusing models via online distillation. However, such a design leads to multiple rounds of inter-node communication resulting in substantial band-width consumption, while also increasing the risk of data leakage and consequent privacy issues. To address these problems, we propose a new distillation-based FL frame-work that can preserve privacy by design, while also consuming substantially less network communication resources when compared to the current methods. Our framework engages in inter-node communication using only publicly available and approved datasets, thereby giving explicit privacy control to the user. To distill knowledge among the various local models, our framework involves a novel ensemble distillation algorithm that uses both final prediction as well as model attention. This algorithm explicitly considers the diversity among various local nodes while also seeking consensus among them. This results in a comprehensive technique to distill knowledge from various decentralized nodes. We demonstrate the various aspects and the associated benefits of our FL framework through extensive experiments that produce state-of-the-art results on both classification and segmentation tasks on natural and medical images.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
隐形曼青应助科研通管家采纳,获得10
2秒前
汉堡包应助Nacy采纳,获得10
5秒前
13秒前
hwq完成签到,获得积分10
15秒前
Nacy发布了新的文献求助10
18秒前
22秒前
26秒前
领导范儿应助ZHErain采纳,获得10
27秒前
31秒前
庚婺关注了科研通微信公众号
42秒前
这个手刹不太灵完成签到 ,获得积分10
47秒前
1分钟前
山野完成签到 ,获得积分10
1分钟前
卓头OvQ应助Nacy采纳,获得10
1分钟前
1分钟前
1分钟前
木木发布了新的文献求助10
1分钟前
隐形曼青应助Nacy采纳,获得10
1分钟前
1分钟前
小谷围桥苯环萘关注了科研通微信公众号
1分钟前
Nacy发布了新的文献求助10
2分钟前
细心妙旋完成签到 ,获得积分10
2分钟前
2分钟前
2分钟前
科研通AI2S应助木木采纳,获得10
2分钟前
2分钟前
2分钟前
Jayden完成签到,获得积分10
2分钟前
yoona完成签到,获得积分10
2分钟前
2分钟前
2分钟前
黑峯发布了新的文献求助10
2分钟前
ZHErain发布了新的文献求助10
2分钟前
zheng完成签到 ,获得积分10
2分钟前
体育爱好者完成签到,获得积分10
2分钟前
忍冬完成签到,获得积分10
2分钟前
124332发布了新的文献求助150
2分钟前
黑峯发布了新的文献求助10
2分钟前
打打应助Nacy采纳,获得10
2分钟前
3分钟前
高分求助中
Rock-Forming Minerals, Volume 3C, Sheet Silicates: Clay Minerals 2000
The late Devonian Standard Conodont Zonation 2000
Nickel superalloy market size, share, growth, trends, and forecast 2023-2030 2000
The Lali Section: An Excellent Reference Section for Upper - Devonian in South China 1500
Very-high-order BVD Schemes Using β-variable THINC Method 930
The Vladimirov Diaries [by Peter Vladimirov] 600
Development of general formulas for bolted flanges, by E.O. Waters [and others] 600
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 基因 遗传学 催化作用 物理化学 免疫学 量子力学 细胞生物学
热门帖子
关注 科研通微信公众号,转发送积分 3265467
求助须知:如何正确求助?哪些是违规求助? 2905505
关于积分的说明 8333941
捐赠科研通 2575798
什么是DOI,文献DOI怎么找? 1400130
科研通“疑难数据库(出版商)”最低求助积分说明 654702
邀请新用户注册赠送积分活动 633532