Ensemble Attention Distillation for Privacy-Preserving Federated Learning

计算机科学 机器学习 节点(物理) 人工智能 蒸馏 数据挖掘 结构工程 工程类 有机化学 化学
作者
Xuan Gong,Abhishek Sharma,Srikrishna Karanam,Ziyan Wu,Terrence Chen,David Doermann,Arun Innanje
标识
DOI:10.1109/iccv48922.2021.01480
摘要

We consider the problem of Federated Learning (FL) where numerous decentralized computational nodes collaborate with each other to train a centralized machine learning model without explicitly sharing their local data samples. Such decentralized training naturally leads to issues of imbalanced or differing data distributions among the local models and challenges in fusing them into a central model. Existing FL methods deal with these issues by either sharing local parameters or fusing models via online distillation. However, such a design leads to multiple rounds of inter-node communication resulting in substantial band-width consumption, while also increasing the risk of data leakage and consequent privacy issues. To address these problems, we propose a new distillation-based FL frame-work that can preserve privacy by design, while also consuming substantially less network communication resources when compared to the current methods. Our framework engages in inter-node communication using only publicly available and approved datasets, thereby giving explicit privacy control to the user. To distill knowledge among the various local models, our framework involves a novel ensemble distillation algorithm that uses both final prediction as well as model attention. This algorithm explicitly considers the diversity among various local nodes while also seeking consensus among them. This results in a comprehensive technique to distill knowledge from various decentralized nodes. We demonstrate the various aspects and the associated benefits of our FL framework through extensive experiments that produce state-of-the-art results on both classification and segmentation tasks on natural and medical images.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
SciGPT应助快乐滑板采纳,获得10
1秒前
丰富的乐儿完成签到,获得积分10
2秒前
2秒前
傲安发布了新的文献求助10
3秒前
MJ发布了新的文献求助10
4秒前
flasher22发布了新的文献求助10
6秒前
充电宝应助哈哈哈哈采纳,获得10
6秒前
7秒前
lin完成签到,获得积分10
10秒前
吮指鸡完成签到,获得积分10
10秒前
鸽子汤完成签到 ,获得积分10
12秒前
傲安完成签到,获得积分10
14秒前
18秒前
李俊朋应助MJ采纳,获得10
18秒前
18秒前
19秒前
sukuyo完成签到,获得积分10
20秒前
sci应助星星点点采纳,获得10
21秒前
ZYC007完成签到,获得积分10
22秒前
MJ完成签到,获得积分10
24秒前
852应助激动的煎饼采纳,获得10
25秒前
科研小白_李完成签到,获得积分10
26秒前
26秒前
量子星尘发布了新的文献求助10
27秒前
28秒前
FashionBoy应助楚向东采纳,获得10
30秒前
Walden完成签到,获得积分10
32秒前
契阔完成签到 ,获得积分10
33秒前
黑黑黑完成签到,获得积分10
34秒前
35秒前
jt完成签到 ,获得积分10
35秒前
37秒前
38秒前
等待的航空完成签到 ,获得积分10
39秒前
Freya1528完成签到,获得积分10
40秒前
312034发布了新的文献求助10
40秒前
44秒前
付茂青完成签到 ,获得积分10
45秒前
专注寻菱完成签到,获得积分10
46秒前
高分求助中
The Mother of All Tableaux Order, Equivalence, and Geometry in the Large-scale Structure of Optimality Theory 2400
Ophthalmic Equipment Market by Devices(surgical: vitreorentinal,IOLs,OVDs,contact lens,RGP lens,backflush,diagnostic&monitoring:OCT,actorefractor,keratometer,tonometer,ophthalmoscpe,OVD), End User,Buying Criteria-Global Forecast to2029 2000
Cognitive Neuroscience: The Biology of the Mind (Sixth Edition) 1000
Optimal Transport: A Comprehensive Introduction to Modeling, Analysis, Simulation, Applications 800
Official Methods of Analysis of AOAC INTERNATIONAL 600
ACSM’s Guidelines for Exercise Testing and Prescription, 12th edition 588
A Preliminary Study on Correlation Between Independent Components of Facial Thermal Images and Subjective Assessment of Chronic Stress 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3958114
求助须知:如何正确求助?哪些是违规求助? 3504298
关于积分的说明 11117743
捐赠科研通 3235614
什么是DOI,文献DOI怎么找? 1788403
邀请新用户注册赠送积分活动 871211
科研通“疑难数据库(出版商)”最低求助积分说明 802547