亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

Ensemble Attention Distillation for Privacy-Preserving Federated Learning

计算机科学 机器学习 节点(物理) 人工智能 蒸馏 数据挖掘 化学 结构工程 有机化学 工程类
作者
Xuan Gong,Abhishek Sharma,Srikrishna Karanam,Ziyan Wu,Terrence Chen,David Doermann,Arun Innanje
标识
DOI:10.1109/iccv48922.2021.01480
摘要

We consider the problem of Federated Learning (FL) where numerous decentralized computational nodes collaborate with each other to train a centralized machine learning model without explicitly sharing their local data samples. Such decentralized training naturally leads to issues of imbalanced or differing data distributions among the local models and challenges in fusing them into a central model. Existing FL methods deal with these issues by either sharing local parameters or fusing models via online distillation. However, such a design leads to multiple rounds of inter-node communication resulting in substantial band-width consumption, while also increasing the risk of data leakage and consequent privacy issues. To address these problems, we propose a new distillation-based FL frame-work that can preserve privacy by design, while also consuming substantially less network communication resources when compared to the current methods. Our framework engages in inter-node communication using only publicly available and approved datasets, thereby giving explicit privacy control to the user. To distill knowledge among the various local models, our framework involves a novel ensemble distillation algorithm that uses both final prediction as well as model attention. This algorithm explicitly considers the diversity among various local nodes while also seeking consensus among them. This results in a comprehensive technique to distill knowledge from various decentralized nodes. We demonstrate the various aspects and the associated benefits of our FL framework through extensive experiments that produce state-of-the-art results on both classification and segmentation tasks on natural and medical images.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
微醺潮汐完成签到,获得积分10
1秒前
mmyhn应助科研通管家采纳,获得20
4秒前
andrele应助科研通管家采纳,获得10
4秒前
4秒前
4秒前
所所应助FanKun采纳,获得10
4秒前
Li发布了新的文献求助10
7秒前
123完成签到,获得积分10
8秒前
11秒前
上官若男应助殷琛采纳,获得10
14秒前
奥利奥完成签到 ,获得积分10
15秒前
srx完成签到 ,获得积分10
16秒前
禅依完成签到,获得积分10
17秒前
FanKun发布了新的文献求助10
17秒前
虾球发布了新的文献求助10
19秒前
21秒前
赘婿应助禅依采纳,获得10
21秒前
我不到啊完成签到 ,获得积分10
22秒前
彭于晏应助VERITAS采纳,获得10
24秒前
tomato发布了新的文献求助10
28秒前
29秒前
inRe发布了新的文献求助10
30秒前
32秒前
殷琛发布了新的文献求助10
34秒前
zz发布了新的文献求助10
38秒前
41秒前
42秒前
传奇3应助殷琛采纳,获得10
42秒前
43秒前
秦小狸完成签到 ,获得积分10
44秒前
VERITAS发布了新的文献求助10
46秒前
土豪的摩托完成签到 ,获得积分10
46秒前
48秒前
yezio完成签到 ,获得积分10
49秒前
怕黑鲂完成签到 ,获得积分10
51秒前
52秒前
体贴花卷发布了新的文献求助10
52秒前
kaka完成签到 ,获得积分10
55秒前
1分钟前
Liu完成签到 ,获得积分10
1分钟前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Encyclopedia of Reproduction Third Edition 3000
《药学类医疗服务价格项目立项指南(征求意见稿)》 1000
花の香りの秘密―遺伝子情報から機能性まで 800
1st Edition Sports Rehabilitation and Training Multidisciplinary Perspectives By Richard Moss, Adam Gledhill 600
nephSAP® Nephrology Self-Assessment Program - Hypertension The American Society of Nephrology 500
Digital and Social Media Marketing 500
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5627829
求助须知:如何正确求助?哪些是违规求助? 4714854
关于积分的说明 14963247
捐赠科研通 4785572
什么是DOI,文献DOI怎么找? 2555178
邀请新用户注册赠送积分活动 1516526
关于科研通互助平台的介绍 1476936