已入深夜,您辛苦了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!祝你早点完成任务,早点休息,好梦!

Data-Free Knowledge Distillation via Generator-Free Data Generation for Non-IID Federated Learning

计算机科学 发电机(电路理论) 利用 蒸馏 数据挖掘 人工智能 机器学习 功率(物理) 化学 物理 计算机安全 有机化学 量子力学
作者
Siran Zhao,Tianchi Liao,Lele Fu,Chuan Chen,Jing Bian,Zibin Zheng
出处
期刊:Research Square - Research Square
标识
DOI:10.21203/rs.3.rs-3364332/v1
摘要

Abstract Data heterogeneity (Non-IID) on Federated Learning (FL) is currently a widely publicized problem, which leads to local model drift and performance degradation. Because of the advantage of knowledge distillation, it has been explored in some recent work to refine global models. However, these approaches rely on a proxy dataset or a data generator. First, in many FL scenarios, proxy dataset do not necessarily exist on the server. Second, the quality of data generated by the generator is unstable and the generator depends on the computing resources of the server. In this work, we propose a novel data-Free knowledge distillation approach via generator-Free Data Generation for Non-IID FL, dubbed as FedF 2 DG. Specifically, FedF 2 DG requires only local models to generate pseudo datasets for each client, and can generate hard samples by adding an additional regularization term that exploit disagreements between local model and global model. Meanwhile, FedF 2 DG enables flexible utilization of computational resources by generating pseudo dataset locally or on the server. And to address the label distribution shift in Non-IID FL, we propose a Data Generation Principle that can adaptively control the label distribution and number of pseudo dataset based on client current state, and this allows for the extraction of more client knowledge. Then knowledge distillation is performed to transfer the knowledge in local models to the global model. Extensive experiments demonstrate that our proposed method significantly outperforms the state-of-the-art FL methods and can serve as plugin for existing Federated Learning methds such as FedAvg, FedProx, etc, and improve their performance.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
建议保存本图,每天支付宝扫一扫(相册选取)领红包
实时播报
dengdeng发布了新的文献求助10
1秒前
吴荣方发布了新的文献求助10
3秒前
壮观大炮完成签到,获得积分10
3秒前
小蘑菇应助热情的未来采纳,获得10
4秒前
Jasper应助轻松的小曾采纳,获得10
5秒前
酷波er应助内向的绿海采纳,获得10
8秒前
充电宝应助内向的绿海采纳,获得10
8秒前
鈮宝完成签到 ,获得积分10
8秒前
WerWu完成签到,获得积分0
11秒前
11秒前
12秒前
医疗废物专用车乘客完成签到,获得积分10
14秒前
小曾发布了新的文献求助10
15秒前
wwt发布了新的文献求助10
17秒前
FashionBoy应助内向的绿海采纳,获得10
20秒前
20秒前
三泥完成签到,获得积分10
20秒前
Fn完成签到 ,获得积分10
22秒前
Momomo应助科研通管家采纳,获得10
23秒前
脑洞疼应助科研通管家采纳,获得30
24秒前
科研通AI6应助科研通管家采纳,获得10
24秒前
浮游应助科研通管家采纳,获得10
24秒前
Momomo应助科研通管家采纳,获得10
24秒前
浮游应助科研通管家采纳,获得10
24秒前
浮游应助科研通管家采纳,获得10
24秒前
Momomo应助科研通管家采纳,获得10
24秒前
Momomo应助科研通管家采纳,获得10
24秒前
浮游应助科研通管家采纳,获得10
24秒前
浮游应助科研通管家采纳,获得10
24秒前
wanci应助科研通管家采纳,获得10
24秒前
Orange应助科研通管家采纳,获得10
24秒前
丘比特应助科研通管家采纳,获得10
24秒前
科研通AI2S应助科研通管家采纳,获得30
24秒前
24秒前
24秒前
25秒前
朱砂完成签到,获得积分10
26秒前
共享精神应助nickel采纳,获得10
26秒前
重要的水壶完成签到,获得积分10
27秒前
枝头树上的布谷鸟完成签到 ,获得积分10
27秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
List of 1,091 Public Pension Profiles by Region 1041
Mentoring for Wellbeing in Schools 1000
Binary Alloy Phase Diagrams, 2nd Edition 600
Atlas of Liver Pathology: A Pattern-Based Approach 500
A Technologist’s Guide to Performing Sleep Studies 500
EEG in Childhood Epilepsy: Initial Presentation & Long-Term Follow-Up 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 纳米技术 计算机科学 内科学 化学工程 复合材料 物理化学 基因 遗传学 催化作用 冶金 量子力学 光电子学
热门帖子
关注 科研通微信公众号,转发送积分 5493621
求助须知:如何正确求助?哪些是违规求助? 4591657
关于积分的说明 14434342
捐赠科研通 4524055
什么是DOI,文献DOI怎么找? 2478579
邀请新用户注册赠送积分活动 1463596
关于科研通互助平台的介绍 1436426