计算机科学
机器学习
蒸馏
人工智能
一般化
生成模型
联合学习
生成语法
代理(统计)
水准点(测量)
相互信息
数据挖掘
数学分析
有机化学
化学
数学
地理
大地测量学
作者
Chao Peng,Yiming Guo,Yao Chen,Qilin Rui,Zhengfeng Yang,Chenyang Xu
标识
DOI:10.1007/978-3-031-39698-4_23
摘要
Federated learning is a distributed machine learning that enables models to aggregate on the server after local training to protect privacy. However, user heterogeneity presents a challenge in federated learning. To address this issue, some recent work has proposed using knowledge distillation. But the application of knowledge distillation in federated learning is dependent on the proxy dataset, which can be difficult to obtain in practice. Additionally, the simple average aggregation method of model parameters may fail to achieve a global model with good generalization performance, and may also lead to potential privacy breaches. To tackle these issues, we propose FedGM, a data-free federated knowledge distillation method that combines generative learning with mutual distillation. FedGM addresses user heterogeneity while also protecting user privacy. We use a conditional generator to extract global knowledge to guide local model training and build a proxy dataset on the server-side to perform mutual distillation. Extensive experiments on benchmark datasets show that FedGM outperforms state-of-the-art approaches in terms of generalization performance and privacy protection.
科研通智能强力驱动
Strongly Powered by AbleSci AI