联合学习
计算机科学
稳健性(进化)
机器学习
建筑
人工智能
分布式学习
方案(数学)
数据建模
分布式计算
数据挖掘
数据库
教育学
生物化学
化学
视觉艺术
艺术
数学分析
心理学
基因
数学
作者
Donglin Jiang,Shan Chen,Zhihui Zhang
标识
DOI:10.1109/icaice51518.2020.00038
摘要
Federated learning is a new scheme of distributed machine learning, which enables a large number of edge computing devices to jointly learn a shared model without private data sharing. Federated learning allows nodes to synchronize only the locally trained models instead of their own private data, which provides a guarantee for privacy and security. However, due to the challenges of heterogeneity in federated learning, which are: (1) heterogeneous model architecture among devices; (2) statistical heterogeneity in real federated dataset, which do not obey independent-identical-distribution, resulting in poor performance of traditional federated learning algorithms. To solve the problems above, this paper proposes FedDistill, a new distributed training method based on knowledge distillation. By introducing personalized model on each device, the personalized model aims to improve the local performance even in a situation that global model fails to adapt to the local dataset, thereby improving the ability and robustness of the global model. The improvement of the performance of local device benefits from the effect of knowledge distillation, which can guide the improvement of global model by knowledge transfer between heterogeneous networks. Experiments show that FedDistill can significantly improve the accuracy of classification tasks and meet the needs of heterogeneous users.
科研通智能强力驱动
Strongly Powered by AbleSci AI