计算机科学
人工智能
趋同(经济学)
机器学习
经济
经济增长
作者
Dezhong Yao,Wanning Pan,Yutong Dai,Yao Wan,Xiaofeng Ding,Yu Chen,Hai Jin,Zheng Xu,Lichao Sun
标识
DOI:10.1109/tc.2023.3315066
摘要
Federated learning, as one enabling technology of edge intelligence, has gained substantial attention due to its efficacy in training deep learning models without data privacy and network bandwidth concerns. However, due to the heterogeneity of the edge computing system and data, many methods suffer from the "client-drift" issue that could considerably impede the convergence of global model training: local models on clients can drift apart, and the aggregated model can be different from the global optimum. To tackle this issue, one intuitive idea is to guide the local model training by global teachers, i.e., past global models, where each client learns the global knowledge from past global models via adaptive knowledge distillation techniques. Inspired by these insights, we propose a novel approach for heterogeneous federated learning, FedGKD , which fuses the knowledge from historical global models and guides local training to alleviate the "client-drift" issue. In this paper, we evaluate FedGKD through extensive experiments across various CV and NLP datasets ( i.e., CIFAR-10/100, Tiny-ImageNet, AG News, SST5) under different heterogeneous settings. The proposed method is guaranteed to converge under common assumptions and outperforms the state-of-the-art baselines in the non-IID federated setting.
科研通智能强力驱动
Strongly Powered by AbleSci AI