计算机科学
趋同(经济学)
集合(抽象数据类型)
联合学习
跟踪(心理语言学)
主流
人工智能
算法
理论计算机科学
法学
政治学
语言学
经济增长
哲学
经济
程序设计语言
作者
Lei Tan,Xiaoxi Zhang,Yipeng Zhou,Xinkai Che,Miao Hu,Xu Chen,Di Wu
出处
期刊:IEEE Transactions on Network Science and Engineering
[Institute of Electrical and Electronics Engineers]
日期:2022-04-22
卷期号:9 (4): 2708-2720
被引量:11
标识
DOI:10.1109/tnse.2022.3168969
摘要
Federated learning (FL) has become one of the mainstream paradigms for multi-party collaborative learning with privacy protection. As it is difficult to guarantee all FL devices to be active simultaneously, a common approach is to only use a partial set of devices to participate in each round of model training. However, such partial device participation may introduce significant bias on the trained model. In this paper, we first conduct a theoretical analysis to investigate the negative impact of biased device participation and derive the convergence rate of FedAvg, the most well-known FL algorithm, under biased device participation. We further propose an optimized participation-aware federated learning algorithm called AdaFed , which can adaptively tune the aggregation weight of each device based on its historical participation records and remove the bias introduced by partial device participation. To be more rigorous, we formally prove the convergence guarantee of AdaFed. Finally, we conduct trace-driven experiments to validate the effectiveness of our proposed algorithm. The experimental results are consistent with our theoretical analysis and show that AdaFed improves the global model accuracy and converges much faster than the state-of-the-art FL algorithms by eliminating the negative effect of biased device participation.
科研通智能强力驱动
Strongly Powered by AbleSci AI