异步通信
计算机科学
符号
趋同(经济学)
计算机网络
数学
经济增长
算术
经济
作者
Yu Zhang,Duo Liu,Moming Duan,Li Li,Xianzhang Chen,Ao Ren,Yujuan Tan,Chengliang Wang
出处
期刊:IEEE Transactions on Parallel and Distributed Systems
[Institute of Electrical and Electronics Engineers]
日期:2023-03-01
卷期号:34 (3): 1007-1019
被引量:10
标识
DOI:10.1109/tpds.2023.3237752
摘要
Federated learning (FL) is an emerging distributed machine learning paradigm that protects privacy and tackles the problem of isolated data islands. At present, there are two main communication strategies of FL: synchronous FL and asynchronous FL. The advantages of synchronous FL are the high precision and easy convergence of the model. However, this synchronous communication strategy has the risk of the straggler effect. Asynchronous FL has a natural advantage in mitigating the straggler effect, but there are threats of model quality degradation and server crash. In this paper, we propose a model discrepancy-aware semi-asynchronous clustered FL framework, FedMDS , which alleviates the straggler effect by 1) a clustered strategy based on the delay and direction of the model update and 2) a synchronous trigger mechanism that limits the model staleness. FedMDS leverages the clustered algorithm to reschedule the clients. Each group of clients performs asynchronous updates until the synchronous update mechanism based on the model discrepancy is triggered. We evaluate FedMDS based on four typical federated datasets in a non-IID setting and compare FedMDS to the baselines. The experimental results show that FedMDS significantly improves average test accuracy by more than $+9.2\%$ on the four datasets compared to TA-FedAvg . In particular, FedMDS improves absolute Top-1 test accuracy by $+37.6\%$ on FEMNIST compared to TA-FedAvg . The frequency of the average synchronization waiting time of FedMDS is significantly lower than that of TA-FedAvg on all datasets. Moreover, FedMDS can improve the accuracy and alleviate the straggler effect.
科研通智能强力驱动
Strongly Powered by AbleSci AI