趋同(经济学)
计算机科学
方差减少
脚手架
差异(会计)
相似性(几何)
收敛速度
采样(信号处理)
分布式计算
算法
数学优化
人工智能
数学
计算机网络
电信
数据库
频道(广播)
业务
会计
探测器
经济
图像(数学)
经济增长
作者
Sai Praneeth Karimireddy,Satyen Kale,Mehryar Mohri,Sashank J. Reddi,Sebastian U. Stich,Ananda Theertha Suresh
摘要
Federated Averaging (FedAvg) has emerged as the algorithm of choice for federated learning due to its simplicity and low communication cost. However, in spite of recent research efforts, its performance is not fully understood. We obtain tight convergence rates for FedAvg and prove that it suffers from `client-drift' when the data is heterogeneous (non-iid), resulting in unstable and slow convergence. As a solution, we propose a new algorithm (SCAFFOLD) which uses control variates (variance reduction) to correct for the `client-drift' in its local updates. We prove that SCAFFOLD requires significantly fewer communication rounds and is not affected by data heterogeneity or client sampling. Further, we show that (for quadratics) SCAFFOLD can take advantage of similarity in the client's data yielding even faster convergence. The latter is the first result to quantify the usefulness of local-steps in distributed optimization.
科研通智能强力驱动
Strongly Powered by AbleSci AI