MNIST数据库
趋同(经济学)
计算机科学
机器学习
人工智能
联合学习
服务器
分布式计算
深度学习
云计算
边缘计算
数据挖掘
计算机网络
经济增长
经济
作者
Jianhang Xiao,Chunhui Du,Zijing Duan,Wei Guo
出处
期刊:International Symposium on Parallel and Distributed Computing
日期:2021-07-28
被引量:11
标识
DOI:10.1109/ispdc52870.2021.9521631
摘要
Federated learning has been a promising distributed machine learning approach in many fields like e-economic, autodriving and medical imaging for its privacy-aware manner. However, researchers have discovered that the performance of traditional federated learning approaches such as Federated Averaging (FEDAVG) declines extremely under Non-Independent and Identical (Non-IID) situations. We observed that part of the reason is the improper way of traditional federated learning’s server-side aggregation method.The contributions of clients in federated learning can be distinguished by their trained models’ validated accuracies. Based on that observation, we proposed a new federated learning algorithm, Accuracy Based Averaging (ABAVG), which improves the server-side aggregation method of traditional federated learning so that it can accelerate the convergence speed of federated learning in Non-IID situations. We extensively evaluate our proposed algorithm with FEDAVG as a baseline and we experiment on various Non-IID conditions to demonstrate the robust of our proposed algorithm. Experimental results show that the convergence speed averagely increased by 47% in Mnist dataset, 59% in Fashion-Mnist dataset and 33% in CIFAR-10 dataset in different data distributions by ABAVG.
科研通智能强力驱动
Strongly Powered by AbleSci AI