计算机科学
稳健性(进化)
联合学习
数据建模
分布式计算
延迟(音频)
数据传输
过程(计算)
服务器
原始数据
人工智能
机器学习
计算机网络
数据库
操作系统
基因
化学
程序设计语言
电信
生物化学
作者
Yawen Xu,Xiaojun Li,Zeyu Yang,Hengjie J. Song
摘要
Federated learning is an emerging machine learning setting, which can train a shared model on large amounts of decentralized data while protecting data privacy. However, the communication cost of federated learning is heavy, especially for mobile devices with higher latency and lower throughput. Although several algorithms have been proposed to reduce the communication cost, they are extremely sensitive to data distribution, even inapplicable to the real client Non-IID data. In this paper, we propose an effective communication strategy for federated learning called FedSAA, which increases the testing performance on Non-IID data by introducing self- attention mechanism. Two major innovations of our paper are presented here. Firstly, we utilize self-attention mechanism to optimize both the server-to-client and the client-to-client parameter divergence during the model aggregation process so as to improve the model robustness for Non-IID data. Secondly, we adopt the sign compression operator to help data transmission between nodes. The experimental results demonstrate that the model accuracy of our communication-efficient strategy for federated learning with Non-IID data is superior to other communication-efficient algorithms.
科研通智能强力驱动
Strongly Powered by AbleSci AI