MNIST数据库
计算机科学
维数(图论)
算法
通信系统
人工神经网络
理论计算机科学
人工智能
数学
计算机网络
纯数学
作者
Runmeng Du,Daojing He,Zikang Ding,Miao Wang,Sammy Chan,Xuru Li
出处
期刊:IEEE Internet of Things Journal
[Institute of Electrical and Electronics Engineers]
日期:2024-05-20
卷期号:11 (17): 28253-28266
标识
DOI:10.1109/jiot.2024.3403178
摘要
This paper addresses the challenge of communication efficiency in federated learning by the proposed algorithm called global sparsification with adaptive aggregated stochastic gradients (GSASG). GSASG leverages the advantages of local sparse communication, global sparsification communication, and adaptive aggregated gradients. More specifically, we devise an efficient global top-k' sparsification operator. By applying this operator to the aggregated gradients obtained from the top-k sparsification, the global model parameter is rarefied to reduce the download transmitted bits from O(dMT) to O(k'MT), where d is the dimension of the gradient, M is the number of workers, T is the total number of epochs and k'≤k
科研通智能强力驱动
Strongly Powered by AbleSci AI