计算机科学
瓶颈
MNIST数据库
联合学习
有损压缩
残余物
带宽(计算)
软件部署
无线
计算机网络
压缩比
方案(数学)
分布式计算
机器学习
数据挖掘
人工智能
深度学习
算法
电信
嵌入式系统
数学分析
数学
汽车工程
工程类
内燃机
操作系统
作者
Rui Song,Liguo Zhou,Lingjuan Lyu,Andreas Festag,Alois Knoll
出处
期刊:IEEE Internet of Things Journal
[Institute of Electrical and Electronics Engineers]
日期:2023-10-12
卷期号:: 1-1
被引量:3
标识
DOI:10.1109/jiot.2023.3324079
摘要
Federated learning allows for cooperative training among distributed clients by sharing their locally learned model parameters, such as weights or gradients. However, as model size increases, the communication bandwidth required for deployment in wireless networks becomes a bottleneck. To address this, we propose a residual-based federated learning framework (ResFed) that transmits residuals instead of gradients or weights in networks. By predicting model updates at both clients and the server, residuals are calculated as the difference between updated and predicted models and contain more dense information than weights or gradients. We find that the residuals are less sensitive to an increasing compression ratio than other parameters, and hence use lossy compression techniques on residuals to improve communication efficiency for training in federated settings. With the same compression ratio, ResFed outperforms current methods (weight-or gradient-based federated learning) by over 1.4× on federated datasets, including MNIST, FashionMNIST, SVHN, CIFAR-10, CIFAR-100, FEMNIST, in client-to-server communication, and can also be applied to reduce communication costs for server-to-client communication.
科研通智能强力驱动
Strongly Powered by AbleSci AI