计算机科学
上传
量化(信号处理)
边缘设备
机器学习
架空(工程)
人工智能
计算机工程
云计算
算法
操作系统
作者
Zhong Long,Yuling Chen,Hui Dou,Yangwen Zhang,Yao Chen
出处
期刊:IEEE Transactions on Consumer Electronics
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:: 1-1
标识
DOI:10.1109/tce.2024.3352432
摘要
In this paper, we explore an emerging customer application named federated learning in Mobile Edge Computing (MEC). Federated learning is a machine learning technique that enables data privacy and avoids data silos. However, federated learning tasks with high performance often face large model parameters and high communication costs. To address this challenge, this paper presents FedSQ (Federated Learning with Sparsity and Quantization), a novel sparsity and quantization strategy that reduces communication overhead and enhances model convergence. FedSQ selectively uploads the parameters with large gradient change ratios and uses the old values from the previous round for the parameters with small gradient change ratios. FedSQ also incorporates an error compensation mechanism with correction to mitigate the compression loss and balance the model parameters between the current and previous rounds. We conduct experiments on standard datasets and demonstrate that FedSQ achieves high compression ratio without sacrificing the model accuracy significantly in various federated learning tasks. Across all communication rounds, FedSQ achieved the highest accuracy improvement of 25.74% and the highest compression rate of 97.98%. Our research provides a novel intelligent data-driven model for consumer electronics, which can improve the communication efficiency and model convergence of client applications in MEC.
科研通智能强力驱动
Strongly Powered by AbleSci AI