计算机科学
加密
方案(数学)
灵活性(工程)
分布式计算
人工智能
机器学习
数据挖掘
计算机网络
数学分析
统计
数学
作者
Qian Wang,Siguang Chen,Meng Wu
出处
期刊:IEEE Transactions on Network and Service Management
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:: 1-1
标识
DOI:10.1109/tnsm.2023.3323129
摘要
Federated learning (FL) gets a sound momentum of growth, which is widely applied to train model in the distributed scenario. However, huge communication cost, poor performance under heterogeneous datasets and models, and emerging privacy leakage are major problems of FL. In this paper, we propose a communication-efficient personalized FL scheme with privacy-preserving. Firstly, we develop a personalized FL with feature fusion-based mutual-learning, which can achieve communication-efficient and personalized learning by training the shared model, private model and fusion model reciprocally on the client. Specifically, only the shared model is shared with global model to reduce communication cost, the private model can be personalized, and the fusion model can fuse the local and global knowledge adaptively in different stages. Secondly, to further reduce the communication cost and enhance the privacy of gradients, we design a privacy-preserving method with gradient compression. In this method, we construct a chaotic encrypted cyclic measurement matrix, which can achieve well privacy protection and lightweight compression. Moreover, we present a sparsity-based adaptive iterative hard threshold algorithm to improve the flexibility and reconstruction performance. Finally, we perform extensive experiments on different datasets and models, and the results show that our scheme achieves more competitive results than other benchmarks on model performance and privacy.
科研通智能强力驱动
Strongly Powered by AbleSci AI