计算机科学
架空(工程)
服务器
共谋
方案(数学)
加密
密码学
计算机网络
散列函数
无损压缩
信息隐私
计算机安全
分布式计算
数据压缩
算法
操作系统
数学分析
数学
经济
微观经济学
作者
Liangyu Zhong,Lulu Wang,Lei Zhang,Josep Domingo‐Ferrer,Lin Xu,Changti Wu,Rui Zhang
出处
期刊:IEEE Transactions on Network and Service Management
[Institute of Electrical and Electronics Engineers]
日期:2024-05-10
卷期号:21 (4): 4787-4800
被引量:1
标识
DOI:10.1109/tnsm.2024.3399534
摘要
Federated learning (FL) allows multiple users to collaboratively train global machine learning models by keeping their data sets local. However, the existing privacy-preserving FL schemes suffer from several limitations, e.g., loss of accuracy, high communication/computation cost, failure to support dynamic users, and insecurity against collusion attacks. To solve these limitations, we propose a lightweight privacy-preserving FL scheme based on a dual-server architecture. Our scheme involves only lightweight cryptographic operations, i.e., hash and symmetric encryption operations, and it has low communication overhead. Thus, it is computationally lightweight and round-efficient. Further, it allows users to join/quit an FL task and it is accuracy-lossless. We formally prove that our scheme remains secure even in case of collusion attacks. In particular, if an attacker colludes with one of the servers and all the users who participate in an FL task except two, the privacy of user gradients stays unviolated. The reported experimental results demonstrate that our scheme incurs only a marginal increase in total communication overhead compared to the FL scheme without any privacy protection. In terms of computation overhead, the cost per user remains stable as the number of users grows, while the cost for the server is comparable to that of the FL scheme without any privacy protection.
科研通智能强力驱动
Strongly Powered by AbleSci AI