计算机科学
同态加密
可扩展性
服务器
共谋
单点故障
分布式计算
联合学习
加密
信息隐私
方案(数学)
分布式学习
计算机网络
计算机安全
数据库
心理学
数学分析
教育学
数学
经济
微观经济学
作者
Mengxue Shang,Dandan Zhang,Fengyin Li
标识
DOI:10.1109/dspp58763.2023.10405290
摘要
Federated learning is a distributed learning paradigm for machine learning that has been widely studied and applied to a variety of scenarios. Since federated learning relies on only one central server to receive model updates from all clients, it has extremely high network bandwidth requirements and risks of single point of failure and privacy leakage. In order to prevent data leakage, this paper proposes a local data aggregation scheme based on xMK-CKKS. To realize decentralized services, this paper proposes a global model aggregation scheme based on RingAllreduce. Further, a decentralized distributed federated learning scheme based on multi-key homomorphic encryption is proposed to realize decentralized hierarchical federated learning with privacy protection. The security analysis and performance analysis show that the scheme in this paper is more scalable to support larger scale federated learning scenarios while ensuring data security, and is more robust to $k\lt N -1$ collusion between clients and distributed servers.
科研通智能强力驱动
Strongly Powered by AbleSci AI