计算机科学
互操作性
联合学习
聚类分析
协议(科学)
分布式计算
构造(python库)
分拆(数论)
数据聚合器
数据挖掘
机器学习
计算机网络
无线传感器网络
数学
医学
操作系统
组合数学
病理
替代医学
作者
Ziyao Liu,Jiale Guo,Wenzhuo Yang,Jiani Fan,Kwok‐Yan Lam,Jun Zhao
出处
期刊:IEEE Transactions on Dependable and Secure Computing
[Institute of Electrical and Electronics Engineers]
日期:2024-01-01
卷期号:: 1-12
标识
DOI:10.1109/tdsc.2024.3355458
摘要
With the wider adoption of machine learning and increasing concern about data privacy, federated learning (FL) has received tremendous attention. FL schemes typically enable a set of participants, i.e., data owners, to individually train a machine learning model using their local data, which are then aggregated with the coordination of a central server to construct a global FL model. Improvements upon standard FL include (i) reducing the communication overheads of gradient transmission by utilizing gradient sparsification and (ii) enhancing the security of aggregation by adopting privacy-preserving aggregation (PPAgg) protocols. However, state-of-the-art PPAgg protocols do not interoperate easily with gradient sparsification due to the heterogeneity of users' sparsified gradient vectors. To resolve this issue, we propose a Dynamic User Clustering (DUC) approach with a set of supporting protocols to partition users into clusters based on the nature of the PPAgg protocol and gradient sparsification technique, providing both security guarantees and communication efficiency. Experimental results show that DUC-FL significantly reduces communication overheads and achieves similar model accuracy compared to the baselines. The simplicity of the proposed protocol makes it attractive for both implementation and further improvements.
科研通智能强力驱动
Strongly Powered by AbleSci AI