计算机科学
地点
架空(工程)
后悔
机器学习
人工智能
GSM演进的增强数据速率
边缘设备
选择(遗传算法)
局部敏感散列
分布式计算
散列函数
数据挖掘
哈希表
云计算
哲学
语言学
计算机安全
操作系统
作者
Qiying Pan,H Cao,Yifei Zhu,Jiangchuan Liu,Bo Li
标识
DOI:10.1109/tmc.2023.3323645
摘要
Federated learning (FL) has emerged as a prominent distributed learning paradigm, enabling collaborative training of neural network models across local devices with raw data stay local. However, FL systems often encounter significant challenges due to data heterogeneity. Specifically, the non-IID dataset in FL systems substantially slows down the convergence speed during training and adversely impacts the accuracy of the final model. In our paper, we introduce a novel client selection framework that judiciously leverages correlations across local datasets to accelerate training. Our framework first employs a lightweight locality-sensitive hashing algorithm to extract client features while respecting data privacy and incurring minimal overhead. We then design a novel Neural Contextual Combinatorial Bandit (NCCB) algorithm to establish relationships between client features and rewards, enabling intelligent selection of client combinations. We theoretically prove that our proposed NCCB has a bounded regret. Extensive experiments on real-world datasets further demonstrate that our framework surpasses state-of-the-art solutions, resulting in a 50% reduction in training time and a 17% increase in final model accuracy, closing to the performance in the ideal IID case.
科研通智能强力驱动
Strongly Powered by AbleSci AI