计算机科学
MNIST数据库
基线(sea)
原始数据
趋同(经济学)
选择(遗传算法)
联合学习
图层(电子)
机器学习
分布式计算
人工智能
数据挖掘
人工神经网络
海洋学
化学
有机化学
经济增长
经济
程序设计语言
地质学
作者
Yu Qiao,Md. Shirajum Munir,Apurba Adhikary,Avi Deb Raha,Choong Seon Hong
标识
DOI:10.1109/noms56928.2023.10154273
摘要
Federated learning (FL) allows local clients to train a global model by cooperating with a server while ensuring that their raw data is not revealed. However, most existing works usually choose clients randomly, regardless of their capabilities and contributions to training. Additionally, FL client selection mechanisms concentrate on a significant challenge associated with system or statistical heterogeneity. This paper tries to manage both the system and statistical heterogeneity of distributed clients in the networks. First, to manage the system heterogeneity, an optimization objective is first proposed to maximize the number of clients with similar capabilities such as storage, computational, and communication capabilities. Then, a network framework with a logical layer is proposed to logically group similar clients by checking their capabilities. Finally, to manage the statistical heterogeneity among clients, a novel Contribution-based Dynamic Federated training strategy, called CDFed, is designed to dynamically adjust the probability of clients being chosen based on Shapley values in each global round. Experimental results on two baseline datasets: MNIST and FMNIST, demonstrate that our proposal has a faster convergence rate, about 50%, and a higher average test accuracy, at least 1%, than baselines in most cases.
科研通智能强力驱动
Strongly Powered by AbleSci AI