计算机科学
超参数
人工智能
稳健性(进化)
趋同(经济学)
帕累托原理
机器学习
情报检索
数学
数学优化
生物化学
化学
经济
基因
经济增长
作者
Zeou Hu,Kiarash Shaloudegi,Guojun Zhang,Yaoliang Yu
出处
期刊:IEEE Transactions on Network Science and Engineering
[Institute of Electrical and Electronics Engineers]
日期:2022-07-01
卷期号:9 (4): 2039-2051
被引量:36
标识
DOI:10.1109/tnse.2022.3169117
摘要
Federated learning has emerged as a promising, massively distributed way to train a joint deep model over large amounts of edgedevices while keeping private user data strictly on device. In this work, motivated from ensuring fairness among users and robustness against malicious adversaries, we formulate federated learning as multi-objective optimization and propose a new algorithm FedMGDA+ that is guaranteed to converge to Pareto stationary solutions. FedMGDA+ is simple to implement, has fewer hyperparameters to tune, and refrains from sacrificing the performance of any participating user. We establish the convergence properties of FedMGDA+ and point out its connections to existing approaches. Extensive experiments on a variety of datasets confirm that FedMGDA+ compares favorably against state-of-the-art.
科研通智能强力驱动
Strongly Powered by AbleSci AI