计算机科学
任务(项目管理)
人工神经网络
概率逻辑
钥匙(锁)
代表(政治)
人工智能
机器学习
联合学习
分布(数学)
贝叶斯概率
数据挖掘
数学
数学分析
计算机安全
管理
政治
政治学
法学
经济
作者
Xueyang Wu,Hengguan Huang,Youlong Ding,Hao Wang,Ye Wang,Qian Xu
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence
[Association for the Advancement of Artificial Intelligence (AAAI)]
日期:2023-06-26
卷期号:37 (9): 10399-10407
被引量:2
标识
DOI:10.1609/aaai.v37i9.26237
摘要
Traditional federated learning (FL) algorithms, such as FedAvg, fail to handle non-i.i.d data because they learn a global model by simply averaging biased local models that are trained on non-i.i.d local data, therefore failing to model the global data distribution. In this paper, we present a novel Bayesian FL algorithm that successfully handles such a non-i.i.d FL setting by enhancing the local training task with an auxiliary task that explicitly estimates the global data distribution. One key challenge in estimating the global data distribution is that the data are partitioned in FL, and therefore the ground-truth global data distribution is inaccessible. To address this challenge, we propose an expectation-propagation-inspired probabilistic neural network, dubbed federated neural propagation (FedNP), which efficiently estimates the global data distribution given non-i.i.d data partitions. Our algorithm is sampling-free and end-to-end differentiable, can be applied with any conventional FL frameworks and learns richer global data representation. Experiments on both image classification tasks with synthetic non-i.i.d image data partitions and real-world non-i.i.d speech recognition tasks demonstrate that our framework effectively alleviates the performance deterioration caused by non-i.i.d data.
科研通智能强力驱动
Strongly Powered by AbleSci AI