MNIST数据库
计算机科学
背景(考古学)
人工智能
GSM演进的增强数据速率
机器学习
数据挖掘
数据建模
人工神经网络
理论计算机科学
算法
数据库
生物
古生物学
作者
Jun Bai,Atul Sajjanhar,Yong Xiang,Xiaojun Tong,Shan Zeng
标识
DOI:10.1109/ijcnn55064.2022.9892851
摘要
Federated Learning (FL) offers a novel distributed machine learning context whereby a global model is collaboratively learned through edge devices without violating data privacy. However, intrinsic data heterogeneity in the federated network can induce model heterogeneity, thus posing a great challenge to the server-side model aggregation performance. Existing FL algorithms widely adopt model-wise weighted averaging for client models to generate the new global model, which emphasizes the importance of the holistic model but ignores the importance of distinctions between internal parameters of various client models. In this paper, we propose a novel parameter-wise elastic weighted averaging aggregation approach to realize the rapid fusion of heterogeneous client models. Specifically, each client evaluates the importance of model internal parameters in the model update and obtains the corresponding parameter importance coefficient vector; the server implements the parameter-wise weighted averaging for each parameter based on their importance coefficient vectors, thereby aggregating a new global model. Extensive experiments on MNIST and CIFAR-10 datasets with diverse network architectures and hyper-parameter combinations show that our proposed algorithm outperforms the existing state-of-the-art FL algorithms on the performance of heterogeneous model fusion.
科研通智能强力驱动
Strongly Powered by AbleSci AI