联合学习
计算机科学
推论
同步(交流)
集成学习
机器学习
自治
人工智能
数据挖掘
电信
频道(广播)
政治学
法学
作者
Meng Chen,Hengzhu Liu,Huanhuan Chi,Ping Xiong
出处
期刊:IEEE transactions on sustainable computing
[Institute of Electrical and Electronics Engineers]
日期:2024-01-05
卷期号:9 (4): 591-601
标识
DOI:10.1109/tsusc.2024.3350040
摘要
Multi-party collaborative learning has become a paradigm for large-scale knowledge discovery in the era of big data. As a typical form of collaborative learning, federated learning (FL) has received widespread research attention in recent years. In practice, however, FL faces a range of challenges such as objective inconsistency, communication and synchronization issues, due to the heterogeneity in the clients' local datasets and devices. In this paper, we propose EnsembleFed, a novel ensemble framework for heterogeneous FL. The proposed framework first allows each client to train a local model with full autonomy and without having to consider the heterogeneity of local datasets. The confidence scores of training samples output by each local model are then perturbed to defend against membership inference attacks, after which they are submitted to the server for use in constructing the global model. We apply a GAN-based method to generate calibrated noise for confidence perturbation. Benefiting from the ensemble framework, EnsembleFed disengages from the restriction of real-time synchronization and achieves collaborative learning with lower communication costs than traditional FL. Experiments on real-world datasets demonstrate that the proposed EnsembleFed can significantly improve the performance of the global model while also effectively defending against membership inference attacks.
科研通智能强力驱动
Strongly Powered by AbleSci AI