计算机科学
特征(语言学)
转化(遗传学)
图层(电子)
元学习(计算机科学)
人工智能
特征学习
机器学习
模式识别(心理学)
生物
哲学
生物化学
语言学
化学
管理
有机化学
经济
基因
任务(项目管理)
作者
Jingke Tu,Jiaming Huang,Lei Yang,Wan‐Yu Lin
出处
期刊:ACM Transactions on Knowledge Discovery From Data
[Association for Computing Machinery]
日期:2024-02-13
卷期号:18 (4): 1-21
摘要
Federated learning enables multiple clients to collaboratively learn machine learning models in a privacy-preserving manner. However, in real-world scenarios, a key challenge encountered in federated learning is the statistical heterogeneity among clients. Existing work mainly focused on a single global model shared across the clients, making it hard to generalize well to all clients due to the large discrepancy in the data distributions. To address this challenge, we propose pFedLT , a novel approach that can adapt the single global model to different data distributions. Specifically, we propose to perform a pluggable layer-wise transformation during the local update phase based on scaling and shifting operations. In particular, these operations are learned with a meta-learning strategy. By doing so, pFedLT can capture the diversity of data distribution among clients, therefore, can generalize well even when the data distributions among clients exhibit high statistical heterogeneity. We conduct extensive experiments on synthetic and real-world datasets (MNIST, Fashion_MNIST, CIFAR-10, and Office+Caltech10) under different Non-IID settings. Experimental results demonstrate that pFedLT significantly improves the model accuracy by up to 11.67% and reduces the communication costs compared with state-of-the-art approaches.
科研通智能强力驱动
Strongly Powered by AbleSci AI