MNIST数据库
差别隐私
计算机科学
选择(遗传算法)
过程(计算)
联合学习
机器学习
图层(电子)
人工智能
数据挖掘
深度学习
操作系统
有机化学
化学
作者
Zhuotao Lian,Weizheng Wang,Chunhua Su
出处
期刊:International Conference on Communications
日期:2021-06-01
被引量:23
标识
DOI:10.1109/icc42927.2021.9500632
摘要
Federated learning can collaboratively train a global model without gathering clients’ private data. Many works focus on reducing communication cost by designing kinds of client selection method or averaging algorithm. But they all consider whether the client will participant or not, and the training time could not be reduced as data size of update for each client is not changed. We proposed COFEL, a novel federated learning system which can both reduce the communication time by layer-based parameter selection and enhance the privacy protection by applying local differential privacy mechanism on the selected parameters. We present COFEL-AVG algorithm for global aggregation and designed layer-based parameter selection method which can select the valuable parameters for global aggregation to optimize the communication and training process. And it can reduce the update data size as only selected part will be transferred. We compared with traditional federated learning system and CMFL which also applies a parameter selection method but model-based and performed experiments on MNIST, Fashion-MNIST and CIFAR-10 to verify the effectiveness of COFEL. The results denoted that it can improve at most 22.8% accuracy compared with CMFL on CIFAR-10 and reduce around 20% and 48% training time to reach an accuracy of 0.85 compared with traditional FL and CMFL on Fashion-MNIST dataset.
科研通智能强力驱动
Strongly Powered by AbleSci AI